00:00:00.001 Started by upstream project "autotest-spdk-master-vs-dpdk-v22.11" build number 2373 00:00:00.001 originally caused by: 00:00:00.002 Started by upstream project "nightly-trigger" build number 3638 00:00:00.002 originally caused by: 00:00:00.002 Started by timer 00:00:00.038 Checking out git https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool into /var/jenkins_home/workspace/short-fuzz-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4 to read jbp/jenkins/jjb-config/jobs/autotest-downstream/autotest-phy.groovy 00:00:00.039 The recommended git tool is: git 00:00:00.039 using credential 00000000-0000-0000-0000-000000000002 00:00:00.042 > git rev-parse --resolve-git-dir /var/jenkins_home/workspace/short-fuzz-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4/jbp/.git # timeout=10 00:00:00.059 Fetching changes from the remote Git repository 00:00:00.061 > git config remote.origin.url https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool # timeout=10 00:00:00.080 Using shallow fetch with depth 1 00:00:00.080 Fetching upstream changes from https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool 00:00:00.080 > git --version # timeout=10 00:00:00.104 > git --version # 'git version 2.39.2' 00:00:00.104 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:00.132 Setting http proxy: proxy-dmz.intel.com:911 00:00:00.132 > git fetch --tags --force --progress --depth=1 -- https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool refs/heads/master # timeout=5 00:00:02.977 > git rev-parse origin/FETCH_HEAD^{commit} # timeout=10 00:00:02.990 > git rev-parse FETCH_HEAD^{commit} # timeout=10 00:00:03.000 Checking out Revision b9dd3f7ec12b0ee8a44940dc99ce739345caa4cf (FETCH_HEAD) 00:00:03.000 > git config core.sparsecheckout # timeout=10 00:00:03.010 > git read-tree -mu HEAD # timeout=10 00:00:03.025 > git checkout -f b9dd3f7ec12b0ee8a44940dc99ce739345caa4cf # timeout=5 00:00:03.044 Commit message: "jenkins/jjb-config: Ignore OS version mismatch under freebsd" 00:00:03.044 > git rev-list --no-walk b9dd3f7ec12b0ee8a44940dc99ce739345caa4cf # timeout=10 00:00:03.123 [Pipeline] Start of Pipeline 00:00:03.137 [Pipeline] library 00:00:03.138 Loading library shm_lib@master 00:00:03.139 Library shm_lib@master is cached. Copying from home. 00:00:03.159 [Pipeline] node 00:00:03.185 Running on WFP20 in /var/jenkins/workspace/short-fuzz-phy-autotest 00:00:03.187 [Pipeline] { 00:00:03.193 [Pipeline] catchError 00:00:03.194 [Pipeline] { 00:00:03.201 [Pipeline] wrap 00:00:03.206 [Pipeline] { 00:00:03.211 [Pipeline] stage 00:00:03.213 [Pipeline] { (Prologue) 00:00:03.402 [Pipeline] sh 00:00:04.256 + logger -p user.info -t JENKINS-CI 00:00:04.289 [Pipeline] echo 00:00:04.291 Node: WFP20 00:00:04.299 [Pipeline] sh 00:00:04.638 [Pipeline] setCustomBuildProperty 00:00:04.650 [Pipeline] echo 00:00:04.651 Cleanup processes 00:00:04.656 [Pipeline] sh 00:00:04.955 + sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:04.955 4722 sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:04.968 [Pipeline] sh 00:00:05.262 ++ sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:05.262 ++ grep -v 'sudo pgrep' 00:00:05.262 ++ awk '{print $1}' 00:00:05.262 + sudo kill -9 00:00:05.262 + true 00:00:05.279 [Pipeline] cleanWs 00:00:05.290 [WS-CLEANUP] Deleting project workspace... 00:00:05.290 [WS-CLEANUP] Deferred wipeout is used... 00:00:05.301 [WS-CLEANUP] done 00:00:05.305 [Pipeline] setCustomBuildProperty 00:00:05.317 [Pipeline] sh 00:00:05.606 + sudo git config --global --replace-all safe.directory '*' 00:00:05.697 [Pipeline] httpRequest 00:00:07.365 [Pipeline] echo 00:00:07.367 Sorcerer 10.211.164.20 is alive 00:00:07.375 [Pipeline] retry 00:00:07.377 [Pipeline] { 00:00:07.389 [Pipeline] httpRequest 00:00:07.393 HttpMethod: GET 00:00:07.394 URL: http://10.211.164.20/packages/jbp_b9dd3f7ec12b0ee8a44940dc99ce739345caa4cf.tar.gz 00:00:07.395 Sending request to url: http://10.211.164.20/packages/jbp_b9dd3f7ec12b0ee8a44940dc99ce739345caa4cf.tar.gz 00:00:07.399 Response Code: HTTP/1.1 200 OK 00:00:07.399 Success: Status code 200 is in the accepted range: 200,404 00:00:07.400 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/jbp_b9dd3f7ec12b0ee8a44940dc99ce739345caa4cf.tar.gz 00:00:08.247 [Pipeline] } 00:00:08.261 [Pipeline] // retry 00:00:08.266 [Pipeline] sh 00:00:08.577 + tar --no-same-owner -xf jbp_b9dd3f7ec12b0ee8a44940dc99ce739345caa4cf.tar.gz 00:00:08.593 [Pipeline] httpRequest 00:00:08.905 [Pipeline] echo 00:00:08.906 Sorcerer 10.211.164.20 is alive 00:00:08.915 [Pipeline] retry 00:00:08.916 [Pipeline] { 00:00:08.927 [Pipeline] httpRequest 00:00:08.931 HttpMethod: GET 00:00:08.932 URL: http://10.211.164.20/packages/spdk_83e8405e4c25408c010ba2b9e02ce45e2347370c.tar.gz 00:00:08.933 Sending request to url: http://10.211.164.20/packages/spdk_83e8405e4c25408c010ba2b9e02ce45e2347370c.tar.gz 00:00:08.948 Response Code: HTTP/1.1 200 OK 00:00:08.949 Success: Status code 200 is in the accepted range: 200,404 00:00:08.949 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk_83e8405e4c25408c010ba2b9e02ce45e2347370c.tar.gz 00:01:36.188 [Pipeline] } 00:01:36.205 [Pipeline] // retry 00:01:36.213 [Pipeline] sh 00:01:36.515 + tar --no-same-owner -xf spdk_83e8405e4c25408c010ba2b9e02ce45e2347370c.tar.gz 00:01:39.093 [Pipeline] sh 00:01:39.389 + git -C spdk log --oneline -n5 00:01:39.389 83e8405e4 nvmf/fc: Qpair disconnect callback: Serialize FC delete connection & close qpair process 00:01:39.389 0eab4c6fb nvmf/fc: Validate the ctrlr pointer inside nvmf_fc_req_bdev_abort() 00:01:39.389 4bcab9fb9 correct kick for CQ full case 00:01:39.389 8531656d3 test/nvmf: Interrupt test for local pcie nvme device 00:01:39.389 318515b44 nvme/perf: interrupt mode support for pcie controller 00:01:39.410 [Pipeline] withCredentials 00:01:39.422 > git --version # timeout=10 00:01:39.436 > git --version # 'git version 2.39.2' 00:01:39.465 Masking supported pattern matches of $GIT_PASSWORD or $GIT_ASKPASS 00:01:39.467 [Pipeline] { 00:01:39.475 [Pipeline] retry 00:01:39.477 [Pipeline] { 00:01:39.490 [Pipeline] sh 00:01:39.997 + git ls-remote http://dpdk.org/git/dpdk-stable v22.11.4 00:01:40.273 [Pipeline] } 00:01:40.291 [Pipeline] // retry 00:01:40.296 [Pipeline] } 00:01:40.310 [Pipeline] // withCredentials 00:01:40.319 [Pipeline] httpRequest 00:01:40.884 [Pipeline] echo 00:01:40.885 Sorcerer 10.211.164.20 is alive 00:01:40.892 [Pipeline] retry 00:01:40.894 [Pipeline] { 00:01:40.904 [Pipeline] httpRequest 00:01:40.908 HttpMethod: GET 00:01:40.909 URL: http://10.211.164.20/packages/dpdk_fee0f13c213d0584f0c42a51d0e0625d99a0b2f1.tar.gz 00:01:40.910 Sending request to url: http://10.211.164.20/packages/dpdk_fee0f13c213d0584f0c42a51d0e0625d99a0b2f1.tar.gz 00:01:40.915 Response Code: HTTP/1.1 200 OK 00:01:40.915 Success: Status code 200 is in the accepted range: 200,404 00:01:40.915 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk_fee0f13c213d0584f0c42a51d0e0625d99a0b2f1.tar.gz 00:01:46.719 [Pipeline] } 00:01:46.735 [Pipeline] // retry 00:01:46.742 [Pipeline] sh 00:01:47.034 + tar --no-same-owner -xf dpdk_fee0f13c213d0584f0c42a51d0e0625d99a0b2f1.tar.gz 00:01:48.435 [Pipeline] sh 00:01:48.726 + git -C dpdk log --oneline -n5 00:01:48.726 caf0f5d395 version: 22.11.4 00:01:48.726 7d6f1cc05f Revert "net/iavf: fix abnormal disable HW interrupt" 00:01:48.726 dc9c799c7d vhost: fix missing spinlock unlock 00:01:48.726 4307659a90 net/mlx5: fix LACP redirection in Rx domain 00:01:48.726 6ef77f2a5e net/gve: fix RX buffer size alignment 00:01:48.737 [Pipeline] } 00:01:48.751 [Pipeline] // stage 00:01:48.760 [Pipeline] stage 00:01:48.762 [Pipeline] { (Prepare) 00:01:48.781 [Pipeline] writeFile 00:01:48.796 [Pipeline] sh 00:01:49.086 + logger -p user.info -t JENKINS-CI 00:01:49.100 [Pipeline] sh 00:01:49.390 + logger -p user.info -t JENKINS-CI 00:01:49.403 [Pipeline] sh 00:01:49.691 + cat autorun-spdk.conf 00:01:49.691 SPDK_RUN_FUNCTIONAL_TEST=1 00:01:49.691 SPDK_TEST_FUZZER_SHORT=1 00:01:49.691 SPDK_TEST_FUZZER=1 00:01:49.691 SPDK_TEST_SETUP=1 00:01:49.691 SPDK_RUN_UBSAN=1 00:01:49.691 SPDK_TEST_NATIVE_DPDK=v22.11.4 00:01:49.691 SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:49.699 RUN_NIGHTLY=1 00:01:49.704 [Pipeline] readFile 00:01:49.738 [Pipeline] withEnv 00:01:49.740 [Pipeline] { 00:01:49.751 [Pipeline] sh 00:01:50.041 + set -ex 00:01:50.041 + [[ -f /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf ]] 00:01:50.041 + source /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:01:50.041 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:01:50.041 ++ SPDK_TEST_FUZZER_SHORT=1 00:01:50.041 ++ SPDK_TEST_FUZZER=1 00:01:50.041 ++ SPDK_TEST_SETUP=1 00:01:50.041 ++ SPDK_RUN_UBSAN=1 00:01:50.041 ++ SPDK_TEST_NATIVE_DPDK=v22.11.4 00:01:50.041 ++ SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:50.041 ++ RUN_NIGHTLY=1 00:01:50.042 + case $SPDK_TEST_NVMF_NICS in 00:01:50.042 + DRIVERS= 00:01:50.042 + [[ -n '' ]] 00:01:50.042 + exit 0 00:01:50.052 [Pipeline] } 00:01:50.067 [Pipeline] // withEnv 00:01:50.072 [Pipeline] } 00:01:50.085 [Pipeline] // stage 00:01:50.093 [Pipeline] catchError 00:01:50.095 [Pipeline] { 00:01:50.108 [Pipeline] timeout 00:01:50.108 Timeout set to expire in 30 min 00:01:50.110 [Pipeline] { 00:01:50.122 [Pipeline] stage 00:01:50.124 [Pipeline] { (Tests) 00:01:50.136 [Pipeline] sh 00:01:50.427 + jbp/jenkins/jjb-config/jobs/scripts/autoruner.sh /var/jenkins/workspace/short-fuzz-phy-autotest 00:01:50.427 ++ readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest 00:01:50.427 + DIR_ROOT=/var/jenkins/workspace/short-fuzz-phy-autotest 00:01:50.427 + [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest ]] 00:01:50.427 + DIR_SPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:50.427 + DIR_OUTPUT=/var/jenkins/workspace/short-fuzz-phy-autotest/output 00:01:50.427 + [[ -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk ]] 00:01:50.427 + [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/output ]] 00:01:50.427 + mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/output 00:01:50.427 + [[ -d /var/jenkins/workspace/short-fuzz-phy-autotest/output ]] 00:01:50.427 + [[ short-fuzz-phy-autotest == pkgdep-* ]] 00:01:50.427 + cd /var/jenkins/workspace/short-fuzz-phy-autotest 00:01:50.427 + source /etc/os-release 00:01:50.427 ++ NAME='Fedora Linux' 00:01:50.427 ++ VERSION='39 (Cloud Edition)' 00:01:50.427 ++ ID=fedora 00:01:50.427 ++ VERSION_ID=39 00:01:50.427 ++ VERSION_CODENAME= 00:01:50.427 ++ PLATFORM_ID=platform:f39 00:01:50.427 ++ PRETTY_NAME='Fedora Linux 39 (Cloud Edition)' 00:01:50.427 ++ ANSI_COLOR='0;38;2;60;110;180' 00:01:50.427 ++ LOGO=fedora-logo-icon 00:01:50.427 ++ CPE_NAME=cpe:/o:fedoraproject:fedora:39 00:01:50.427 ++ HOME_URL=https://fedoraproject.org/ 00:01:50.427 ++ DOCUMENTATION_URL=https://docs.fedoraproject.org/en-US/fedora/f39/system-administrators-guide/ 00:01:50.427 ++ SUPPORT_URL=https://ask.fedoraproject.org/ 00:01:50.427 ++ BUG_REPORT_URL=https://bugzilla.redhat.com/ 00:01:50.427 ++ REDHAT_BUGZILLA_PRODUCT=Fedora 00:01:50.427 ++ REDHAT_BUGZILLA_PRODUCT_VERSION=39 00:01:50.427 ++ REDHAT_SUPPORT_PRODUCT=Fedora 00:01:50.427 ++ REDHAT_SUPPORT_PRODUCT_VERSION=39 00:01:50.427 ++ SUPPORT_END=2024-11-12 00:01:50.427 ++ VARIANT='Cloud Edition' 00:01:50.427 ++ VARIANT_ID=cloud 00:01:50.427 + uname -a 00:01:50.427 Linux spdk-wfp-20 6.8.9-200.fc39.x86_64 #1 SMP PREEMPT_DYNAMIC Wed Jul 24 03:04:40 UTC 2024 x86_64 GNU/Linux 00:01:50.427 + sudo /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:01:53.731 Hugepages 00:01:53.732 node hugesize free / total 00:01:53.732 node0 1048576kB 0 / 0 00:01:53.732 node0 2048kB 0 / 0 00:01:53.732 node1 1048576kB 0 / 0 00:01:53.732 node1 2048kB 0 / 0 00:01:53.732 00:01:53.732 Type BDF Vendor Device NUMA Driver Device Block devices 00:01:53.732 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:01:53.732 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:01:53.732 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:01:53.732 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:01:53.732 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:01:53.732 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:01:53.732 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:01:53.732 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:01:53.732 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:01:53.732 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:01:53.732 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:01:53.732 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:01:53.732 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:01:53.732 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:01:53.732 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:01:53.732 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:01:53.732 NVMe 0000:d8:00.0 8086 0a54 1 nvme nvme0 nvme0n1 00:01:53.732 + rm -f /tmp/spdk-ld-path 00:01:53.732 + source autorun-spdk.conf 00:01:53.732 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:01:53.732 ++ SPDK_TEST_FUZZER_SHORT=1 00:01:53.732 ++ SPDK_TEST_FUZZER=1 00:01:53.732 ++ SPDK_TEST_SETUP=1 00:01:53.732 ++ SPDK_RUN_UBSAN=1 00:01:53.732 ++ SPDK_TEST_NATIVE_DPDK=v22.11.4 00:01:53.732 ++ SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:53.732 ++ RUN_NIGHTLY=1 00:01:53.732 + (( SPDK_TEST_NVME_CMB == 1 || SPDK_TEST_NVME_PMR == 1 )) 00:01:53.732 + [[ -n '' ]] 00:01:53.732 + sudo git config --global --add safe.directory /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:53.732 + for M in /var/spdk/build-*-manifest.txt 00:01:53.732 + [[ -f /var/spdk/build-kernel-manifest.txt ]] 00:01:53.732 + cp /var/spdk/build-kernel-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:01:53.732 + for M in /var/spdk/build-*-manifest.txt 00:01:53.732 + [[ -f /var/spdk/build-pkg-manifest.txt ]] 00:01:53.732 + cp /var/spdk/build-pkg-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:01:53.732 + for M in /var/spdk/build-*-manifest.txt 00:01:53.732 + [[ -f /var/spdk/build-repo-manifest.txt ]] 00:01:53.732 + cp /var/spdk/build-repo-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:01:53.732 ++ uname 00:01:53.732 + [[ Linux == \L\i\n\u\x ]] 00:01:53.732 + sudo dmesg -T 00:01:53.732 + sudo dmesg --clear 00:01:53.732 + dmesg_pid=5640 00:01:53.732 + [[ Fedora Linux == FreeBSD ]] 00:01:53.732 + export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:01:53.732 + UNBIND_ENTIRE_IOMMU_GROUP=yes 00:01:53.732 + [[ -e /var/spdk/dependencies/vhost/spdk_test_image.qcow2 ]] 00:01:53.732 + sudo dmesg -Tw 00:01:53.732 + [[ -x /usr/src/fio-static/fio ]] 00:01:53.732 + export FIO_BIN=/usr/src/fio-static/fio 00:01:53.732 + FIO_BIN=/usr/src/fio-static/fio 00:01:53.732 + [[ '' == \/\v\a\r\/\j\e\n\k\i\n\s\/\w\o\r\k\s\p\a\c\e\/\s\h\o\r\t\-\f\u\z\z\-\p\h\y\-\a\u\t\o\t\e\s\t\/\q\e\m\u\_\v\f\i\o\/* ]] 00:01:53.732 + [[ ! -v VFIO_QEMU_BIN ]] 00:01:53.732 + [[ -e /usr/local/qemu/vfio-user-latest ]] 00:01:53.732 + export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:01:53.732 + VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:01:53.732 + [[ -e /usr/local/qemu/vanilla-latest ]] 00:01:53.732 + export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:01:53.732 + QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:01:53.732 + spdk/autorun.sh /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:01:53.732 10:56:18 -- common/autotest_common.sh@1692 -- $ [[ n == y ]] 00:01:53.732 10:56:18 -- spdk/autorun.sh@20 -- $ source /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:01:53.732 10:56:18 -- short-fuzz-phy-autotest/autorun-spdk.conf@1 -- $ SPDK_RUN_FUNCTIONAL_TEST=1 00:01:53.732 10:56:18 -- short-fuzz-phy-autotest/autorun-spdk.conf@2 -- $ SPDK_TEST_FUZZER_SHORT=1 00:01:53.732 10:56:18 -- short-fuzz-phy-autotest/autorun-spdk.conf@3 -- $ SPDK_TEST_FUZZER=1 00:01:53.732 10:56:18 -- short-fuzz-phy-autotest/autorun-spdk.conf@4 -- $ SPDK_TEST_SETUP=1 00:01:53.732 10:56:18 -- short-fuzz-phy-autotest/autorun-spdk.conf@5 -- $ SPDK_RUN_UBSAN=1 00:01:53.732 10:56:18 -- short-fuzz-phy-autotest/autorun-spdk.conf@6 -- $ SPDK_TEST_NATIVE_DPDK=v22.11.4 00:01:53.732 10:56:18 -- short-fuzz-phy-autotest/autorun-spdk.conf@7 -- $ SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:53.732 10:56:18 -- short-fuzz-phy-autotest/autorun-spdk.conf@8 -- $ RUN_NIGHTLY=1 00:01:53.732 10:56:18 -- spdk/autorun.sh@22 -- $ trap 'timing_finish || exit 1' EXIT 00:01:53.732 10:56:18 -- spdk/autorun.sh@25 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/autobuild.sh /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:01:53.732 10:56:18 -- common/autotest_common.sh@1692 -- $ [[ n == y ]] 00:01:53.732 10:56:18 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:01:53.732 10:56:18 -- scripts/common.sh@15 -- $ shopt -s extglob 00:01:53.732 10:56:18 -- scripts/common.sh@544 -- $ [[ -e /bin/wpdk_common.sh ]] 00:01:53.732 10:56:18 -- scripts/common.sh@552 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:01:53.732 10:56:18 -- scripts/common.sh@553 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:01:53.732 10:56:18 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:53.732 10:56:18 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:53.732 10:56:18 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:53.732 10:56:18 -- paths/export.sh@5 -- $ export PATH 00:01:53.732 10:56:18 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:53.732 10:56:18 -- common/autobuild_common.sh@485 -- $ out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:01:53.732 10:56:18 -- common/autobuild_common.sh@486 -- $ date +%s 00:01:53.732 10:56:18 -- common/autobuild_common.sh@486 -- $ mktemp -dt spdk_1731837378.XXXXXX 00:01:53.732 10:56:18 -- common/autobuild_common.sh@486 -- $ SPDK_WORKSPACE=/tmp/spdk_1731837378.lDhfvq 00:01:53.732 10:56:18 -- common/autobuild_common.sh@488 -- $ [[ -n '' ]] 00:01:53.732 10:56:18 -- common/autobuild_common.sh@492 -- $ '[' -n v22.11.4 ']' 00:01:53.732 10:56:18 -- common/autobuild_common.sh@493 -- $ dirname /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:53.732 10:56:18 -- common/autobuild_common.sh@493 -- $ scanbuild_exclude=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk' 00:01:53.732 10:56:18 -- common/autobuild_common.sh@499 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp' 00:01:53.732 10:56:18 -- common/autobuild_common.sh@501 -- $ scanbuild='scan-build -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:01:53.732 10:56:18 -- common/autobuild_common.sh@502 -- $ get_config_params 00:01:53.732 10:56:18 -- common/autotest_common.sh@409 -- $ xtrace_disable 00:01:53.732 10:56:18 -- common/autotest_common.sh@10 -- $ set +x 00:01:53.994 10:56:18 -- common/autobuild_common.sh@502 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user' 00:01:53.994 10:56:18 -- common/autobuild_common.sh@504 -- $ start_monitor_resources 00:01:53.994 10:56:18 -- pm/common@17 -- $ local monitor 00:01:53.994 10:56:18 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:01:53.994 10:56:18 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:01:53.994 10:56:18 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:01:53.994 10:56:18 -- pm/common@21 -- $ date +%s 00:01:53.994 10:56:18 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:01:53.994 10:56:18 -- pm/common@21 -- $ date +%s 00:01:53.994 10:56:18 -- pm/common@25 -- $ sleep 1 00:01:53.994 10:56:18 -- pm/common@21 -- $ date +%s 00:01:53.994 10:56:18 -- pm/common@21 -- $ date +%s 00:01:53.994 10:56:18 -- pm/common@21 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1731837378 00:01:53.994 10:56:18 -- pm/common@21 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1731837378 00:01:53.994 10:56:18 -- pm/common@21 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1731837378 00:01:53.994 10:56:18 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1731837378 00:01:53.994 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1731837378_collect-vmstat.pm.log 00:01:53.994 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1731837378_collect-cpu-load.pm.log 00:01:53.994 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1731837378_collect-cpu-temp.pm.log 00:01:53.994 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1731837378_collect-bmc-pm.bmc.pm.log 00:01:54.937 10:56:19 -- common/autobuild_common.sh@505 -- $ trap stop_monitor_resources EXIT 00:01:54.937 10:56:19 -- spdk/autobuild.sh@11 -- $ SPDK_TEST_AUTOBUILD= 00:01:54.937 10:56:19 -- spdk/autobuild.sh@12 -- $ umask 022 00:01:54.937 10:56:19 -- spdk/autobuild.sh@13 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:54.937 10:56:19 -- spdk/autobuild.sh@16 -- $ date -u 00:01:54.937 Sun Nov 17 09:56:19 AM UTC 2024 00:01:54.937 10:56:19 -- spdk/autobuild.sh@17 -- $ git describe --tags 00:01:54.937 v25.01-pre-189-g83e8405e4 00:01:54.937 10:56:19 -- spdk/autobuild.sh@19 -- $ '[' 0 -eq 1 ']' 00:01:54.937 10:56:19 -- spdk/autobuild.sh@23 -- $ '[' 1 -eq 1 ']' 00:01:54.937 10:56:19 -- spdk/autobuild.sh@24 -- $ run_test ubsan echo 'using ubsan' 00:01:54.937 10:56:19 -- common/autotest_common.sh@1105 -- $ '[' 3 -le 1 ']' 00:01:54.937 10:56:19 -- common/autotest_common.sh@1111 -- $ xtrace_disable 00:01:54.937 10:56:19 -- common/autotest_common.sh@10 -- $ set +x 00:01:54.937 ************************************ 00:01:54.937 START TEST ubsan 00:01:54.937 ************************************ 00:01:54.937 10:56:19 ubsan -- common/autotest_common.sh@1129 -- $ echo 'using ubsan' 00:01:54.937 using ubsan 00:01:54.937 00:01:54.937 real 0m0.001s 00:01:54.937 user 0m0.000s 00:01:54.937 sys 0m0.000s 00:01:54.937 10:56:19 ubsan -- common/autotest_common.sh@1130 -- $ xtrace_disable 00:01:54.937 10:56:19 ubsan -- common/autotest_common.sh@10 -- $ set +x 00:01:54.937 ************************************ 00:01:54.937 END TEST ubsan 00:01:54.937 ************************************ 00:01:54.937 10:56:19 -- spdk/autobuild.sh@27 -- $ '[' -n v22.11.4 ']' 00:01:54.937 10:56:19 -- spdk/autobuild.sh@28 -- $ build_native_dpdk 00:01:54.937 10:56:19 -- common/autobuild_common.sh@442 -- $ run_test build_native_dpdk _build_native_dpdk 00:01:54.937 10:56:19 -- common/autotest_common.sh@1105 -- $ '[' 2 -le 1 ']' 00:01:54.937 10:56:19 -- common/autotest_common.sh@1111 -- $ xtrace_disable 00:01:54.937 10:56:19 -- common/autotest_common.sh@10 -- $ set +x 00:01:54.937 ************************************ 00:01:54.937 START TEST build_native_dpdk 00:01:54.937 ************************************ 00:01:54.937 10:56:19 build_native_dpdk -- common/autotest_common.sh@1129 -- $ _build_native_dpdk 00:01:54.937 10:56:19 build_native_dpdk -- common/autobuild_common.sh@48 -- $ local external_dpdk_dir 00:01:54.937 10:56:19 build_native_dpdk -- common/autobuild_common.sh@49 -- $ local external_dpdk_base_dir 00:01:54.937 10:56:19 build_native_dpdk -- common/autobuild_common.sh@50 -- $ local compiler_version 00:01:54.937 10:56:19 build_native_dpdk -- common/autobuild_common.sh@51 -- $ local compiler 00:01:54.937 10:56:19 build_native_dpdk -- common/autobuild_common.sh@52 -- $ local dpdk_kmods 00:01:54.937 10:56:19 build_native_dpdk -- common/autobuild_common.sh@53 -- $ local repo=dpdk 00:01:54.937 10:56:19 build_native_dpdk -- common/autobuild_common.sh@55 -- $ compiler=gcc 00:01:54.937 10:56:19 build_native_dpdk -- common/autobuild_common.sh@61 -- $ export CC=gcc 00:01:54.937 10:56:19 build_native_dpdk -- common/autobuild_common.sh@61 -- $ CC=gcc 00:01:54.937 10:56:19 build_native_dpdk -- common/autobuild_common.sh@63 -- $ [[ gcc != *clang* ]] 00:01:54.937 10:56:19 build_native_dpdk -- common/autobuild_common.sh@63 -- $ [[ gcc != *gcc* ]] 00:01:54.938 10:56:19 build_native_dpdk -- common/autobuild_common.sh@68 -- $ gcc -dumpversion 00:01:55.199 10:56:19 build_native_dpdk -- common/autobuild_common.sh@68 -- $ compiler_version=13 00:01:55.199 10:56:19 build_native_dpdk -- common/autobuild_common.sh@69 -- $ compiler_version=13 00:01:55.199 10:56:19 build_native_dpdk -- common/autobuild_common.sh@70 -- $ external_dpdk_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:55.199 10:56:19 build_native_dpdk -- common/autobuild_common.sh@71 -- $ dirname /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:55.199 10:56:19 build_native_dpdk -- common/autobuild_common.sh@71 -- $ external_dpdk_base_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk 00:01:55.199 10:56:19 build_native_dpdk -- common/autobuild_common.sh@73 -- $ [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk ]] 00:01:55.199 10:56:19 build_native_dpdk -- common/autobuild_common.sh@82 -- $ orgdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:55.199 10:56:19 build_native_dpdk -- common/autobuild_common.sh@83 -- $ git -C /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk log --oneline -n 5 00:01:55.199 caf0f5d395 version: 22.11.4 00:01:55.199 7d6f1cc05f Revert "net/iavf: fix abnormal disable HW interrupt" 00:01:55.199 dc9c799c7d vhost: fix missing spinlock unlock 00:01:55.199 4307659a90 net/mlx5: fix LACP redirection in Rx domain 00:01:55.199 6ef77f2a5e net/gve: fix RX buffer size alignment 00:01:55.199 10:56:19 build_native_dpdk -- common/autobuild_common.sh@85 -- $ dpdk_cflags='-fPIC -g -fcommon' 00:01:55.199 10:56:19 build_native_dpdk -- common/autobuild_common.sh@86 -- $ dpdk_ldflags= 00:01:55.199 10:56:19 build_native_dpdk -- common/autobuild_common.sh@87 -- $ dpdk_ver=22.11.4 00:01:55.199 10:56:19 build_native_dpdk -- common/autobuild_common.sh@89 -- $ [[ gcc == *gcc* ]] 00:01:55.199 10:56:19 build_native_dpdk -- common/autobuild_common.sh@89 -- $ [[ 13 -ge 5 ]] 00:01:55.199 10:56:19 build_native_dpdk -- common/autobuild_common.sh@90 -- $ dpdk_cflags+=' -Werror' 00:01:55.199 10:56:19 build_native_dpdk -- common/autobuild_common.sh@93 -- $ [[ gcc == *gcc* ]] 00:01:55.199 10:56:19 build_native_dpdk -- common/autobuild_common.sh@93 -- $ [[ 13 -ge 10 ]] 00:01:55.199 10:56:19 build_native_dpdk -- common/autobuild_common.sh@94 -- $ dpdk_cflags+=' -Wno-stringop-overflow' 00:01:55.199 10:56:19 build_native_dpdk -- common/autobuild_common.sh@100 -- $ DPDK_DRIVERS=("bus" "bus/pci" "bus/vdev" "mempool/ring" "net/i40e" "net/i40e/base") 00:01:55.199 10:56:19 build_native_dpdk -- common/autobuild_common.sh@102 -- $ local mlx5_libs_added=n 00:01:55.199 10:56:19 build_native_dpdk -- common/autobuild_common.sh@103 -- $ [[ 0 -eq 1 ]] 00:01:55.199 10:56:19 build_native_dpdk -- common/autobuild_common.sh@103 -- $ [[ 0 -eq 1 ]] 00:01:55.199 10:56:19 build_native_dpdk -- common/autobuild_common.sh@139 -- $ [[ 0 -eq 1 ]] 00:01:55.199 10:56:19 build_native_dpdk -- common/autobuild_common.sh@167 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk 00:01:55.199 10:56:19 build_native_dpdk -- common/autobuild_common.sh@168 -- $ uname -s 00:01:55.199 10:56:19 build_native_dpdk -- common/autobuild_common.sh@168 -- $ '[' Linux = Linux ']' 00:01:55.199 10:56:19 build_native_dpdk -- common/autobuild_common.sh@169 -- $ lt 22.11.4 21.11.0 00:01:55.199 10:56:19 build_native_dpdk -- scripts/common.sh@373 -- $ cmp_versions 22.11.4 '<' 21.11.0 00:01:55.199 10:56:19 build_native_dpdk -- scripts/common.sh@333 -- $ local ver1 ver1_l 00:01:55.199 10:56:19 build_native_dpdk -- scripts/common.sh@334 -- $ local ver2 ver2_l 00:01:55.199 10:56:19 build_native_dpdk -- scripts/common.sh@336 -- $ IFS=.-: 00:01:55.199 10:56:19 build_native_dpdk -- scripts/common.sh@336 -- $ read -ra ver1 00:01:55.199 10:56:19 build_native_dpdk -- scripts/common.sh@337 -- $ IFS=.-: 00:01:55.199 10:56:19 build_native_dpdk -- scripts/common.sh@337 -- $ read -ra ver2 00:01:55.199 10:56:19 build_native_dpdk -- scripts/common.sh@338 -- $ local 'op=<' 00:01:55.199 10:56:19 build_native_dpdk -- scripts/common.sh@340 -- $ ver1_l=3 00:01:55.199 10:56:19 build_native_dpdk -- scripts/common.sh@341 -- $ ver2_l=3 00:01:55.199 10:56:19 build_native_dpdk -- scripts/common.sh@343 -- $ local lt=0 gt=0 eq=0 v 00:01:55.199 10:56:19 build_native_dpdk -- scripts/common.sh@344 -- $ case "$op" in 00:01:55.199 10:56:19 build_native_dpdk -- scripts/common.sh@345 -- $ : 1 00:01:55.199 10:56:19 build_native_dpdk -- scripts/common.sh@364 -- $ (( v = 0 )) 00:01:55.199 10:56:19 build_native_dpdk -- scripts/common.sh@364 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:01:55.199 10:56:19 build_native_dpdk -- scripts/common.sh@365 -- $ decimal 22 00:01:55.199 10:56:19 build_native_dpdk -- scripts/common.sh@353 -- $ local d=22 00:01:55.199 10:56:19 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 22 =~ ^[0-9]+$ ]] 00:01:55.199 10:56:19 build_native_dpdk -- scripts/common.sh@355 -- $ echo 22 00:01:55.199 10:56:19 build_native_dpdk -- scripts/common.sh@365 -- $ ver1[v]=22 00:01:55.199 10:56:19 build_native_dpdk -- scripts/common.sh@366 -- $ decimal 21 00:01:55.199 10:56:19 build_native_dpdk -- scripts/common.sh@353 -- $ local d=21 00:01:55.199 10:56:19 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 21 =~ ^[0-9]+$ ]] 00:01:55.199 10:56:19 build_native_dpdk -- scripts/common.sh@355 -- $ echo 21 00:01:55.199 10:56:19 build_native_dpdk -- scripts/common.sh@366 -- $ ver2[v]=21 00:01:55.199 10:56:19 build_native_dpdk -- scripts/common.sh@367 -- $ (( ver1[v] > ver2[v] )) 00:01:55.199 10:56:19 build_native_dpdk -- scripts/common.sh@367 -- $ return 1 00:01:55.199 10:56:19 build_native_dpdk -- common/autobuild_common.sh@173 -- $ patch -p1 00:01:55.199 patching file config/rte_config.h 00:01:55.199 Hunk #1 succeeded at 60 (offset 1 line). 00:01:55.199 10:56:19 build_native_dpdk -- common/autobuild_common.sh@176 -- $ lt 22.11.4 24.07.0 00:01:55.199 10:56:19 build_native_dpdk -- scripts/common.sh@373 -- $ cmp_versions 22.11.4 '<' 24.07.0 00:01:55.199 10:56:19 build_native_dpdk -- scripts/common.sh@333 -- $ local ver1 ver1_l 00:01:55.199 10:56:19 build_native_dpdk -- scripts/common.sh@334 -- $ local ver2 ver2_l 00:01:55.199 10:56:19 build_native_dpdk -- scripts/common.sh@336 -- $ IFS=.-: 00:01:55.199 10:56:19 build_native_dpdk -- scripts/common.sh@336 -- $ read -ra ver1 00:01:55.199 10:56:19 build_native_dpdk -- scripts/common.sh@337 -- $ IFS=.-: 00:01:55.199 10:56:19 build_native_dpdk -- scripts/common.sh@337 -- $ read -ra ver2 00:01:55.199 10:56:19 build_native_dpdk -- scripts/common.sh@338 -- $ local 'op=<' 00:01:55.199 10:56:19 build_native_dpdk -- scripts/common.sh@340 -- $ ver1_l=3 00:01:55.199 10:56:19 build_native_dpdk -- scripts/common.sh@341 -- $ ver2_l=3 00:01:55.199 10:56:19 build_native_dpdk -- scripts/common.sh@343 -- $ local lt=0 gt=0 eq=0 v 00:01:55.199 10:56:19 build_native_dpdk -- scripts/common.sh@344 -- $ case "$op" in 00:01:55.199 10:56:19 build_native_dpdk -- scripts/common.sh@345 -- $ : 1 00:01:55.199 10:56:19 build_native_dpdk -- scripts/common.sh@364 -- $ (( v = 0 )) 00:01:55.199 10:56:19 build_native_dpdk -- scripts/common.sh@364 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:01:55.199 10:56:19 build_native_dpdk -- scripts/common.sh@365 -- $ decimal 22 00:01:55.199 10:56:19 build_native_dpdk -- scripts/common.sh@353 -- $ local d=22 00:01:55.199 10:56:19 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 22 =~ ^[0-9]+$ ]] 00:01:55.199 10:56:19 build_native_dpdk -- scripts/common.sh@355 -- $ echo 22 00:01:55.199 10:56:19 build_native_dpdk -- scripts/common.sh@365 -- $ ver1[v]=22 00:01:55.199 10:56:19 build_native_dpdk -- scripts/common.sh@366 -- $ decimal 24 00:01:55.199 10:56:19 build_native_dpdk -- scripts/common.sh@353 -- $ local d=24 00:01:55.199 10:56:19 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 24 =~ ^[0-9]+$ ]] 00:01:55.199 10:56:19 build_native_dpdk -- scripts/common.sh@355 -- $ echo 24 00:01:55.199 10:56:19 build_native_dpdk -- scripts/common.sh@366 -- $ ver2[v]=24 00:01:55.199 10:56:19 build_native_dpdk -- scripts/common.sh@367 -- $ (( ver1[v] > ver2[v] )) 00:01:55.199 10:56:19 build_native_dpdk -- scripts/common.sh@368 -- $ (( ver1[v] < ver2[v] )) 00:01:55.199 10:56:19 build_native_dpdk -- scripts/common.sh@368 -- $ return 0 00:01:55.199 10:56:19 build_native_dpdk -- common/autobuild_common.sh@177 -- $ patch -p1 00:01:55.199 patching file lib/pcapng/rte_pcapng.c 00:01:55.199 Hunk #1 succeeded at 110 (offset -18 lines). 00:01:55.199 10:56:19 build_native_dpdk -- common/autobuild_common.sh@179 -- $ ge 22.11.4 24.07.0 00:01:55.199 10:56:19 build_native_dpdk -- scripts/common.sh@376 -- $ cmp_versions 22.11.4 '>=' 24.07.0 00:01:55.199 10:56:19 build_native_dpdk -- scripts/common.sh@333 -- $ local ver1 ver1_l 00:01:55.199 10:56:19 build_native_dpdk -- scripts/common.sh@334 -- $ local ver2 ver2_l 00:01:55.199 10:56:19 build_native_dpdk -- scripts/common.sh@336 -- $ IFS=.-: 00:01:55.199 10:56:19 build_native_dpdk -- scripts/common.sh@336 -- $ read -ra ver1 00:01:55.199 10:56:19 build_native_dpdk -- scripts/common.sh@337 -- $ IFS=.-: 00:01:55.199 10:56:19 build_native_dpdk -- scripts/common.sh@337 -- $ read -ra ver2 00:01:55.199 10:56:19 build_native_dpdk -- scripts/common.sh@338 -- $ local 'op=>=' 00:01:55.199 10:56:19 build_native_dpdk -- scripts/common.sh@340 -- $ ver1_l=3 00:01:55.199 10:56:19 build_native_dpdk -- scripts/common.sh@341 -- $ ver2_l=3 00:01:55.199 10:56:19 build_native_dpdk -- scripts/common.sh@343 -- $ local lt=0 gt=0 eq=0 v 00:01:55.199 10:56:19 build_native_dpdk -- scripts/common.sh@344 -- $ case "$op" in 00:01:55.199 10:56:19 build_native_dpdk -- scripts/common.sh@348 -- $ : 1 00:01:55.199 10:56:19 build_native_dpdk -- scripts/common.sh@364 -- $ (( v = 0 )) 00:01:55.199 10:56:19 build_native_dpdk -- scripts/common.sh@364 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:01:55.199 10:56:19 build_native_dpdk -- scripts/common.sh@365 -- $ decimal 22 00:01:55.199 10:56:19 build_native_dpdk -- scripts/common.sh@353 -- $ local d=22 00:01:55.200 10:56:19 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 22 =~ ^[0-9]+$ ]] 00:01:55.200 10:56:19 build_native_dpdk -- scripts/common.sh@355 -- $ echo 22 00:01:55.200 10:56:19 build_native_dpdk -- scripts/common.sh@365 -- $ ver1[v]=22 00:01:55.200 10:56:19 build_native_dpdk -- scripts/common.sh@366 -- $ decimal 24 00:01:55.200 10:56:19 build_native_dpdk -- scripts/common.sh@353 -- $ local d=24 00:01:55.200 10:56:19 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 24 =~ ^[0-9]+$ ]] 00:01:55.200 10:56:19 build_native_dpdk -- scripts/common.sh@355 -- $ echo 24 00:01:55.200 10:56:19 build_native_dpdk -- scripts/common.sh@366 -- $ ver2[v]=24 00:01:55.200 10:56:19 build_native_dpdk -- scripts/common.sh@367 -- $ (( ver1[v] > ver2[v] )) 00:01:55.200 10:56:19 build_native_dpdk -- scripts/common.sh@368 -- $ (( ver1[v] < ver2[v] )) 00:01:55.200 10:56:19 build_native_dpdk -- scripts/common.sh@368 -- $ return 1 00:01:55.200 10:56:19 build_native_dpdk -- common/autobuild_common.sh@183 -- $ dpdk_kmods=false 00:01:55.200 10:56:19 build_native_dpdk -- common/autobuild_common.sh@184 -- $ uname -s 00:01:55.200 10:56:19 build_native_dpdk -- common/autobuild_common.sh@184 -- $ '[' Linux = FreeBSD ']' 00:01:55.200 10:56:19 build_native_dpdk -- common/autobuild_common.sh@188 -- $ printf %s, bus bus/pci bus/vdev mempool/ring net/i40e net/i40e/base 00:01:55.200 10:56:19 build_native_dpdk -- common/autobuild_common.sh@188 -- $ meson build-tmp --prefix=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --libdir lib -Denable_docs=false -Denable_kmods=false -Dtests=false -Dc_link_args= '-Dc_args=-fPIC -g -fcommon -Werror -Wno-stringop-overflow' -Dmachine=native -Denable_drivers=bus,bus/pci,bus/vdev,mempool/ring,net/i40e,net/i40e/base, 00:02:01.783 The Meson build system 00:02:01.783 Version: 1.5.0 00:02:01.783 Source dir: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk 00:02:01.783 Build dir: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp 00:02:01.783 Build type: native build 00:02:01.783 Program cat found: YES (/usr/bin/cat) 00:02:01.783 Project name: DPDK 00:02:01.783 Project version: 22.11.4 00:02:01.783 C compiler for the host machine: gcc (gcc 13.3.1 "gcc (GCC) 13.3.1 20240522 (Red Hat 13.3.1-1)") 00:02:01.783 C linker for the host machine: gcc ld.bfd 2.40-14 00:02:01.783 Host machine cpu family: x86_64 00:02:01.783 Host machine cpu: x86_64 00:02:01.783 Message: ## Building in Developer Mode ## 00:02:01.783 Program pkg-config found: YES (/usr/bin/pkg-config) 00:02:01.783 Program check-symbols.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/buildtools/check-symbols.sh) 00:02:01.783 Program options-ibverbs-static.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/buildtools/options-ibverbs-static.sh) 00:02:01.783 Program objdump found: YES (/usr/bin/objdump) 00:02:01.783 Program python3 found: YES (/usr/bin/python3) 00:02:01.783 Program cat found: YES (/usr/bin/cat) 00:02:01.783 config/meson.build:83: WARNING: The "machine" option is deprecated. Please use "cpu_instruction_set" instead. 00:02:01.783 Checking for size of "void *" : 8 00:02:01.783 Checking for size of "void *" : 8 (cached) 00:02:01.783 Library m found: YES 00:02:01.783 Library numa found: YES 00:02:01.783 Has header "numaif.h" : YES 00:02:01.783 Library fdt found: NO 00:02:01.783 Library execinfo found: NO 00:02:01.783 Has header "execinfo.h" : YES 00:02:01.783 Found pkg-config: YES (/usr/bin/pkg-config) 1.9.5 00:02:01.783 Run-time dependency libarchive found: NO (tried pkgconfig) 00:02:01.783 Run-time dependency libbsd found: NO (tried pkgconfig) 00:02:01.783 Run-time dependency jansson found: NO (tried pkgconfig) 00:02:01.783 Run-time dependency openssl found: YES 3.1.1 00:02:01.783 Run-time dependency libpcap found: YES 1.10.4 00:02:01.783 Has header "pcap.h" with dependency libpcap: YES 00:02:01.783 Compiler for C supports arguments -Wcast-qual: YES 00:02:01.783 Compiler for C supports arguments -Wdeprecated: YES 00:02:01.783 Compiler for C supports arguments -Wformat: YES 00:02:01.783 Compiler for C supports arguments -Wformat-nonliteral: NO 00:02:01.783 Compiler for C supports arguments -Wformat-security: NO 00:02:01.783 Compiler for C supports arguments -Wmissing-declarations: YES 00:02:01.783 Compiler for C supports arguments -Wmissing-prototypes: YES 00:02:01.783 Compiler for C supports arguments -Wnested-externs: YES 00:02:01.783 Compiler for C supports arguments -Wold-style-definition: YES 00:02:01.783 Compiler for C supports arguments -Wpointer-arith: YES 00:02:01.783 Compiler for C supports arguments -Wsign-compare: YES 00:02:01.783 Compiler for C supports arguments -Wstrict-prototypes: YES 00:02:01.783 Compiler for C supports arguments -Wundef: YES 00:02:01.783 Compiler for C supports arguments -Wwrite-strings: YES 00:02:01.783 Compiler for C supports arguments -Wno-address-of-packed-member: YES 00:02:01.783 Compiler for C supports arguments -Wno-packed-not-aligned: YES 00:02:01.783 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:02:01.783 Compiler for C supports arguments -Wno-zero-length-bounds: YES 00:02:01.783 Compiler for C supports arguments -mavx512f: YES 00:02:01.783 Checking if "AVX512 checking" compiles: YES 00:02:01.783 Fetching value of define "__SSE4_2__" : 1 00:02:01.783 Fetching value of define "__AES__" : 1 00:02:01.783 Fetching value of define "__AVX__" : 1 00:02:01.783 Fetching value of define "__AVX2__" : 1 00:02:01.783 Fetching value of define "__AVX512BW__" : 1 00:02:01.783 Fetching value of define "__AVX512CD__" : 1 00:02:01.783 Fetching value of define "__AVX512DQ__" : 1 00:02:01.783 Fetching value of define "__AVX512F__" : 1 00:02:01.783 Fetching value of define "__AVX512VL__" : 1 00:02:01.783 Fetching value of define "__PCLMUL__" : 1 00:02:01.783 Fetching value of define "__RDRND__" : 1 00:02:01.783 Fetching value of define "__RDSEED__" : 1 00:02:01.783 Fetching value of define "__VPCLMULQDQ__" : (undefined) 00:02:01.783 Compiler for C supports arguments -Wno-format-truncation: YES 00:02:01.783 Message: lib/kvargs: Defining dependency "kvargs" 00:02:01.783 Message: lib/telemetry: Defining dependency "telemetry" 00:02:01.783 Checking for function "getentropy" : YES 00:02:01.783 Message: lib/eal: Defining dependency "eal" 00:02:01.783 Message: lib/ring: Defining dependency "ring" 00:02:01.783 Message: lib/rcu: Defining dependency "rcu" 00:02:01.783 Message: lib/mempool: Defining dependency "mempool" 00:02:01.783 Message: lib/mbuf: Defining dependency "mbuf" 00:02:01.783 Fetching value of define "__PCLMUL__" : 1 (cached) 00:02:01.783 Fetching value of define "__AVX512F__" : 1 (cached) 00:02:01.783 Fetching value of define "__AVX512BW__" : 1 (cached) 00:02:01.783 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:02:01.783 Fetching value of define "__AVX512VL__" : 1 (cached) 00:02:01.783 Fetching value of define "__VPCLMULQDQ__" : (undefined) (cached) 00:02:01.783 Compiler for C supports arguments -mpclmul: YES 00:02:01.783 Compiler for C supports arguments -maes: YES 00:02:01.783 Compiler for C supports arguments -mavx512f: YES (cached) 00:02:01.783 Compiler for C supports arguments -mavx512bw: YES 00:02:01.783 Compiler for C supports arguments -mavx512dq: YES 00:02:01.783 Compiler for C supports arguments -mavx512vl: YES 00:02:01.783 Compiler for C supports arguments -mvpclmulqdq: YES 00:02:01.783 Compiler for C supports arguments -mavx2: YES 00:02:01.783 Compiler for C supports arguments -mavx: YES 00:02:01.783 Message: lib/net: Defining dependency "net" 00:02:01.783 Message: lib/meter: Defining dependency "meter" 00:02:01.783 Message: lib/ethdev: Defining dependency "ethdev" 00:02:01.783 Message: lib/pci: Defining dependency "pci" 00:02:01.783 Message: lib/cmdline: Defining dependency "cmdline" 00:02:01.783 Message: lib/metrics: Defining dependency "metrics" 00:02:01.783 Message: lib/hash: Defining dependency "hash" 00:02:01.783 Message: lib/timer: Defining dependency "timer" 00:02:01.784 Fetching value of define "__AVX2__" : 1 (cached) 00:02:01.784 Fetching value of define "__AVX512F__" : 1 (cached) 00:02:01.784 Fetching value of define "__AVX512VL__" : 1 (cached) 00:02:01.784 Fetching value of define "__AVX512CD__" : 1 (cached) 00:02:01.784 Fetching value of define "__AVX512BW__" : 1 (cached) 00:02:01.784 Message: lib/acl: Defining dependency "acl" 00:02:01.784 Message: lib/bbdev: Defining dependency "bbdev" 00:02:01.784 Message: lib/bitratestats: Defining dependency "bitratestats" 00:02:01.784 Run-time dependency libelf found: YES 0.191 00:02:01.784 Message: lib/bpf: Defining dependency "bpf" 00:02:01.784 Message: lib/cfgfile: Defining dependency "cfgfile" 00:02:01.784 Message: lib/compressdev: Defining dependency "compressdev" 00:02:01.784 Message: lib/cryptodev: Defining dependency "cryptodev" 00:02:01.784 Message: lib/distributor: Defining dependency "distributor" 00:02:01.784 Message: lib/efd: Defining dependency "efd" 00:02:01.784 Message: lib/eventdev: Defining dependency "eventdev" 00:02:01.784 Message: lib/gpudev: Defining dependency "gpudev" 00:02:01.784 Message: lib/gro: Defining dependency "gro" 00:02:01.784 Message: lib/gso: Defining dependency "gso" 00:02:01.784 Message: lib/ip_frag: Defining dependency "ip_frag" 00:02:01.784 Message: lib/jobstats: Defining dependency "jobstats" 00:02:01.784 Message: lib/latencystats: Defining dependency "latencystats" 00:02:01.784 Message: lib/lpm: Defining dependency "lpm" 00:02:01.784 Fetching value of define "__AVX512F__" : 1 (cached) 00:02:01.784 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:02:01.784 Fetching value of define "__AVX512IFMA__" : (undefined) 00:02:01.784 Compiler for C supports arguments -mavx512f -mavx512dq -mavx512ifma: YES 00:02:01.784 Message: lib/member: Defining dependency "member" 00:02:01.784 Message: lib/pcapng: Defining dependency "pcapng" 00:02:01.784 Compiler for C supports arguments -Wno-cast-qual: YES 00:02:01.784 Message: lib/power: Defining dependency "power" 00:02:01.784 Message: lib/rawdev: Defining dependency "rawdev" 00:02:01.784 Message: lib/regexdev: Defining dependency "regexdev" 00:02:01.784 Message: lib/dmadev: Defining dependency "dmadev" 00:02:01.784 Message: lib/rib: Defining dependency "rib" 00:02:01.784 Message: lib/reorder: Defining dependency "reorder" 00:02:01.784 Message: lib/sched: Defining dependency "sched" 00:02:01.784 Message: lib/security: Defining dependency "security" 00:02:01.784 Message: lib/stack: Defining dependency "stack" 00:02:01.784 Has header "linux/userfaultfd.h" : YES 00:02:01.784 Message: lib/vhost: Defining dependency "vhost" 00:02:01.784 Message: lib/ipsec: Defining dependency "ipsec" 00:02:01.784 Fetching value of define "__AVX512F__" : 1 (cached) 00:02:01.784 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:02:01.784 Fetching value of define "__AVX512BW__" : 1 (cached) 00:02:01.784 Message: lib/fib: Defining dependency "fib" 00:02:01.784 Message: lib/port: Defining dependency "port" 00:02:01.784 Message: lib/pdump: Defining dependency "pdump" 00:02:01.784 Message: lib/table: Defining dependency "table" 00:02:01.784 Message: lib/pipeline: Defining dependency "pipeline" 00:02:01.784 Message: lib/graph: Defining dependency "graph" 00:02:01.784 Message: lib/node: Defining dependency "node" 00:02:01.784 Compiler for C supports arguments -Wno-format-truncation: YES (cached) 00:02:01.784 Message: drivers/bus/pci: Defining dependency "bus_pci" 00:02:01.784 Message: drivers/bus/vdev: Defining dependency "bus_vdev" 00:02:01.784 Message: drivers/mempool/ring: Defining dependency "mempool_ring" 00:02:01.784 Compiler for C supports arguments -Wno-sign-compare: YES 00:02:01.784 Compiler for C supports arguments -Wno-unused-value: YES 00:02:01.784 Compiler for C supports arguments -Wno-format: YES 00:02:01.784 Compiler for C supports arguments -Wno-format-security: YES 00:02:01.784 Compiler for C supports arguments -Wno-format-nonliteral: YES 00:02:02.356 Compiler for C supports arguments -Wno-strict-aliasing: YES 00:02:02.356 Compiler for C supports arguments -Wno-unused-but-set-variable: YES 00:02:02.356 Compiler for C supports arguments -Wno-unused-parameter: YES 00:02:02.356 Fetching value of define "__AVX2__" : 1 (cached) 00:02:02.356 Fetching value of define "__AVX512F__" : 1 (cached) 00:02:02.356 Fetching value of define "__AVX512BW__" : 1 (cached) 00:02:02.356 Compiler for C supports arguments -mavx512f: YES (cached) 00:02:02.356 Compiler for C supports arguments -mavx512bw: YES (cached) 00:02:02.356 Compiler for C supports arguments -march=skylake-avx512: YES 00:02:02.356 Message: drivers/net/i40e: Defining dependency "net_i40e" 00:02:02.356 Program doxygen found: YES (/usr/local/bin/doxygen) 00:02:02.356 Configuring doxy-api.conf using configuration 00:02:02.356 Program sphinx-build found: NO 00:02:02.356 Configuring rte_build_config.h using configuration 00:02:02.356 Message: 00:02:02.356 ================= 00:02:02.356 Applications Enabled 00:02:02.356 ================= 00:02:02.356 00:02:02.356 apps: 00:02:02.356 dumpcap, pdump, proc-info, test-acl, test-bbdev, test-cmdline, test-compress-perf, test-crypto-perf, 00:02:02.356 test-eventdev, test-fib, test-flow-perf, test-gpudev, test-pipeline, test-pmd, test-regex, test-sad, 00:02:02.356 test-security-perf, 00:02:02.356 00:02:02.356 Message: 00:02:02.356 ================= 00:02:02.356 Libraries Enabled 00:02:02.356 ================= 00:02:02.356 00:02:02.356 libs: 00:02:02.356 kvargs, telemetry, eal, ring, rcu, mempool, mbuf, net, 00:02:02.356 meter, ethdev, pci, cmdline, metrics, hash, timer, acl, 00:02:02.356 bbdev, bitratestats, bpf, cfgfile, compressdev, cryptodev, distributor, efd, 00:02:02.356 eventdev, gpudev, gro, gso, ip_frag, jobstats, latencystats, lpm, 00:02:02.356 member, pcapng, power, rawdev, regexdev, dmadev, rib, reorder, 00:02:02.356 sched, security, stack, vhost, ipsec, fib, port, pdump, 00:02:02.356 table, pipeline, graph, node, 00:02:02.356 00:02:02.356 Message: 00:02:02.356 =============== 00:02:02.356 Drivers Enabled 00:02:02.356 =============== 00:02:02.356 00:02:02.356 common: 00:02:02.356 00:02:02.356 bus: 00:02:02.356 pci, vdev, 00:02:02.356 mempool: 00:02:02.356 ring, 00:02:02.356 dma: 00:02:02.356 00:02:02.356 net: 00:02:02.356 i40e, 00:02:02.356 raw: 00:02:02.356 00:02:02.356 crypto: 00:02:02.356 00:02:02.356 compress: 00:02:02.356 00:02:02.356 regex: 00:02:02.356 00:02:02.356 vdpa: 00:02:02.356 00:02:02.356 event: 00:02:02.356 00:02:02.356 baseband: 00:02:02.356 00:02:02.356 gpu: 00:02:02.356 00:02:02.356 00:02:02.356 Message: 00:02:02.356 ================= 00:02:02.356 Content Skipped 00:02:02.356 ================= 00:02:02.356 00:02:02.356 apps: 00:02:02.356 00:02:02.356 libs: 00:02:02.356 kni: explicitly disabled via build config (deprecated lib) 00:02:02.356 flow_classify: explicitly disabled via build config (deprecated lib) 00:02:02.356 00:02:02.356 drivers: 00:02:02.356 common/cpt: not in enabled drivers build config 00:02:02.356 common/dpaax: not in enabled drivers build config 00:02:02.356 common/iavf: not in enabled drivers build config 00:02:02.356 common/idpf: not in enabled drivers build config 00:02:02.356 common/mvep: not in enabled drivers build config 00:02:02.356 common/octeontx: not in enabled drivers build config 00:02:02.356 bus/auxiliary: not in enabled drivers build config 00:02:02.356 bus/dpaa: not in enabled drivers build config 00:02:02.356 bus/fslmc: not in enabled drivers build config 00:02:02.356 bus/ifpga: not in enabled drivers build config 00:02:02.356 bus/vmbus: not in enabled drivers build config 00:02:02.356 common/cnxk: not in enabled drivers build config 00:02:02.356 common/mlx5: not in enabled drivers build config 00:02:02.356 common/qat: not in enabled drivers build config 00:02:02.356 common/sfc_efx: not in enabled drivers build config 00:02:02.356 mempool/bucket: not in enabled drivers build config 00:02:02.356 mempool/cnxk: not in enabled drivers build config 00:02:02.356 mempool/dpaa: not in enabled drivers build config 00:02:02.356 mempool/dpaa2: not in enabled drivers build config 00:02:02.356 mempool/octeontx: not in enabled drivers build config 00:02:02.356 mempool/stack: not in enabled drivers build config 00:02:02.356 dma/cnxk: not in enabled drivers build config 00:02:02.356 dma/dpaa: not in enabled drivers build config 00:02:02.356 dma/dpaa2: not in enabled drivers build config 00:02:02.356 dma/hisilicon: not in enabled drivers build config 00:02:02.356 dma/idxd: not in enabled drivers build config 00:02:02.356 dma/ioat: not in enabled drivers build config 00:02:02.356 dma/skeleton: not in enabled drivers build config 00:02:02.356 net/af_packet: not in enabled drivers build config 00:02:02.356 net/af_xdp: not in enabled drivers build config 00:02:02.356 net/ark: not in enabled drivers build config 00:02:02.356 net/atlantic: not in enabled drivers build config 00:02:02.356 net/avp: not in enabled drivers build config 00:02:02.356 net/axgbe: not in enabled drivers build config 00:02:02.356 net/bnx2x: not in enabled drivers build config 00:02:02.356 net/bnxt: not in enabled drivers build config 00:02:02.356 net/bonding: not in enabled drivers build config 00:02:02.356 net/cnxk: not in enabled drivers build config 00:02:02.356 net/cxgbe: not in enabled drivers build config 00:02:02.356 net/dpaa: not in enabled drivers build config 00:02:02.356 net/dpaa2: not in enabled drivers build config 00:02:02.356 net/e1000: not in enabled drivers build config 00:02:02.356 net/ena: not in enabled drivers build config 00:02:02.356 net/enetc: not in enabled drivers build config 00:02:02.356 net/enetfec: not in enabled drivers build config 00:02:02.356 net/enic: not in enabled drivers build config 00:02:02.356 net/failsafe: not in enabled drivers build config 00:02:02.356 net/fm10k: not in enabled drivers build config 00:02:02.356 net/gve: not in enabled drivers build config 00:02:02.356 net/hinic: not in enabled drivers build config 00:02:02.356 net/hns3: not in enabled drivers build config 00:02:02.356 net/iavf: not in enabled drivers build config 00:02:02.356 net/ice: not in enabled drivers build config 00:02:02.356 net/idpf: not in enabled drivers build config 00:02:02.356 net/igc: not in enabled drivers build config 00:02:02.356 net/ionic: not in enabled drivers build config 00:02:02.356 net/ipn3ke: not in enabled drivers build config 00:02:02.356 net/ixgbe: not in enabled drivers build config 00:02:02.356 net/kni: not in enabled drivers build config 00:02:02.356 net/liquidio: not in enabled drivers build config 00:02:02.356 net/mana: not in enabled drivers build config 00:02:02.356 net/memif: not in enabled drivers build config 00:02:02.356 net/mlx4: not in enabled drivers build config 00:02:02.356 net/mlx5: not in enabled drivers build config 00:02:02.356 net/mvneta: not in enabled drivers build config 00:02:02.356 net/mvpp2: not in enabled drivers build config 00:02:02.356 net/netvsc: not in enabled drivers build config 00:02:02.356 net/nfb: not in enabled drivers build config 00:02:02.356 net/nfp: not in enabled drivers build config 00:02:02.356 net/ngbe: not in enabled drivers build config 00:02:02.356 net/null: not in enabled drivers build config 00:02:02.356 net/octeontx: not in enabled drivers build config 00:02:02.356 net/octeon_ep: not in enabled drivers build config 00:02:02.356 net/pcap: not in enabled drivers build config 00:02:02.356 net/pfe: not in enabled drivers build config 00:02:02.356 net/qede: not in enabled drivers build config 00:02:02.356 net/ring: not in enabled drivers build config 00:02:02.356 net/sfc: not in enabled drivers build config 00:02:02.356 net/softnic: not in enabled drivers build config 00:02:02.356 net/tap: not in enabled drivers build config 00:02:02.356 net/thunderx: not in enabled drivers build config 00:02:02.356 net/txgbe: not in enabled drivers build config 00:02:02.356 net/vdev_netvsc: not in enabled drivers build config 00:02:02.356 net/vhost: not in enabled drivers build config 00:02:02.356 net/virtio: not in enabled drivers build config 00:02:02.356 net/vmxnet3: not in enabled drivers build config 00:02:02.356 raw/cnxk_bphy: not in enabled drivers build config 00:02:02.356 raw/cnxk_gpio: not in enabled drivers build config 00:02:02.356 raw/dpaa2_cmdif: not in enabled drivers build config 00:02:02.357 raw/ifpga: not in enabled drivers build config 00:02:02.357 raw/ntb: not in enabled drivers build config 00:02:02.357 raw/skeleton: not in enabled drivers build config 00:02:02.357 crypto/armv8: not in enabled drivers build config 00:02:02.357 crypto/bcmfs: not in enabled drivers build config 00:02:02.357 crypto/caam_jr: not in enabled drivers build config 00:02:02.357 crypto/ccp: not in enabled drivers build config 00:02:02.357 crypto/cnxk: not in enabled drivers build config 00:02:02.357 crypto/dpaa_sec: not in enabled drivers build config 00:02:02.357 crypto/dpaa2_sec: not in enabled drivers build config 00:02:02.357 crypto/ipsec_mb: not in enabled drivers build config 00:02:02.357 crypto/mlx5: not in enabled drivers build config 00:02:02.357 crypto/mvsam: not in enabled drivers build config 00:02:02.357 crypto/nitrox: not in enabled drivers build config 00:02:02.357 crypto/null: not in enabled drivers build config 00:02:02.357 crypto/octeontx: not in enabled drivers build config 00:02:02.357 crypto/openssl: not in enabled drivers build config 00:02:02.357 crypto/scheduler: not in enabled drivers build config 00:02:02.357 crypto/uadk: not in enabled drivers build config 00:02:02.357 crypto/virtio: not in enabled drivers build config 00:02:02.357 compress/isal: not in enabled drivers build config 00:02:02.357 compress/mlx5: not in enabled drivers build config 00:02:02.357 compress/octeontx: not in enabled drivers build config 00:02:02.357 compress/zlib: not in enabled drivers build config 00:02:02.357 regex/mlx5: not in enabled drivers build config 00:02:02.357 regex/cn9k: not in enabled drivers build config 00:02:02.357 vdpa/ifc: not in enabled drivers build config 00:02:02.357 vdpa/mlx5: not in enabled drivers build config 00:02:02.357 vdpa/sfc: not in enabled drivers build config 00:02:02.357 event/cnxk: not in enabled drivers build config 00:02:02.357 event/dlb2: not in enabled drivers build config 00:02:02.357 event/dpaa: not in enabled drivers build config 00:02:02.357 event/dpaa2: not in enabled drivers build config 00:02:02.357 event/dsw: not in enabled drivers build config 00:02:02.357 event/opdl: not in enabled drivers build config 00:02:02.357 event/skeleton: not in enabled drivers build config 00:02:02.357 event/sw: not in enabled drivers build config 00:02:02.357 event/octeontx: not in enabled drivers build config 00:02:02.357 baseband/acc: not in enabled drivers build config 00:02:02.357 baseband/fpga_5gnr_fec: not in enabled drivers build config 00:02:02.357 baseband/fpga_lte_fec: not in enabled drivers build config 00:02:02.357 baseband/la12xx: not in enabled drivers build config 00:02:02.357 baseband/null: not in enabled drivers build config 00:02:02.357 baseband/turbo_sw: not in enabled drivers build config 00:02:02.357 gpu/cuda: not in enabled drivers build config 00:02:02.357 00:02:02.357 00:02:02.357 Build targets in project: 311 00:02:02.357 00:02:02.357 DPDK 22.11.4 00:02:02.357 00:02:02.357 User defined options 00:02:02.357 libdir : lib 00:02:02.357 prefix : /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:02:02.357 c_args : -fPIC -g -fcommon -Werror -Wno-stringop-overflow 00:02:02.357 c_link_args : 00:02:02.357 enable_docs : false 00:02:02.357 enable_drivers: bus,bus/pci,bus/vdev,mempool/ring,net/i40e,net/i40e/base, 00:02:02.357 enable_kmods : false 00:02:02.357 machine : native 00:02:02.357 tests : false 00:02:02.357 00:02:02.357 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:02:02.357 WARNING: Running the setup command as `meson [options]` instead of `meson setup [options]` is ambiguous and deprecated. 00:02:02.620 10:56:27 build_native_dpdk -- common/autobuild_common.sh@192 -- $ ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp -j112 00:02:02.620 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp' 00:02:02.620 [1/740] Generating lib/rte_kvargs_def with a custom command 00:02:02.620 [2/740] Generating lib/rte_kvargs_mingw with a custom command 00:02:02.620 [3/740] Generating lib/rte_telemetry_mingw with a custom command 00:02:02.620 [4/740] Generating lib/rte_telemetry_def with a custom command 00:02:02.620 [5/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hypervisor.c.o 00:02:02.620 [6/740] Compiling C object lib/librte_eal.a.p/eal_common_rte_version.c.o 00:02:02.886 [7/740] Generating lib/rte_ring_def with a custom command 00:02:02.887 [8/740] Generating lib/rte_mbuf_def with a custom command 00:02:02.887 [9/740] Generating lib/rte_rcu_def with a custom command 00:02:02.887 [10/740] Generating lib/rte_eal_mingw with a custom command 00:02:02.887 [11/740] Generating lib/rte_mempool_def with a custom command 00:02:02.887 [12/740] Generating lib/rte_rcu_mingw with a custom command 00:02:02.887 [13/740] Generating lib/rte_net_mingw with a custom command 00:02:02.887 [14/740] Generating lib/rte_mempool_mingw with a custom command 00:02:02.887 [15/740] Generating lib/rte_meter_mingw with a custom command 00:02:02.887 [16/740] Generating lib/rte_meter_def with a custom command 00:02:02.887 [17/740] Generating lib/rte_eal_def with a custom command 00:02:02.887 [18/740] Generating lib/rte_ring_mingw with a custom command 00:02:02.887 [19/740] Generating lib/rte_mbuf_mingw with a custom command 00:02:02.887 [20/740] Generating lib/rte_net_def with a custom command 00:02:02.887 [21/740] Compiling C object lib/librte_eal.a.p/eal_x86_rte_spinlock.c.o 00:02:02.887 [22/740] Compiling C object lib/librte_eal.a.p/eal_x86_rte_hypervisor.c.o 00:02:02.887 [23/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_cpuflags.c.o 00:02:02.887 [24/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_class.c.o 00:02:02.887 [25/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_log.c.o 00:02:02.887 [26/740] Compiling C object lib/librte_eal.a.p/eal_common_rte_reciprocal.c.o 00:02:02.887 [27/740] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cpuflags.c.o 00:02:02.887 [28/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_errno.c.o 00:02:02.887 [29/740] Generating lib/rte_pci_def with a custom command 00:02:02.887 [30/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_cpuflags.c.o 00:02:02.887 [31/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hexdump.c.o 00:02:02.887 [32/740] Generating lib/rte_pci_mingw with a custom command 00:02:02.887 [33/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_uuid.c.o 00:02:02.887 [34/740] Generating lib/rte_ethdev_mingw with a custom command 00:02:02.887 [35/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_debug.c.o 00:02:02.887 [36/740] Generating lib/rte_ethdev_def with a custom command 00:02:02.887 [37/740] Compiling C object lib/librte_kvargs.a.p/kvargs_rte_kvargs.c.o 00:02:02.887 [38/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_timer.c.o 00:02:02.887 [39/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio_mp_sync.c.o 00:02:02.887 [40/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_firmware.c.o 00:02:02.887 [41/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_string_fns.c.o 00:02:02.887 [42/740] Generating lib/rte_cmdline_def with a custom command 00:02:02.887 [43/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_thread.c.o 00:02:02.887 [44/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_config.c.o 00:02:02.887 [45/740] Linking static target lib/librte_kvargs.a 00:02:02.887 [46/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_debug.c.o 00:02:02.887 [47/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_lcore.c.o 00:02:02.887 [48/740] Generating lib/rte_cmdline_mingw with a custom command 00:02:02.887 [49/740] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_data.c.o 00:02:02.887 [50/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_tailqs.c.o 00:02:02.887 [51/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_timer.c.o 00:02:02.887 [52/740] Generating lib/rte_metrics_def with a custom command 00:02:02.887 [53/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_thread.c.o 00:02:02.887 [54/740] Generating lib/rte_metrics_mingw with a custom command 00:02:02.887 [55/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_launch.c.o 00:02:02.887 [56/740] Compiling C object lib/librte_eal.a.p/eal_x86_rte_power_intrinsics.c.o 00:02:02.887 [57/740] Generating lib/rte_hash_def with a custom command 00:02:02.887 [58/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_timer.c.o 00:02:02.887 [59/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_vt100.c.o 00:02:02.887 [60/740] Generating lib/rte_timer_def with a custom command 00:02:02.887 [61/740] Generating lib/rte_hash_mingw with a custom command 00:02:02.887 [62/740] Compiling C object lib/librte_eal.a.p/eal_common_rte_keepalive.c.o 00:02:02.887 [63/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_file.c.o 00:02:02.887 [64/740] Generating lib/rte_timer_mingw with a custom command 00:02:02.887 [65/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_mcfg.c.o 00:02:02.887 [66/740] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cycles.c.o 00:02:02.887 [67/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_filesystem.c.o 00:02:02.887 [68/740] Compiling C object lib/librte_eal.a.p/eal_unix_rte_thread.c.o 00:02:02.887 [69/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_memory.c.o 00:02:02.887 [70/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_socket.c.o 00:02:02.887 [71/740] Generating lib/rte_acl_mingw with a custom command 00:02:02.887 [72/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_bus.c.o 00:02:02.887 [73/740] Generating lib/rte_acl_def with a custom command 00:02:02.887 [74/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_ctf.c.o 00:02:03.156 [75/740] Compiling C object lib/librte_pci.a.p/pci_rte_pci.c.o 00:02:03.156 [76/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_os_unix.c.o 00:02:03.156 [77/740] Generating lib/rte_bitratestats_def with a custom command 00:02:03.156 [78/740] Generating lib/rte_bbdev_mingw with a custom command 00:02:03.156 [79/740] Generating lib/rte_bitratestats_mingw with a custom command 00:02:03.156 [80/740] Generating lib/rte_bbdev_def with a custom command 00:02:03.156 [81/740] Linking static target lib/librte_pci.a 00:02:03.156 [82/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_portlist.c.o 00:02:03.156 [83/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_alarm.c.o 00:02:03.156 [84/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_num.c.o 00:02:03.156 [85/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_devargs.c.o 00:02:03.156 [86/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline.c.o 00:02:03.156 [87/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_string.c.o 00:02:03.156 [88/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memalloc.c.o 00:02:03.156 [89/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_points.c.o 00:02:03.156 [90/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace.c.o 00:02:03.156 [91/740] Compiling C object lib/librte_meter.a.p/meter_rte_meter.c.o 00:02:03.156 [92/740] Generating lib/rte_bpf_def with a custom command 00:02:03.156 [93/740] Generating lib/rte_bpf_mingw with a custom command 00:02:03.156 [94/740] Generating lib/rte_cfgfile_def with a custom command 00:02:03.156 [95/740] Generating lib/rte_cfgfile_mingw with a custom command 00:02:03.156 [96/740] Compiling C object lib/librte_ring.a.p/ring_rte_ring.c.o 00:02:03.156 [97/740] Compiling C object lib/librte_eal.a.p/eal_common_hotplug_mp.c.o 00:02:03.156 [98/740] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_legacy.c.o 00:02:03.156 [99/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_thread.c.o 00:02:03.156 [100/740] Linking static target lib/librte_meter.a 00:02:03.156 [101/740] Generating lib/rte_compressdev_mingw with a custom command 00:02:03.156 [102/740] Linking static target lib/librte_ring.a 00:02:03.156 [103/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_lcore.c.o 00:02:03.156 [104/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_interrupts.c.o 00:02:03.156 [105/740] Generating lib/rte_compressdev_def with a custom command 00:02:03.156 [106/740] Generating lib/rte_cryptodev_mingw with a custom command 00:02:03.156 [107/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_cirbuf.c.o 00:02:03.156 [108/740] Generating lib/rte_cryptodev_def with a custom command 00:02:03.156 [109/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dev.c.o 00:02:03.156 [110/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_log.c.o 00:02:03.156 [111/740] Generating lib/rte_distributor_mingw with a custom command 00:02:03.156 [112/740] Generating lib/rte_efd_def with a custom command 00:02:03.156 [113/740] Generating lib/rte_distributor_def with a custom command 00:02:03.156 [114/740] Generating lib/rte_efd_mingw with a custom command 00:02:03.156 [115/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dynmem.c.o 00:02:03.156 [116/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memzone.c.o 00:02:03.156 [117/740] Compiling C object lib/librte_eal.a.p/eal_common_malloc_elem.c.o 00:02:03.156 [118/740] Compiling C object lib/librte_eal.a.p/eal_common_malloc_mp.c.o 00:02:03.156 [119/740] Compiling C object lib/librte_hash.a.p/hash_rte_fbk_hash.c.o 00:02:03.156 [120/740] Generating lib/rte_gpudev_mingw with a custom command 00:02:03.156 [121/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_utils.c.o 00:02:03.156 [122/740] Generating lib/rte_eventdev_def with a custom command 00:02:03.156 [123/740] Generating lib/rte_eventdev_mingw with a custom command 00:02:03.156 [124/740] Generating lib/rte_gpudev_def with a custom command 00:02:03.156 [125/740] Compiling C object lib/librte_metrics.a.p/metrics_rte_metrics.c.o 00:02:03.156 [126/740] Generating lib/rte_gro_mingw with a custom command 00:02:03.156 [127/740] Generating lib/rte_gro_def with a custom command 00:02:03.156 [128/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_dev.c.o 00:02:03.156 [129/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse.c.o 00:02:03.156 [130/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_hugepage_info.c.o 00:02:03.157 [131/740] Generating lib/rte_gso_def with a custom command 00:02:03.157 [132/740] Generating lib/rte_gso_mingw with a custom command 00:02:03.417 [133/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_rdline.c.o 00:02:03.417 [134/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memory.c.o 00:02:03.417 [135/740] Compiling C object lib/librte_eal.a.p/eal_common_rte_service.c.o 00:02:03.417 [136/740] Generating lib/rte_ip_frag_def with a custom command 00:02:03.417 [137/740] Generating lib/rte_ip_frag_mingw with a custom command 00:02:03.417 [138/740] Generating lib/kvargs.sym_chk with a custom command (wrapped by meson to capture output) 00:02:03.417 [139/740] Generating lib/pci.sym_chk with a custom command (wrapped by meson to capture output) 00:02:03.418 [140/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal.c.o 00:02:03.418 [141/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_proc.c.o 00:02:03.418 [142/740] Generating lib/rte_jobstats_def with a custom command 00:02:03.418 [143/740] Generating lib/rte_jobstats_mingw with a custom command 00:02:03.418 [144/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_fbarray.c.o 00:02:03.418 [145/740] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops_default.c.o 00:02:03.418 [146/740] Compiling C object lib/librte_net.a.p/net_net_crc_sse.c.o 00:02:03.418 [147/740] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_pool_ops.c.o 00:02:03.418 [148/740] Compiling C object lib/librte_cfgfile.a.p/cfgfile_rte_cfgfile.c.o 00:02:03.418 [149/740] Compiling C object lib/librte_eal.a.p/eal_common_malloc_heap.c.o 00:02:03.418 [150/740] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_ptype.c.o 00:02:03.418 [151/740] Linking static target lib/librte_cfgfile.a 00:02:03.418 [152/740] Linking target lib/librte_kvargs.so.23.0 00:02:03.418 [153/740] Generating lib/rte_latencystats_mingw with a custom command 00:02:03.418 [154/740] Generating lib/rte_latencystats_def with a custom command 00:02:03.418 [155/740] Compiling C object lib/librte_net.a.p/net_rte_net_crc.c.o 00:02:03.418 [156/740] Compiling C object lib/librte_eal.a.p/eal_common_rte_random.c.o 00:02:03.418 [157/740] Generating lib/meter.sym_chk with a custom command (wrapped by meson to capture output) 00:02:03.418 [158/740] Generating lib/rte_lpm_def with a custom command 00:02:03.418 [159/740] Generating lib/rte_lpm_mingw with a custom command 00:02:03.418 [160/740] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_profile.c.o 00:02:03.418 [161/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_interrupts.c.o 00:02:03.418 [162/740] Generating lib/ring.sym_chk with a custom command (wrapped by meson to capture output) 00:02:03.418 [163/740] Generating lib/rte_member_def with a custom command 00:02:03.418 [164/740] Generating lib/rte_member_mingw with a custom command 00:02:03.418 [165/740] Compiling C object lib/librte_mempool.a.p/mempool_mempool_trace_points.c.o 00:02:03.418 [166/740] Compiling C object lib/librte_net.a.p/net_rte_ether.c.o 00:02:03.418 [167/740] Generating lib/rte_pcapng_def with a custom command 00:02:03.418 [168/740] Generating lib/rte_pcapng_mingw with a custom command 00:02:03.685 [169/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_etheraddr.c.o 00:02:03.685 [170/740] Compiling C object lib/librte_jobstats.a.p/jobstats_rte_jobstats.c.o 00:02:03.685 [171/740] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_cman.c.o 00:02:03.685 [172/740] Linking static target lib/librte_jobstats.a 00:02:03.685 [173/740] Compiling C object lib/librte_acl.a.p/acl_tb_mem.c.o 00:02:03.685 [174/740] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_trace_points.c.o 00:02:03.685 [175/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_ipaddr.c.o 00:02:03.685 [176/740] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops.c.o 00:02:03.685 [177/740] Compiling C object lib/librte_timer.a.p/timer_rte_timer.c.o 00:02:03.685 [178/740] Generating lib/rte_power_def with a custom command 00:02:03.685 [179/740] Compiling C object lib/net/libnet_crc_avx512_lib.a.p/net_crc_avx512.c.o 00:02:03.685 [180/740] Compiling C object lib/librte_metrics.a.p/metrics_rte_metrics_telemetry.c.o 00:02:03.685 [181/740] Linking static target lib/librte_cmdline.a 00:02:03.685 [182/740] Generating lib/rte_power_mingw with a custom command 00:02:03.685 [183/740] Linking static target lib/librte_timer.a 00:02:03.685 [184/740] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_class_eth.c.o 00:02:03.685 [185/740] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry.c.o 00:02:03.685 [186/740] Linking static target lib/net/libnet_crc_avx512_lib.a 00:02:03.685 [187/740] Linking static target lib/librte_metrics.a 00:02:03.685 [188/740] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_telemetry.c.o 00:02:03.685 [189/740] Generating lib/rte_rawdev_def with a custom command 00:02:03.685 [190/740] Linking static target lib/librte_telemetry.a 00:02:03.685 [191/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memalloc.c.o 00:02:03.685 [192/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_stub.c.o 00:02:03.685 [193/740] Compiling C object lib/librte_power.a.p/power_power_common.c.o 00:02:03.685 [194/740] Generating lib/rte_rawdev_mingw with a custom command 00:02:03.685 [195/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf.c.o 00:02:03.685 [196/740] Compiling C object lib/librte_power.a.p/power_guest_channel.c.o 00:02:03.685 [197/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memory.c.o 00:02:03.685 [198/740] Compiling C object lib/librte_acl.a.p/acl_rte_acl.c.o 00:02:03.685 [199/740] Compiling C object lib/librte_net.a.p/net_rte_net.c.o 00:02:03.685 [200/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_dump.c.o 00:02:03.685 [201/740] Compiling C object lib/librte_power.a.p/power_power_kvm_vm.c.o 00:02:03.685 [202/740] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_private.c.o 00:02:03.685 [203/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_load.c.o 00:02:03.685 [204/740] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8472.c.o 00:02:03.686 [205/740] Generating lib/rte_regexdev_def with a custom command 00:02:03.686 [206/740] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_dyn.c.o 00:02:03.686 [207/740] Generating lib/rte_dmadev_def with a custom command 00:02:03.686 [208/740] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor_match_sse.c.o 00:02:03.686 [209/740] Compiling C object lib/librte_net.a.p/net_rte_arp.c.o 00:02:03.686 [210/740] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev_pmd.c.o 00:02:03.686 [211/740] Compiling C object lib/librte_power.a.p/power_rte_power.c.o 00:02:03.686 [212/740] Linking static target lib/librte_net.a 00:02:03.686 [213/740] Generating lib/rte_rib_def with a custom command 00:02:03.686 [214/740] Generating lib/rte_regexdev_mingw with a custom command 00:02:03.686 [215/740] Generating lib/rte_dmadev_mingw with a custom command 00:02:03.686 [216/740] Generating lib/rte_rib_mingw with a custom command 00:02:03.686 [217/740] Generating lib/rte_reorder_mingw with a custom command 00:02:03.686 [218/740] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_mtr.c.o 00:02:03.686 [219/740] Compiling C object lib/librte_bitratestats.a.p/bitratestats_rte_bitrate.c.o 00:02:03.686 [220/740] Generating lib/rte_reorder_def with a custom command 00:02:03.686 [221/740] Generating lib/rte_sched_mingw with a custom command 00:02:03.686 [222/740] Linking static target lib/librte_bitratestats.a 00:02:03.686 [223/740] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_ring.c.o 00:02:03.686 [224/740] Generating lib/rte_security_def with a custom command 00:02:03.686 [225/740] Generating lib/rte_sched_def with a custom command 00:02:03.686 [226/740] Generating lib/rte_security_mingw with a custom command 00:02:03.686 [227/740] Compiling C object lib/librte_eventdev.a.p/eventdev_eventdev_private.c.o 00:02:03.686 [228/740] Generating lib/rte_stack_def with a custom command 00:02:03.686 [229/740] Compiling C object lib/librte_eal.a.p/eal_common_rte_malloc.c.o 00:02:03.686 [230/740] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_driver.c.o 00:02:03.686 [231/740] Generating lib/rte_stack_mingw with a custom command 00:02:03.686 [232/740] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8079.c.o 00:02:03.686 [233/740] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_pmd.c.o 00:02:03.686 [234/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio.c.o 00:02:03.686 [235/740] Compiling C object lib/librte_sched.a.p/sched_rte_pie.c.o 00:02:03.686 [236/740] Generating lib/rte_vhost_mingw with a custom command 00:02:03.950 [237/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_load_elf.c.o 00:02:03.950 [238/740] Generating lib/rte_vhost_def with a custom command 00:02:03.950 [239/740] Compiling C object lib/librte_stack.a.p/stack_rte_stack_std.c.o 00:02:03.950 [240/740] Generating lib/rte_ipsec_def with a custom command 00:02:03.950 [241/740] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_common.c.o 00:02:03.950 [242/740] Compiling C object lib/librte_gso.a.p/gso_gso_udp4.c.o 00:02:03.950 [243/740] Compiling C object lib/librte_gso.a.p/gso_gso_tcp4.c.o 00:02:03.950 [244/740] Compiling C object lib/librte_gso.a.p/gso_gso_tunnel_udp4.c.o 00:02:03.950 [245/740] Compiling C object lib/librte_power.a.p/power_rte_power_empty_poll.c.o 00:02:03.950 [246/740] Generating lib/rte_ipsec_mingw with a custom command 00:02:03.950 [247/740] Compiling C object lib/librte_sched.a.p/sched_rte_approx.c.o 00:02:03.950 [248/740] Compiling C object lib/librte_stack.a.p/stack_rte_stack_lf.c.o 00:02:03.950 [249/740] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_tm.c.o 00:02:03.950 [250/740] Compiling C object lib/librte_sched.a.p/sched_rte_red.c.o 00:02:03.950 [251/740] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor_single.c.o 00:02:03.950 [252/740] Compiling C object lib/librte_gso.a.p/gso_rte_gso.c.o 00:02:03.950 [253/740] Compiling C object lib/librte_stack.a.p/stack_rte_stack.c.o 00:02:03.950 [254/740] Compiling C object lib/librte_acl.a.p/acl_acl_gen.c.o 00:02:03.950 [255/740] Generating lib/rte_fib_mingw with a custom command 00:02:03.950 [256/740] Generating lib/rte_fib_def with a custom command 00:02:03.950 [257/740] Compiling C object lib/librte_eventdev.a.p/eventdev_eventdev_trace_points.c.o 00:02:03.950 [258/740] Compiling C object lib/librte_gso.a.p/gso_gso_tunnel_tcp4.c.o 00:02:03.950 [259/740] Linking static target lib/librte_stack.a 00:02:03.950 [260/740] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_comp.c.o 00:02:03.950 [261/740] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv4_reassembly.c.o 00:02:03.950 [262/740] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8636.c.o 00:02:03.950 [263/740] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev.c.o 00:02:03.950 [264/740] Linking static target lib/librte_compressdev.a 00:02:03.950 [265/740] Generating lib/rte_port_def with a custom command 00:02:03.950 [266/740] Compiling C object lib/librte_vhost.a.p/vhost_fd_man.c.o 00:02:03.950 [267/740] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv6_reassembly.c.o 00:02:03.950 [268/740] Generating lib/rte_pdump_def with a custom command 00:02:03.950 [269/740] Generating lib/rte_port_mingw with a custom command 00:02:03.950 [270/740] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_trace_points.c.o 00:02:03.950 [271/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_convert.c.o 00:02:03.950 [272/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_exec.c.o 00:02:03.950 [273/740] Generating lib/rte_pdump_mingw with a custom command 00:02:03.950 [274/740] Compiling C object lib/librte_acl.a.p/acl_acl_run_scalar.c.o 00:02:03.950 [275/740] Compiling C object lib/librte_gro.a.p/gro_gro_tcp4.c.o 00:02:03.950 [276/740] Generating lib/cfgfile.sym_chk with a custom command (wrapped by meson to capture output) 00:02:03.950 [277/740] Compiling C object lib/librte_fib.a.p/fib_rte_fib.c.o 00:02:03.950 [278/740] Compiling C object lib/librte_rcu.a.p/rcu_rte_rcu_qsbr.c.o 00:02:03.950 [279/740] Linking static target lib/librte_rcu.a 00:02:03.950 [280/740] Compiling C object lib/librte_gro.a.p/gro_gro_udp4.c.o 00:02:04.215 [281/740] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool.c.o 00:02:04.215 [282/740] Generating lib/jobstats.sym_chk with a custom command (wrapped by meson to capture output) 00:02:04.215 [283/740] Compiling C object lib/librte_gro.a.p/gro_rte_gro.c.o 00:02:04.215 [284/740] Compiling C object lib/librte_rawdev.a.p/rawdev_rte_rawdev.c.o 00:02:04.215 [285/740] Linking static target lib/librte_mempool.a 00:02:04.215 [286/740] Linking static target lib/librte_rawdev.a 00:02:04.215 [287/740] Generating lib/bitratestats.sym_chk with a custom command (wrapped by meson to capture output) 00:02:04.215 [288/740] Compiling C object lib/librte_bbdev.a.p/bbdev_rte_bbdev.c.o 00:02:04.215 [289/740] Compiling C object lib/librte_member.a.p/member_rte_member.c.o 00:02:04.215 [290/740] Linking static target lib/librte_bbdev.a 00:02:04.215 [291/740] Compiling C object lib/librte_gro.a.p/gro_gro_vxlan_tcp4.c.o 00:02:04.215 [292/740] Compiling C object lib/librte_gro.a.p/gro_gro_vxlan_udp4.c.o 00:02:04.215 [293/740] Generating lib/rte_table_def with a custom command 00:02:04.215 [294/740] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ip_frag_common.c.o 00:02:04.215 [295/740] Compiling C object lib/librte_gpudev.a.p/gpudev_gpudev.c.o 00:02:04.215 [296/740] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev.c.o 00:02:04.215 [297/740] Compiling C object lib/librte_hash.a.p/hash_rte_thash.c.o 00:02:04.215 [298/740] Linking static target lib/librte_gro.a 00:02:04.215 [299/740] Generating lib/rte_table_mingw with a custom command 00:02:04.215 [300/740] Linking static target lib/librte_gpudev.a 00:02:04.215 [301/740] Linking static target lib/librte_dmadev.a 00:02:04.215 [302/740] Generating lib/net.sym_chk with a custom command (wrapped by meson to capture output) 00:02:04.215 [303/740] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv6_fragmentation.c.o 00:02:04.215 [304/740] Compiling C object lib/librte_table.a.p/table_rte_swx_keycmp.c.o 00:02:04.215 [305/740] Compiling C object lib/librte_gso.a.p/gso_gso_common.c.o 00:02:04.215 [306/740] Compiling C object lib/librte_ip_frag.a.p/ip_frag_ip_frag_internal.c.o 00:02:04.215 [307/740] Generating lib/rte_pipeline_def with a custom command 00:02:04.215 [308/740] Linking static target lib/librte_gso.a 00:02:04.215 [309/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_pkt.c.o 00:02:04.215 [310/740] Compiling C object lib/member/libsketch_avx512_tmp.a.p/rte_member_sketch_avx512.c.o 00:02:04.215 [311/740] Linking static target lib/member/libsketch_avx512_tmp.a 00:02:04.215 [312/740] Compiling C object lib/librte_power.a.p/power_rte_power_intel_uncore.c.o 00:02:04.215 [313/740] Compiling C object lib/librte_latencystats.a.p/latencystats_rte_latencystats.c.o 00:02:04.215 [314/740] Generating lib/timer.sym_chk with a custom command (wrapped by meson to capture output) 00:02:04.215 [315/740] Generating lib/rte_pipeline_mingw with a custom command 00:02:04.215 [316/740] Linking static target lib/librte_latencystats.a 00:02:04.215 [317/740] Generating lib/stack.sym_chk with a custom command (wrapped by meson to capture output) 00:02:04.215 [318/740] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor.c.o 00:02:04.215 [319/740] Generating lib/metrics.sym_chk with a custom command (wrapped by meson to capture output) 00:02:04.215 [320/740] Generating lib/telemetry.sym_chk with a custom command (wrapped by meson to capture output) 00:02:04.215 [321/740] Linking static target lib/librte_distributor.a 00:02:04.215 [322/740] Compiling C object lib/librte_power.a.p/power_power_acpi_cpufreq.c.o 00:02:04.215 [323/740] Generating lib/rte_graph_mingw with a custom command 00:02:04.476 [324/740] Generating lib/rte_graph_def with a custom command 00:02:04.476 [325/740] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv4_fragmentation.c.o 00:02:04.476 [326/740] Linking static target lib/librte_ip_frag.a 00:02:04.476 [327/740] Linking target lib/librte_telemetry.so.23.0 00:02:04.476 [328/740] Compiling C object lib/librte_lpm.a.p/lpm_rte_lpm.c.o 00:02:04.476 [329/740] Compiling C object lib/librte_power.a.p/power_power_cppc_cpufreq.c.o 00:02:04.476 [330/740] Compiling C object lib/librte_table.a.p/table_rte_swx_table_learner.c.o 00:02:04.476 [331/740] Compiling C object lib/librte_regexdev.a.p/regexdev_rte_regexdev.c.o 00:02:04.476 [332/740] Compiling C object lib/librte_power.a.p/power_rte_power_pmd_mgmt.c.o 00:02:04.476 [333/740] Linking static target lib/librte_regexdev.a 00:02:04.476 [334/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_options.c.o 00:02:04.476 [335/740] Compiling C object lib/librte_node.a.p/node_null.c.o 00:02:04.476 [336/740] Compiling C object lib/librte_ipsec.a.p/ipsec_ses.c.o 00:02:04.476 [337/740] Compiling C object lib/librte_member.a.p/member_rte_member_vbf.c.o 00:02:04.476 [338/740] Compiling C object lib/librte_table.a.p/table_rte_swx_table_em.c.o 00:02:04.476 [339/740] Compiling C object lib/librte_fib.a.p/fib_rte_fib6.c.o 00:02:04.476 [340/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_validate.c.o 00:02:04.476 [341/740] Linking static target lib/librte_eal.a 00:02:04.476 [342/740] Generating lib/rte_node_mingw with a custom command 00:02:04.476 [343/740] Compiling C object lib/librte_ipsec.a.p/ipsec_ipsec_telemetry.c.o 00:02:04.476 [344/740] Generating lib/rte_node_def with a custom command 00:02:04.476 [345/740] Generating lib/gro.sym_chk with a custom command (wrapped by meson to capture output) 00:02:04.476 [346/740] Generating lib/gso.sym_chk with a custom command (wrapped by meson to capture output) 00:02:04.476 [347/740] Compiling C object lib/librte_vhost.a.p/vhost_vdpa.c.o 00:02:04.476 [348/740] Compiling C object lib/librte_rib.a.p/rib_rte_rib.c.o 00:02:04.476 [349/740] Compiling C object lib/librte_power.a.p/power_power_pstate_cpufreq.c.o 00:02:04.476 [350/740] Compiling C object lib/librte_reorder.a.p/reorder_rte_reorder.c.o 00:02:04.476 [351/740] Generating drivers/rte_bus_pci_def with a custom command 00:02:04.476 [352/740] Generating lib/rcu.sym_chk with a custom command (wrapped by meson to capture output) 00:02:04.745 [353/740] Generating drivers/rte_bus_pci_mingw with a custom command 00:02:04.745 [354/740] Linking static target lib/librte_reorder.a 00:02:04.745 [355/740] Linking static target lib/librte_power.a 00:02:04.745 [356/740] Compiling C object lib/librte_port.a.p/port_rte_port_sched.c.o 00:02:04.745 [357/740] Generating lib/latencystats.sym_chk with a custom command (wrapped by meson to capture output) 00:02:04.745 [358/740] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_params.c.o 00:02:04.745 [359/740] Generating drivers/rte_bus_vdev_def with a custom command 00:02:04.745 [360/740] Compiling C object lib/librte_fib.a.p/fib_trie_avx512.c.o 00:02:04.745 [361/740] Compiling C object lib/librte_fib.a.p/fib_dir24_8_avx512.c.o 00:02:04.745 [362/740] Generating drivers/rte_bus_vdev_mingw with a custom command 00:02:04.745 [363/740] Compiling C object lib/librte_vhost.a.p/vhost_iotlb.c.o 00:02:04.745 [364/740] Generating drivers/rte_mempool_ring_def with a custom command 00:02:04.745 [365/740] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev_params.c.o 00:02:04.745 [366/740] Generating drivers/rte_mempool_ring_mingw with a custom command 00:02:04.745 [367/740] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf.c.o 00:02:04.745 [368/740] Compiling C object lib/librte_pcapng.a.p/pcapng_rte_pcapng.c.o 00:02:04.745 [369/740] Compiling C object lib/librte_security.a.p/security_rte_security.c.o 00:02:04.745 [370/740] Linking static target lib/librte_mbuf.a 00:02:04.745 [371/740] Linking static target lib/librte_pcapng.a 00:02:04.745 [372/740] Linking static target lib/librte_security.a 00:02:04.745 [373/740] Compiling C object lib/librte_table.a.p/table_rte_table_array.c.o 00:02:04.745 [374/740] Compiling C object lib/librte_table.a.p/table_rte_table_stub.c.o 00:02:04.745 [375/740] Generating lib/distributor.sym_chk with a custom command (wrapped by meson to capture output) 00:02:04.745 [376/740] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common_uio.c.o 00:02:04.745 [377/740] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_timer_adapter.c.o 00:02:04.746 [378/740] Compiling C object lib/librte_table.a.p/table_rte_swx_table_wm.c.o 00:02:04.746 [379/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_jit_x86.c.o 00:02:04.746 [380/740] Compiling C object lib/librte_ipsec.a.p/ipsec_sa.c.o 00:02:04.746 [381/740] Compiling C object lib/librte_table.a.p/table_rte_table_lpm.c.o 00:02:04.746 [382/740] Linking static target lib/librte_bpf.a 00:02:04.746 [383/740] Compiling C object lib/librte_table.a.p/table_rte_table_hash_cuckoo.c.o 00:02:04.746 [384/740] Compiling C object lib/librte_table.a.p/table_rte_swx_table_selector.c.o 00:02:04.746 [385/740] Compiling C object lib/librte_table.a.p/table_rte_table_lpm_ipv6.c.o 00:02:04.746 [386/740] Generating lib/ip_frag.sym_chk with a custom command (wrapped by meson to capture output) 00:02:04.746 [387/740] Compiling C object lib/librte_graph.a.p/graph_graph_debug.c.o 00:02:05.013 [388/740] Generating drivers/rte_net_i40e_def with a custom command 00:02:05.013 [389/740] Compiling C object lib/librte_vhost.a.p/vhost_socket.c.o 00:02:05.013 [390/740] Generating lib/rawdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:05.013 [391/740] Generating drivers/rte_net_i40e_mingw with a custom command 00:02:05.013 [392/740] Compiling C object lib/librte_graph.a.p/graph_graph_ops.c.o 00:02:05.013 [393/740] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_uio.c.o 00:02:05.013 [394/740] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_eth_tx_adapter.c.o 00:02:05.013 [395/740] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_port_in_action.c.o 00:02:05.013 [396/740] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_crypto_adapter.c.o 00:02:05.013 [397/740] Compiling C object lib/librte_graph.a.p/graph_graph_populate.c.o 00:02:05.013 [398/740] Compiling C object lib/librte_lpm.a.p/lpm_rte_lpm6.c.o 00:02:05.013 [399/740] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev.c.o 00:02:05.013 [400/740] Linking static target lib/librte_lpm.a 00:02:05.013 [401/740] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common.c.o 00:02:05.013 [402/740] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_flow.c.o 00:02:05.013 [403/740] Compiling C object lib/librte_graph.a.p/graph_node.c.o 00:02:05.013 [404/740] Linking static target drivers/libtmp_rte_bus_vdev.a 00:02:05.013 [405/740] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_eventdev.c.o 00:02:05.013 [406/740] Compiling C object lib/librte_node.a.p/node_log.c.o 00:02:05.013 [407/740] Compiling C object lib/librte_rib.a.p/rib_rte_rib6.c.o 00:02:05.013 [408/740] Compiling C object lib/librte_port.a.p/port_rte_port_frag.c.o 00:02:05.013 [409/740] Compiling C object lib/librte_table.a.p/table_rte_table_acl.c.o 00:02:05.013 [410/740] Linking static target lib/librte_rib.a 00:02:05.013 [411/740] Generating lib/reorder.sym_chk with a custom command (wrapped by meson to capture output) 00:02:05.013 [412/740] Compiling C object lib/librte_port.a.p/port_rte_swx_port_fd.c.o 00:02:05.013 [413/740] Compiling C object lib/librte_node.a.p/node_ethdev_ctrl.c.o 00:02:05.013 [414/740] Compiling C object lib/librte_port.a.p/port_rte_swx_port_ethdev.c.o 00:02:05.013 [415/740] Compiling C object lib/librte_ipsec.a.p/ipsec_ipsec_sad.c.o 00:02:05.013 [416/740] Compiling C object app/dpdk-test-cmdline.p/test-cmdline_cmdline_test.c.o 00:02:05.013 [417/740] Compiling C object app/dpdk-test-cmdline.p/test-cmdline_commands.c.o 00:02:05.014 [418/740] Compiling C object lib/librte_port.a.p/port_rte_port_ras.c.o 00:02:05.014 [419/740] Compiling C object lib/librte_graph.a.p/graph_graph_stats.c.o 00:02:05.014 [420/740] Compiling C object lib/librte_port.a.p/port_rte_port_fd.c.o 00:02:05.014 [421/740] Generating lib/dmadev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:05.014 [422/740] Compiling C object lib/librte_member.a.p/member_rte_member_ht.c.o 00:02:05.014 [423/740] Compiling C object lib/librte_graph.a.p/graph_graph.c.o 00:02:05.014 [424/740] Compiling C object lib/librte_port.a.p/port_rte_port_ethdev.c.o 00:02:05.014 [425/740] Compiling C object lib/librte_node.a.p/node_pkt_drop.c.o 00:02:05.014 [426/740] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_vfio.c.o 00:02:05.014 [427/740] Linking static target lib/librte_graph.a 00:02:05.014 [428/740] Generating lib/bbdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:05.278 [429/740] Compiling C object lib/librte_efd.a.p/efd_rte_efd.c.o 00:02:05.278 [430/740] Compiling C object lib/librte_port.a.p/port_rte_swx_port_source_sink.c.o 00:02:05.278 [431/740] Compiling C object lib/librte_node.a.p/node_ethdev_tx.c.o 00:02:05.278 [432/740] Compiling C object lib/librte_port.a.p/port_rte_port_sym_crypto.c.o 00:02:05.278 [433/740] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_main.c.o 00:02:05.278 [434/740] Linking static target lib/librte_efd.a 00:02:05.278 [435/740] Compiling C object lib/librte_port.a.p/port_rte_port_eventdev.c.o 00:02:05.278 [436/740] Compiling C object lib/librte_fib.a.p/fib_trie.c.o 00:02:05.278 [437/740] Generating lib/cmdline.sym_chk with a custom command (wrapped by meson to capture output) 00:02:05.278 [438/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_test.c.o 00:02:05.278 [439/740] Compiling C object lib/librte_node.a.p/node_ethdev_rx.c.o 00:02:05.278 [440/740] Generating lib/pcapng.sym_chk with a custom command (wrapped by meson to capture output) 00:02:05.278 [441/740] Compiling C object lib/librte_port.a.p/port_rte_port_source_sink.c.o 00:02:05.278 [442/740] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci.c.o 00:02:05.278 [443/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_diag.c.o 00:02:05.278 [444/740] Linking static target drivers/libtmp_rte_bus_pci.a 00:02:05.278 [445/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_hmc.c.o 00:02:05.278 [446/740] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key8.c.o 00:02:05.547 [447/740] Generating lib/bpf.sym_chk with a custom command (wrapped by meson to capture output) 00:02:05.547 [448/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_parser.c.o 00:02:05.547 [449/740] Compiling C object lib/librte_fib.a.p/fib_dir24_8.c.o 00:02:05.547 [450/740] Compiling C object lib/librte_acl.a.p/acl_acl_bld.c.o 00:02:05.547 [451/740] Linking static target lib/librte_fib.a 00:02:05.547 [452/740] Compiling C object lib/librte_port.a.p/port_rte_swx_port_ring.c.o 00:02:05.547 [453/740] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_pipeline.c.o 00:02:05.547 [454/740] Generating lib/mempool.sym_chk with a custom command (wrapped by meson to capture output) 00:02:05.547 [455/740] Generating lib/security.sym_chk with a custom command (wrapped by meson to capture output) 00:02:05.547 [456/740] Generating drivers/rte_bus_vdev.pmd.c with a custom command 00:02:05.547 [457/740] Compiling C object drivers/librte_bus_vdev.a.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:02:05.547 [458/740] Linking static target drivers/librte_bus_vdev.a 00:02:05.547 [459/740] Generating lib/compressdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:05.547 [460/740] Compiling C object lib/librte_pdump.a.p/pdump_rte_pdump.c.o 00:02:05.547 [461/740] Generating lib/efd.sym_chk with a custom command (wrapped by meson to capture output) 00:02:05.547 [462/740] Linking static target lib/librte_pdump.a 00:02:05.547 [463/740] Compiling C object lib/librte_vhost.a.p/vhost_vhost.c.o 00:02:05.547 [464/740] Generating lib/lpm.sym_chk with a custom command (wrapped by meson to capture output) 00:02:05.547 [465/740] Compiling C object drivers/librte_bus_vdev.so.23.0.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:02:05.547 [466/740] Compiling C object lib/librte_node.a.p/node_pkt_cls.c.o 00:02:05.815 [467/740] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key16.c.o 00:02:05.815 [468/740] Compiling C object lib/librte_table.a.p/table_rte_table_hash_ext.c.o 00:02:05.815 [469/740] Compiling C object lib/librte_vhost.a.p/vhost_vhost_user.c.o 00:02:05.815 [470/740] Compiling C object lib/librte_table.a.p/table_rte_table_hash_lru.c.o 00:02:05.815 [471/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_dcb.c.o 00:02:05.815 [472/740] Generating lib/mbuf.sym_chk with a custom command (wrapped by meson to capture output) 00:02:05.815 [473/740] Generating drivers/rte_bus_pci.pmd.c with a custom command 00:02:05.815 [474/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_lan_hmc.c.o 00:02:05.815 [475/740] Generating lib/gpudev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:05.815 [476/740] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_options_parse.c.o 00:02:05.815 [477/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_vf_representor.c.o 00:02:05.815 [478/740] Generating lib/regexdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:05.815 [479/740] Compiling C object drivers/librte_bus_pci.a.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:02:05.815 [480/740] Compiling C object drivers/librte_bus_pci.so.23.0.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:02:05.815 [481/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_common.c.o 00:02:05.815 [482/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_vectors.c.o 00:02:05.815 [483/740] Linking static target drivers/librte_bus_pci.a 00:02:05.815 [484/740] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_main.c.o 00:02:05.815 [485/740] Generating lib/rib.sym_chk with a custom command (wrapped by meson to capture output) 00:02:06.077 [486/740] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key32.c.o 00:02:06.077 [487/740] Compiling C object lib/librte_node.a.p/node_ip4_lookup.c.o 00:02:06.077 [488/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_adminq.c.o 00:02:06.077 [489/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_tm.c.o 00:02:06.077 [490/740] Linking static target lib/librte_table.a 00:02:06.077 [491/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_main.c.o 00:02:06.077 [492/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_options.c.o 00:02:06.077 [493/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_vector_parsing.c.o 00:02:06.077 [494/740] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_flow_gen.c.o 00:02:06.077 [495/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_hash.c.o 00:02:06.077 [496/740] Compiling C object app/dpdk-test-acl.p/test-acl_main.c.o 00:02:06.077 [497/740] Generating drivers/rte_bus_vdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:06.077 [498/740] Generating lib/fib.sym_chk with a custom command (wrapped by meson to capture output) 00:02:06.077 [499/740] Generating symbol file lib/librte_kvargs.so.23.0.p/librte_kvargs.so.23.0.symbols 00:02:06.077 [500/740] Generating symbol file lib/librte_telemetry.so.23.0.p/librte_telemetry.so.23.0.symbols 00:02:06.077 [501/740] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_items_gen.c.o 00:02:06.077 [502/740] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_ctl.c.o 00:02:06.077 [503/740] Compiling C object app/dpdk-test-gpudev.p/test-gpudev_main.c.o 00:02:06.077 [504/740] Generating lib/pdump.sym_chk with a custom command (wrapped by meson to capture output) 00:02:06.077 [505/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_acl.c.o 00:02:06.077 [506/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_config.c.o 00:02:06.077 [507/740] Compiling C object app/dpdk-dumpcap.p/dumpcap_main.c.o 00:02:06.077 [508/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_main.c.o 00:02:06.077 [509/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_common.c.o 00:02:06.077 [510/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_init.c.o 00:02:06.077 [511/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_main.c.o 00:02:06.077 [512/740] Compiling C object lib/librte_acl.a.p/acl_acl_run_sse.c.o 00:02:06.337 [513/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_stub.c.o 00:02:06.337 [514/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_lpm_ipv6.c.o 00:02:06.337 [515/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_lpm.c.o 00:02:06.337 [516/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_options_parsing.c.o 00:02:06.337 [517/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_nvm.c.o 00:02:06.337 [518/740] Generating lib/power.sym_chk with a custom command (wrapped by meson to capture output) 00:02:06.337 [519/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_queue.c.o 00:02:06.337 [520/740] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_throughput.c.o 00:02:06.337 [521/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_ops.c.o 00:02:06.337 [522/740] Compiling C object drivers/libtmp_rte_mempool_ring.a.p/mempool_ring_rte_mempool_ring.c.o 00:02:06.337 [523/740] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_actions_gen.c.o 00:02:06.337 [524/740] Linking static target drivers/libtmp_rte_mempool_ring.a 00:02:06.337 [525/740] Compiling C object lib/librte_sched.a.p/sched_rte_sched.c.o 00:02:06.337 [526/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_atq.c.o 00:02:06.337 [527/740] Linking static target lib/librte_sched.a 00:02:06.337 [528/740] Compiling C object lib/librte_cryptodev.a.p/cryptodev_rte_cryptodev.c.o 00:02:06.337 [529/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_throughput.c.o 00:02:06.337 [530/740] Linking static target lib/librte_cryptodev.a 00:02:06.337 [531/740] Compiling C object lib/librte_ipsec.a.p/ipsec_esp_outb.c.o 00:02:06.337 [532/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_hash.c.o 00:02:06.337 [533/740] Generating drivers/rte_bus_pci.sym_chk with a custom command (wrapped by meson to capture output) 00:02:06.337 [534/740] Compiling C object lib/librte_node.a.p/node_ip4_rewrite.c.o 00:02:06.337 [535/740] Compiling C object app/dpdk-proc-info.p/proc-info_main.c.o 00:02:06.337 [536/740] Linking static target lib/librte_node.a 00:02:06.337 [537/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_verify.c.o 00:02:06.337 [538/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_pf.c.o 00:02:06.337 [539/740] Compiling C object app/dpdk-testpmd.p/test-pmd_cmd_flex_item.c.o 00:02:06.337 [540/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_latency.c.o 00:02:06.337 [541/740] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_verify.c.o 00:02:06.337 [542/740] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev.c.o 00:02:06.596 [543/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_pmd_cyclecount.c.o 00:02:06.596 [544/740] Linking static target lib/librte_ethdev.a 00:02:06.596 [545/740] Compiling C object lib/librte_ipsec.a.p/ipsec_esp_inb.c.o 00:02:06.596 [546/740] Generating drivers/rte_mempool_ring.pmd.c with a custom command 00:02:06.596 [547/740] Linking static target lib/librte_ipsec.a 00:02:06.596 [548/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_runtime.c.o 00:02:06.596 [549/740] Generating lib/graph.sym_chk with a custom command (wrapped by meson to capture output) 00:02:06.596 [550/740] Compiling C object app/dpdk-testpmd.p/test-pmd_5tswap.c.o 00:02:06.596 [551/740] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev.c.o 00:02:06.596 [552/740] Compiling C object drivers/librte_mempool_ring.a.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:02:06.596 [553/740] Compiling C object drivers/librte_mempool_ring.so.23.0.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:02:06.596 [554/740] Linking static target drivers/librte_mempool_ring.a 00:02:06.596 [555/740] Compiling C object lib/librte_member.a.p/member_rte_member_sketch.c.o 00:02:06.596 [556/740] Compiling C object app/dpdk-testpmd.p/test-pmd_bpf_cmd.c.o 00:02:06.596 [557/740] Compiling C object app/dpdk-testpmd.p/test-pmd_ieee1588fwd.c.o 00:02:06.596 [558/740] Linking static target lib/librte_member.a 00:02:06.596 [559/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_queue.c.o 00:02:06.596 [560/740] Compiling C object app/dpdk-test-security-perf.p/test-security-perf_test_security_perf.c.o 00:02:06.596 [561/740] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev_vector.c.o 00:02:06.596 [562/740] Generating lib/node.sym_chk with a custom command (wrapped by meson to capture output) 00:02:06.596 [563/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_atq.c.o 00:02:06.596 [564/740] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_common.c.o 00:02:06.596 [565/740] Compiling C object app/dpdk-testpmd.p/test-pmd_rxonly.c.o 00:02:06.596 [566/740] Compiling C object app/dpdk-testpmd.p/test-pmd_macfwd.c.o 00:02:06.596 [567/740] Compiling C object lib/librte_port.a.p/port_rte_port_ring.c.o 00:02:06.596 [568/740] Compiling C object app/dpdk-pdump.p/pdump_main.c.o 00:02:06.596 [569/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_flow.c.o 00:02:06.596 [570/740] Linking static target lib/librte_port.a 00:02:06.596 [571/740] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_eth_rx_adapter.c.o 00:02:06.596 [572/740] Compiling C object app/dpdk-testpmd.p/test-pmd_shared_rxq_fwd.c.o 00:02:06.855 [573/740] Compiling C object app/dpdk-testpmd.p/test-pmd_iofwd.c.o 00:02:06.855 [574/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_atq.c.o 00:02:06.855 [575/740] Linking static target lib/librte_eventdev.a 00:02:06.855 [576/740] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_mtr.c.o 00:02:06.855 [577/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_common.c.o 00:02:06.855 [578/740] Compiling C object app/dpdk-testpmd.p/test-pmd_util.c.o 00:02:06.855 [579/740] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_cyclecount.c.o 00:02:06.855 [580/740] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_main.c.o 00:02:06.855 [581/740] Compiling C object app/dpdk-testpmd.p/test-pmd_icmpecho.c.o 00:02:06.855 [582/740] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_tm.c.o 00:02:06.855 [583/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_queue.c.o 00:02:06.855 [584/740] Generating lib/sched.sym_chk with a custom command (wrapped by meson to capture output) 00:02:06.855 [585/740] Generating lib/table.sym_chk with a custom command (wrapped by meson to capture output) 00:02:06.855 [586/740] Compiling C object app/dpdk-testpmd.p/test-pmd_macswap.c.o 00:02:06.855 [587/740] Compiling C object app/dpdk-testpmd.p/test-pmd_flowgen.c.o 00:02:06.855 [588/740] Compiling C object app/dpdk-test-fib.p/test-fib_main.c.o 00:02:06.855 [589/740] Compiling C object app/dpdk-test-sad.p/test-sad_main.c.o 00:02:06.855 [590/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_rxtx_vec_sse.c.o 00:02:06.855 [591/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_fdir.c.o 00:02:06.855 [592/740] Compiling C object app/dpdk-testpmd.p/test-pmd_parameters.c.o 00:02:07.115 [593/740] Generating lib/ipsec.sym_chk with a custom command (wrapped by meson to capture output) 00:02:07.115 [594/740] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_pipeline_spec.c.o 00:02:07.115 [595/740] Compiling C object app/dpdk-testpmd.p/.._drivers_net_i40e_i40e_testpmd.c.o 00:02:07.115 [596/740] Generating lib/member.sym_chk with a custom command (wrapped by meson to capture output) 00:02:07.115 [597/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_rte_pmd_i40e.c.o 00:02:07.115 [598/740] Compiling C object lib/librte_acl.a.p/acl_acl_run_avx512.c.o 00:02:07.115 [599/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_common.c.o 00:02:07.115 [600/740] Compiling C object app/dpdk-test-regex.p/test-regex_main.c.o 00:02:07.115 [601/740] Linking static target drivers/net/i40e/base/libi40e_base.a 00:02:07.115 [602/740] Compiling C object drivers/net/i40e/libi40e_avx512_lib.a.p/i40e_rxtx_vec_avx512.c.o 00:02:07.115 [603/740] Linking static target drivers/net/i40e/libi40e_avx512_lib.a 00:02:07.115 [604/740] Compiling C object lib/librte_acl.a.p/acl_acl_run_avx2.c.o 00:02:07.115 [605/740] Compiling C object lib/librte_hash.a.p/hash_rte_cuckoo_hash.c.o 00:02:07.115 [606/740] Compiling C object app/dpdk-test-security-perf.p/test_test_cryptodev_security_ipsec.c.o 00:02:07.375 [607/740] Linking static target lib/librte_acl.a 00:02:07.375 [608/740] Linking static target lib/librte_hash.a 00:02:07.375 [609/740] Compiling C object app/dpdk-testpmd.p/test-pmd_txonly.c.o 00:02:07.375 [610/740] Compiling C object app/dpdk-testpmd.p/test-pmd_csumonly.c.o 00:02:07.375 [611/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_common.c.o 00:02:07.375 [612/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_rxtx_vec_avx2.c.o 00:02:07.639 [613/740] Generating lib/port.sym_chk with a custom command (wrapped by meson to capture output) 00:02:07.639 [614/740] Compiling C object app/dpdk-testpmd.p/test-pmd_testpmd.c.o 00:02:07.639 [615/740] Generating lib/acl.sym_chk with a custom command (wrapped by meson to capture output) 00:02:07.899 [616/740] Compiling C object app/dpdk-testpmd.p/test-pmd_noisy_vnf.c.o 00:02:07.899 [617/740] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline.c.o 00:02:08.468 [618/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_rxtx.c.o 00:02:08.468 [619/740] Compiling C object app/dpdk-testpmd.p/test-pmd_config.c.o 00:02:08.468 [620/740] Generating lib/hash.sym_chk with a custom command (wrapped by meson to capture output) 00:02:08.468 [621/740] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_flow.c.o 00:02:09.038 [622/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_ethdev.c.o 00:02:09.038 [623/740] Linking static target drivers/libtmp_rte_net_i40e.a 00:02:09.299 [624/740] Generating drivers/rte_net_i40e.pmd.c with a custom command 00:02:09.299 [625/740] Compiling C object drivers/librte_net_i40e.so.23.0.p/meson-generated_.._rte_net_i40e.pmd.c.o 00:02:09.299 [626/740] Compiling C object drivers/librte_net_i40e.a.p/meson-generated_.._rte_net_i40e.pmd.c.o 00:02:09.299 [627/740] Linking static target drivers/librte_net_i40e.a 00:02:09.870 [628/740] Compiling C object lib/librte_vhost.a.p/vhost_vhost_crypto.c.o 00:02:10.130 [629/740] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev_perf.c.o 00:02:10.130 [630/740] Generating lib/cryptodev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:10.130 [631/740] Generating lib/eventdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:10.390 [632/740] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_pipeline.c.o 00:02:10.650 [633/740] Generating drivers/rte_net_i40e.sym_chk with a custom command (wrapped by meson to capture output) 00:02:15.932 [634/740] Generating lib/ethdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:16.502 [635/740] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net.c.o 00:02:16.502 [636/740] Linking static target lib/librte_vhost.a 00:02:16.763 [637/740] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_table_action.c.o 00:02:16.763 [638/740] Linking static target lib/librte_pipeline.a 00:02:17.023 [639/740] Linking target app/dpdk-dumpcap 00:02:17.023 [640/740] Linking target app/dpdk-proc-info 00:02:17.023 [641/740] Linking target app/dpdk-test-cmdline 00:02:17.023 [642/740] Linking target app/dpdk-test-acl 00:02:17.023 [643/740] Linking target app/dpdk-pdump 00:02:17.023 [644/740] Linking target app/dpdk-test-gpudev 00:02:17.023 [645/740] Linking target app/dpdk-test-sad 00:02:17.023 [646/740] Linking target app/dpdk-test-fib 00:02:17.023 [647/740] Linking target app/dpdk-test-flow-perf 00:02:17.023 [648/740] Linking target app/dpdk-test-pipeline 00:02:17.023 [649/740] Linking target app/dpdk-test-bbdev 00:02:17.023 [650/740] Linking target app/dpdk-test-regex 00:02:17.023 [651/740] Linking target app/dpdk-test-compress-perf 00:02:17.023 [652/740] Linking target app/dpdk-test-security-perf 00:02:17.023 [653/740] Linking target app/dpdk-test-crypto-perf 00:02:17.023 [654/740] Linking target app/dpdk-test-eventdev 00:02:17.283 [655/740] Linking target app/dpdk-testpmd 00:02:18.671 [656/740] Generating lib/vhost.sym_chk with a custom command (wrapped by meson to capture output) 00:02:18.671 [657/740] Generating lib/eal.sym_chk with a custom command (wrapped by meson to capture output) 00:02:18.931 [658/740] Linking target lib/librte_eal.so.23.0 00:02:18.931 [659/740] Generating symbol file lib/librte_eal.so.23.0.p/librte_eal.so.23.0.symbols 00:02:18.931 [660/740] Linking target lib/librte_timer.so.23.0 00:02:18.931 [661/740] Linking target lib/librte_meter.so.23.0 00:02:18.931 [662/740] Linking target lib/librte_pci.so.23.0 00:02:18.931 [663/740] Linking target lib/librte_graph.so.23.0 00:02:18.931 [664/740] Linking target lib/librte_stack.so.23.0 00:02:18.931 [665/740] Linking target lib/librte_jobstats.so.23.0 00:02:18.931 [666/740] Linking target lib/librte_dmadev.so.23.0 00:02:18.931 [667/740] Linking target lib/librte_rawdev.so.23.0 00:02:18.931 [668/740] Linking target lib/librte_cfgfile.so.23.0 00:02:18.931 [669/740] Linking target lib/librte_ring.so.23.0 00:02:18.931 [670/740] Linking target drivers/librte_bus_vdev.so.23.0 00:02:18.931 [671/740] Linking target lib/librte_acl.so.23.0 00:02:19.191 [672/740] Generating symbol file lib/librte_timer.so.23.0.p/librte_timer.so.23.0.symbols 00:02:19.191 [673/740] Generating symbol file lib/librte_graph.so.23.0.p/librte_graph.so.23.0.symbols 00:02:19.191 [674/740] Generating symbol file lib/librte_dmadev.so.23.0.p/librte_dmadev.so.23.0.symbols 00:02:19.191 [675/740] Generating symbol file lib/librte_pci.so.23.0.p/librte_pci.so.23.0.symbols 00:02:19.191 [676/740] Generating symbol file lib/librte_meter.so.23.0.p/librte_meter.so.23.0.symbols 00:02:19.191 [677/740] Generating symbol file lib/librte_ring.so.23.0.p/librte_ring.so.23.0.symbols 00:02:19.191 [678/740] Generating symbol file drivers/librte_bus_vdev.so.23.0.p/librte_bus_vdev.so.23.0.symbols 00:02:19.191 [679/740] Generating symbol file lib/librte_acl.so.23.0.p/librte_acl.so.23.0.symbols 00:02:19.191 [680/740] Linking target drivers/librte_bus_pci.so.23.0 00:02:19.191 [681/740] Linking target lib/librte_mempool.so.23.0 00:02:19.191 [682/740] Linking target lib/librte_rcu.so.23.0 00:02:19.451 [683/740] Generating symbol file drivers/librte_bus_pci.so.23.0.p/librte_bus_pci.so.23.0.symbols 00:02:19.451 [684/740] Generating symbol file lib/librte_mempool.so.23.0.p/librte_mempool.so.23.0.symbols 00:02:19.451 [685/740] Generating symbol file lib/librte_rcu.so.23.0.p/librte_rcu.so.23.0.symbols 00:02:19.451 [686/740] Linking target drivers/librte_mempool_ring.so.23.0 00:02:19.451 [687/740] Linking target lib/librte_mbuf.so.23.0 00:02:19.451 [688/740] Linking target lib/librte_rib.so.23.0 00:02:19.451 [689/740] Generating symbol file lib/librte_rib.so.23.0.p/librte_rib.so.23.0.symbols 00:02:19.711 [690/740] Generating symbol file lib/librte_mbuf.so.23.0.p/librte_mbuf.so.23.0.symbols 00:02:19.711 [691/740] Linking target lib/librte_fib.so.23.0 00:02:19.711 [692/740] Linking target lib/librte_bbdev.so.23.0 00:02:19.711 [693/740] Linking target lib/librte_compressdev.so.23.0 00:02:19.711 [694/740] Linking target lib/librte_gpudev.so.23.0 00:02:19.711 [695/740] Linking target lib/librte_net.so.23.0 00:02:19.711 [696/740] Linking target lib/librte_regexdev.so.23.0 00:02:19.711 [697/740] Linking target lib/librte_reorder.so.23.0 00:02:19.711 [698/740] Linking target lib/librte_distributor.so.23.0 00:02:19.711 [699/740] Linking target lib/librte_cryptodev.so.23.0 00:02:19.711 [700/740] Linking target lib/librte_sched.so.23.0 00:02:19.711 [701/740] Generating symbol file lib/librte_sched.so.23.0.p/librte_sched.so.23.0.symbols 00:02:19.711 [702/740] Generating symbol file lib/librte_net.so.23.0.p/librte_net.so.23.0.symbols 00:02:19.711 [703/740] Generating symbol file lib/librte_cryptodev.so.23.0.p/librte_cryptodev.so.23.0.symbols 00:02:19.711 [704/740] Linking target lib/librte_security.so.23.0 00:02:19.970 [705/740] Linking target lib/librte_hash.so.23.0 00:02:19.970 [706/740] Linking target lib/librte_cmdline.so.23.0 00:02:19.970 [707/740] Linking target lib/librte_ethdev.so.23.0 00:02:19.970 [708/740] Generating symbol file lib/librte_security.so.23.0.p/librte_security.so.23.0.symbols 00:02:19.970 [709/740] Generating symbol file lib/librte_hash.so.23.0.p/librte_hash.so.23.0.symbols 00:02:19.970 [710/740] Generating symbol file lib/librte_ethdev.so.23.0.p/librte_ethdev.so.23.0.symbols 00:02:19.970 [711/740] Linking target lib/librte_lpm.so.23.0 00:02:19.970 [712/740] Linking target lib/librte_efd.so.23.0 00:02:19.970 [713/740] Linking target lib/librte_ipsec.so.23.0 00:02:19.970 [714/740] Linking target lib/librte_member.so.23.0 00:02:19.970 [715/740] Linking target lib/librte_metrics.so.23.0 00:02:19.970 [716/740] Linking target lib/librte_pcapng.so.23.0 00:02:19.970 [717/740] Linking target lib/librte_gso.so.23.0 00:02:19.970 [718/740] Linking target lib/librte_bpf.so.23.0 00:02:19.970 [719/740] Linking target lib/librte_ip_frag.so.23.0 00:02:19.971 [720/740] Linking target lib/librte_gro.so.23.0 00:02:19.971 [721/740] Linking target lib/librte_power.so.23.0 00:02:19.971 [722/740] Linking target lib/librte_eventdev.so.23.0 00:02:19.971 [723/740] Linking target lib/librte_vhost.so.23.0 00:02:20.231 [724/740] Linking target drivers/librte_net_i40e.so.23.0 00:02:20.231 [725/740] Generating symbol file lib/librte_lpm.so.23.0.p/librte_lpm.so.23.0.symbols 00:02:20.231 [726/740] Generating symbol file lib/librte_metrics.so.23.0.p/librte_metrics.so.23.0.symbols 00:02:20.231 [727/740] Generating symbol file lib/librte_eventdev.so.23.0.p/librte_eventdev.so.23.0.symbols 00:02:20.231 [728/740] Generating symbol file lib/librte_pcapng.so.23.0.p/librte_pcapng.so.23.0.symbols 00:02:20.231 [729/740] Generating symbol file lib/librte_ip_frag.so.23.0.p/librte_ip_frag.so.23.0.symbols 00:02:20.231 [730/740] Generating symbol file lib/librte_bpf.so.23.0.p/librte_bpf.so.23.0.symbols 00:02:20.231 [731/740] Linking target lib/librte_node.so.23.0 00:02:20.231 [732/740] Linking target lib/librte_bitratestats.so.23.0 00:02:20.231 [733/740] Linking target lib/librte_latencystats.so.23.0 00:02:20.231 [734/740] Linking target lib/librte_pdump.so.23.0 00:02:20.231 [735/740] Linking target lib/librte_port.so.23.0 00:02:20.491 [736/740] Generating symbol file lib/librte_port.so.23.0.p/librte_port.so.23.0.symbols 00:02:20.491 [737/740] Linking target lib/librte_table.so.23.0 00:02:20.491 [738/740] Generating symbol file lib/librte_table.so.23.0.p/librte_table.so.23.0.symbols 00:02:21.874 [739/740] Generating lib/pipeline.sym_chk with a custom command (wrapped by meson to capture output) 00:02:22.135 [740/740] Linking target lib/librte_pipeline.so.23.0 00:02:22.135 10:56:46 build_native_dpdk -- common/autobuild_common.sh@194 -- $ uname -s 00:02:22.135 10:56:46 build_native_dpdk -- common/autobuild_common.sh@194 -- $ [[ Linux == \F\r\e\e\B\S\D ]] 00:02:22.135 10:56:46 build_native_dpdk -- common/autobuild_common.sh@207 -- $ ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp -j112 install 00:02:22.135 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp' 00:02:22.135 [0/1] Installing files. 00:02:22.400 Installing subdir /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples 00:02:22.400 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq 00:02:22.400 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq 00:02:22.400 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/packet_ordering/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/packet_ordering 00:02:22.400 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/packet_ordering/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/packet_ordering 00:02:22.400 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/skeleton/basicfwd.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/skeleton 00:02:22.400 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/skeleton/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/skeleton 00:02:22.400 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/helloworld/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/helloworld 00:02:22.400 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/helloworld/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/helloworld 00:02:22.400 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/obj.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:22.400 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/obj.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:22.400 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/thread.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:22.400 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/conn.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:22.400 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/conn.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:22.401 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/cli.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:22.401 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:22.401 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/cli.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:22.401 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:22.401 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/thread.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:22.401 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:22.401 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan_table.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:22.401 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/mirroring.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:22.401 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/hash_func.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:22.401 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/recirculation.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:22.401 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/selector.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:22.401 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp_pcap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:22.401 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:22.401 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:22.401 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/varbit.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:22.401 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/hash_func.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:22.401 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:22.401 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/meter.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:22.401 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/registers.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:22.401 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/varbit.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:22.401 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:22.401 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/learner.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:22.401 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:22.401 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ethdev.io to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:22.401 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/learner.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:22.401 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/mirroring.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:22.401 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib_nexthop_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:22.401 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/recirculation.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:22.401 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/meter.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:22.401 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:22.401 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan_pcap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:22.401 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:22.401 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib_routing_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:22.401 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/selector.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:22.401 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:22.401 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/registers.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:22.401 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_pcap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:22.401 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib_nexthop_group_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:22.401 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/pcap.io to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:22.401 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/packet.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:22.401 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/selector.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:22.401 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_ecdsa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:22.401 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:22.401 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_xts.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:22.401 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_cmac.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:22.401 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_tdes.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:22.401 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_hmac.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:22.401 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_ccm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:22.401 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_aes.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:22.401 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:22.401 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_dev_self_test.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:22.401 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:22.401 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_rsa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:22.401 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_sha.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:22.401 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_gcm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:22.402 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_dev_self_test.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:22.402 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:22.402 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-jobstats/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-jobstats 00:02:22.402 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-jobstats/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-jobstats 00:02:22.402 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_classify/flow_classify.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_classify 00:02:22.402 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_classify/ipv4_rules_file.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_classify 00:02:22.402 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_classify/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_classify 00:02:22.402 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd 00:02:22.402 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd 00:02:22.402 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/service_cores/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/service_cores 00:02:22.402 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/service_cores/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/service_cores 00:02:22.402 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/virtio_net.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:02:22.402 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:02:22.402 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:02:22.402 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:02:22.402 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/pkt_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common 00:02:22.402 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/sse/port_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common/sse 00:02:22.402 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/altivec/port_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common/altivec 00:02:22.402 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/neon/port_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common/neon 00:02:22.402 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/flow_blocks.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:02:22.402 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:02:22.402 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:02:22.402 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_fragmentation/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_fragmentation 00:02:22.402 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_fragmentation/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_fragmentation 00:02:22.402 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_crypto/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_crypto 00:02:22.402 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_crypto/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_crypto 00:02:22.402 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor_nop.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:22.402 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_monitor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:22.402 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/power_manager.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:22.402 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:22.402 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:22.402 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/vm_power_cli.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:22.402 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/parse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:22.402 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_monitor.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:22.402 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_manager.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:22.402 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_manager.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:22.402 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/power_manager.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:22.402 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor_x86.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:22.402 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:22.402 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/vm_power_cli.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:22.402 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:22.402 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:22.402 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/vm_power_cli_guest.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:22.402 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/parse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:22.402 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:22.402 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/vm_power_cli_guest.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:22.402 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:22.402 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/perf_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:22.402 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/perf_core.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:22.402 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:22.402 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:22.402 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:22.402 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/rxtx_callbacks/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/rxtx_callbacks 00:02:22.402 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/rxtx_callbacks/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/rxtx_callbacks 00:02:22.402 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bond/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:02:22.402 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bond/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:02:22.402 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bond/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:02:22.402 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/link_status_interrupt/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/link_status_interrupt 00:02:22.402 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/link_status_interrupt/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/link_status_interrupt 00:02:22.402 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bbdev_app/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bbdev_app 00:02:22.402 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bbdev_app/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bbdev_app 00:02:22.402 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:22.403 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:22.403 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_poll.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:22.403 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_poll.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:22.403 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_common.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:22.403 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:22.403 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:22.403 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event_internal_port.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:22.403 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:22.403 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event_generic.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:22.403 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/t2.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:22.403 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/README to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:22.403 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/t1.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:22.403 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/dummy.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:22.403 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/t3.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:22.403 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/vhost_blk.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:22.403 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/blk.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:22.403 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/vhost_blk.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:22.403 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/vhost_blk_compat.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:22.403 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/blk_spec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:22.403 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:22.403 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd 00:02:22.403 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/node/node.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/node 00:02:22.403 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/node/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/node 00:02:22.403 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/server/init.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/server 00:02:22.403 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/server/args.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/server 00:02:22.403 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/server/init.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/server 00:02:22.403 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/server/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/server 00:02:22.403 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/server/args.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/server 00:02:22.403 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/server/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/server 00:02:22.403 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/shared/common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/shared 00:02:22.403 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ptpclient/ptpclient.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ptpclient 00:02:22.403 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ptpclient/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ptpclient 00:02:22.403 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_reassembly/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_reassembly 00:02:22.403 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_reassembly/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_reassembly 00:02:22.403 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/parse_obj_list.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:22.403 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/parse_obj_list.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:22.403 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/commands.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:22.403 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/commands.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:22.403 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:22.403 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:22.403 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/vdpa_blk_compact.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:02:22.403 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:02:22.403 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:02:22.403 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/dma/dmafwd.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/dma 00:02:22.403 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/dma/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/dma 00:02:22.403 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/rte_policer.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:22.403 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/rte_policer.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:22.403 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:22.403 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:22.403 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:22.403 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/cat.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:02:22.403 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/cat.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:02:22.403 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:02:22.403 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/l2fwd-cat.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:02:22.403 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/timer/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/timer 00:02:22.403 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/timer/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/timer 00:02:22.403 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/init.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:22.403 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:22.403 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/stats.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:22.404 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile_red.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:22.404 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile_pie.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:22.404 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/cfg_file.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:22.404 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile_ov.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:22.404 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:22.404 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:22.404 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/args.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:22.404 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/cmdline.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:22.404 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:22.404 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/cfg_file.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:22.404 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/app_thread.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:22.404 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipv4_multicast/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipv4_multicast 00:02:22.404 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipv4_multicast/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipv4_multicast 00:02:22.404 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:22.404 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:22.404 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:22.404 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:22.404 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_route.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:22.404 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:22.404 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl_scalar.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:22.404 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/lpm_default_v4.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:22.404 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:22.404 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_sequential.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:22.404 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:22.404 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/lpm_route_parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:22.404 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_altivec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:22.404 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:22.404 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/lpm_default_v6.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:22.404 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:22.404 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:22.404 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/em_default_v4.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:22.404 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:22.404 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_fib.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:22.404 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:22.404 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:22.404 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:22.404 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:22.404 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/em_route_parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:22.404 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/em_default_v6.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:22.404 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:22.404 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event_internal_port.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:22.404 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:22.404 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event_generic.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:22.404 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_altivec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:22.404 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:22.404 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:22.404 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:22.404 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/event_helper.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:22.404 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_worker.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:22.404 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/parser.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:22.404 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_worker.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:22.404 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec-secgw.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:22.404 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_process.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:22.404 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:22.404 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:22.404 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sp4.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:22.404 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/esp.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:22.404 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sad.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:22.404 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_lpm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:22.404 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/flow.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:22.404 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:22.405 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/parser.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:22.405 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ep1.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:22.405 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sad.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:22.405 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/flow.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:22.405 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/rt.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:22.405 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/event_helper.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:22.405 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/esp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:22.405 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ep0.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:22.405 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipip.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:22.405 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec-secgw.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:22.405 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sp6.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:22.405 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:22.405 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesctr_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:22.405 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_3descbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:22.405 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aescbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:22.405 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/run_test.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:22.405 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aescbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:22.405 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/bypass_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:22.405 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesctr_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:22.405 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aescbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:22.405 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/common_defs_secgw.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:22.405 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesctr_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:22.405 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:22.405 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesgcm_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:22.405 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/pkttest.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:22.405 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/linux_test.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:22.405 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/data_rxtx.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:22.405 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_3descbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:22.405 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_null_header_reconstruct.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:22.405 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_3descbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:22.405 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_ipv6opts.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:22.405 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/pkttest.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:22.405 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/load_env.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:22.405 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_3descbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:22.405 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesgcm_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:22.405 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aescbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:22.405 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesctr_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:22.405 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesgcm_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:22.405 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesgcm_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:22.405 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/shm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:02:22.405 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/shm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:02:22.405 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:02:22.405 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:02:22.405 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/ka-agent/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive/ka-agent 00:02:22.405 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/ka-agent/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive/ka-agent 00:02:22.405 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:22.405 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/thread.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:22.405 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/link.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:22.405 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/action.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:22.405 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cryptodev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:22.405 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/conn.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:22.405 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/conn.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:22.406 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cryptodev.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:22.406 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/parser.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:22.406 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/mempool.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:22.406 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/swq.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:22.406 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/parser.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:22.406 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/pipeline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:22.406 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tmgr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:22.406 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/swq.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:22.406 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tmgr.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:22.406 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tap.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:22.406 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/kni.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:22.406 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tap.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:22.406 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/mempool.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:22.406 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cli.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:22.406 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:22.406 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/link.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:22.406 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/action.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:22.406 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/pipeline.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:22.406 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/kni.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:22.406 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cli.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:22.406 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:22.406 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/thread.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:22.406 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/kni.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:22.406 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/firewall.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:22.406 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/route_ecmp.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:22.406 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/flow.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:22.406 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/route.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:22.406 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/l2fwd.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:22.406 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/flow_crypto.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:22.406 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/tap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:22.406 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/rss.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:22.406 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq_dcb/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq_dcb 00:02:22.406 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq_dcb/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq_dcb 00:02:22.406 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ntb/ntb_fwd.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ntb 00:02:22.406 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ntb/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ntb 00:02:22.406 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool 00:02:22.406 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/ethapp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:02:22.406 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/ethapp.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:02:22.406 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:02:22.406 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:02:22.406 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/lib/rte_ethtool.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:02:22.406 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/lib/rte_ethtool.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:02:22.406 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/lib/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:02:22.406 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process 00:02:22.406 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/mp_commands.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:22.406 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:22.406 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/mp_commands.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:22.406 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:22.406 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/commands.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:02:22.406 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/commands.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:02:22.406 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:02:22.406 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:02:22.406 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp 00:02:22.406 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_client/client.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_client 00:02:22.406 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_client/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_client 00:02:22.406 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/init.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:22.406 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/args.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:22.407 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/init.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:22.407 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:22.407 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/args.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:22.407 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:22.407 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/shared/common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/shared 00:02:22.407 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/symmetric_mp/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/symmetric_mp 00:02:22.407 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/symmetric_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/symmetric_mp 00:02:22.407 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_worker_generic.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:22.407 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:22.407 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:22.407 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_worker_tx.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:22.407 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:22.407 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/distributor/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/distributor 00:02:22.407 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/distributor/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/distributor 00:02:22.407 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-graph/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-graph 00:02:22.407 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-graph/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-graph 00:02:22.407 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-crypto/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-crypto 00:02:22.407 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-crypto/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-crypto 00:02:22.407 Installing lib/librte_kvargs.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.407 Installing lib/librte_kvargs.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.407 Installing lib/librte_telemetry.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.407 Installing lib/librte_telemetry.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.407 Installing lib/librte_eal.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.407 Installing lib/librte_eal.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.407 Installing lib/librte_ring.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.407 Installing lib/librte_ring.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.407 Installing lib/librte_rcu.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.407 Installing lib/librte_rcu.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.407 Installing lib/librte_mempool.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.407 Installing lib/librte_mempool.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.407 Installing lib/librte_mbuf.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.407 Installing lib/librte_mbuf.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.407 Installing lib/librte_net.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.407 Installing lib/librte_net.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.407 Installing lib/librte_meter.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.407 Installing lib/librte_meter.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.407 Installing lib/librte_ethdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.407 Installing lib/librte_ethdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.407 Installing lib/librte_pci.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.407 Installing lib/librte_pci.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.407 Installing lib/librte_cmdline.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.407 Installing lib/librte_cmdline.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.407 Installing lib/librte_metrics.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.407 Installing lib/librte_metrics.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.407 Installing lib/librte_hash.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.407 Installing lib/librte_hash.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.407 Installing lib/librte_timer.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.407 Installing lib/librte_timer.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.407 Installing lib/librte_acl.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.407 Installing lib/librte_acl.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.407 Installing lib/librte_bbdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.407 Installing lib/librte_bbdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.407 Installing lib/librte_bitratestats.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.407 Installing lib/librte_bitratestats.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.407 Installing lib/librte_bpf.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.407 Installing lib/librte_bpf.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.407 Installing lib/librte_cfgfile.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.407 Installing lib/librte_cfgfile.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.407 Installing lib/librte_compressdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.407 Installing lib/librte_compressdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.407 Installing lib/librte_cryptodev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.407 Installing lib/librte_cryptodev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.407 Installing lib/librte_distributor.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.407 Installing lib/librte_distributor.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.407 Installing lib/librte_efd.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.407 Installing lib/librte_efd.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.407 Installing lib/librte_eventdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.407 Installing lib/librte_eventdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.407 Installing lib/librte_gpudev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.407 Installing lib/librte_gpudev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.407 Installing lib/librte_gro.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.407 Installing lib/librte_gro.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.407 Installing lib/librte_gso.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.407 Installing lib/librte_gso.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.407 Installing lib/librte_ip_frag.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.407 Installing lib/librte_ip_frag.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.407 Installing lib/librte_jobstats.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.407 Installing lib/librte_jobstats.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.407 Installing lib/librte_latencystats.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.407 Installing lib/librte_latencystats.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.407 Installing lib/librte_lpm.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.407 Installing lib/librte_lpm.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.407 Installing lib/librte_member.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.407 Installing lib/librte_member.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.407 Installing lib/librte_pcapng.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.407 Installing lib/librte_pcapng.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.407 Installing lib/librte_power.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.407 Installing lib/librte_power.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.407 Installing lib/librte_rawdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.407 Installing lib/librte_rawdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.407 Installing lib/librte_regexdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.407 Installing lib/librte_regexdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.407 Installing lib/librte_dmadev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.407 Installing lib/librte_dmadev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.407 Installing lib/librte_rib.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.407 Installing lib/librte_rib.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.408 Installing lib/librte_reorder.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.408 Installing lib/librte_reorder.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.672 Installing lib/librte_sched.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.672 Installing lib/librte_sched.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.672 Installing lib/librte_security.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.672 Installing lib/librte_security.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.672 Installing lib/librte_stack.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.672 Installing lib/librte_stack.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.672 Installing lib/librte_vhost.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.672 Installing lib/librte_vhost.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.672 Installing lib/librte_ipsec.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.672 Installing lib/librte_ipsec.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.672 Installing lib/librte_fib.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.672 Installing lib/librte_fib.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.672 Installing lib/librte_port.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.672 Installing lib/librte_port.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.672 Installing lib/librte_pdump.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.672 Installing lib/librte_pdump.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.672 Installing lib/librte_table.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.672 Installing lib/librte_table.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.672 Installing lib/librte_pipeline.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.672 Installing lib/librte_pipeline.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.672 Installing lib/librte_graph.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.672 Installing lib/librte_graph.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.672 Installing lib/librte_node.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.672 Installing lib/librte_node.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.672 Installing drivers/librte_bus_pci.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.672 Installing drivers/librte_bus_pci.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0 00:02:22.672 Installing drivers/librte_bus_vdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.672 Installing drivers/librte_bus_vdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0 00:02:22.672 Installing drivers/librte_mempool_ring.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.672 Installing drivers/librte_mempool_ring.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0 00:02:22.672 Installing drivers/librte_net_i40e.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.672 Installing drivers/librte_net_i40e.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0 00:02:22.672 Installing app/dpdk-dumpcap to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:22.672 Installing app/dpdk-pdump to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:22.672 Installing app/dpdk-proc-info to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:22.672 Installing app/dpdk-test-acl to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:22.672 Installing app/dpdk-test-bbdev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:22.672 Installing app/dpdk-test-cmdline to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:22.672 Installing app/dpdk-test-compress-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:22.672 Installing app/dpdk-test-crypto-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:22.672 Installing app/dpdk-test-eventdev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:22.672 Installing app/dpdk-test-fib to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:22.672 Installing app/dpdk-test-flow-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:22.672 Installing app/dpdk-test-gpudev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:22.672 Installing app/dpdk-test-pipeline to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:22.672 Installing app/dpdk-testpmd to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:22.672 Installing app/dpdk-test-regex to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:22.672 Installing app/dpdk-test-sad to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:22.672 Installing app/dpdk-test-security-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:22.672 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/config/rte_config.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.672 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/kvargs/rte_kvargs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.672 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/telemetry/rte_telemetry.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.672 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_atomic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:22.672 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_byteorder.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:22.672 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_cpuflags.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:22.672 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_cycles.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:22.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_io.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:22.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_memcpy.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:22.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_pause.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:22.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_power_intrinsics.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:22.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_prefetch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:22.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_rwlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:22.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_spinlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:22.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_vect.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:22.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_cpuflags.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_cycles.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_io.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_memcpy.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_pause.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_power_intrinsics.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_prefetch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_rtm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_rwlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_spinlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_vect.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic_32.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic_64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder_32.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder_64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_alarm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bitmap.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bitops.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_branch_prediction.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bus.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_class.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_compat.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_debug.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_dev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_devargs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_eal.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_eal_memconfig.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_eal_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_errno.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_epoll.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_fbarray.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_hexdump.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_hypervisor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_interrupts.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_keepalive.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_launch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_lcore.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_log.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_malloc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_mcslock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_memory.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_memzone.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_pci_dev_feature_defs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_pci_dev_features.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_per_lcore.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_pflock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_random.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_reciprocal.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_seqcount.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_seqlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_service.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_service_component.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_string_fns.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_tailq.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_thread.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_ticketlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_time.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_trace_point.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_trace_point_register.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_uuid.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_version.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_vfio.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/linux/include/rte_os.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_elem.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_c11_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_generic_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_hts.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_hts_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_peek.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_peek_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_peek_zc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_rts.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_rts_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rcu/rte_rcu_qsbr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mempool/rte_mempool.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mempool/rte_mempool_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mempool/rte_mempool_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_ptype.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_pool_ops.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_dyn.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ip.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_tcp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_udp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_esp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_sctp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_icmp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_arp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ether.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_macsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_vxlan.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_gre.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_gtp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_net.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_net_crc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_mpls.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_higig.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ecpri.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_geneve.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_l2tpv2.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ppp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/meter/rte_meter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_cman.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_dev_info.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_flow.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_flow_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_mtr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_mtr_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_tm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_tm_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_eth_ctrl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pci/rte_pci.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.675 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_num.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.675 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_ipaddr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.675 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_etheraddr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.675 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_string.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.675 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_rdline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.675 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_vt100.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.675 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_socket.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.675 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_cirbuf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.675 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_portlist.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.675 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/metrics/rte_metrics.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.675 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/metrics/rte_metrics_telemetry.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.675 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_fbk_hash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.675 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_hash_crc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.675 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_hash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.675 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_jhash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.675 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_thash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.675 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_thash_gfni.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.675 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_arm64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.675 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_generic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.675 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_sw.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.675 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_x86.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.675 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_thash_x86_gfni.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.675 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/timer/rte_timer.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.675 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/acl/rte_acl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.675 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/acl/rte_acl_osdep.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.675 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bbdev/rte_bbdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.675 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bbdev/rte_bbdev_pmd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.675 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bbdev/rte_bbdev_op.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.675 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bitratestats/rte_bitrate.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.675 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bpf/bpf_def.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.675 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bpf/rte_bpf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.675 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bpf/rte_bpf_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.675 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cfgfile/rte_cfgfile.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.675 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/compressdev/rte_compressdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.675 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/compressdev/rte_comp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.675 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.675 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.675 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.675 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_crypto.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.675 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_crypto_sym.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.675 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_crypto_asym.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.675 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.675 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/distributor/rte_distributor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.675 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/efd/rte_efd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.675 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_crypto_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.675 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_eth_rx_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.675 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_eth_tx_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.675 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.675 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_timer_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.675 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_eventdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.675 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_eventdev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.675 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_eventdev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.675 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/gpudev/rte_gpudev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.675 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/gro/rte_gro.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.675 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/gso/rte_gso.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.675 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ip_frag/rte_ip_frag.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.675 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/jobstats/rte_jobstats.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.675 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/latencystats/rte_latencystats.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.675 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.675 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.675 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_altivec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.675 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.675 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_scalar.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.676 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.676 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_sve.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.676 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/member/rte_member.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.676 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pcapng/rte_pcapng.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.676 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.676 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_empty_poll.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.676 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_intel_uncore.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.676 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_pmd_mgmt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.676 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_guest_channel.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.676 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rawdev/rte_rawdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.676 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rawdev/rte_rawdev_pmd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.676 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/regexdev/rte_regexdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.676 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/regexdev/rte_regexdev_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.676 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/regexdev/rte_regexdev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.676 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/dmadev/rte_dmadev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.676 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/dmadev/rte_dmadev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.676 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rib/rte_rib.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.676 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rib/rte_rib6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.676 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/reorder/rte_reorder.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.676 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_approx.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.676 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_red.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.676 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_sched.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.676 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_sched_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.676 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_pie.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.676 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/security/rte_security.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.676 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/security/rte_security_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.676 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.676 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_std.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.676 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.676 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf_generic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.676 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf_c11.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.676 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf_stubs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.676 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vdpa.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.676 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vhost.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.676 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vhost_async.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.676 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vhost_crypto.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.676 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.676 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec_sa.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.676 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec_sad.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.676 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.676 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/fib/rte_fib.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.676 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/fib/rte_fib6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.676 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.676 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_fd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.676 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_frag.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.676 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_ras.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.676 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.676 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.676 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_sched.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.676 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_source_sink.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.676 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_sym_crypto.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.676 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_eventdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.676 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.676 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.676 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_fd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.676 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.676 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_source_sink.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.676 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pdump/rte_pdump.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.676 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_lru.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.676 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_hash_func.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.676 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.676 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_em.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.676 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_learner.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.676 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_selector.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.676 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_wm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.676 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.676 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_acl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.676 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_array.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.676 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.676 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash_cuckoo.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.676 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash_func.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.677 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_lpm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.677 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_lpm_ipv6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.677 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_stub.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.677 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_lru_arm64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.677 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_lru_x86.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.677 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash_func_arm64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.677 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_pipeline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.677 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_port_in_action.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.677 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_table_action.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.677 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_pipeline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.677 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_extern.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.677 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_ctl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.677 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.677 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph_worker.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.677 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/node/rte_node_ip4_api.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.677 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/node/rte_node_eth_api.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.677 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/drivers/bus/pci/rte_bus_pci.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.677 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/drivers/bus/vdev/rte_bus_vdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.677 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/drivers/net/i40e/rte_pmd_i40e.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.677 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-devbind.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:22.677 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-pmdinfo.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:22.677 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-telemetry.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:22.677 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-hugepages.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:22.677 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp/rte_build_config.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.677 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp/meson-private/libdpdk-libs.pc to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig 00:02:22.677 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp/meson-private/libdpdk.pc to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig 00:02:22.677 Installing symlink pointing to librte_kvargs.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_kvargs.so.23 00:02:22.677 Installing symlink pointing to librte_kvargs.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_kvargs.so 00:02:22.677 Installing symlink pointing to librte_telemetry.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_telemetry.so.23 00:02:22.677 Installing symlink pointing to librte_telemetry.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_telemetry.so 00:02:22.677 Installing symlink pointing to librte_eal.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eal.so.23 00:02:22.677 Installing symlink pointing to librte_eal.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eal.so 00:02:22.677 Installing symlink pointing to librte_ring.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ring.so.23 00:02:22.677 Installing symlink pointing to librte_ring.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ring.so 00:02:22.677 Installing symlink pointing to librte_rcu.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rcu.so.23 00:02:22.677 Installing symlink pointing to librte_rcu.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rcu.so 00:02:22.677 Installing symlink pointing to librte_mempool.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mempool.so.23 00:02:22.677 Installing symlink pointing to librte_mempool.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mempool.so 00:02:22.677 Installing symlink pointing to librte_mbuf.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mbuf.so.23 00:02:22.677 Installing symlink pointing to librte_mbuf.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mbuf.so 00:02:22.677 Installing symlink pointing to librte_net.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_net.so.23 00:02:22.677 Installing symlink pointing to librte_net.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_net.so 00:02:22.677 Installing symlink pointing to librte_meter.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_meter.so.23 00:02:22.677 Installing symlink pointing to librte_meter.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_meter.so 00:02:22.677 Installing symlink pointing to librte_ethdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ethdev.so.23 00:02:22.677 Installing symlink pointing to librte_ethdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ethdev.so 00:02:22.677 Installing symlink pointing to librte_pci.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pci.so.23 00:02:22.677 Installing symlink pointing to librte_pci.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pci.so 00:02:22.677 Installing symlink pointing to librte_cmdline.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cmdline.so.23 00:02:22.677 Installing symlink pointing to librte_cmdline.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cmdline.so 00:02:22.677 Installing symlink pointing to librte_metrics.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_metrics.so.23 00:02:22.677 Installing symlink pointing to librte_metrics.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_metrics.so 00:02:22.677 Installing symlink pointing to librte_hash.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_hash.so.23 00:02:22.677 Installing symlink pointing to librte_hash.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_hash.so 00:02:22.677 Installing symlink pointing to librte_timer.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_timer.so.23 00:02:22.677 Installing symlink pointing to librte_timer.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_timer.so 00:02:22.677 Installing symlink pointing to librte_acl.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_acl.so.23 00:02:22.677 Installing symlink pointing to librte_acl.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_acl.so 00:02:22.677 Installing symlink pointing to librte_bbdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bbdev.so.23 00:02:22.677 Installing symlink pointing to librte_bbdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bbdev.so 00:02:22.677 Installing symlink pointing to librte_bitratestats.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bitratestats.so.23 00:02:22.677 Installing symlink pointing to librte_bitratestats.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bitratestats.so 00:02:22.677 Installing symlink pointing to librte_bpf.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bpf.so.23 00:02:22.677 Installing symlink pointing to librte_bpf.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bpf.so 00:02:22.677 Installing symlink pointing to librte_cfgfile.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cfgfile.so.23 00:02:22.677 Installing symlink pointing to librte_cfgfile.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cfgfile.so 00:02:22.677 Installing symlink pointing to librte_compressdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_compressdev.so.23 00:02:22.677 Installing symlink pointing to librte_compressdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_compressdev.so 00:02:22.677 Installing symlink pointing to librte_cryptodev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cryptodev.so.23 00:02:22.677 Installing symlink pointing to librte_cryptodev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cryptodev.so 00:02:22.677 Installing symlink pointing to librte_distributor.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_distributor.so.23 00:02:22.677 Installing symlink pointing to librte_distributor.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_distributor.so 00:02:22.677 Installing symlink pointing to librte_efd.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_efd.so.23 00:02:22.678 Installing symlink pointing to librte_efd.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_efd.so 00:02:22.678 Installing symlink pointing to librte_eventdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eventdev.so.23 00:02:22.678 Installing symlink pointing to librte_eventdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eventdev.so 00:02:22.678 Installing symlink pointing to librte_gpudev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gpudev.so.23 00:02:22.678 Installing symlink pointing to librte_gpudev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gpudev.so 00:02:22.678 Installing symlink pointing to librte_gro.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gro.so.23 00:02:22.678 Installing symlink pointing to librte_gro.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gro.so 00:02:22.678 Installing symlink pointing to librte_gso.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gso.so.23 00:02:22.678 Installing symlink pointing to librte_gso.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gso.so 00:02:22.678 Installing symlink pointing to librte_ip_frag.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ip_frag.so.23 00:02:22.678 Installing symlink pointing to librte_ip_frag.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ip_frag.so 00:02:22.678 Installing symlink pointing to librte_jobstats.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_jobstats.so.23 00:02:22.678 Installing symlink pointing to librte_jobstats.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_jobstats.so 00:02:22.678 Installing symlink pointing to librte_latencystats.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_latencystats.so.23 00:02:22.678 Installing symlink pointing to librte_latencystats.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_latencystats.so 00:02:22.678 Installing symlink pointing to librte_lpm.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_lpm.so.23 00:02:22.678 Installing symlink pointing to librte_lpm.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_lpm.so 00:02:22.678 Installing symlink pointing to librte_member.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_member.so.23 00:02:22.678 Installing symlink pointing to librte_member.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_member.so 00:02:22.678 Installing symlink pointing to librte_pcapng.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pcapng.so.23 00:02:22.678 Installing symlink pointing to librte_pcapng.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pcapng.so 00:02:22.678 Installing symlink pointing to librte_power.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_power.so.23 00:02:22.678 Installing symlink pointing to librte_power.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_power.so 00:02:22.678 Installing symlink pointing to librte_rawdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rawdev.so.23 00:02:22.678 Installing symlink pointing to librte_rawdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rawdev.so 00:02:22.678 Installing symlink pointing to librte_regexdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_regexdev.so.23 00:02:22.678 Installing symlink pointing to librte_regexdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_regexdev.so 00:02:22.678 Installing symlink pointing to librte_dmadev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_dmadev.so.23 00:02:22.678 Installing symlink pointing to librte_dmadev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_dmadev.so 00:02:22.678 Installing symlink pointing to librte_rib.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rib.so.23 00:02:22.678 Installing symlink pointing to librte_rib.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rib.so 00:02:22.678 Installing symlink pointing to librte_reorder.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_reorder.so.23 00:02:22.678 Installing symlink pointing to librte_reorder.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_reorder.so 00:02:22.678 Installing symlink pointing to librte_sched.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_sched.so.23 00:02:22.678 Installing symlink pointing to librte_sched.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_sched.so 00:02:22.678 Installing symlink pointing to librte_security.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_security.so.23 00:02:22.678 Installing symlink pointing to librte_security.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_security.so 00:02:22.678 Installing symlink pointing to librte_stack.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_stack.so.23 00:02:22.678 Installing symlink pointing to librte_stack.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_stack.so 00:02:22.678 Installing symlink pointing to librte_vhost.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_vhost.so.23 00:02:22.678 Installing symlink pointing to librte_vhost.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_vhost.so 00:02:22.678 './librte_bus_pci.so' -> 'dpdk/pmds-23.0/librte_bus_pci.so' 00:02:22.678 './librte_bus_pci.so.23' -> 'dpdk/pmds-23.0/librte_bus_pci.so.23' 00:02:22.678 './librte_bus_pci.so.23.0' -> 'dpdk/pmds-23.0/librte_bus_pci.so.23.0' 00:02:22.678 './librte_bus_vdev.so' -> 'dpdk/pmds-23.0/librte_bus_vdev.so' 00:02:22.678 './librte_bus_vdev.so.23' -> 'dpdk/pmds-23.0/librte_bus_vdev.so.23' 00:02:22.678 './librte_bus_vdev.so.23.0' -> 'dpdk/pmds-23.0/librte_bus_vdev.so.23.0' 00:02:22.678 './librte_mempool_ring.so' -> 'dpdk/pmds-23.0/librte_mempool_ring.so' 00:02:22.678 './librte_mempool_ring.so.23' -> 'dpdk/pmds-23.0/librte_mempool_ring.so.23' 00:02:22.678 './librte_mempool_ring.so.23.0' -> 'dpdk/pmds-23.0/librte_mempool_ring.so.23.0' 00:02:22.678 './librte_net_i40e.so' -> 'dpdk/pmds-23.0/librte_net_i40e.so' 00:02:22.678 './librte_net_i40e.so.23' -> 'dpdk/pmds-23.0/librte_net_i40e.so.23' 00:02:22.678 './librte_net_i40e.so.23.0' -> 'dpdk/pmds-23.0/librte_net_i40e.so.23.0' 00:02:22.678 Installing symlink pointing to librte_ipsec.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ipsec.so.23 00:02:22.678 Installing symlink pointing to librte_ipsec.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ipsec.so 00:02:22.678 Installing symlink pointing to librte_fib.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_fib.so.23 00:02:22.678 Installing symlink pointing to librte_fib.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_fib.so 00:02:22.678 Installing symlink pointing to librte_port.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_port.so.23 00:02:22.678 Installing symlink pointing to librte_port.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_port.so 00:02:22.678 Installing symlink pointing to librte_pdump.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pdump.so.23 00:02:22.678 Installing symlink pointing to librte_pdump.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pdump.so 00:02:22.678 Installing symlink pointing to librte_table.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_table.so.23 00:02:22.678 Installing symlink pointing to librte_table.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_table.so 00:02:22.678 Installing symlink pointing to librte_pipeline.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pipeline.so.23 00:02:22.678 Installing symlink pointing to librte_pipeline.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pipeline.so 00:02:22.678 Installing symlink pointing to librte_graph.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_graph.so.23 00:02:22.678 Installing symlink pointing to librte_graph.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_graph.so 00:02:22.678 Installing symlink pointing to librte_node.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_node.so.23 00:02:22.678 Installing symlink pointing to librte_node.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_node.so 00:02:22.678 Installing symlink pointing to librte_bus_pci.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_bus_pci.so.23 00:02:22.678 Installing symlink pointing to librte_bus_pci.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_bus_pci.so 00:02:22.678 Installing symlink pointing to librte_bus_vdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_bus_vdev.so.23 00:02:22.678 Installing symlink pointing to librte_bus_vdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_bus_vdev.so 00:02:22.678 Installing symlink pointing to librte_mempool_ring.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_mempool_ring.so.23 00:02:22.678 Installing symlink pointing to librte_mempool_ring.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_mempool_ring.so 00:02:22.678 Installing symlink pointing to librte_net_i40e.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_net_i40e.so.23 00:02:22.678 Installing symlink pointing to librte_net_i40e.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_net_i40e.so 00:02:22.678 Running custom install script '/bin/sh /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/config/../buildtools/symlink-drivers-solibs.sh lib dpdk/pmds-23.0' 00:02:22.678 10:56:47 build_native_dpdk -- common/autobuild_common.sh@213 -- $ cat 00:02:22.678 10:56:47 build_native_dpdk -- common/autobuild_common.sh@218 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:02:22.678 00:02:22.678 real 0m27.748s 00:02:22.678 user 6m36.817s 00:02:22.678 sys 2m25.166s 00:02:22.678 10:56:47 build_native_dpdk -- common/autotest_common.sh@1130 -- $ xtrace_disable 00:02:22.678 10:56:47 build_native_dpdk -- common/autotest_common.sh@10 -- $ set +x 00:02:22.678 ************************************ 00:02:22.678 END TEST build_native_dpdk 00:02:22.678 ************************************ 00:02:22.939 10:56:47 -- spdk/autobuild.sh@31 -- $ case "$SPDK_TEST_AUTOBUILD" in 00:02:22.939 10:56:47 -- spdk/autobuild.sh@47 -- $ [[ 0 -eq 1 ]] 00:02:22.939 10:56:47 -- spdk/autobuild.sh@51 -- $ [[ 1 -eq 1 ]] 00:02:22.939 10:56:47 -- spdk/autobuild.sh@52 -- $ llvm_precompile 00:02:22.939 10:56:47 -- common/autobuild_common.sh@438 -- $ run_test autobuild_llvm_precompile _llvm_precompile 00:02:22.939 10:56:47 -- common/autotest_common.sh@1105 -- $ '[' 2 -le 1 ']' 00:02:22.939 10:56:47 -- common/autotest_common.sh@1111 -- $ xtrace_disable 00:02:22.939 10:56:47 -- common/autotest_common.sh@10 -- $ set +x 00:02:22.939 ************************************ 00:02:22.939 START TEST autobuild_llvm_precompile 00:02:22.939 ************************************ 00:02:22.939 10:56:47 autobuild_llvm_precompile -- common/autotest_common.sh@1129 -- $ _llvm_precompile 00:02:22.939 10:56:47 autobuild_llvm_precompile -- common/autobuild_common.sh@32 -- $ clang --version 00:02:23.199 10:56:47 autobuild_llvm_precompile -- common/autobuild_common.sh@32 -- $ [[ clang version 17.0.6 (Fedora 17.0.6-2.fc39) 00:02:23.199 Target: x86_64-redhat-linux-gnu 00:02:23.199 Thread model: posix 00:02:23.199 InstalledDir: /usr/bin =~ version (([0-9]+).([0-9]+).([0-9]+)) ]] 00:02:23.199 10:56:47 autobuild_llvm_precompile -- common/autobuild_common.sh@33 -- $ clang_num=17 00:02:23.199 10:56:47 autobuild_llvm_precompile -- common/autobuild_common.sh@35 -- $ export CC=clang-17 00:02:23.199 10:56:47 autobuild_llvm_precompile -- common/autobuild_common.sh@35 -- $ CC=clang-17 00:02:23.199 10:56:47 autobuild_llvm_precompile -- common/autobuild_common.sh@36 -- $ export CXX=clang++-17 00:02:23.199 10:56:47 autobuild_llvm_precompile -- common/autobuild_common.sh@36 -- $ CXX=clang++-17 00:02:23.199 10:56:47 autobuild_llvm_precompile -- common/autobuild_common.sh@38 -- $ fuzzer_libs=(/usr/lib*/clang/@("$clang_num"|"$clang_version")/lib/*linux*/libclang_rt.fuzzer_no_main?(-x86_64).a) 00:02:23.199 10:56:47 autobuild_llvm_precompile -- common/autobuild_common.sh@39 -- $ fuzzer_lib=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:02:23.199 10:56:47 autobuild_llvm_precompile -- common/autobuild_common.sh@40 -- $ [[ -e /usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a ]] 00:02:23.199 10:56:47 autobuild_llvm_precompile -- common/autobuild_common.sh@42 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user --with-fuzzer=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a' 00:02:23.199 10:56:47 autobuild_llvm_precompile -- common/autobuild_common.sh@44 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user --with-fuzzer=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:02:23.460 Using /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig for additional libs... 00:02:23.720 DPDK libraries: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:23.720 DPDK includes: //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:23.980 Using default SPDK env in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:02:24.550 Using 'verbs' RDMA provider 00:02:40.838 Configuring ISA-L (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.spdk-isal.log)...done. 00:02:55.740 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.spdk-isal-crypto.log)...done. 00:02:55.740 Creating mk/config.mk...done. 00:02:55.740 Creating mk/cc.flags.mk...done. 00:02:55.740 Type 'make' to build. 00:02:55.740 00:02:55.740 real 0m32.548s 00:02:55.740 user 0m13.161s 00:02:55.740 sys 0m18.756s 00:02:55.740 10:57:19 autobuild_llvm_precompile -- common/autotest_common.sh@1130 -- $ xtrace_disable 00:02:55.740 10:57:19 autobuild_llvm_precompile -- common/autotest_common.sh@10 -- $ set +x 00:02:55.740 ************************************ 00:02:55.740 END TEST autobuild_llvm_precompile 00:02:55.740 ************************************ 00:02:55.740 10:57:19 -- spdk/autobuild.sh@55 -- $ [[ -n '' ]] 00:02:55.740 10:57:19 -- spdk/autobuild.sh@57 -- $ [[ 0 -eq 1 ]] 00:02:55.740 10:57:19 -- spdk/autobuild.sh@59 -- $ [[ 0 -eq 1 ]] 00:02:55.740 10:57:19 -- spdk/autobuild.sh@62 -- $ [[ 1 -eq 1 ]] 00:02:55.740 10:57:19 -- spdk/autobuild.sh@64 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user --with-fuzzer=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:02:55.740 Using /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig for additional libs... 00:02:56.000 DPDK libraries: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:56.000 DPDK includes: //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:56.000 Using default SPDK env in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:02:56.261 Using 'verbs' RDMA provider 00:03:09.875 Configuring ISA-L (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.spdk-isal.log)...done. 00:03:22.104 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.spdk-isal-crypto.log)...done. 00:03:22.104 Creating mk/config.mk...done. 00:03:22.104 Creating mk/cc.flags.mk...done. 00:03:22.104 Type 'make' to build. 00:03:22.104 10:57:46 -- spdk/autobuild.sh@70 -- $ run_test make make -j112 00:03:22.104 10:57:46 -- common/autotest_common.sh@1105 -- $ '[' 3 -le 1 ']' 00:03:22.104 10:57:46 -- common/autotest_common.sh@1111 -- $ xtrace_disable 00:03:22.104 10:57:46 -- common/autotest_common.sh@10 -- $ set +x 00:03:22.104 ************************************ 00:03:22.104 START TEST make 00:03:22.104 ************************************ 00:03:22.104 10:57:46 make -- common/autotest_common.sh@1129 -- $ make -j112 00:03:22.104 make[1]: Nothing to be done for 'all'. 00:03:23.497 The Meson build system 00:03:23.497 Version: 1.5.0 00:03:23.497 Source dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user 00:03:23.497 Build dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:03:23.497 Build type: native build 00:03:23.497 Project name: libvfio-user 00:03:23.497 Project version: 0.0.1 00:03:23.497 C compiler for the host machine: clang-17 (clang 17.0.6 "clang version 17.0.6 (Fedora 17.0.6-2.fc39)") 00:03:23.497 C linker for the host machine: clang-17 ld.bfd 2.40-14 00:03:23.497 Host machine cpu family: x86_64 00:03:23.497 Host machine cpu: x86_64 00:03:23.497 Run-time dependency threads found: YES 00:03:23.497 Library dl found: YES 00:03:23.497 Found pkg-config: YES (/usr/bin/pkg-config) 1.9.5 00:03:23.497 Run-time dependency json-c found: YES 0.17 00:03:23.497 Run-time dependency cmocka found: YES 1.1.7 00:03:23.497 Program pytest-3 found: NO 00:03:23.497 Program flake8 found: NO 00:03:23.497 Program misspell-fixer found: NO 00:03:23.497 Program restructuredtext-lint found: NO 00:03:23.497 Program valgrind found: YES (/usr/bin/valgrind) 00:03:23.497 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:03:23.497 Compiler for C supports arguments -Wmissing-declarations: YES 00:03:23.497 Compiler for C supports arguments -Wwrite-strings: YES 00:03:23.497 ../libvfio-user/test/meson.build:20: WARNING: Project targets '>= 0.53.0' but uses feature introduced in '0.57.0': exclude_suites arg in add_test_setup. 00:03:23.497 Program test-lspci.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user/test/test-lspci.sh) 00:03:23.497 Program test-linkage.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user/test/test-linkage.sh) 00:03:23.497 ../libvfio-user/test/py/meson.build:16: WARNING: Project targets '>= 0.53.0' but uses feature introduced in '0.57.0': exclude_suites arg in add_test_setup. 00:03:23.497 Build targets in project: 8 00:03:23.497 WARNING: Project specifies a minimum meson_version '>= 0.53.0' but uses features which were added in newer versions: 00:03:23.497 * 0.57.0: {'exclude_suites arg in add_test_setup'} 00:03:23.497 00:03:23.497 libvfio-user 0.0.1 00:03:23.497 00:03:23.497 User defined options 00:03:23.497 buildtype : debug 00:03:23.497 default_library: static 00:03:23.497 libdir : /usr/local/lib 00:03:23.497 00:03:23.497 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:03:24.068 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug' 00:03:24.068 [1/36] Compiling C object samples/lspci.p/lspci.c.o 00:03:24.068 [2/36] Compiling C object samples/shadow_ioeventfd_server.p/shadow_ioeventfd_server.c.o 00:03:24.068 [3/36] Compiling C object samples/null.p/null.c.o 00:03:24.068 [4/36] Compiling C object test/unit_tests.p/.._lib_irq.c.o 00:03:24.068 [5/36] Compiling C object lib/libvfio-user.a.p/irq.c.o 00:03:24.068 [6/36] Compiling C object samples/gpio-pci-idio-16.p/gpio-pci-idio-16.c.o 00:03:24.068 [7/36] Compiling C object lib/libvfio-user.a.p/migration.c.o 00:03:24.068 [8/36] Compiling C object samples/client.p/.._lib_migration.c.o 00:03:24.068 [9/36] Compiling C object test/unit_tests.p/.._lib_migration.c.o 00:03:24.068 [10/36] Compiling C object lib/libvfio-user.a.p/pci_caps.c.o 00:03:24.068 [11/36] Compiling C object test/unit_tests.p/mocks.c.o 00:03:24.068 [12/36] Compiling C object test/unit_tests.p/.._lib_pci.c.o 00:03:24.068 [13/36] Compiling C object lib/libvfio-user.a.p/pci.c.o 00:03:24.068 [14/36] Compiling C object test/unit_tests.p/.._lib_pci_caps.c.o 00:03:24.068 [15/36] Compiling C object test/unit_tests.p/.._lib_tran_pipe.c.o 00:03:24.068 [16/36] Compiling C object samples/client.p/.._lib_tran.c.o 00:03:24.068 [17/36] Compiling C object lib/libvfio-user.a.p/tran.c.o 00:03:24.068 [18/36] Compiling C object test/unit_tests.p/.._lib_tran.c.o 00:03:24.068 [19/36] Compiling C object test/unit_tests.p/.._lib_tran_sock.c.o 00:03:24.068 [20/36] Compiling C object test/unit_tests.p/.._lib_dma.c.o 00:03:24.068 [21/36] Compiling C object lib/libvfio-user.a.p/dma.c.o 00:03:24.068 [22/36] Compiling C object samples/client.p/.._lib_tran_sock.c.o 00:03:24.068 [23/36] Compiling C object lib/libvfio-user.a.p/tran_sock.c.o 00:03:24.068 [24/36] Compiling C object samples/server.p/server.c.o 00:03:24.068 [25/36] Compiling C object test/unit_tests.p/unit-tests.c.o 00:03:24.068 [26/36] Compiling C object samples/client.p/client.c.o 00:03:24.068 [27/36] Compiling C object lib/libvfio-user.a.p/libvfio-user.c.o 00:03:24.068 [28/36] Compiling C object test/unit_tests.p/.._lib_libvfio-user.c.o 00:03:24.068 [29/36] Linking target samples/client 00:03:24.329 [30/36] Linking target test/unit_tests 00:03:24.329 [31/36] Linking static target lib/libvfio-user.a 00:03:24.329 [32/36] Linking target samples/shadow_ioeventfd_server 00:03:24.329 [33/36] Linking target samples/null 00:03:24.329 [34/36] Linking target samples/server 00:03:24.329 [35/36] Linking target samples/lspci 00:03:24.329 [36/36] Linking target samples/gpio-pci-idio-16 00:03:24.329 INFO: autodetecting backend as ninja 00:03:24.329 INFO: calculating backend command to run: /usr/local/bin/ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:03:24.329 DESTDIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user meson install --quiet -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:03:24.590 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug' 00:03:24.590 ninja: no work to do. 00:03:39.547 CC lib/ut/ut.o 00:03:39.547 CC lib/log/log.o 00:03:39.547 CC lib/log/log_flags.o 00:03:39.547 CC lib/log/log_deprecated.o 00:03:39.547 CC lib/ut_mock/mock.o 00:03:39.547 LIB libspdk_ut.a 00:03:39.547 LIB libspdk_ut_mock.a 00:03:39.547 LIB libspdk_log.a 00:03:39.547 CC lib/dma/dma.o 00:03:39.547 CC lib/ioat/ioat.o 00:03:39.547 CXX lib/trace_parser/trace.o 00:03:39.547 CC lib/util/bit_array.o 00:03:39.547 CC lib/util/base64.o 00:03:39.547 CC lib/util/cpuset.o 00:03:39.547 CC lib/util/crc16.o 00:03:39.547 CC lib/util/crc32.o 00:03:39.547 CC lib/util/crc32c.o 00:03:39.547 CC lib/util/crc32_ieee.o 00:03:39.547 CC lib/util/crc64.o 00:03:39.547 CC lib/util/dif.o 00:03:39.547 CC lib/util/fd.o 00:03:39.547 CC lib/util/fd_group.o 00:03:39.547 CC lib/util/file.o 00:03:39.547 CC lib/util/hexlify.o 00:03:39.547 CC lib/util/iov.o 00:03:39.547 CC lib/util/math.o 00:03:39.547 CC lib/util/net.o 00:03:39.547 CC lib/util/pipe.o 00:03:39.547 CC lib/util/strerror_tls.o 00:03:39.547 CC lib/util/string.o 00:03:39.547 CC lib/util/uuid.o 00:03:39.547 CC lib/util/xor.o 00:03:39.547 CC lib/util/zipf.o 00:03:39.547 CC lib/util/md5.o 00:03:39.547 CC lib/vfio_user/host/vfio_user_pci.o 00:03:39.547 CC lib/vfio_user/host/vfio_user.o 00:03:39.547 LIB libspdk_dma.a 00:03:39.547 LIB libspdk_ioat.a 00:03:39.547 LIB libspdk_vfio_user.a 00:03:39.547 LIB libspdk_util.a 00:03:39.547 CC lib/rdma_utils/rdma_utils.o 00:03:39.547 CC lib/vmd/vmd.o 00:03:39.547 CC lib/vmd/led.o 00:03:39.547 LIB libspdk_trace_parser.a 00:03:39.547 CC lib/conf/conf.o 00:03:39.547 CC lib/json/json_parse.o 00:03:39.547 CC lib/json/json_util.o 00:03:39.547 CC lib/json/json_write.o 00:03:39.547 CC lib/env_dpdk/env.o 00:03:39.547 CC lib/idxd/idxd.o 00:03:39.547 CC lib/env_dpdk/memory.o 00:03:39.547 CC lib/idxd/idxd_user.o 00:03:39.547 CC lib/env_dpdk/pci.o 00:03:39.547 CC lib/idxd/idxd_kernel.o 00:03:39.547 CC lib/env_dpdk/init.o 00:03:39.547 CC lib/env_dpdk/threads.o 00:03:39.547 CC lib/env_dpdk/pci_ioat.o 00:03:39.547 CC lib/env_dpdk/pci_virtio.o 00:03:39.547 CC lib/env_dpdk/pci_vmd.o 00:03:39.547 CC lib/env_dpdk/pci_idxd.o 00:03:39.547 CC lib/env_dpdk/pci_event.o 00:03:39.547 CC lib/env_dpdk/sigbus_handler.o 00:03:39.547 CC lib/env_dpdk/pci_dpdk_2207.o 00:03:39.547 CC lib/env_dpdk/pci_dpdk.o 00:03:39.547 CC lib/env_dpdk/pci_dpdk_2211.o 00:03:39.547 LIB libspdk_conf.a 00:03:39.547 LIB libspdk_rdma_utils.a 00:03:39.547 LIB libspdk_json.a 00:03:39.547 LIB libspdk_idxd.a 00:03:39.547 LIB libspdk_vmd.a 00:03:39.547 CC lib/rdma_provider/common.o 00:03:39.547 CC lib/rdma_provider/rdma_provider_verbs.o 00:03:39.547 CC lib/jsonrpc/jsonrpc_server.o 00:03:39.547 CC lib/jsonrpc/jsonrpc_server_tcp.o 00:03:39.547 CC lib/jsonrpc/jsonrpc_client.o 00:03:39.547 CC lib/jsonrpc/jsonrpc_client_tcp.o 00:03:39.547 LIB libspdk_rdma_provider.a 00:03:39.547 LIB libspdk_jsonrpc.a 00:03:39.547 LIB libspdk_env_dpdk.a 00:03:39.547 CC lib/rpc/rpc.o 00:03:39.547 LIB libspdk_rpc.a 00:03:39.547 CC lib/trace/trace.o 00:03:39.547 CC lib/trace/trace_flags.o 00:03:39.547 CC lib/trace/trace_rpc.o 00:03:39.547 CC lib/keyring/keyring.o 00:03:39.548 CC lib/keyring/keyring_rpc.o 00:03:39.548 CC lib/notify/notify.o 00:03:39.548 CC lib/notify/notify_rpc.o 00:03:39.808 LIB libspdk_notify.a 00:03:39.808 LIB libspdk_trace.a 00:03:39.808 LIB libspdk_keyring.a 00:03:40.069 CC lib/sock/sock.o 00:03:40.069 CC lib/sock/sock_rpc.o 00:03:40.069 CC lib/thread/thread.o 00:03:40.069 CC lib/thread/iobuf.o 00:03:40.330 LIB libspdk_sock.a 00:03:40.591 CC lib/nvme/nvme_ctrlr_cmd.o 00:03:40.591 CC lib/nvme/nvme_ctrlr.o 00:03:40.591 CC lib/nvme/nvme_fabric.o 00:03:40.591 CC lib/nvme/nvme_ns_cmd.o 00:03:40.591 CC lib/nvme/nvme_ns.o 00:03:40.591 CC lib/nvme/nvme_qpair.o 00:03:40.591 CC lib/nvme/nvme_pcie_common.o 00:03:40.591 CC lib/nvme/nvme_pcie.o 00:03:40.591 CC lib/nvme/nvme.o 00:03:40.591 CC lib/nvme/nvme_quirks.o 00:03:40.591 CC lib/nvme/nvme_transport.o 00:03:40.591 CC lib/nvme/nvme_discovery.o 00:03:40.591 CC lib/nvme/nvme_ctrlr_ocssd_cmd.o 00:03:40.591 CC lib/nvme/nvme_ns_ocssd_cmd.o 00:03:40.591 CC lib/nvme/nvme_tcp.o 00:03:40.591 CC lib/nvme/nvme_opal.o 00:03:40.591 CC lib/nvme/nvme_io_msg.o 00:03:40.851 CC lib/nvme/nvme_poll_group.o 00:03:40.851 CC lib/nvme/nvme_zns.o 00:03:40.851 CC lib/nvme/nvme_stubs.o 00:03:40.851 CC lib/nvme/nvme_auth.o 00:03:40.851 CC lib/nvme/nvme_cuse.o 00:03:40.851 CC lib/nvme/nvme_vfio_user.o 00:03:40.851 CC lib/nvme/nvme_rdma.o 00:03:40.851 LIB libspdk_thread.a 00:03:41.111 CC lib/virtio/virtio.o 00:03:41.111 CC lib/virtio/virtio_pci.o 00:03:41.111 CC lib/virtio/virtio_vhost_user.o 00:03:41.111 CC lib/virtio/virtio_vfio_user.o 00:03:41.111 CC lib/vfu_tgt/tgt_rpc.o 00:03:41.111 CC lib/vfu_tgt/tgt_endpoint.o 00:03:41.111 CC lib/accel/accel_sw.o 00:03:41.111 CC lib/accel/accel.o 00:03:41.111 CC lib/accel/accel_rpc.o 00:03:41.111 CC lib/init/json_config.o 00:03:41.111 CC lib/init/subsystem.o 00:03:41.111 CC lib/blob/blobstore.o 00:03:41.111 CC lib/init/subsystem_rpc.o 00:03:41.111 CC lib/blob/request.o 00:03:41.111 CC lib/init/rpc.o 00:03:41.111 CC lib/blob/zeroes.o 00:03:41.111 CC lib/blob/blob_bs_dev.o 00:03:41.111 CC lib/fsdev/fsdev.o 00:03:41.111 CC lib/fsdev/fsdev_io.o 00:03:41.111 CC lib/fsdev/fsdev_rpc.o 00:03:41.371 LIB libspdk_init.a 00:03:41.371 LIB libspdk_virtio.a 00:03:41.371 LIB libspdk_vfu_tgt.a 00:03:41.632 LIB libspdk_fsdev.a 00:03:41.632 CC lib/event/app.o 00:03:41.632 CC lib/event/reactor.o 00:03:41.632 CC lib/event/log_rpc.o 00:03:41.632 CC lib/event/app_rpc.o 00:03:41.632 CC lib/event/scheduler_static.o 00:03:41.892 CC lib/fuse_dispatcher/fuse_dispatcher.o 00:03:41.892 LIB libspdk_accel.a 00:03:41.892 LIB libspdk_event.a 00:03:41.892 LIB libspdk_nvme.a 00:03:42.152 CC lib/bdev/bdev.o 00:03:42.152 CC lib/bdev/bdev_rpc.o 00:03:42.152 CC lib/bdev/bdev_zone.o 00:03:42.152 CC lib/bdev/part.o 00:03:42.152 CC lib/bdev/scsi_nvme.o 00:03:42.412 LIB libspdk_fuse_dispatcher.a 00:03:42.984 LIB libspdk_blob.a 00:03:43.244 CC lib/blobfs/blobfs.o 00:03:43.244 CC lib/blobfs/tree.o 00:03:43.244 CC lib/lvol/lvol.o 00:03:43.815 LIB libspdk_lvol.a 00:03:43.815 LIB libspdk_blobfs.a 00:03:43.815 LIB libspdk_bdev.a 00:03:44.075 CC lib/nvmf/ctrlr.o 00:03:44.075 CC lib/nvmf/ctrlr_discovery.o 00:03:44.075 CC lib/nvmf/ctrlr_bdev.o 00:03:44.075 CC lib/nvmf/nvmf.o 00:03:44.075 CC lib/nvmf/subsystem.o 00:03:44.335 CC lib/nvmf/nvmf_rpc.o 00:03:44.335 CC lib/nvmf/transport.o 00:03:44.335 CC lib/nvmf/tcp.o 00:03:44.335 CC lib/nvmf/stubs.o 00:03:44.335 CC lib/nvmf/mdns_server.o 00:03:44.335 CC lib/nvmf/vfio_user.o 00:03:44.335 CC lib/nvmf/rdma.o 00:03:44.335 CC lib/nvmf/auth.o 00:03:44.335 CC lib/scsi/dev.o 00:03:44.335 CC lib/scsi/lun.o 00:03:44.335 CC lib/scsi/port.o 00:03:44.335 CC lib/nbd/nbd.o 00:03:44.335 CC lib/scsi/scsi.o 00:03:44.335 CC lib/ftl/ftl_core.o 00:03:44.335 CC lib/scsi/scsi_bdev.o 00:03:44.335 CC lib/ftl/ftl_init.o 00:03:44.335 CC lib/scsi/scsi_pr.o 00:03:44.335 CC lib/scsi/scsi_rpc.o 00:03:44.335 CC lib/nbd/nbd_rpc.o 00:03:44.335 CC lib/ublk/ublk_rpc.o 00:03:44.335 CC lib/scsi/task.o 00:03:44.335 CC lib/ublk/ublk.o 00:03:44.335 CC lib/ftl/ftl_layout.o 00:03:44.335 CC lib/ftl/ftl_debug.o 00:03:44.335 CC lib/ftl/ftl_sb.o 00:03:44.335 CC lib/ftl/ftl_io.o 00:03:44.335 CC lib/ftl/ftl_l2p.o 00:03:44.335 CC lib/ftl/ftl_l2p_flat.o 00:03:44.335 CC lib/ftl/ftl_nv_cache.o 00:03:44.335 CC lib/ftl/ftl_band.o 00:03:44.335 CC lib/ftl/ftl_writer.o 00:03:44.335 CC lib/ftl/ftl_band_ops.o 00:03:44.335 CC lib/ftl/ftl_rq.o 00:03:44.335 CC lib/ftl/ftl_reloc.o 00:03:44.335 CC lib/ftl/ftl_l2p_cache.o 00:03:44.335 CC lib/ftl/ftl_p2l.o 00:03:44.335 CC lib/ftl/ftl_p2l_log.o 00:03:44.335 CC lib/ftl/mngt/ftl_mngt.o 00:03:44.335 CC lib/ftl/mngt/ftl_mngt_bdev.o 00:03:44.335 CC lib/ftl/mngt/ftl_mngt_shutdown.o 00:03:44.335 CC lib/ftl/mngt/ftl_mngt_startup.o 00:03:44.335 CC lib/ftl/mngt/ftl_mngt_md.o 00:03:44.335 CC lib/ftl/mngt/ftl_mngt_misc.o 00:03:44.335 CC lib/ftl/mngt/ftl_mngt_ioch.o 00:03:44.335 CC lib/ftl/mngt/ftl_mngt_l2p.o 00:03:44.335 CC lib/ftl/mngt/ftl_mngt_band.o 00:03:44.335 CC lib/ftl/mngt/ftl_mngt_p2l.o 00:03:44.335 CC lib/ftl/mngt/ftl_mngt_self_test.o 00:03:44.335 CC lib/ftl/mngt/ftl_mngt_recovery.o 00:03:44.335 CC lib/ftl/mngt/ftl_mngt_upgrade.o 00:03:44.335 CC lib/ftl/utils/ftl_conf.o 00:03:44.335 CC lib/ftl/utils/ftl_mempool.o 00:03:44.335 CC lib/ftl/utils/ftl_md.o 00:03:44.335 CC lib/ftl/utils/ftl_bitmap.o 00:03:44.335 CC lib/ftl/utils/ftl_property.o 00:03:44.335 CC lib/ftl/upgrade/ftl_layout_upgrade.o 00:03:44.335 CC lib/ftl/upgrade/ftl_sb_upgrade.o 00:03:44.335 CC lib/ftl/utils/ftl_layout_tracker_bdev.o 00:03:44.335 CC lib/ftl/upgrade/ftl_p2l_upgrade.o 00:03:44.335 CC lib/ftl/upgrade/ftl_band_upgrade.o 00:03:44.335 CC lib/ftl/upgrade/ftl_chunk_upgrade.o 00:03:44.335 CC lib/ftl/upgrade/ftl_trim_upgrade.o 00:03:44.335 CC lib/ftl/upgrade/ftl_sb_v3.o 00:03:44.335 CC lib/ftl/upgrade/ftl_sb_v5.o 00:03:44.335 CC lib/ftl/nvc/ftl_nvc_dev.o 00:03:44.335 CC lib/ftl/nvc/ftl_nvc_bdev_vss.o 00:03:44.335 CC lib/ftl/nvc/ftl_nvc_bdev_non_vss.o 00:03:44.335 CC lib/ftl/nvc/ftl_nvc_bdev_common.o 00:03:44.335 CC lib/ftl/base/ftl_base_dev.o 00:03:44.335 CC lib/ftl/base/ftl_base_bdev.o 00:03:44.335 CC lib/ftl/ftl_trace.o 00:03:44.595 LIB libspdk_nbd.a 00:03:44.595 LIB libspdk_scsi.a 00:03:44.595 LIB libspdk_ublk.a 00:03:44.855 CC lib/iscsi/conn.o 00:03:44.855 CC lib/iscsi/init_grp.o 00:03:44.855 CC lib/vhost/vhost.o 00:03:44.855 CC lib/iscsi/iscsi.o 00:03:44.855 CC lib/vhost/vhost_rpc.o 00:03:44.855 CC lib/iscsi/param.o 00:03:44.855 CC lib/vhost/vhost_scsi.o 00:03:44.855 CC lib/iscsi/portal_grp.o 00:03:44.855 CC lib/vhost/vhost_blk.o 00:03:44.855 CC lib/iscsi/iscsi_subsystem.o 00:03:44.855 CC lib/iscsi/tgt_node.o 00:03:44.855 CC lib/vhost/rte_vhost_user.o 00:03:44.855 CC lib/iscsi/iscsi_rpc.o 00:03:44.855 CC lib/iscsi/task.o 00:03:44.855 LIB libspdk_ftl.a 00:03:45.425 LIB libspdk_nvmf.a 00:03:45.425 LIB libspdk_vhost.a 00:03:45.686 LIB libspdk_iscsi.a 00:03:46.256 CC module/env_dpdk/env_dpdk_rpc.o 00:03:46.256 CC module/vfu_device/vfu_virtio.o 00:03:46.256 CC module/vfu_device/vfu_virtio_blk.o 00:03:46.256 CC module/vfu_device/vfu_virtio_scsi.o 00:03:46.256 CC module/vfu_device/vfu_virtio_fs.o 00:03:46.256 CC module/vfu_device/vfu_virtio_rpc.o 00:03:46.256 LIB libspdk_env_dpdk_rpc.a 00:03:46.256 CC module/sock/posix/posix.o 00:03:46.256 CC module/accel/dsa/accel_dsa.o 00:03:46.256 CC module/accel/dsa/accel_dsa_rpc.o 00:03:46.256 CC module/accel/ioat/accel_ioat.o 00:03:46.256 CC module/accel/ioat/accel_ioat_rpc.o 00:03:46.256 CC module/fsdev/aio/fsdev_aio.o 00:03:46.256 CC module/fsdev/aio/fsdev_aio_rpc.o 00:03:46.256 CC module/fsdev/aio/linux_aio_mgr.o 00:03:46.256 CC module/scheduler/dynamic/scheduler_dynamic.o 00:03:46.256 CC module/blob/bdev/blob_bdev.o 00:03:46.256 CC module/scheduler/gscheduler/gscheduler.o 00:03:46.256 CC module/keyring/file/keyring.o 00:03:46.256 CC module/keyring/linux/keyring.o 00:03:46.256 CC module/keyring/linux/keyring_rpc.o 00:03:46.256 CC module/keyring/file/keyring_rpc.o 00:03:46.256 CC module/accel/iaa/accel_iaa.o 00:03:46.256 CC module/accel/iaa/accel_iaa_rpc.o 00:03:46.256 CC module/accel/error/accel_error.o 00:03:46.256 CC module/accel/error/accel_error_rpc.o 00:03:46.256 CC module/scheduler/dpdk_governor/dpdk_governor.o 00:03:46.256 LIB libspdk_keyring_linux.a 00:03:46.528 LIB libspdk_scheduler_gscheduler.a 00:03:46.528 LIB libspdk_keyring_file.a 00:03:46.528 LIB libspdk_accel_ioat.a 00:03:46.528 LIB libspdk_scheduler_dpdk_governor.a 00:03:46.528 LIB libspdk_scheduler_dynamic.a 00:03:46.528 LIB libspdk_accel_iaa.a 00:03:46.528 LIB libspdk_accel_error.a 00:03:46.528 LIB libspdk_blob_bdev.a 00:03:46.528 LIB libspdk_accel_dsa.a 00:03:46.528 LIB libspdk_vfu_device.a 00:03:46.789 LIB libspdk_sock_posix.a 00:03:46.789 LIB libspdk_fsdev_aio.a 00:03:46.789 CC module/bdev/delay/vbdev_delay.o 00:03:46.789 CC module/bdev/delay/vbdev_delay_rpc.o 00:03:46.789 CC module/bdev/null/bdev_null_rpc.o 00:03:46.789 CC module/bdev/null/bdev_null.o 00:03:46.789 CC module/bdev/lvol/vbdev_lvol.o 00:03:46.789 CC module/bdev/zone_block/vbdev_zone_block.o 00:03:46.789 CC module/bdev/zone_block/vbdev_zone_block_rpc.o 00:03:46.789 CC module/bdev/lvol/vbdev_lvol_rpc.o 00:03:46.789 CC module/bdev/gpt/gpt.o 00:03:46.789 CC module/bdev/gpt/vbdev_gpt.o 00:03:46.789 CC module/bdev/error/vbdev_error.o 00:03:46.789 CC module/bdev/error/vbdev_error_rpc.o 00:03:46.789 CC module/bdev/aio/bdev_aio.o 00:03:46.789 CC module/blobfs/bdev/blobfs_bdev.o 00:03:46.789 CC module/bdev/iscsi/bdev_iscsi.o 00:03:46.789 CC module/bdev/aio/bdev_aio_rpc.o 00:03:46.789 CC module/bdev/ftl/bdev_ftl.o 00:03:46.789 CC module/blobfs/bdev/blobfs_bdev_rpc.o 00:03:46.789 CC module/bdev/raid/bdev_raid.o 00:03:46.789 CC module/bdev/ftl/bdev_ftl_rpc.o 00:03:46.789 CC module/bdev/iscsi/bdev_iscsi_rpc.o 00:03:46.789 CC module/bdev/virtio/bdev_virtio_scsi.o 00:03:46.789 CC module/bdev/raid/bdev_raid_rpc.o 00:03:46.790 CC module/bdev/virtio/bdev_virtio_blk.o 00:03:46.790 CC module/bdev/raid/bdev_raid_sb.o 00:03:46.790 CC module/bdev/virtio/bdev_virtio_rpc.o 00:03:46.790 CC module/bdev/raid/raid0.o 00:03:46.790 CC module/bdev/split/vbdev_split.o 00:03:46.790 CC module/bdev/passthru/vbdev_passthru.o 00:03:46.790 CC module/bdev/split/vbdev_split_rpc.o 00:03:46.790 CC module/bdev/passthru/vbdev_passthru_rpc.o 00:03:46.790 CC module/bdev/raid/raid1.o 00:03:46.790 CC module/bdev/malloc/bdev_malloc.o 00:03:46.790 CC module/bdev/malloc/bdev_malloc_rpc.o 00:03:46.790 CC module/bdev/raid/concat.o 00:03:46.790 CC module/bdev/nvme/bdev_nvme_rpc.o 00:03:46.790 CC module/bdev/nvme/bdev_nvme.o 00:03:46.790 CC module/bdev/nvme/nvme_rpc.o 00:03:46.790 CC module/bdev/nvme/bdev_mdns_client.o 00:03:46.790 CC module/bdev/nvme/vbdev_opal.o 00:03:46.790 CC module/bdev/nvme/vbdev_opal_rpc.o 00:03:46.790 CC module/bdev/nvme/bdev_nvme_cuse_rpc.o 00:03:47.049 LIB libspdk_blobfs_bdev.a 00:03:47.049 LIB libspdk_bdev_split.a 00:03:47.049 LIB libspdk_bdev_gpt.a 00:03:47.049 LIB libspdk_bdev_null.a 00:03:47.049 LIB libspdk_bdev_error.a 00:03:47.049 LIB libspdk_bdev_ftl.a 00:03:47.049 LIB libspdk_bdev_aio.a 00:03:47.049 LIB libspdk_bdev_passthru.a 00:03:47.049 LIB libspdk_bdev_zone_block.a 00:03:47.049 LIB libspdk_bdev_iscsi.a 00:03:47.049 LIB libspdk_bdev_delay.a 00:03:47.049 LIB libspdk_bdev_malloc.a 00:03:47.310 LIB libspdk_bdev_lvol.a 00:03:47.310 LIB libspdk_bdev_virtio.a 00:03:47.572 LIB libspdk_bdev_raid.a 00:03:48.514 LIB libspdk_bdev_nvme.a 00:03:48.775 CC module/event/subsystems/vmd/vmd.o 00:03:48.775 CC module/event/subsystems/vmd/vmd_rpc.o 00:03:48.775 CC module/event/subsystems/keyring/keyring.o 00:03:48.775 CC module/event/subsystems/scheduler/scheduler.o 00:03:48.775 CC module/event/subsystems/iobuf/iobuf.o 00:03:48.775 CC module/event/subsystems/vhost_blk/vhost_blk.o 00:03:48.775 CC module/event/subsystems/iobuf/iobuf_rpc.o 00:03:48.775 CC module/event/subsystems/vfu_tgt/vfu_tgt.o 00:03:49.036 CC module/event/subsystems/sock/sock.o 00:03:49.036 CC module/event/subsystems/fsdev/fsdev.o 00:03:49.036 LIB libspdk_event_vmd.a 00:03:49.036 LIB libspdk_event_scheduler.a 00:03:49.036 LIB libspdk_event_vhost_blk.a 00:03:49.036 LIB libspdk_event_vfu_tgt.a 00:03:49.036 LIB libspdk_event_keyring.a 00:03:49.036 LIB libspdk_event_fsdev.a 00:03:49.036 LIB libspdk_event_sock.a 00:03:49.036 LIB libspdk_event_iobuf.a 00:03:49.297 CC module/event/subsystems/accel/accel.o 00:03:49.558 LIB libspdk_event_accel.a 00:03:49.818 CC module/event/subsystems/bdev/bdev.o 00:03:49.818 LIB libspdk_event_bdev.a 00:03:50.079 CC module/event/subsystems/nbd/nbd.o 00:03:50.079 CC module/event/subsystems/ublk/ublk.o 00:03:50.079 CC module/event/subsystems/nvmf/nvmf_rpc.o 00:03:50.079 CC module/event/subsystems/nvmf/nvmf_tgt.o 00:03:50.079 CC module/event/subsystems/scsi/scsi.o 00:03:50.340 LIB libspdk_event_ublk.a 00:03:50.340 LIB libspdk_event_nbd.a 00:03:50.340 LIB libspdk_event_scsi.a 00:03:50.340 LIB libspdk_event_nvmf.a 00:03:50.601 CC module/event/subsystems/iscsi/iscsi.o 00:03:50.601 CC module/event/subsystems/vhost_scsi/vhost_scsi.o 00:03:50.862 LIB libspdk_event_vhost_scsi.a 00:03:50.862 LIB libspdk_event_iscsi.a 00:03:51.126 CXX app/trace/trace.o 00:03:51.126 CC app/trace_record/trace_record.o 00:03:51.126 CC app/spdk_nvme_identify/identify.o 00:03:51.126 CC app/spdk_nvme_discover/discovery_aer.o 00:03:51.126 CC app/spdk_lspci/spdk_lspci.o 00:03:51.126 TEST_HEADER include/spdk/accel.h 00:03:51.126 TEST_HEADER include/spdk/assert.h 00:03:51.126 TEST_HEADER include/spdk/accel_module.h 00:03:51.126 TEST_HEADER include/spdk/barrier.h 00:03:51.126 TEST_HEADER include/spdk/base64.h 00:03:51.126 TEST_HEADER include/spdk/bdev_zone.h 00:03:51.126 TEST_HEADER include/spdk/bdev.h 00:03:51.126 TEST_HEADER include/spdk/bit_array.h 00:03:51.126 TEST_HEADER include/spdk/bdev_module.h 00:03:51.126 CC app/spdk_nvme_perf/perf.o 00:03:51.126 CC app/spdk_top/spdk_top.o 00:03:51.126 TEST_HEADER include/spdk/bit_pool.h 00:03:51.126 CC test/rpc_client/rpc_client_test.o 00:03:51.126 TEST_HEADER include/spdk/blobfs.h 00:03:51.126 TEST_HEADER include/spdk/blob_bdev.h 00:03:51.126 TEST_HEADER include/spdk/blobfs_bdev.h 00:03:51.126 TEST_HEADER include/spdk/conf.h 00:03:51.126 TEST_HEADER include/spdk/blob.h 00:03:51.126 TEST_HEADER include/spdk/cpuset.h 00:03:51.126 TEST_HEADER include/spdk/config.h 00:03:51.126 TEST_HEADER include/spdk/dma.h 00:03:51.126 TEST_HEADER include/spdk/dif.h 00:03:51.126 TEST_HEADER include/spdk/crc16.h 00:03:51.126 TEST_HEADER include/spdk/crc32.h 00:03:51.126 TEST_HEADER include/spdk/crc64.h 00:03:51.126 TEST_HEADER include/spdk/event.h 00:03:51.126 TEST_HEADER include/spdk/endian.h 00:03:51.126 TEST_HEADER include/spdk/env_dpdk.h 00:03:51.126 TEST_HEADER include/spdk/env.h 00:03:51.126 TEST_HEADER include/spdk/file.h 00:03:51.126 TEST_HEADER include/spdk/fd_group.h 00:03:51.126 TEST_HEADER include/spdk/fd.h 00:03:51.126 TEST_HEADER include/spdk/fsdev_module.h 00:03:51.126 TEST_HEADER include/spdk/fsdev.h 00:03:51.126 TEST_HEADER include/spdk/gpt_spec.h 00:03:51.126 TEST_HEADER include/spdk/ftl.h 00:03:51.126 TEST_HEADER include/spdk/hexlify.h 00:03:51.126 TEST_HEADER include/spdk/fuse_dispatcher.h 00:03:51.126 TEST_HEADER include/spdk/idxd.h 00:03:51.126 TEST_HEADER include/spdk/idxd_spec.h 00:03:51.126 TEST_HEADER include/spdk/histogram_data.h 00:03:51.126 TEST_HEADER include/spdk/init.h 00:03:51.126 TEST_HEADER include/spdk/ioat.h 00:03:51.126 TEST_HEADER include/spdk/ioat_spec.h 00:03:51.126 TEST_HEADER include/spdk/keyring_module.h 00:03:51.126 TEST_HEADER include/spdk/likely.h 00:03:51.126 TEST_HEADER include/spdk/iscsi_spec.h 00:03:51.126 TEST_HEADER include/spdk/keyring.h 00:03:51.126 TEST_HEADER include/spdk/json.h 00:03:51.126 TEST_HEADER include/spdk/jsonrpc.h 00:03:51.126 TEST_HEADER include/spdk/lvol.h 00:03:51.126 TEST_HEADER include/spdk/log.h 00:03:51.126 TEST_HEADER include/spdk/md5.h 00:03:51.126 TEST_HEADER include/spdk/memory.h 00:03:51.126 CC examples/interrupt_tgt/interrupt_tgt.o 00:03:51.126 CC app/nvmf_tgt/nvmf_main.o 00:03:51.126 TEST_HEADER include/spdk/nbd.h 00:03:51.126 TEST_HEADER include/spdk/mmio.h 00:03:51.126 CC app/spdk_dd/spdk_dd.o 00:03:51.126 TEST_HEADER include/spdk/net.h 00:03:51.126 TEST_HEADER include/spdk/notify.h 00:03:51.126 CC app/iscsi_tgt/iscsi_tgt.o 00:03:51.126 TEST_HEADER include/spdk/nvme.h 00:03:51.126 TEST_HEADER include/spdk/nvme_intel.h 00:03:51.126 TEST_HEADER include/spdk/nvme_ocssd.h 00:03:51.126 TEST_HEADER include/spdk/nvme_zns.h 00:03:51.126 TEST_HEADER include/spdk/nvmf_cmd.h 00:03:51.126 TEST_HEADER include/spdk/nvmf_fc_spec.h 00:03:51.126 TEST_HEADER include/spdk/nvme_spec.h 00:03:51.126 TEST_HEADER include/spdk/nvme_ocssd_spec.h 00:03:51.126 TEST_HEADER include/spdk/nvmf.h 00:03:51.126 TEST_HEADER include/spdk/nvmf_spec.h 00:03:51.126 TEST_HEADER include/spdk/nvmf_transport.h 00:03:51.126 TEST_HEADER include/spdk/opal.h 00:03:51.126 TEST_HEADER include/spdk/opal_spec.h 00:03:51.126 TEST_HEADER include/spdk/pci_ids.h 00:03:51.126 TEST_HEADER include/spdk/pipe.h 00:03:51.126 TEST_HEADER include/spdk/queue.h 00:03:51.126 TEST_HEADER include/spdk/reduce.h 00:03:51.126 TEST_HEADER include/spdk/rpc.h 00:03:51.126 TEST_HEADER include/spdk/scsi.h 00:03:51.126 TEST_HEADER include/spdk/scheduler.h 00:03:51.126 TEST_HEADER include/spdk/sock.h 00:03:51.126 TEST_HEADER include/spdk/scsi_spec.h 00:03:51.126 TEST_HEADER include/spdk/string.h 00:03:51.126 TEST_HEADER include/spdk/stdinc.h 00:03:51.126 TEST_HEADER include/spdk/thread.h 00:03:51.126 TEST_HEADER include/spdk/trace_parser.h 00:03:51.126 TEST_HEADER include/spdk/trace.h 00:03:51.126 TEST_HEADER include/spdk/ublk.h 00:03:51.126 TEST_HEADER include/spdk/util.h 00:03:51.126 TEST_HEADER include/spdk/tree.h 00:03:51.126 TEST_HEADER include/spdk/vfio_user_pci.h 00:03:51.126 TEST_HEADER include/spdk/uuid.h 00:03:51.126 TEST_HEADER include/spdk/version.h 00:03:51.126 TEST_HEADER include/spdk/vmd.h 00:03:51.126 TEST_HEADER include/spdk/vfio_user_spec.h 00:03:51.126 TEST_HEADER include/spdk/vhost.h 00:03:51.126 TEST_HEADER include/spdk/xor.h 00:03:51.126 TEST_HEADER include/spdk/zipf.h 00:03:51.126 CXX test/cpp_headers/accel.o 00:03:51.126 CXX test/cpp_headers/accel_module.o 00:03:51.126 CXX test/cpp_headers/assert.o 00:03:51.126 CXX test/cpp_headers/barrier.o 00:03:51.126 CXX test/cpp_headers/base64.o 00:03:51.126 CC app/spdk_tgt/spdk_tgt.o 00:03:51.126 CXX test/cpp_headers/bdev_module.o 00:03:51.126 CXX test/cpp_headers/bdev.o 00:03:51.126 CXX test/cpp_headers/bit_array.o 00:03:51.126 CXX test/cpp_headers/bit_pool.o 00:03:51.126 CXX test/cpp_headers/bdev_zone.o 00:03:51.126 CXX test/cpp_headers/blobfs_bdev.o 00:03:51.126 CXX test/cpp_headers/blob_bdev.o 00:03:51.126 CXX test/cpp_headers/conf.o 00:03:51.126 CXX test/cpp_headers/blobfs.o 00:03:51.126 CXX test/cpp_headers/blob.o 00:03:51.126 CXX test/cpp_headers/config.o 00:03:51.126 CXX test/cpp_headers/cpuset.o 00:03:51.126 CXX test/cpp_headers/crc32.o 00:03:51.126 CXX test/cpp_headers/crc16.o 00:03:51.126 CXX test/cpp_headers/crc64.o 00:03:51.126 CXX test/cpp_headers/dif.o 00:03:51.126 CXX test/cpp_headers/endian.o 00:03:51.126 CXX test/cpp_headers/dma.o 00:03:51.126 CXX test/cpp_headers/env_dpdk.o 00:03:51.126 CXX test/cpp_headers/env.o 00:03:51.126 CXX test/cpp_headers/fd_group.o 00:03:51.126 CXX test/cpp_headers/event.o 00:03:51.126 CXX test/cpp_headers/file.o 00:03:51.126 CXX test/cpp_headers/fd.o 00:03:51.126 CXX test/cpp_headers/fsdev.o 00:03:51.126 CXX test/cpp_headers/fsdev_module.o 00:03:51.126 CXX test/cpp_headers/fuse_dispatcher.o 00:03:51.126 CXX test/cpp_headers/ftl.o 00:03:51.126 CXX test/cpp_headers/hexlify.o 00:03:51.126 CXX test/cpp_headers/gpt_spec.o 00:03:51.126 CXX test/cpp_headers/idxd.o 00:03:51.126 CC examples/ioat/perf/perf.o 00:03:51.126 CXX test/cpp_headers/histogram_data.o 00:03:51.126 CC examples/ioat/verify/verify.o 00:03:51.126 CXX test/cpp_headers/idxd_spec.o 00:03:51.126 CXX test/cpp_headers/init.o 00:03:51.126 CXX test/cpp_headers/ioat_spec.o 00:03:51.126 CXX test/cpp_headers/ioat.o 00:03:51.126 CXX test/cpp_headers/json.o 00:03:51.126 CXX test/cpp_headers/iscsi_spec.o 00:03:51.126 CC examples/util/zipf/zipf.o 00:03:51.126 CXX test/cpp_headers/jsonrpc.o 00:03:51.126 CXX test/cpp_headers/keyring.o 00:03:51.126 CXX test/cpp_headers/keyring_module.o 00:03:51.126 CXX test/cpp_headers/likely.o 00:03:51.126 CXX test/cpp_headers/log.o 00:03:51.126 CXX test/cpp_headers/lvol.o 00:03:51.126 CXX test/cpp_headers/md5.o 00:03:51.126 CXX test/cpp_headers/memory.o 00:03:51.126 CXX test/cpp_headers/mmio.o 00:03:51.126 CXX test/cpp_headers/nbd.o 00:03:51.126 CXX test/cpp_headers/net.o 00:03:51.126 CXX test/cpp_headers/nvme.o 00:03:51.126 CXX test/cpp_headers/notify.o 00:03:51.126 CXX test/cpp_headers/nvme_intel.o 00:03:51.126 CXX test/cpp_headers/nvme_ocssd.o 00:03:51.126 CXX test/cpp_headers/nvme_ocssd_spec.o 00:03:51.127 CXX test/cpp_headers/nvme_spec.o 00:03:51.127 CXX test/cpp_headers/nvme_zns.o 00:03:51.127 CXX test/cpp_headers/nvmf_cmd.o 00:03:51.127 CC test/env/vtophys/vtophys.o 00:03:51.127 CC test/env/memory/memory_ut.o 00:03:51.127 CXX test/cpp_headers/nvmf_fc_spec.o 00:03:51.127 CXX test/cpp_headers/nvmf_spec.o 00:03:51.127 CXX test/cpp_headers/nvmf.o 00:03:51.127 CXX test/cpp_headers/nvmf_transport.o 00:03:51.127 CXX test/cpp_headers/opal.o 00:03:51.127 CXX test/cpp_headers/opal_spec.o 00:03:51.127 CXX test/cpp_headers/pci_ids.o 00:03:51.127 CXX test/cpp_headers/pipe.o 00:03:51.127 CXX test/cpp_headers/reduce.o 00:03:51.127 CXX test/cpp_headers/queue.o 00:03:51.127 CXX test/cpp_headers/rpc.o 00:03:51.127 CXX test/cpp_headers/scheduler.o 00:03:51.127 CC test/app/jsoncat/jsoncat.o 00:03:51.127 CXX test/cpp_headers/scsi_spec.o 00:03:51.127 CXX test/cpp_headers/scsi.o 00:03:51.127 CC test/app/histogram_perf/histogram_perf.o 00:03:51.127 CXX test/cpp_headers/stdinc.o 00:03:51.127 CXX test/cpp_headers/sock.o 00:03:51.127 CC test/thread/poller_perf/poller_perf.o 00:03:51.127 CC app/fio/nvme/fio_plugin.o 00:03:51.127 CC test/env/pci/pci_ut.o 00:03:51.127 CC test/env/env_dpdk_post_init/env_dpdk_post_init.o 00:03:51.127 CXX test/cpp_headers/string.o 00:03:51.127 CC test/thread/lock/spdk_lock.o 00:03:51.127 CC test/app/stub/stub.o 00:03:51.127 LINK spdk_lspci 00:03:51.386 LINK spdk_nvme_discover 00:03:51.386 CC app/fio/bdev/fio_plugin.o 00:03:51.386 CXX test/cpp_headers/thread.o 00:03:51.386 CC test/dma/test_dma/test_dma.o 00:03:51.386 LINK rpc_client_test 00:03:51.386 CC test/app/bdev_svc/bdev_svc.o 00:03:51.386 LINK spdk_trace_record 00:03:51.386 CC test/env/mem_callbacks/mem_callbacks.o 00:03:51.386 CC test/app/fuzz/iscsi_fuzz/iscsi_fuzz.o 00:03:51.386 CC test/app/fuzz/nvme_fuzz/nvme_fuzz.o 00:03:51.386 LINK nvmf_tgt 00:03:51.386 LINK interrupt_tgt 00:03:51.386 CXX test/cpp_headers/trace.o 00:03:51.386 CXX test/cpp_headers/trace_parser.o 00:03:51.386 CXX test/cpp_headers/tree.o 00:03:51.386 CC test/app/fuzz/vhost_fuzz/vhost_fuzz_rpc.o 00:03:51.386 CXX test/cpp_headers/ublk.o 00:03:51.386 CC test/app/fuzz/vhost_fuzz/vhost_fuzz.o 00:03:51.386 CXX test/cpp_headers/util.o 00:03:51.386 CXX test/cpp_headers/uuid.o 00:03:51.386 CXX test/cpp_headers/version.o 00:03:51.386 CXX test/cpp_headers/vfio_user_pci.o 00:03:51.386 CXX test/cpp_headers/vfio_user_spec.o 00:03:51.386 LINK zipf 00:03:51.386 CXX test/cpp_headers/vhost.o 00:03:51.386 CXX test/cpp_headers/vmd.o 00:03:51.386 LINK vtophys 00:03:51.386 CXX test/cpp_headers/xor.o 00:03:51.386 CXX test/cpp_headers/zipf.o 00:03:51.386 CC test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.o 00:03:51.386 LINK jsoncat 00:03:51.386 LINK iscsi_tgt 00:03:51.386 LINK poller_perf 00:03:51.386 LINK histogram_perf 00:03:51.386 CC test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.o 00:03:51.386 LINK env_dpdk_post_init 00:03:51.386 LINK ioat_perf 00:03:51.386 LINK verify 00:03:51.386 LINK stub 00:03:51.386 LINK spdk_tgt 00:03:51.386 LINK spdk_trace 00:03:51.646 LINK bdev_svc 00:03:51.646 LINK mem_callbacks 00:03:51.646 LINK spdk_dd 00:03:51.646 LINK nvme_fuzz 00:03:51.646 LINK llvm_vfio_fuzz 00:03:51.646 LINK pci_ut 00:03:51.646 LINK spdk_nvme_identify 00:03:51.646 LINK vhost_fuzz 00:03:51.646 LINK test_dma 00:03:51.646 LINK spdk_nvme_perf 00:03:51.907 LINK spdk_top 00:03:51.907 LINK spdk_bdev 00:03:51.907 LINK spdk_nvme 00:03:51.907 LINK llvm_nvme_fuzz 00:03:51.907 LINK memory_ut 00:03:51.907 CC examples/sock/hello_world/hello_sock.o 00:03:51.907 CC app/vhost/vhost.o 00:03:51.907 CC examples/idxd/perf/perf.o 00:03:52.169 CC examples/vmd/led/led.o 00:03:52.169 CC examples/vmd/lsvmd/lsvmd.o 00:03:52.169 CC examples/thread/thread/thread_ex.o 00:03:52.169 LINK hello_sock 00:03:52.169 LINK lsvmd 00:03:52.169 LINK vhost 00:03:52.169 LINK led 00:03:52.169 LINK idxd_perf 00:03:52.169 LINK thread 00:03:52.429 LINK spdk_lock 00:03:52.429 LINK iscsi_fuzz 00:03:53.000 CC examples/nvme/nvme_manage/nvme_manage.o 00:03:53.000 CC examples/nvme/arbitration/arbitration.o 00:03:53.000 CC examples/nvme/reconnect/reconnect.o 00:03:53.000 CC examples/nvme/hotplug/hotplug.o 00:03:53.000 CC examples/nvme/hello_world/hello_world.o 00:03:53.000 CC examples/nvme/cmb_copy/cmb_copy.o 00:03:53.000 CC examples/nvme/abort/abort.o 00:03:53.000 CC examples/nvme/pmr_persistence/pmr_persistence.o 00:03:53.000 CC test/event/event_perf/event_perf.o 00:03:53.000 CC test/event/reactor_perf/reactor_perf.o 00:03:53.000 CC test/event/reactor/reactor.o 00:03:53.000 LINK pmr_persistence 00:03:53.000 CC test/event/app_repeat/app_repeat.o 00:03:53.000 LINK cmb_copy 00:03:53.000 LINK hotplug 00:03:53.000 LINK hello_world 00:03:53.000 CC test/event/scheduler/scheduler.o 00:03:53.000 LINK reconnect 00:03:53.260 LINK arbitration 00:03:53.260 LINK abort 00:03:53.260 LINK reactor_perf 00:03:53.260 LINK event_perf 00:03:53.260 LINK reactor 00:03:53.260 LINK nvme_manage 00:03:53.260 LINK app_repeat 00:03:53.260 LINK scheduler 00:03:53.520 CC test/nvme/reset/reset.o 00:03:53.520 CC test/nvme/aer/aer.o 00:03:53.520 CC test/nvme/simple_copy/simple_copy.o 00:03:53.520 CC test/nvme/doorbell_aers/doorbell_aers.o 00:03:53.520 CC test/nvme/reserve/reserve.o 00:03:53.520 CC test/nvme/fdp/fdp.o 00:03:53.520 CC test/nvme/startup/startup.o 00:03:53.520 CC test/nvme/sgl/sgl.o 00:03:53.520 CC test/nvme/overhead/overhead.o 00:03:53.520 CC test/nvme/err_injection/err_injection.o 00:03:53.520 CC test/nvme/boot_partition/boot_partition.o 00:03:53.520 CC test/nvme/cuse/cuse.o 00:03:53.520 CC test/nvme/fused_ordering/fused_ordering.o 00:03:53.520 CC test/nvme/compliance/nvme_compliance.o 00:03:53.520 CC test/nvme/connect_stress/connect_stress.o 00:03:53.520 CC test/accel/dif/dif.o 00:03:53.520 CC test/nvme/e2edp/nvme_dp.o 00:03:53.520 CC test/blobfs/mkfs/mkfs.o 00:03:53.520 CC test/lvol/esnap/esnap.o 00:03:53.520 LINK boot_partition 00:03:53.520 LINK startup 00:03:53.520 LINK err_injection 00:03:53.520 LINK connect_stress 00:03:53.520 LINK doorbell_aers 00:03:53.520 LINK reserve 00:03:53.520 LINK simple_copy 00:03:53.520 LINK fused_ordering 00:03:53.520 LINK reset 00:03:53.520 LINK aer 00:03:53.520 LINK nvme_dp 00:03:53.520 LINK sgl 00:03:53.520 LINK mkfs 00:03:53.520 LINK fdp 00:03:53.520 LINK overhead 00:03:53.780 LINK nvme_compliance 00:03:53.780 LINK dif 00:03:54.040 CC examples/accel/perf/accel_perf.o 00:03:54.040 CC examples/fsdev/hello_world/hello_fsdev.o 00:03:54.040 CC examples/blob/hello_world/hello_blob.o 00:03:54.040 CC examples/blob/cli/blobcli.o 00:03:54.300 LINK hello_blob 00:03:54.300 LINK hello_fsdev 00:03:54.300 LINK accel_perf 00:03:54.300 LINK cuse 00:03:54.300 LINK blobcli 00:03:55.241 CC examples/bdev/hello_world/hello_bdev.o 00:03:55.241 CC examples/bdev/bdevperf/bdevperf.o 00:03:55.241 LINK hello_bdev 00:03:55.501 CC test/bdev/bdevio/bdevio.o 00:03:55.501 LINK bdevperf 00:03:55.762 LINK bdevio 00:03:56.703 LINK esnap 00:03:56.964 CC examples/nvmf/nvmf/nvmf.o 00:03:57.224 LINK nvmf 00:03:58.609 00:03:58.609 real 0m36.882s 00:03:58.609 user 4m38.249s 00:03:58.609 sys 1m46.855s 00:03:58.609 10:58:22 make -- common/autotest_common.sh@1130 -- $ xtrace_disable 00:03:58.609 10:58:22 make -- common/autotest_common.sh@10 -- $ set +x 00:03:58.609 ************************************ 00:03:58.609 END TEST make 00:03:58.609 ************************************ 00:03:58.609 10:58:23 -- spdk/autobuild.sh@1 -- $ stop_monitor_resources 00:03:58.609 10:58:23 -- pm/common@29 -- $ signal_monitor_resources TERM 00:03:58.609 10:58:23 -- pm/common@40 -- $ local monitor pid pids signal=TERM 00:03:58.609 10:58:23 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:03:58.609 10:58:23 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:03:58.609 10:58:23 -- pm/common@44 -- $ pid=5687 00:03:58.609 10:58:23 -- pm/common@50 -- $ kill -TERM 5687 00:03:58.609 10:58:23 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:03:58.609 10:58:23 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:03:58.609 10:58:23 -- pm/common@44 -- $ pid=5689 00:03:58.609 10:58:23 -- pm/common@50 -- $ kill -TERM 5689 00:03:58.609 10:58:23 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:03:58.609 10:58:23 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:03:58.609 10:58:23 -- pm/common@44 -- $ pid=5691 00:03:58.609 10:58:23 -- pm/common@50 -- $ kill -TERM 5691 00:03:58.609 10:58:23 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:03:58.609 10:58:23 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:03:58.609 10:58:23 -- pm/common@44 -- $ pid=5715 00:03:58.609 10:58:23 -- pm/common@50 -- $ sudo -E kill -TERM 5715 00:03:58.609 10:58:23 -- spdk/autorun.sh@26 -- $ (( SPDK_TEST_UNITTEST == 1 || SPDK_RUN_FUNCTIONAL_TEST == 1 )) 00:03:58.609 10:58:23 -- spdk/autorun.sh@27 -- $ sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/autotest.sh /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:03:58.609 10:58:23 -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:03:58.609 10:58:23 -- common/autotest_common.sh@1693 -- # lcov --version 00:03:58.609 10:58:23 -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:03:58.609 10:58:23 -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:03:58.609 10:58:23 -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:03:58.609 10:58:23 -- scripts/common.sh@333 -- # local ver1 ver1_l 00:03:58.609 10:58:23 -- scripts/common.sh@334 -- # local ver2 ver2_l 00:03:58.609 10:58:23 -- scripts/common.sh@336 -- # IFS=.-: 00:03:58.609 10:58:23 -- scripts/common.sh@336 -- # read -ra ver1 00:03:58.609 10:58:23 -- scripts/common.sh@337 -- # IFS=.-: 00:03:58.609 10:58:23 -- scripts/common.sh@337 -- # read -ra ver2 00:03:58.609 10:58:23 -- scripts/common.sh@338 -- # local 'op=<' 00:03:58.609 10:58:23 -- scripts/common.sh@340 -- # ver1_l=2 00:03:58.609 10:58:23 -- scripts/common.sh@341 -- # ver2_l=1 00:03:58.609 10:58:23 -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:03:58.609 10:58:23 -- scripts/common.sh@344 -- # case "$op" in 00:03:58.609 10:58:23 -- scripts/common.sh@345 -- # : 1 00:03:58.609 10:58:23 -- scripts/common.sh@364 -- # (( v = 0 )) 00:03:58.609 10:58:23 -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:03:58.609 10:58:23 -- scripts/common.sh@365 -- # decimal 1 00:03:58.609 10:58:23 -- scripts/common.sh@353 -- # local d=1 00:03:58.609 10:58:23 -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:03:58.609 10:58:23 -- scripts/common.sh@355 -- # echo 1 00:03:58.609 10:58:23 -- scripts/common.sh@365 -- # ver1[v]=1 00:03:58.609 10:58:23 -- scripts/common.sh@366 -- # decimal 2 00:03:58.609 10:58:23 -- scripts/common.sh@353 -- # local d=2 00:03:58.609 10:58:23 -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:03:58.609 10:58:23 -- scripts/common.sh@355 -- # echo 2 00:03:58.609 10:58:23 -- scripts/common.sh@366 -- # ver2[v]=2 00:03:58.609 10:58:23 -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:03:58.609 10:58:23 -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:03:58.609 10:58:23 -- scripts/common.sh@368 -- # return 0 00:03:58.609 10:58:23 -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:03:58.609 10:58:23 -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:03:58.609 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:58.609 --rc genhtml_branch_coverage=1 00:03:58.609 --rc genhtml_function_coverage=1 00:03:58.609 --rc genhtml_legend=1 00:03:58.609 --rc geninfo_all_blocks=1 00:03:58.609 --rc geninfo_unexecuted_blocks=1 00:03:58.609 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:58.609 ' 00:03:58.609 10:58:23 -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:03:58.609 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:58.609 --rc genhtml_branch_coverage=1 00:03:58.609 --rc genhtml_function_coverage=1 00:03:58.609 --rc genhtml_legend=1 00:03:58.609 --rc geninfo_all_blocks=1 00:03:58.609 --rc geninfo_unexecuted_blocks=1 00:03:58.609 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:58.609 ' 00:03:58.609 10:58:23 -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:03:58.609 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:58.609 --rc genhtml_branch_coverage=1 00:03:58.609 --rc genhtml_function_coverage=1 00:03:58.609 --rc genhtml_legend=1 00:03:58.609 --rc geninfo_all_blocks=1 00:03:58.609 --rc geninfo_unexecuted_blocks=1 00:03:58.609 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:58.609 ' 00:03:58.609 10:58:23 -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:03:58.609 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:58.609 --rc genhtml_branch_coverage=1 00:03:58.609 --rc genhtml_function_coverage=1 00:03:58.609 --rc genhtml_legend=1 00:03:58.609 --rc geninfo_all_blocks=1 00:03:58.609 --rc geninfo_unexecuted_blocks=1 00:03:58.609 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:58.609 ' 00:03:58.609 10:58:23 -- spdk/autotest.sh@25 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:03:58.870 10:58:23 -- nvmf/common.sh@7 -- # uname -s 00:03:58.870 10:58:23 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:03:58.870 10:58:23 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:03:58.870 10:58:23 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:03:58.870 10:58:23 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:03:58.870 10:58:23 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:03:58.870 10:58:23 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:03:58.870 10:58:23 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:03:58.870 10:58:23 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:03:58.870 10:58:23 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:03:58.870 10:58:23 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:03:58.870 10:58:23 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:03:58.870 10:58:23 -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:03:58.870 10:58:23 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:03:58.870 10:58:23 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:03:58.870 10:58:23 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:03:58.870 10:58:23 -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:03:58.870 10:58:23 -- nvmf/common.sh@49 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:03:58.870 10:58:23 -- scripts/common.sh@15 -- # shopt -s extglob 00:03:58.870 10:58:23 -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:03:58.870 10:58:23 -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:03:58.870 10:58:23 -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:03:58.870 10:58:23 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:58.870 10:58:23 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:58.870 10:58:23 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:58.870 10:58:23 -- paths/export.sh@5 -- # export PATH 00:03:58.871 10:58:23 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:58.871 10:58:23 -- nvmf/common.sh@51 -- # : 0 00:03:58.871 10:58:23 -- nvmf/common.sh@52 -- # export NVMF_APP_SHM_ID 00:03:58.871 10:58:23 -- nvmf/common.sh@53 -- # build_nvmf_app_args 00:03:58.871 10:58:23 -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:03:58.871 10:58:23 -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:03:58.871 10:58:23 -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:03:58.871 10:58:23 -- nvmf/common.sh@33 -- # '[' '' -eq 1 ']' 00:03:58.871 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh: line 33: [: : integer expression expected 00:03:58.871 10:58:23 -- nvmf/common.sh@37 -- # '[' -n '' ']' 00:03:58.871 10:58:23 -- nvmf/common.sh@39 -- # '[' 0 -eq 1 ']' 00:03:58.871 10:58:23 -- nvmf/common.sh@55 -- # have_pci_nics=0 00:03:58.871 10:58:23 -- spdk/autotest.sh@27 -- # '[' 0 -ne 0 ']' 00:03:58.871 10:58:23 -- spdk/autotest.sh@32 -- # uname -s 00:03:58.871 10:58:23 -- spdk/autotest.sh@32 -- # '[' Linux = Linux ']' 00:03:58.871 10:58:23 -- spdk/autotest.sh@33 -- # old_core_pattern='|/usr/lib/systemd/systemd-coredump %P %u %g %s %t %c %h' 00:03:58.871 10:58:23 -- spdk/autotest.sh@34 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/coredumps 00:03:58.871 10:58:23 -- spdk/autotest.sh@39 -- # echo '|/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/core-collector.sh %P %s %t' 00:03:58.871 10:58:23 -- spdk/autotest.sh@40 -- # echo /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/coredumps 00:03:58.871 10:58:23 -- spdk/autotest.sh@44 -- # modprobe nbd 00:03:58.871 10:58:23 -- spdk/autotest.sh@46 -- # type -P udevadm 00:03:58.871 10:58:23 -- spdk/autotest.sh@46 -- # udevadm=/usr/sbin/udevadm 00:03:58.871 10:58:23 -- spdk/autotest.sh@48 -- # udevadm_pid=84666 00:03:58.871 10:58:23 -- spdk/autotest.sh@53 -- # start_monitor_resources 00:03:58.871 10:58:23 -- pm/common@17 -- # local monitor 00:03:58.871 10:58:23 -- spdk/autotest.sh@47 -- # /usr/sbin/udevadm monitor --property 00:03:58.871 10:58:23 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:03:58.871 10:58:23 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:03:58.871 10:58:23 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:03:58.871 10:58:23 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:03:58.871 10:58:23 -- pm/common@21 -- # date +%s 00:03:58.871 10:58:23 -- pm/common@21 -- # date +%s 00:03:58.871 10:58:23 -- pm/common@21 -- # date +%s 00:03:58.871 10:58:23 -- pm/common@25 -- # sleep 1 00:03:58.871 10:58:23 -- pm/common@21 -- # date +%s 00:03:58.871 10:58:23 -- pm/common@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1731837503 00:03:58.871 10:58:23 -- pm/common@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1731837503 00:03:58.871 10:58:23 -- pm/common@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1731837503 00:03:58.871 10:58:23 -- pm/common@21 -- # sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1731837503 00:03:58.871 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autotest.sh.1731837503_collect-cpu-temp.pm.log 00:03:58.871 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autotest.sh.1731837503_collect-vmstat.pm.log 00:03:58.871 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autotest.sh.1731837503_collect-cpu-load.pm.log 00:03:58.871 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autotest.sh.1731837503_collect-bmc-pm.bmc.pm.log 00:03:59.814 10:58:24 -- spdk/autotest.sh@55 -- # trap 'autotest_cleanup || :; exit 1' SIGINT SIGTERM EXIT 00:03:59.814 10:58:24 -- spdk/autotest.sh@57 -- # timing_enter autotest 00:03:59.814 10:58:24 -- common/autotest_common.sh@726 -- # xtrace_disable 00:03:59.814 10:58:24 -- common/autotest_common.sh@10 -- # set +x 00:03:59.814 10:58:24 -- spdk/autotest.sh@59 -- # create_test_list 00:03:59.814 10:58:24 -- common/autotest_common.sh@752 -- # xtrace_disable 00:03:59.814 10:58:24 -- common/autotest_common.sh@10 -- # set +x 00:03:59.814 10:58:24 -- spdk/autotest.sh@61 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/autotest.sh 00:03:59.814 10:58:24 -- spdk/autotest.sh@61 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:03:59.814 10:58:24 -- spdk/autotest.sh@61 -- # src=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:03:59.814 10:58:24 -- spdk/autotest.sh@62 -- # out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:03:59.814 10:58:24 -- spdk/autotest.sh@63 -- # cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:03:59.814 10:58:24 -- spdk/autotest.sh@65 -- # freebsd_update_contigmem_mod 00:03:59.814 10:58:24 -- common/autotest_common.sh@1457 -- # uname 00:04:00.074 10:58:24 -- common/autotest_common.sh@1457 -- # '[' Linux = FreeBSD ']' 00:04:00.074 10:58:24 -- spdk/autotest.sh@66 -- # freebsd_set_maxsock_buf 00:04:00.074 10:58:24 -- common/autotest_common.sh@1477 -- # uname 00:04:00.074 10:58:24 -- common/autotest_common.sh@1477 -- # [[ Linux = FreeBSD ]] 00:04:00.074 10:58:24 -- spdk/autotest.sh@68 -- # [[ y == y ]] 00:04:00.074 10:58:24 -- spdk/autotest.sh@70 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh --version 00:04:00.074 lcov: LCOV version 1.15 00:04:00.074 10:58:24 -- spdk/autotest.sh@72 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -c --no-external -i -t Baseline -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_base.info 00:04:08.215 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/mdns_server.gcno 00:04:10.764 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_stubs.gcno 00:04:16.050 10:58:40 -- spdk/autotest.sh@76 -- # timing_enter pre_cleanup 00:04:16.050 10:58:40 -- common/autotest_common.sh@726 -- # xtrace_disable 00:04:16.050 10:58:40 -- common/autotest_common.sh@10 -- # set +x 00:04:16.050 10:58:40 -- spdk/autotest.sh@78 -- # rm -f 00:04:16.050 10:58:40 -- spdk/autotest.sh@81 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:19.348 0000:00:04.7 (8086 2021): Already using the ioatdma driver 00:04:19.349 0000:00:04.6 (8086 2021): Already using the ioatdma driver 00:04:19.349 0000:00:04.5 (8086 2021): Already using the ioatdma driver 00:04:19.349 0000:00:04.4 (8086 2021): Already using the ioatdma driver 00:04:19.349 0000:00:04.3 (8086 2021): Already using the ioatdma driver 00:04:19.349 0000:00:04.2 (8086 2021): Already using the ioatdma driver 00:04:19.349 0000:00:04.1 (8086 2021): Already using the ioatdma driver 00:04:19.349 0000:00:04.0 (8086 2021): Already using the ioatdma driver 00:04:19.349 0000:80:04.7 (8086 2021): Already using the ioatdma driver 00:04:19.349 0000:80:04.6 (8086 2021): Already using the ioatdma driver 00:04:19.349 0000:80:04.5 (8086 2021): Already using the ioatdma driver 00:04:19.349 0000:80:04.4 (8086 2021): Already using the ioatdma driver 00:04:19.349 0000:80:04.3 (8086 2021): Already using the ioatdma driver 00:04:19.349 0000:80:04.2 (8086 2021): Already using the ioatdma driver 00:04:19.349 0000:80:04.1 (8086 2021): Already using the ioatdma driver 00:04:19.349 0000:80:04.0 (8086 2021): Already using the ioatdma driver 00:04:19.349 0000:d8:00.0 (8086 0a54): Already using the nvme driver 00:04:19.609 10:58:44 -- spdk/autotest.sh@83 -- # get_zoned_devs 00:04:19.609 10:58:44 -- common/autotest_common.sh@1657 -- # zoned_devs=() 00:04:19.609 10:58:44 -- common/autotest_common.sh@1657 -- # local -gA zoned_devs 00:04:19.609 10:58:44 -- common/autotest_common.sh@1658 -- # local nvme bdf 00:04:19.609 10:58:44 -- common/autotest_common.sh@1660 -- # for nvme in /sys/block/nvme* 00:04:19.609 10:58:44 -- common/autotest_common.sh@1661 -- # is_block_zoned nvme0n1 00:04:19.609 10:58:44 -- common/autotest_common.sh@1650 -- # local device=nvme0n1 00:04:19.609 10:58:44 -- common/autotest_common.sh@1652 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:04:19.609 10:58:44 -- common/autotest_common.sh@1653 -- # [[ none != none ]] 00:04:19.609 10:58:44 -- spdk/autotest.sh@85 -- # (( 0 > 0 )) 00:04:19.609 10:58:44 -- spdk/autotest.sh@97 -- # for dev in /dev/nvme*n!(*p*) 00:04:19.609 10:58:44 -- spdk/autotest.sh@99 -- # [[ -z '' ]] 00:04:19.609 10:58:44 -- spdk/autotest.sh@100 -- # block_in_use /dev/nvme0n1 00:04:19.609 10:58:44 -- scripts/common.sh@381 -- # local block=/dev/nvme0n1 pt 00:04:19.609 10:58:44 -- scripts/common.sh@390 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/spdk-gpt.py /dev/nvme0n1 00:04:19.609 No valid GPT data, bailing 00:04:19.609 10:58:44 -- scripts/common.sh@394 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:04:19.609 10:58:44 -- scripts/common.sh@394 -- # pt= 00:04:19.609 10:58:44 -- scripts/common.sh@395 -- # return 1 00:04:19.609 10:58:44 -- spdk/autotest.sh@101 -- # dd if=/dev/zero of=/dev/nvme0n1 bs=1M count=1 00:04:19.609 1+0 records in 00:04:19.609 1+0 records out 00:04:19.609 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00159051 s, 659 MB/s 00:04:19.609 10:58:44 -- spdk/autotest.sh@105 -- # sync 00:04:19.609 10:58:44 -- spdk/autotest.sh@107 -- # xtrace_disable_per_cmd reap_spdk_processes 00:04:19.609 10:58:44 -- common/autotest_common.sh@22 -- # eval 'reap_spdk_processes 12> /dev/null' 00:04:19.609 10:58:44 -- common/autotest_common.sh@22 -- # reap_spdk_processes 00:04:27.749 10:58:51 -- spdk/autotest.sh@111 -- # uname -s 00:04:27.749 10:58:51 -- spdk/autotest.sh@111 -- # [[ Linux == Linux ]] 00:04:27.749 10:58:51 -- spdk/autotest.sh@111 -- # [[ 1 -eq 1 ]] 00:04:27.749 10:58:51 -- spdk/autotest.sh@112 -- # run_test setup.sh /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/test-setup.sh 00:04:27.749 10:58:51 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:04:27.749 10:58:51 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:04:27.749 10:58:51 -- common/autotest_common.sh@10 -- # set +x 00:04:27.749 ************************************ 00:04:27.749 START TEST setup.sh 00:04:27.749 ************************************ 00:04:27.749 10:58:51 setup.sh -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/test-setup.sh 00:04:27.749 * Looking for test storage... 00:04:27.749 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:04:27.749 10:58:51 setup.sh -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:04:27.749 10:58:51 setup.sh -- common/autotest_common.sh@1693 -- # lcov --version 00:04:27.749 10:58:51 setup.sh -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:04:27.749 10:58:51 setup.sh -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:04:27.749 10:58:51 setup.sh -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:04:27.749 10:58:51 setup.sh -- scripts/common.sh@333 -- # local ver1 ver1_l 00:04:27.749 10:58:51 setup.sh -- scripts/common.sh@334 -- # local ver2 ver2_l 00:04:27.749 10:58:51 setup.sh -- scripts/common.sh@336 -- # IFS=.-: 00:04:27.749 10:58:51 setup.sh -- scripts/common.sh@336 -- # read -ra ver1 00:04:27.749 10:58:51 setup.sh -- scripts/common.sh@337 -- # IFS=.-: 00:04:27.749 10:58:51 setup.sh -- scripts/common.sh@337 -- # read -ra ver2 00:04:27.749 10:58:51 setup.sh -- scripts/common.sh@338 -- # local 'op=<' 00:04:27.749 10:58:51 setup.sh -- scripts/common.sh@340 -- # ver1_l=2 00:04:27.749 10:58:51 setup.sh -- scripts/common.sh@341 -- # ver2_l=1 00:04:27.749 10:58:51 setup.sh -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:04:27.749 10:58:51 setup.sh -- scripts/common.sh@344 -- # case "$op" in 00:04:27.749 10:58:51 setup.sh -- scripts/common.sh@345 -- # : 1 00:04:27.749 10:58:51 setup.sh -- scripts/common.sh@364 -- # (( v = 0 )) 00:04:27.749 10:58:51 setup.sh -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:27.749 10:58:51 setup.sh -- scripts/common.sh@365 -- # decimal 1 00:04:27.749 10:58:51 setup.sh -- scripts/common.sh@353 -- # local d=1 00:04:27.749 10:58:51 setup.sh -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:27.749 10:58:51 setup.sh -- scripts/common.sh@355 -- # echo 1 00:04:27.749 10:58:51 setup.sh -- scripts/common.sh@365 -- # ver1[v]=1 00:04:27.749 10:58:51 setup.sh -- scripts/common.sh@366 -- # decimal 2 00:04:27.749 10:58:51 setup.sh -- scripts/common.sh@353 -- # local d=2 00:04:27.749 10:58:51 setup.sh -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:27.749 10:58:51 setup.sh -- scripts/common.sh@355 -- # echo 2 00:04:27.749 10:58:51 setup.sh -- scripts/common.sh@366 -- # ver2[v]=2 00:04:27.749 10:58:51 setup.sh -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:04:27.749 10:58:51 setup.sh -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:04:27.749 10:58:51 setup.sh -- scripts/common.sh@368 -- # return 0 00:04:27.749 10:58:51 setup.sh -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:27.749 10:58:51 setup.sh -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:04:27.749 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:27.749 --rc genhtml_branch_coverage=1 00:04:27.749 --rc genhtml_function_coverage=1 00:04:27.749 --rc genhtml_legend=1 00:04:27.749 --rc geninfo_all_blocks=1 00:04:27.749 --rc geninfo_unexecuted_blocks=1 00:04:27.749 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:27.749 ' 00:04:27.749 10:58:51 setup.sh -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:04:27.749 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:27.749 --rc genhtml_branch_coverage=1 00:04:27.749 --rc genhtml_function_coverage=1 00:04:27.749 --rc genhtml_legend=1 00:04:27.749 --rc geninfo_all_blocks=1 00:04:27.749 --rc geninfo_unexecuted_blocks=1 00:04:27.749 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:27.749 ' 00:04:27.749 10:58:51 setup.sh -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:04:27.749 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:27.749 --rc genhtml_branch_coverage=1 00:04:27.749 --rc genhtml_function_coverage=1 00:04:27.749 --rc genhtml_legend=1 00:04:27.749 --rc geninfo_all_blocks=1 00:04:27.749 --rc geninfo_unexecuted_blocks=1 00:04:27.749 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:27.749 ' 00:04:27.749 10:58:51 setup.sh -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:04:27.749 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:27.749 --rc genhtml_branch_coverage=1 00:04:27.749 --rc genhtml_function_coverage=1 00:04:27.749 --rc genhtml_legend=1 00:04:27.749 --rc geninfo_all_blocks=1 00:04:27.749 --rc geninfo_unexecuted_blocks=1 00:04:27.749 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:27.749 ' 00:04:27.749 10:58:51 setup.sh -- setup/test-setup.sh@10 -- # uname -s 00:04:27.749 10:58:51 setup.sh -- setup/test-setup.sh@10 -- # [[ Linux == Linux ]] 00:04:27.750 10:58:51 setup.sh -- setup/test-setup.sh@12 -- # run_test acl /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/acl.sh 00:04:27.750 10:58:51 setup.sh -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:04:27.750 10:58:51 setup.sh -- common/autotest_common.sh@1111 -- # xtrace_disable 00:04:27.750 10:58:51 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:04:27.750 ************************************ 00:04:27.750 START TEST acl 00:04:27.750 ************************************ 00:04:27.750 10:58:52 setup.sh.acl -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/acl.sh 00:04:27.750 * Looking for test storage... 00:04:27.750 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:04:27.750 10:58:52 setup.sh.acl -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:04:27.750 10:58:52 setup.sh.acl -- common/autotest_common.sh@1693 -- # lcov --version 00:04:27.750 10:58:52 setup.sh.acl -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:04:27.750 10:58:52 setup.sh.acl -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:04:27.750 10:58:52 setup.sh.acl -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:04:27.750 10:58:52 setup.sh.acl -- scripts/common.sh@333 -- # local ver1 ver1_l 00:04:27.750 10:58:52 setup.sh.acl -- scripts/common.sh@334 -- # local ver2 ver2_l 00:04:27.750 10:58:52 setup.sh.acl -- scripts/common.sh@336 -- # IFS=.-: 00:04:27.750 10:58:52 setup.sh.acl -- scripts/common.sh@336 -- # read -ra ver1 00:04:27.750 10:58:52 setup.sh.acl -- scripts/common.sh@337 -- # IFS=.-: 00:04:27.750 10:58:52 setup.sh.acl -- scripts/common.sh@337 -- # read -ra ver2 00:04:27.750 10:58:52 setup.sh.acl -- scripts/common.sh@338 -- # local 'op=<' 00:04:27.750 10:58:52 setup.sh.acl -- scripts/common.sh@340 -- # ver1_l=2 00:04:27.750 10:58:52 setup.sh.acl -- scripts/common.sh@341 -- # ver2_l=1 00:04:27.750 10:58:52 setup.sh.acl -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:04:27.750 10:58:52 setup.sh.acl -- scripts/common.sh@344 -- # case "$op" in 00:04:27.750 10:58:52 setup.sh.acl -- scripts/common.sh@345 -- # : 1 00:04:27.750 10:58:52 setup.sh.acl -- scripts/common.sh@364 -- # (( v = 0 )) 00:04:27.750 10:58:52 setup.sh.acl -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:27.750 10:58:52 setup.sh.acl -- scripts/common.sh@365 -- # decimal 1 00:04:27.750 10:58:52 setup.sh.acl -- scripts/common.sh@353 -- # local d=1 00:04:27.750 10:58:52 setup.sh.acl -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:27.750 10:58:52 setup.sh.acl -- scripts/common.sh@355 -- # echo 1 00:04:27.750 10:58:52 setup.sh.acl -- scripts/common.sh@365 -- # ver1[v]=1 00:04:27.750 10:58:52 setup.sh.acl -- scripts/common.sh@366 -- # decimal 2 00:04:27.750 10:58:52 setup.sh.acl -- scripts/common.sh@353 -- # local d=2 00:04:27.750 10:58:52 setup.sh.acl -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:27.750 10:58:52 setup.sh.acl -- scripts/common.sh@355 -- # echo 2 00:04:27.750 10:58:52 setup.sh.acl -- scripts/common.sh@366 -- # ver2[v]=2 00:04:27.750 10:58:52 setup.sh.acl -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:04:27.750 10:58:52 setup.sh.acl -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:04:27.750 10:58:52 setup.sh.acl -- scripts/common.sh@368 -- # return 0 00:04:27.750 10:58:52 setup.sh.acl -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:27.750 10:58:52 setup.sh.acl -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:04:27.750 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:27.750 --rc genhtml_branch_coverage=1 00:04:27.750 --rc genhtml_function_coverage=1 00:04:27.750 --rc genhtml_legend=1 00:04:27.750 --rc geninfo_all_blocks=1 00:04:27.750 --rc geninfo_unexecuted_blocks=1 00:04:27.750 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:27.750 ' 00:04:27.750 10:58:52 setup.sh.acl -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:04:27.750 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:27.750 --rc genhtml_branch_coverage=1 00:04:27.750 --rc genhtml_function_coverage=1 00:04:27.750 --rc genhtml_legend=1 00:04:27.750 --rc geninfo_all_blocks=1 00:04:27.750 --rc geninfo_unexecuted_blocks=1 00:04:27.750 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:27.750 ' 00:04:27.750 10:58:52 setup.sh.acl -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:04:27.750 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:27.750 --rc genhtml_branch_coverage=1 00:04:27.750 --rc genhtml_function_coverage=1 00:04:27.750 --rc genhtml_legend=1 00:04:27.750 --rc geninfo_all_blocks=1 00:04:27.750 --rc geninfo_unexecuted_blocks=1 00:04:27.750 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:27.750 ' 00:04:27.750 10:58:52 setup.sh.acl -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:04:27.750 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:27.750 --rc genhtml_branch_coverage=1 00:04:27.750 --rc genhtml_function_coverage=1 00:04:27.750 --rc genhtml_legend=1 00:04:27.750 --rc geninfo_all_blocks=1 00:04:27.750 --rc geninfo_unexecuted_blocks=1 00:04:27.750 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:27.750 ' 00:04:27.750 10:58:52 setup.sh.acl -- setup/acl.sh@10 -- # get_zoned_devs 00:04:27.750 10:58:52 setup.sh.acl -- common/autotest_common.sh@1657 -- # zoned_devs=() 00:04:27.750 10:58:52 setup.sh.acl -- common/autotest_common.sh@1657 -- # local -gA zoned_devs 00:04:27.750 10:58:52 setup.sh.acl -- common/autotest_common.sh@1658 -- # local nvme bdf 00:04:27.750 10:58:52 setup.sh.acl -- common/autotest_common.sh@1660 -- # for nvme in /sys/block/nvme* 00:04:27.750 10:58:52 setup.sh.acl -- common/autotest_common.sh@1661 -- # is_block_zoned nvme0n1 00:04:27.750 10:58:52 setup.sh.acl -- common/autotest_common.sh@1650 -- # local device=nvme0n1 00:04:27.750 10:58:52 setup.sh.acl -- common/autotest_common.sh@1652 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:04:27.750 10:58:52 setup.sh.acl -- common/autotest_common.sh@1653 -- # [[ none != none ]] 00:04:27.750 10:58:52 setup.sh.acl -- setup/acl.sh@12 -- # devs=() 00:04:27.750 10:58:52 setup.sh.acl -- setup/acl.sh@12 -- # declare -a devs 00:04:27.750 10:58:52 setup.sh.acl -- setup/acl.sh@13 -- # drivers=() 00:04:27.750 10:58:52 setup.sh.acl -- setup/acl.sh@13 -- # declare -A drivers 00:04:27.750 10:58:52 setup.sh.acl -- setup/acl.sh@51 -- # setup reset 00:04:27.750 10:58:52 setup.sh.acl -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:27.750 10:58:52 setup.sh.acl -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:31.961 10:58:56 setup.sh.acl -- setup/acl.sh@52 -- # collect_setup_devs 00:04:31.961 10:58:56 setup.sh.acl -- setup/acl.sh@16 -- # local dev driver 00:04:31.961 10:58:56 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:31.961 10:58:56 setup.sh.acl -- setup/acl.sh@15 -- # setup output status 00:04:31.961 10:58:56 setup.sh.acl -- setup/common.sh@9 -- # [[ output == output ]] 00:04:31.961 10:58:56 setup.sh.acl -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:04:35.263 Hugepages 00:04:35.263 node hugesize free / total 00:04:35.263 10:58:59 setup.sh.acl -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:04:35.263 10:58:59 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:35.263 10:58:59 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:35.263 10:58:59 setup.sh.acl -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:04:35.263 10:58:59 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:35.263 10:58:59 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:35.263 10:58:59 setup.sh.acl -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:04:35.263 10:58:59 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:35.263 10:58:59 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:35.263 00:04:35.263 Type BDF Vendor Device NUMA Driver Device Block devices 00:04:35.263 10:58:59 setup.sh.acl -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:04:35.263 10:58:59 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:35.263 10:58:59 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:35.263 10:58:59 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.0 == *:*:*.* ]] 00:04:35.263 10:58:59 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:35.263 10:58:59 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:35.263 10:58:59 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:35.263 10:58:59 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.1 == *:*:*.* ]] 00:04:35.263 10:58:59 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:35.263 10:58:59 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:35.263 10:58:59 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:35.263 10:58:59 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.2 == *:*:*.* ]] 00:04:35.263 10:58:59 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:35.263 10:58:59 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:35.263 10:58:59 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:35.263 10:58:59 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.3 == *:*:*.* ]] 00:04:35.263 10:58:59 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:35.263 10:58:59 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:35.263 10:58:59 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:35.263 10:58:59 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.4 == *:*:*.* ]] 00:04:35.263 10:58:59 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:35.263 10:58:59 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:35.263 10:58:59 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:35.263 10:58:59 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.5 == *:*:*.* ]] 00:04:35.263 10:58:59 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:35.263 10:58:59 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:35.264 10:58:59 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:35.264 10:58:59 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.6 == *:*:*.* ]] 00:04:35.264 10:58:59 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:35.264 10:58:59 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:35.264 10:58:59 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:35.264 10:58:59 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.7 == *:*:*.* ]] 00:04:35.264 10:58:59 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:35.264 10:58:59 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:35.264 10:58:59 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:35.264 10:58:59 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.0 == *:*:*.* ]] 00:04:35.264 10:58:59 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:35.264 10:58:59 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:35.264 10:58:59 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:35.264 10:58:59 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.1 == *:*:*.* ]] 00:04:35.264 10:58:59 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:35.264 10:58:59 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:35.264 10:58:59 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:35.264 10:58:59 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.2 == *:*:*.* ]] 00:04:35.264 10:58:59 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:35.264 10:58:59 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:35.264 10:58:59 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:35.264 10:58:59 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.3 == *:*:*.* ]] 00:04:35.264 10:58:59 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:35.264 10:58:59 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:35.264 10:58:59 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:35.264 10:58:59 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.4 == *:*:*.* ]] 00:04:35.264 10:58:59 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:35.264 10:58:59 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:35.264 10:58:59 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:35.264 10:58:59 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.5 == *:*:*.* ]] 00:04:35.264 10:58:59 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:35.264 10:58:59 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:35.264 10:58:59 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:35.264 10:58:59 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.6 == *:*:*.* ]] 00:04:35.264 10:58:59 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:35.264 10:58:59 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:35.264 10:58:59 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:35.264 10:58:59 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.7 == *:*:*.* ]] 00:04:35.264 10:58:59 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:35.264 10:58:59 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:35.264 10:58:59 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:35.264 10:58:59 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:d8:00.0 == *:*:*.* ]] 00:04:35.264 10:58:59 setup.sh.acl -- setup/acl.sh@20 -- # [[ nvme == nvme ]] 00:04:35.264 10:58:59 setup.sh.acl -- setup/acl.sh@21 -- # [[ '' == *\0\0\0\0\:\d\8\:\0\0\.\0* ]] 00:04:35.264 10:58:59 setup.sh.acl -- setup/acl.sh@22 -- # devs+=("$dev") 00:04:35.264 10:58:59 setup.sh.acl -- setup/acl.sh@22 -- # drivers["$dev"]=nvme 00:04:35.264 10:58:59 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:35.264 10:58:59 setup.sh.acl -- setup/acl.sh@24 -- # (( 1 > 0 )) 00:04:35.264 10:58:59 setup.sh.acl -- setup/acl.sh@54 -- # run_test denied denied 00:04:35.264 10:58:59 setup.sh.acl -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:04:35.264 10:58:59 setup.sh.acl -- common/autotest_common.sh@1111 -- # xtrace_disable 00:04:35.264 10:58:59 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:04:35.264 ************************************ 00:04:35.264 START TEST denied 00:04:35.264 ************************************ 00:04:35.264 10:58:59 setup.sh.acl.denied -- common/autotest_common.sh@1129 -- # denied 00:04:35.264 10:58:59 setup.sh.acl.denied -- setup/acl.sh@38 -- # PCI_BLOCKED=' 0000:d8:00.0' 00:04:35.264 10:58:59 setup.sh.acl.denied -- setup/acl.sh@38 -- # setup output config 00:04:35.264 10:58:59 setup.sh.acl.denied -- setup/acl.sh@39 -- # grep 'Skipping denied controller at 0000:d8:00.0' 00:04:35.264 10:58:59 setup.sh.acl.denied -- setup/common.sh@9 -- # [[ output == output ]] 00:04:35.264 10:58:59 setup.sh.acl.denied -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:39.469 0000:d8:00.0 (8086 0a54): Skipping denied controller at 0000:d8:00.0 00:04:39.469 10:59:03 setup.sh.acl.denied -- setup/acl.sh@40 -- # verify 0000:d8:00.0 00:04:39.469 10:59:03 setup.sh.acl.denied -- setup/acl.sh@28 -- # local dev driver 00:04:39.469 10:59:03 setup.sh.acl.denied -- setup/acl.sh@30 -- # for dev in "$@" 00:04:39.469 10:59:03 setup.sh.acl.denied -- setup/acl.sh@31 -- # [[ -e /sys/bus/pci/devices/0000:d8:00.0 ]] 00:04:39.469 10:59:03 setup.sh.acl.denied -- setup/acl.sh@32 -- # readlink -f /sys/bus/pci/devices/0000:d8:00.0/driver 00:04:39.469 10:59:03 setup.sh.acl.denied -- setup/acl.sh@32 -- # driver=/sys/bus/pci/drivers/nvme 00:04:39.469 10:59:03 setup.sh.acl.denied -- setup/acl.sh@33 -- # [[ nvme == \n\v\m\e ]] 00:04:39.469 10:59:03 setup.sh.acl.denied -- setup/acl.sh@41 -- # setup reset 00:04:39.469 10:59:03 setup.sh.acl.denied -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:39.469 10:59:03 setup.sh.acl.denied -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:43.674 00:04:43.674 real 0m8.480s 00:04:43.674 user 0m2.697s 00:04:43.674 sys 0m5.111s 00:04:43.674 10:59:08 setup.sh.acl.denied -- common/autotest_common.sh@1130 -- # xtrace_disable 00:04:43.674 10:59:08 setup.sh.acl.denied -- common/autotest_common.sh@10 -- # set +x 00:04:43.674 ************************************ 00:04:43.674 END TEST denied 00:04:43.674 ************************************ 00:04:43.674 10:59:08 setup.sh.acl -- setup/acl.sh@55 -- # run_test allowed allowed 00:04:43.674 10:59:08 setup.sh.acl -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:04:43.674 10:59:08 setup.sh.acl -- common/autotest_common.sh@1111 -- # xtrace_disable 00:04:43.674 10:59:08 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:04:43.935 ************************************ 00:04:43.935 START TEST allowed 00:04:43.935 ************************************ 00:04:43.935 10:59:08 setup.sh.acl.allowed -- common/autotest_common.sh@1129 -- # allowed 00:04:43.935 10:59:08 setup.sh.acl.allowed -- setup/acl.sh@45 -- # PCI_ALLOWED=0000:d8:00.0 00:04:43.935 10:59:08 setup.sh.acl.allowed -- setup/acl.sh@45 -- # setup output config 00:04:43.935 10:59:08 setup.sh.acl.allowed -- setup/acl.sh@46 -- # grep -E '0000:d8:00.0 .*: nvme -> .*' 00:04:43.935 10:59:08 setup.sh.acl.allowed -- setup/common.sh@9 -- # [[ output == output ]] 00:04:43.935 10:59:08 setup.sh.acl.allowed -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:49.220 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:04:49.220 10:59:13 setup.sh.acl.allowed -- setup/acl.sh@47 -- # verify 00:04:49.220 10:59:13 setup.sh.acl.allowed -- setup/acl.sh@28 -- # local dev driver 00:04:49.220 10:59:13 setup.sh.acl.allowed -- setup/acl.sh@48 -- # setup reset 00:04:49.220 10:59:13 setup.sh.acl.allowed -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:49.220 10:59:13 setup.sh.acl.allowed -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:53.423 00:04:53.423 real 0m9.131s 00:04:53.423 user 0m2.608s 00:04:53.423 sys 0m5.161s 00:04:53.423 10:59:17 setup.sh.acl.allowed -- common/autotest_common.sh@1130 -- # xtrace_disable 00:04:53.423 10:59:17 setup.sh.acl.allowed -- common/autotest_common.sh@10 -- # set +x 00:04:53.423 ************************************ 00:04:53.423 END TEST allowed 00:04:53.423 ************************************ 00:04:53.423 00:04:53.423 real 0m25.494s 00:04:53.423 user 0m8.187s 00:04:53.423 sys 0m15.577s 00:04:53.423 10:59:17 setup.sh.acl -- common/autotest_common.sh@1130 -- # xtrace_disable 00:04:53.423 10:59:17 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:04:53.423 ************************************ 00:04:53.423 END TEST acl 00:04:53.423 ************************************ 00:04:53.423 10:59:17 setup.sh -- setup/test-setup.sh@13 -- # run_test hugepages /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/hugepages.sh 00:04:53.423 10:59:17 setup.sh -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:04:53.423 10:59:17 setup.sh -- common/autotest_common.sh@1111 -- # xtrace_disable 00:04:53.423 10:59:17 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:04:53.423 ************************************ 00:04:53.423 START TEST hugepages 00:04:53.423 ************************************ 00:04:53.423 10:59:17 setup.sh.hugepages -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/hugepages.sh 00:04:53.423 * Looking for test storage... 00:04:53.423 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:04:53.423 10:59:17 setup.sh.hugepages -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:04:53.423 10:59:17 setup.sh.hugepages -- common/autotest_common.sh@1693 -- # lcov --version 00:04:53.423 10:59:17 setup.sh.hugepages -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:04:53.423 10:59:17 setup.sh.hugepages -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:04:53.423 10:59:17 setup.sh.hugepages -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:04:53.423 10:59:17 setup.sh.hugepages -- scripts/common.sh@333 -- # local ver1 ver1_l 00:04:53.423 10:59:17 setup.sh.hugepages -- scripts/common.sh@334 -- # local ver2 ver2_l 00:04:53.423 10:59:17 setup.sh.hugepages -- scripts/common.sh@336 -- # IFS=.-: 00:04:53.423 10:59:17 setup.sh.hugepages -- scripts/common.sh@336 -- # read -ra ver1 00:04:53.423 10:59:17 setup.sh.hugepages -- scripts/common.sh@337 -- # IFS=.-: 00:04:53.423 10:59:17 setup.sh.hugepages -- scripts/common.sh@337 -- # read -ra ver2 00:04:53.423 10:59:17 setup.sh.hugepages -- scripts/common.sh@338 -- # local 'op=<' 00:04:53.423 10:59:17 setup.sh.hugepages -- scripts/common.sh@340 -- # ver1_l=2 00:04:53.423 10:59:17 setup.sh.hugepages -- scripts/common.sh@341 -- # ver2_l=1 00:04:53.423 10:59:17 setup.sh.hugepages -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:04:53.423 10:59:17 setup.sh.hugepages -- scripts/common.sh@344 -- # case "$op" in 00:04:53.423 10:59:17 setup.sh.hugepages -- scripts/common.sh@345 -- # : 1 00:04:53.423 10:59:17 setup.sh.hugepages -- scripts/common.sh@364 -- # (( v = 0 )) 00:04:53.423 10:59:17 setup.sh.hugepages -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:53.423 10:59:17 setup.sh.hugepages -- scripts/common.sh@365 -- # decimal 1 00:04:53.423 10:59:17 setup.sh.hugepages -- scripts/common.sh@353 -- # local d=1 00:04:53.423 10:59:17 setup.sh.hugepages -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:53.423 10:59:17 setup.sh.hugepages -- scripts/common.sh@355 -- # echo 1 00:04:53.423 10:59:17 setup.sh.hugepages -- scripts/common.sh@365 -- # ver1[v]=1 00:04:53.423 10:59:17 setup.sh.hugepages -- scripts/common.sh@366 -- # decimal 2 00:04:53.423 10:59:17 setup.sh.hugepages -- scripts/common.sh@353 -- # local d=2 00:04:53.423 10:59:17 setup.sh.hugepages -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:53.423 10:59:17 setup.sh.hugepages -- scripts/common.sh@355 -- # echo 2 00:04:53.423 10:59:17 setup.sh.hugepages -- scripts/common.sh@366 -- # ver2[v]=2 00:04:53.423 10:59:17 setup.sh.hugepages -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:04:53.423 10:59:17 setup.sh.hugepages -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:04:53.423 10:59:17 setup.sh.hugepages -- scripts/common.sh@368 -- # return 0 00:04:53.423 10:59:17 setup.sh.hugepages -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:53.423 10:59:17 setup.sh.hugepages -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:04:53.423 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:53.423 --rc genhtml_branch_coverage=1 00:04:53.423 --rc genhtml_function_coverage=1 00:04:53.423 --rc genhtml_legend=1 00:04:53.423 --rc geninfo_all_blocks=1 00:04:53.423 --rc geninfo_unexecuted_blocks=1 00:04:53.423 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:53.423 ' 00:04:53.423 10:59:17 setup.sh.hugepages -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:04:53.423 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:53.423 --rc genhtml_branch_coverage=1 00:04:53.423 --rc genhtml_function_coverage=1 00:04:53.423 --rc genhtml_legend=1 00:04:53.423 --rc geninfo_all_blocks=1 00:04:53.423 --rc geninfo_unexecuted_blocks=1 00:04:53.423 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:53.423 ' 00:04:53.423 10:59:17 setup.sh.hugepages -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:04:53.423 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:53.423 --rc genhtml_branch_coverage=1 00:04:53.423 --rc genhtml_function_coverage=1 00:04:53.423 --rc genhtml_legend=1 00:04:53.423 --rc geninfo_all_blocks=1 00:04:53.423 --rc geninfo_unexecuted_blocks=1 00:04:53.423 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:53.423 ' 00:04:53.423 10:59:17 setup.sh.hugepages -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:04:53.423 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:53.423 --rc genhtml_branch_coverage=1 00:04:53.424 --rc genhtml_function_coverage=1 00:04:53.424 --rc genhtml_legend=1 00:04:53.424 --rc geninfo_all_blocks=1 00:04:53.424 --rc geninfo_unexecuted_blocks=1 00:04:53.424 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:53.424 ' 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/hugepages.sh@10 -- # nodes_sys=() 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/hugepages.sh@10 -- # declare -a nodes_sys 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/hugepages.sh@12 -- # declare -i default_hugepages=0 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/hugepages.sh@13 -- # declare -i no_nodes=0 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/hugepages.sh@14 -- # declare -i nr_hugepages=0 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/hugepages.sh@16 -- # get_meminfo Hugepagesize 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@17 -- # local get=Hugepagesize 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@18 -- # local node= 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@19 -- # local var val 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@20 -- # local mem_f mem 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@28 -- # mapfile -t mem 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283744 kB' 'MemFree: 39464764 kB' 'MemAvailable: 43151684 kB' 'Buffers: 8940 kB' 'Cached: 12494148 kB' 'SwapCached: 0 kB' 'Active: 9514848 kB' 'Inactive: 3662944 kB' 'Active(anon): 9109436 kB' 'Inactive(anon): 0 kB' 'Active(file): 405412 kB' 'Inactive(file): 3662944 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 678216 kB' 'Mapped: 143068 kB' 'Shmem: 8434732 kB' 'KReclaimable: 230864 kB' 'Slab: 841868 kB' 'SReclaimable: 230864 kB' 'SUnreclaim: 611004 kB' 'KernelStack: 22032 kB' 'PageTables: 7804 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36433324 kB' 'Committed_AS: 10913040 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214336 kB' 'VmallocChunk: 0 kB' 'Percpu: 73024 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 2048' 'HugePages_Free: 2048' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 4194304 kB' 'DirectMap4k: 574836 kB' 'DirectMap2M: 11694080 kB' 'DirectMap1G: 57671680 kB' 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:53.424 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Hugepagesize == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@33 -- # echo 2048 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/common.sh@33 -- # return 0 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/hugepages.sh@16 -- # default_hugepages=2048 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/hugepages.sh@17 -- # default_huge_nr=/sys/kernel/mm/hugepages/hugepages-2048kB/nr_hugepages 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/hugepages.sh@18 -- # global_huge_nr=/proc/sys/vm/nr_hugepages 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/hugepages.sh@21 -- # unset -v HUGEMEM 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/hugepages.sh@22 -- # unset -v HUGENODE 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/hugepages.sh@23 -- # unset -v NRHUGE 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/hugepages.sh@197 -- # get_nodes 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/hugepages.sh@26 -- # local node 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=1024 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=1024 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/hugepages.sh@31 -- # no_nodes=2 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/hugepages.sh@32 -- # (( no_nodes > 0 )) 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/hugepages.sh@198 -- # clear_hp 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/hugepages.sh@36 -- # local node hp 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/hugepages.sh@38 -- # for node in "${!nodes_sys[@]}" 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:53.425 10:59:17 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:04:53.426 10:59:17 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:53.426 10:59:17 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:04:53.426 10:59:17 setup.sh.hugepages -- setup/hugepages.sh@38 -- # for node in "${!nodes_sys[@]}" 00:04:53.426 10:59:17 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:53.426 10:59:17 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:04:53.426 10:59:17 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:53.426 10:59:17 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:04:53.426 10:59:17 setup.sh.hugepages -- setup/hugepages.sh@44 -- # export CLEAR_HUGE=yes 00:04:53.426 10:59:17 setup.sh.hugepages -- setup/hugepages.sh@44 -- # CLEAR_HUGE=yes 00:04:53.426 10:59:17 setup.sh.hugepages -- setup/hugepages.sh@200 -- # run_test single_node_setup single_node_setup 00:04:53.426 10:59:17 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:04:53.426 10:59:17 setup.sh.hugepages -- common/autotest_common.sh@1111 -- # xtrace_disable 00:04:53.426 10:59:17 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:53.426 ************************************ 00:04:53.426 START TEST single_node_setup 00:04:53.426 ************************************ 00:04:53.426 10:59:17 setup.sh.hugepages.single_node_setup -- common/autotest_common.sh@1129 -- # single_node_setup 00:04:53.426 10:59:17 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@135 -- # get_test_nr_hugepages 2097152 0 00:04:53.426 10:59:17 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@48 -- # local size=2097152 00:04:53.426 10:59:17 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@49 -- # (( 2 > 1 )) 00:04:53.426 10:59:17 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@50 -- # shift 00:04:53.426 10:59:17 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@51 -- # node_ids=('0') 00:04:53.426 10:59:17 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@51 -- # local node_ids 00:04:53.426 10:59:17 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@54 -- # (( size >= default_hugepages )) 00:04:53.426 10:59:17 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@56 -- # nr_hugepages=1024 00:04:53.426 10:59:17 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@57 -- # get_test_nr_hugepages_per_node 0 00:04:53.426 10:59:17 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@61 -- # user_nodes=('0') 00:04:53.426 10:59:17 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@61 -- # local user_nodes 00:04:53.426 10:59:17 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@63 -- # local _nr_hugepages=1024 00:04:53.426 10:59:17 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@64 -- # local _no_nodes=2 00:04:53.426 10:59:17 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@66 -- # nodes_test=() 00:04:53.426 10:59:17 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@66 -- # local -g nodes_test 00:04:53.426 10:59:17 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@68 -- # (( 1 > 0 )) 00:04:53.426 10:59:17 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@69 -- # for _no_nodes in "${user_nodes[@]}" 00:04:53.426 10:59:17 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@70 -- # nodes_test[_no_nodes]=1024 00:04:53.426 10:59:17 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@72 -- # return 0 00:04:53.426 10:59:17 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@136 -- # NRHUGE=1024 00:04:53.426 10:59:17 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@136 -- # HUGENODE=0 00:04:53.426 10:59:17 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@136 -- # setup output 00:04:53.426 10:59:17 setup.sh.hugepages.single_node_setup -- setup/common.sh@9 -- # [[ output == output ]] 00:04:53.426 10:59:17 setup.sh.hugepages.single_node_setup -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:56.724 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:56.724 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:56.724 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:56.724 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:56.985 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:56.985 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:56.985 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:56.985 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:56.985 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:56.985 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:56.985 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:56.985 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:56.985 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:56.985 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:56.985 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:56.985 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:58.904 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:04:58.904 10:59:23 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@137 -- # verify_nr_hugepages 00:04:58.904 10:59:23 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@88 -- # local node 00:04:58.904 10:59:23 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@89 -- # local sorted_t 00:04:58.904 10:59:23 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@90 -- # local sorted_s 00:04:58.904 10:59:23 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@91 -- # local surp 00:04:58.904 10:59:23 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@92 -- # local resv 00:04:58.904 10:59:23 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@93 -- # local anon 00:04:58.904 10:59:23 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@95 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:58.904 10:59:23 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@96 -- # get_meminfo AnonHugePages 00:04:58.904 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:58.904 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@18 -- # local node= 00:04:58.904 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@19 -- # local var val 00:04:58.904 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:58.904 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:58.904 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:58.904 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:58.904 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:58.904 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:58.904 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.904 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283744 kB' 'MemFree: 41639076 kB' 'MemAvailable: 45325424 kB' 'Buffers: 8940 kB' 'Cached: 12494296 kB' 'SwapCached: 0 kB' 'Active: 9525516 kB' 'Inactive: 3662944 kB' 'Active(anon): 9120104 kB' 'Inactive(anon): 0 kB' 'Active(file): 405412 kB' 'Inactive(file): 3662944 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 688464 kB' 'Mapped: 143680 kB' 'Shmem: 8434880 kB' 'KReclaimable: 229720 kB' 'Slab: 840432 kB' 'SReclaimable: 229720 kB' 'SUnreclaim: 610712 kB' 'KernelStack: 21968 kB' 'PageTables: 7432 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481900 kB' 'Committed_AS: 10920932 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214452 kB' 'VmallocChunk: 0 kB' 'Percpu: 73024 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 574836 kB' 'DirectMap2M: 11694080 kB' 'DirectMap1G: 57671680 kB' 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.905 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # echo 0 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # return 0 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@96 -- # anon=0 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@98 -- # get_meminfo HugePages_Surp 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@18 -- # local node= 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@19 -- # local var val 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283744 kB' 'MemFree: 41640492 kB' 'MemAvailable: 45326840 kB' 'Buffers: 8940 kB' 'Cached: 12494300 kB' 'SwapCached: 0 kB' 'Active: 9520924 kB' 'Inactive: 3662944 kB' 'Active(anon): 9115512 kB' 'Inactive(anon): 0 kB' 'Active(file): 405412 kB' 'Inactive(file): 3662944 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 684316 kB' 'Mapped: 143640 kB' 'Shmem: 8434884 kB' 'KReclaimable: 229720 kB' 'Slab: 840508 kB' 'SReclaimable: 229720 kB' 'SUnreclaim: 610788 kB' 'KernelStack: 21936 kB' 'PageTables: 7628 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481900 kB' 'Committed_AS: 10917240 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214320 kB' 'VmallocChunk: 0 kB' 'Percpu: 73024 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 574836 kB' 'DirectMap2M: 11694080 kB' 'DirectMap1G: 57671680 kB' 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.906 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.907 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # echo 0 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # return 0 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@98 -- # surp=0 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Rsvd 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@18 -- # local node= 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@19 -- # local var val 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283744 kB' 'MemFree: 41637792 kB' 'MemAvailable: 45324140 kB' 'Buffers: 8940 kB' 'Cached: 12494320 kB' 'SwapCached: 0 kB' 'Active: 9525052 kB' 'Inactive: 3662944 kB' 'Active(anon): 9119640 kB' 'Inactive(anon): 0 kB' 'Active(file): 405412 kB' 'Inactive(file): 3662944 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 687832 kB' 'Mapped: 144052 kB' 'Shmem: 8434904 kB' 'KReclaimable: 229720 kB' 'Slab: 840504 kB' 'SReclaimable: 229720 kB' 'SUnreclaim: 610784 kB' 'KernelStack: 22048 kB' 'PageTables: 7412 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481900 kB' 'Committed_AS: 10919472 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214388 kB' 'VmallocChunk: 0 kB' 'Percpu: 73024 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 574836 kB' 'DirectMap2M: 11694080 kB' 'DirectMap1G: 57671680 kB' 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.908 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.909 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # echo 0 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # return 0 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@99 -- # resv=0 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@101 -- # echo nr_hugepages=1024 00:04:58.910 nr_hugepages=1024 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@102 -- # echo resv_hugepages=0 00:04:58.910 resv_hugepages=0 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@103 -- # echo surplus_hugepages=0 00:04:58.910 surplus_hugepages=0 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@104 -- # echo anon_hugepages=0 00:04:58.910 anon_hugepages=0 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@106 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@108 -- # (( 1024 == nr_hugepages )) 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@109 -- # get_meminfo HugePages_Total 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@18 -- # local node= 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@19 -- # local var val 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:58.910 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283744 kB' 'MemFree: 41635868 kB' 'MemAvailable: 45322216 kB' 'Buffers: 8940 kB' 'Cached: 12494340 kB' 'SwapCached: 0 kB' 'Active: 9519476 kB' 'Inactive: 3662944 kB' 'Active(anon): 9114064 kB' 'Inactive(anon): 0 kB' 'Active(file): 405412 kB' 'Inactive(file): 3662944 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 682272 kB' 'Mapped: 143136 kB' 'Shmem: 8434924 kB' 'KReclaimable: 229720 kB' 'Slab: 840504 kB' 'SReclaimable: 229720 kB' 'SUnreclaim: 610784 kB' 'KernelStack: 22016 kB' 'PageTables: 7604 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481900 kB' 'Committed_AS: 10914876 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214416 kB' 'VmallocChunk: 0 kB' 'Percpu: 73024 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 574836 kB' 'DirectMap2M: 11694080 kB' 'DirectMap1G: 57671680 kB' 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.911 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # echo 1024 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # return 0 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@111 -- # get_nodes 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@26 -- # local node 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=1024 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=0 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@31 -- # no_nodes=2 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@32 -- # (( no_nodes > 0 )) 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 0 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@18 -- # local node=0 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@19 -- # local var val 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:58.912 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32585368 kB' 'MemFree: 25138992 kB' 'MemUsed: 7446376 kB' 'SwapCached: 0 kB' 'Active: 3299636 kB' 'Inactive: 154240 kB' 'Active(anon): 3055520 kB' 'Inactive(anon): 0 kB' 'Active(file): 244116 kB' 'Inactive(file): 154240 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3073048 kB' 'Mapped: 91968 kB' 'AnonPages: 383836 kB' 'Shmem: 2674692 kB' 'KernelStack: 11608 kB' 'PageTables: 4784 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 106840 kB' 'Slab: 420304 kB' 'SReclaimable: 106840 kB' 'SUnreclaim: 313464 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.913 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.914 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.914 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.914 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.914 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.914 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.914 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.914 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.914 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.914 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.914 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.914 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.914 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.914 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.914 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.914 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.914 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.914 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.914 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.914 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.914 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.914 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.914 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.914 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.914 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.914 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.914 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.914 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.914 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.914 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.914 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.914 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.914 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.914 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.914 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.914 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.914 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.914 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.914 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.914 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.914 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.914 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.914 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.914 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # echo 0 00:04:58.914 10:59:23 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # return 0 00:04:58.914 10:59:23 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:04:58.914 10:59:23 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:04:58.914 10:59:23 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:04:58.914 10:59:23 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:04:58.914 10:59:23 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@127 -- # echo 'node0=1024 expecting 1024' 00:04:58.914 node0=1024 expecting 1024 00:04:58.914 10:59:23 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@129 -- # [[ 1024 == \1\0\2\4 ]] 00:04:58.914 00:04:58.914 real 0m5.519s 00:04:58.914 user 0m1.598s 00:04:58.914 sys 0m2.417s 00:04:58.914 10:59:23 setup.sh.hugepages.single_node_setup -- common/autotest_common.sh@1130 -- # xtrace_disable 00:04:58.914 10:59:23 setup.sh.hugepages.single_node_setup -- common/autotest_common.sh@10 -- # set +x 00:04:58.914 ************************************ 00:04:58.914 END TEST single_node_setup 00:04:58.914 ************************************ 00:04:58.914 10:59:23 setup.sh.hugepages -- setup/hugepages.sh@201 -- # run_test even_2G_alloc even_2G_alloc 00:04:58.914 10:59:23 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:04:58.914 10:59:23 setup.sh.hugepages -- common/autotest_common.sh@1111 -- # xtrace_disable 00:04:58.914 10:59:23 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:58.914 ************************************ 00:04:58.914 START TEST even_2G_alloc 00:04:58.914 ************************************ 00:04:58.914 10:59:23 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@1129 -- # even_2G_alloc 00:04:58.914 10:59:23 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@142 -- # get_test_nr_hugepages 2097152 00:04:58.914 10:59:23 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@48 -- # local size=2097152 00:04:58.914 10:59:23 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@49 -- # (( 1 > 1 )) 00:04:58.914 10:59:23 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@54 -- # (( size >= default_hugepages )) 00:04:58.914 10:59:23 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@56 -- # nr_hugepages=1024 00:04:58.914 10:59:23 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@57 -- # get_test_nr_hugepages_per_node 00:04:58.914 10:59:23 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@61 -- # user_nodes=() 00:04:58.914 10:59:23 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@61 -- # local user_nodes 00:04:58.914 10:59:23 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@63 -- # local _nr_hugepages=1024 00:04:58.914 10:59:23 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@64 -- # local _no_nodes=2 00:04:58.914 10:59:23 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@66 -- # nodes_test=() 00:04:58.914 10:59:23 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@66 -- # local -g nodes_test 00:04:58.914 10:59:23 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@68 -- # (( 0 > 0 )) 00:04:58.914 10:59:23 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@73 -- # (( 0 > 0 )) 00:04:58.914 10:59:23 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:04:58.914 10:59:23 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # nodes_test[_no_nodes - 1]=512 00:04:58.914 10:59:23 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@82 -- # : 512 00:04:58.914 10:59:23 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@83 -- # : 1 00:04:58.914 10:59:23 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:04:58.914 10:59:23 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # nodes_test[_no_nodes - 1]=512 00:04:58.914 10:59:23 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@82 -- # : 0 00:04:58.914 10:59:23 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@83 -- # : 0 00:04:58.914 10:59:23 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:04:58.914 10:59:23 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@143 -- # NRHUGE=1024 00:04:58.914 10:59:23 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@143 -- # setup output 00:04:58.914 10:59:23 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:58.914 10:59:23 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:03.134 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:03.134 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:03.134 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:03.134 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:03.134 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:03.134 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:03.134 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:03.134 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:03.134 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:03.134 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:03.134 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:03.134 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:03.134 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:03.134 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:03.134 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:03.134 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:03.134 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:05:03.134 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@144 -- # verify_nr_hugepages 00:05:03.134 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@88 -- # local node 00:05:03.134 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@89 -- # local sorted_t 00:05:03.134 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@90 -- # local sorted_s 00:05:03.134 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@91 -- # local surp 00:05:03.134 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@92 -- # local resv 00:05:03.134 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@93 -- # local anon 00:05:03.134 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@95 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:03.134 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@96 -- # get_meminfo AnonHugePages 00:05:03.134 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:03.134 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:05:03.134 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:05:03.134 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:03.134 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:03.134 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:03.134 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:03.134 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:03.134 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:03.134 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.134 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.134 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283744 kB' 'MemFree: 41632196 kB' 'MemAvailable: 45318340 kB' 'Buffers: 8940 kB' 'Cached: 12494460 kB' 'SwapCached: 0 kB' 'Active: 9522380 kB' 'Inactive: 3662944 kB' 'Active(anon): 9116968 kB' 'Inactive(anon): 0 kB' 'Active(file): 405412 kB' 'Inactive(file): 3662944 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 685256 kB' 'Mapped: 142168 kB' 'Shmem: 8435044 kB' 'KReclaimable: 229312 kB' 'Slab: 839448 kB' 'SReclaimable: 229312 kB' 'SUnreclaim: 610136 kB' 'KernelStack: 21744 kB' 'PageTables: 7396 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481900 kB' 'Committed_AS: 10905648 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214288 kB' 'VmallocChunk: 0 kB' 'Percpu: 73024 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 574836 kB' 'DirectMap2M: 11694080 kB' 'DirectMap1G: 57671680 kB' 00:05:03.134 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.134 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.134 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.134 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.134 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.134 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.134 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.134 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.134 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.134 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.134 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.134 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.134 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.134 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.134 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.134 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.134 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.134 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.134 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.134 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.134 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.134 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.134 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.134 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.134 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.134 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.134 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.134 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.134 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.134 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.134 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.134 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.134 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.134 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.134 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.134 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.134 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.134 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.134 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.134 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.134 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.134 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.134 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.134 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.134 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.134 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.134 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.134 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.134 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.134 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.134 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.134 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.135 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.136 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.136 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.136 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.136 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.136 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.136 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.136 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.136 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:05:03.136 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:05:03.136 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@96 -- # anon=0 00:05:03.136 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@98 -- # get_meminfo HugePages_Surp 00:05:03.136 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:03.136 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:05:03.136 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:05:03.136 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:03.136 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:03.136 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:03.136 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:03.136 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:03.136 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:03.136 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.136 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.136 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283744 kB' 'MemFree: 41631644 kB' 'MemAvailable: 45317788 kB' 'Buffers: 8940 kB' 'Cached: 12494476 kB' 'SwapCached: 0 kB' 'Active: 9523068 kB' 'Inactive: 3662944 kB' 'Active(anon): 9117656 kB' 'Inactive(anon): 0 kB' 'Active(file): 405412 kB' 'Inactive(file): 3662944 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 685924 kB' 'Mapped: 142088 kB' 'Shmem: 8435060 kB' 'KReclaimable: 229312 kB' 'Slab: 839456 kB' 'SReclaimable: 229312 kB' 'SUnreclaim: 610144 kB' 'KernelStack: 21760 kB' 'PageTables: 7516 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481900 kB' 'Committed_AS: 10906168 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214256 kB' 'VmallocChunk: 0 kB' 'Percpu: 73024 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 574836 kB' 'DirectMap2M: 11694080 kB' 'DirectMap1G: 57671680 kB' 00:05:03.136 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.136 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.136 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.136 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.136 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.136 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.136 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.136 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.136 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.136 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.136 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.136 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.136 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.136 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.136 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.136 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.136 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.136 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.136 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.136 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.136 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.136 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.136 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.136 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.136 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.136 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.136 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.136 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.136 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.136 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.136 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.136 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.136 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.136 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.136 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.136 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.136 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.136 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.136 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.136 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.136 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.136 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.136 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.136 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.136 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.136 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.136 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.136 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.136 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.136 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.136 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.136 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.136 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.136 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.136 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.136 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.136 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.136 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.136 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.136 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.136 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.136 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.136 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.136 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.136 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.136 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.136 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.136 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.136 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.137 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.137 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.137 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.137 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.137 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.137 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.137 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.137 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.137 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.137 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.137 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.137 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.137 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.137 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.137 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.137 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.137 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.137 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.137 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.137 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.137 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.137 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.137 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.137 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.137 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.137 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.137 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.137 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.137 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.137 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.137 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.137 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.137 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.137 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.137 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.137 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.137 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.137 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.137 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.137 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.137 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.137 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.137 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.137 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.137 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.137 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.137 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.137 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.137 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.137 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.137 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.137 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.137 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.137 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.137 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.137 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.137 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.137 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.137 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.137 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.137 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.137 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.137 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.137 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.137 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.137 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.137 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.137 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.137 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.137 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.137 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.137 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.137 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.137 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.137 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.137 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.137 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.137 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.137 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.137 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.137 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.137 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.137 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.137 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.137 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.137 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.137 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.137 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.138 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.138 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.138 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.138 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.138 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.138 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.138 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.138 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.138 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.138 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.138 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.138 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.138 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.138 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.138 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.138 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.138 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.138 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.138 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.138 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.138 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.138 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.138 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.138 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.138 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.138 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.138 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.138 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.138 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.138 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.138 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.138 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.138 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.138 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.138 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.138 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.138 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.138 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.138 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.138 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.138 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.138 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.138 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.138 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.138 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.138 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.138 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.138 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.138 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:05:03.138 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:05:03.138 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@98 -- # surp=0 00:05:03.138 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Rsvd 00:05:03.138 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:03.138 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:05:03.138 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:05:03.138 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:03.138 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:03.138 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:03.138 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:03.138 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:03.138 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:03.138 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.138 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.138 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283744 kB' 'MemFree: 41633204 kB' 'MemAvailable: 45319348 kB' 'Buffers: 8940 kB' 'Cached: 12494492 kB' 'SwapCached: 0 kB' 'Active: 9523068 kB' 'Inactive: 3662944 kB' 'Active(anon): 9117656 kB' 'Inactive(anon): 0 kB' 'Active(file): 405412 kB' 'Inactive(file): 3662944 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 685936 kB' 'Mapped: 142088 kB' 'Shmem: 8435076 kB' 'KReclaimable: 229312 kB' 'Slab: 839456 kB' 'SReclaimable: 229312 kB' 'SUnreclaim: 610144 kB' 'KernelStack: 21760 kB' 'PageTables: 7488 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481900 kB' 'Committed_AS: 10906188 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214256 kB' 'VmallocChunk: 0 kB' 'Percpu: 73024 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 574836 kB' 'DirectMap2M: 11694080 kB' 'DirectMap1G: 57671680 kB' 00:05:03.138 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.138 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.138 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.138 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.138 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.138 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.138 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.138 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.138 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.138 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.138 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.138 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.138 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.138 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.138 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.138 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.138 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.138 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.138 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.139 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.139 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.139 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.139 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.139 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.139 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.139 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.139 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.139 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.139 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.139 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.139 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.139 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.139 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.139 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.139 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.139 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.139 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.139 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.139 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.139 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.139 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.139 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.139 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.139 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.139 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.139 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.139 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.139 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.139 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.139 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.139 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.139 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.139 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.139 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.139 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.139 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.139 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.139 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.139 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.139 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.139 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.139 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.139 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.139 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.139 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.139 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.139 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.139 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.139 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.139 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.139 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.139 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.139 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.139 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.139 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.139 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.139 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.139 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.139 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.139 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.139 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.139 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.139 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.139 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.139 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.139 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.139 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.139 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.139 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.139 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.139 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.139 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.139 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.139 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.139 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.139 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.139 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.139 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.139 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.139 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.139 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.139 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.139 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.139 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.139 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.139 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.139 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.139 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.139 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.139 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@99 -- # resv=0 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@101 -- # echo nr_hugepages=1024 00:05:03.140 nr_hugepages=1024 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@102 -- # echo resv_hugepages=0 00:05:03.140 resv_hugepages=0 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@103 -- # echo surplus_hugepages=0 00:05:03.140 surplus_hugepages=0 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@104 -- # echo anon_hugepages=0 00:05:03.140 anon_hugepages=0 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@106 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@108 -- # (( 1024 == nr_hugepages )) 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@109 -- # get_meminfo HugePages_Total 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:03.140 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:03.141 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:03.141 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:03.141 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.141 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.141 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283744 kB' 'MemFree: 41633396 kB' 'MemAvailable: 45319540 kB' 'Buffers: 8940 kB' 'Cached: 12494532 kB' 'SwapCached: 0 kB' 'Active: 9522688 kB' 'Inactive: 3662944 kB' 'Active(anon): 9117276 kB' 'Inactive(anon): 0 kB' 'Active(file): 405412 kB' 'Inactive(file): 3662944 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 685508 kB' 'Mapped: 142088 kB' 'Shmem: 8435116 kB' 'KReclaimable: 229312 kB' 'Slab: 839456 kB' 'SReclaimable: 229312 kB' 'SUnreclaim: 610144 kB' 'KernelStack: 21744 kB' 'PageTables: 7440 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481900 kB' 'Committed_AS: 10906212 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214256 kB' 'VmallocChunk: 0 kB' 'Percpu: 73024 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 574836 kB' 'DirectMap2M: 11694080 kB' 'DirectMap1G: 57671680 kB' 00:05:03.141 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.141 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.141 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.141 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.141 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.141 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.141 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.141 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.141 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.141 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.141 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.141 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.141 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.141 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.141 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.141 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.141 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.141 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.141 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.141 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.141 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.141 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.141 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.141 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.141 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.141 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.141 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.141 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.141 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.141 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.141 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.141 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.141 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.141 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.141 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.141 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.141 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.141 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.141 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.141 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.141 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.141 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.141 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.141 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.141 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.141 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.141 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.141 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.141 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.141 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.141 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.141 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.141 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.141 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.141 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.141 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.141 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.141 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.141 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.141 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.141 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.141 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.141 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.141 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.141 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.141 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.141 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.141 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.141 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.141 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.141 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.141 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.141 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.141 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.141 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.141 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.141 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.141 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.141 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.141 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.141 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.141 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.141 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.141 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.141 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.142 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 1024 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@111 -- # get_nodes 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@26 -- # local node 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=512 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=512 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@31 -- # no_nodes=2 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@32 -- # (( no_nodes > 0 )) 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 0 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node=0 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32585368 kB' 'MemFree: 26190076 kB' 'MemUsed: 6395292 kB' 'SwapCached: 0 kB' 'Active: 3301588 kB' 'Inactive: 154240 kB' 'Active(anon): 3057472 kB' 'Inactive(anon): 0 kB' 'Active(file): 244116 kB' 'Inactive(file): 154240 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3073076 kB' 'Mapped: 90964 kB' 'AnonPages: 385984 kB' 'Shmem: 2674720 kB' 'KernelStack: 11448 kB' 'PageTables: 4500 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 106648 kB' 'Slab: 419836 kB' 'SReclaimable: 106648 kB' 'SUnreclaim: 313188 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.143 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.144 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.144 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.144 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.144 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.144 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.144 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.144 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.144 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.144 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.144 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.144 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.144 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.144 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.144 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.144 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.144 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.144 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.144 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.144 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.144 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.144 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.144 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.144 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.144 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.144 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.144 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.144 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.144 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.144 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.144 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.144 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.144 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.144 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.144 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.144 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.144 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.144 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.144 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.144 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.144 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.144 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.144 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.144 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.144 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.144 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.144 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.144 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.144 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.144 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.144 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.144 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.144 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.144 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.144 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.144 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.144 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.144 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.144 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.144 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.144 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.144 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.144 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.144 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.144 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.144 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.144 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.144 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.144 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.144 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.144 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.144 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.144 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.144 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.144 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.144 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.144 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.144 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.144 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:05:03.144 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:05:03.144 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:03.144 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:03.144 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:03.144 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 1 00:05:03.144 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:03.144 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node=1 00:05:03.144 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:05:03.144 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:03.144 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:03.145 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:05:03.145 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:05:03.145 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:03.145 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:03.145 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.145 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.145 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27698376 kB' 'MemFree: 15447604 kB' 'MemUsed: 12250772 kB' 'SwapCached: 0 kB' 'Active: 6221300 kB' 'Inactive: 3508704 kB' 'Active(anon): 6060004 kB' 'Inactive(anon): 0 kB' 'Active(file): 161296 kB' 'Inactive(file): 3508704 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 9430424 kB' 'Mapped: 51124 kB' 'AnonPages: 299700 kB' 'Shmem: 5760424 kB' 'KernelStack: 10296 kB' 'PageTables: 2940 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 122664 kB' 'Slab: 419620 kB' 'SReclaimable: 122664 kB' 'SUnreclaim: 296956 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:05:03.145 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.145 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.145 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.145 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.145 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.145 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.145 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.145 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.145 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.145 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.145 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.145 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.145 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.145 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.145 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.145 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.145 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.145 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.145 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.145 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.145 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.145 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.145 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.145 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.145 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.145 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.145 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.145 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.145 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.145 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.145 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.145 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.145 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.145 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.145 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.145 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.145 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.145 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.145 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.145 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.145 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.145 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.145 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.145 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.145 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.145 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.145 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.145 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.145 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.145 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.145 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.145 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.145 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.145 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.145 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.145 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.145 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.145 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.145 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.145 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.145 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.145 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.145 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.145 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.145 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.145 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.145 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.145 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.145 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.145 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.145 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.145 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.145 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.145 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.146 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.146 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.146 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.146 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.146 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.146 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.146 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.146 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.146 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.146 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.146 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.146 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.146 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.146 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.146 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.146 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.146 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.146 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.146 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.146 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.146 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.146 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.146 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.146 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.146 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.146 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.146 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.146 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.146 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.146 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.146 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.146 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.146 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.146 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.146 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.146 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.146 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.146 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.146 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.146 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.146 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.146 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.146 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.146 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.146 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.146 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.146 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.146 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.146 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.146 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.146 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.146 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.146 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.146 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.146 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.146 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.146 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.146 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.146 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.146 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.146 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.146 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.146 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.146 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.146 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.146 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.146 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.146 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:03.146 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.146 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.146 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.146 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:05:03.146 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:05:03.146 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:03.146 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:03.146 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:03.146 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:03.146 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # echo 'node0=512 expecting 512' 00:05:03.146 node0=512 expecting 512 00:05:03.146 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:03.146 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:03.146 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:03.146 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # echo 'node1=512 expecting 512' 00:05:03.146 node1=512 expecting 512 00:05:03.146 10:59:27 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@129 -- # [[ 512 == \5\1\2 ]] 00:05:03.146 00:05:03.146 real 0m3.741s 00:05:03.146 user 0m1.396s 00:05:03.146 sys 0m2.412s 00:05:03.146 10:59:27 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:03.146 10:59:27 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@10 -- # set +x 00:05:03.146 ************************************ 00:05:03.146 END TEST even_2G_alloc 00:05:03.146 ************************************ 00:05:03.146 10:59:27 setup.sh.hugepages -- setup/hugepages.sh@202 -- # run_test odd_alloc odd_alloc 00:05:03.146 10:59:27 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:03.146 10:59:27 setup.sh.hugepages -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:03.146 10:59:27 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:05:03.146 ************************************ 00:05:03.146 START TEST odd_alloc 00:05:03.146 ************************************ 00:05:03.146 10:59:27 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@1129 -- # odd_alloc 00:05:03.147 10:59:27 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@149 -- # get_test_nr_hugepages 2098176 00:05:03.147 10:59:27 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@48 -- # local size=2098176 00:05:03.147 10:59:27 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@49 -- # (( 1 > 1 )) 00:05:03.147 10:59:27 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@54 -- # (( size >= default_hugepages )) 00:05:03.147 10:59:27 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@56 -- # nr_hugepages=1025 00:05:03.147 10:59:27 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@57 -- # get_test_nr_hugepages_per_node 00:05:03.147 10:59:27 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@61 -- # user_nodes=() 00:05:03.147 10:59:27 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@61 -- # local user_nodes 00:05:03.147 10:59:27 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@63 -- # local _nr_hugepages=1025 00:05:03.147 10:59:27 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@64 -- # local _no_nodes=2 00:05:03.147 10:59:27 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@66 -- # nodes_test=() 00:05:03.147 10:59:27 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@66 -- # local -g nodes_test 00:05:03.147 10:59:27 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@68 -- # (( 0 > 0 )) 00:05:03.147 10:59:27 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@73 -- # (( 0 > 0 )) 00:05:03.147 10:59:27 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:05:03.147 10:59:27 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # nodes_test[_no_nodes - 1]=512 00:05:03.147 10:59:27 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@82 -- # : 513 00:05:03.147 10:59:27 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@83 -- # : 1 00:05:03.147 10:59:27 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:05:03.147 10:59:27 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # nodes_test[_no_nodes - 1]=513 00:05:03.147 10:59:27 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@82 -- # : 0 00:05:03.147 10:59:27 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@83 -- # : 0 00:05:03.147 10:59:27 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:05:03.147 10:59:27 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@150 -- # HUGEMEM=2049 00:05:03.147 10:59:27 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@150 -- # setup output 00:05:03.147 10:59:27 setup.sh.hugepages.odd_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:05:03.147 10:59:27 setup.sh.hugepages.odd_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:06.454 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:06.454 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:06.454 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:06.454 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:06.454 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:06.454 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:06.454 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:06.454 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:06.455 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:06.455 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:06.455 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:06.455 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:06.455 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:06.455 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:06.455 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:06.455 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:06.455 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@151 -- # verify_nr_hugepages 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@88 -- # local node 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@89 -- # local sorted_t 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@90 -- # local sorted_s 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@91 -- # local surp 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@92 -- # local resv 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@93 -- # local anon 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@95 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@96 -- # get_meminfo AnonHugePages 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283744 kB' 'MemFree: 41622312 kB' 'MemAvailable: 45308388 kB' 'Buffers: 8940 kB' 'Cached: 12494644 kB' 'SwapCached: 0 kB' 'Active: 9528612 kB' 'Inactive: 3662944 kB' 'Active(anon): 9123200 kB' 'Inactive(anon): 0 kB' 'Active(file): 405412 kB' 'Inactive(file): 3662944 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 690780 kB' 'Mapped: 142184 kB' 'Shmem: 8435228 kB' 'KReclaimable: 229176 kB' 'Slab: 839420 kB' 'SReclaimable: 229176 kB' 'SUnreclaim: 610244 kB' 'KernelStack: 21728 kB' 'PageTables: 7392 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480876 kB' 'Committed_AS: 10906840 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214368 kB' 'VmallocChunk: 0 kB' 'Percpu: 73024 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 574836 kB' 'DirectMap2M: 11694080 kB' 'DirectMap1G: 57671680 kB' 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.455 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@96 -- # anon=0 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@98 -- # get_meminfo HugePages_Surp 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283744 kB' 'MemFree: 41622200 kB' 'MemAvailable: 45308276 kB' 'Buffers: 8940 kB' 'Cached: 12494664 kB' 'SwapCached: 0 kB' 'Active: 9527336 kB' 'Inactive: 3662944 kB' 'Active(anon): 9121924 kB' 'Inactive(anon): 0 kB' 'Active(file): 405412 kB' 'Inactive(file): 3662944 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 689980 kB' 'Mapped: 142108 kB' 'Shmem: 8435232 kB' 'KReclaimable: 229176 kB' 'Slab: 839468 kB' 'SReclaimable: 229176 kB' 'SUnreclaim: 610292 kB' 'KernelStack: 21712 kB' 'PageTables: 7336 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480876 kB' 'Committed_AS: 10906856 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214336 kB' 'VmallocChunk: 0 kB' 'Percpu: 73024 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 574836 kB' 'DirectMap2M: 11694080 kB' 'DirectMap1G: 57671680 kB' 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.456 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.457 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@98 -- # surp=0 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Rsvd 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283744 kB' 'MemFree: 41622452 kB' 'MemAvailable: 45308528 kB' 'Buffers: 8940 kB' 'Cached: 12494664 kB' 'SwapCached: 0 kB' 'Active: 9527980 kB' 'Inactive: 3662944 kB' 'Active(anon): 9122568 kB' 'Inactive(anon): 0 kB' 'Active(file): 405412 kB' 'Inactive(file): 3662944 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 690632 kB' 'Mapped: 142108 kB' 'Shmem: 8435248 kB' 'KReclaimable: 229176 kB' 'Slab: 839468 kB' 'SReclaimable: 229176 kB' 'SUnreclaim: 610292 kB' 'KernelStack: 21760 kB' 'PageTables: 7492 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480876 kB' 'Committed_AS: 10910944 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214320 kB' 'VmallocChunk: 0 kB' 'Percpu: 73024 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 574836 kB' 'DirectMap2M: 11694080 kB' 'DirectMap1G: 57671680 kB' 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.458 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.459 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.460 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.460 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.460 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.460 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.460 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.460 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.460 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.460 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.460 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.460 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.460 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.460 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.460 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.460 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.460 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.460 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.460 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.460 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.460 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.460 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.460 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.460 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.460 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.460 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.460 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.460 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.460 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.460 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.460 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.460 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.460 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.460 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.460 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.460 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.460 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.460 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.460 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.460 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.460 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.460 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.460 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.460 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.460 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.460 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.460 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.460 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.460 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.460 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.460 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.460 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.460 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.460 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.460 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.460 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.460 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.460 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.460 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.460 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.460 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.460 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.460 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.460 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.460 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.460 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.460 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.460 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.460 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.460 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.460 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:05:06.460 10:59:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:06.460 10:59:30 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@99 -- # resv=0 00:05:06.460 10:59:30 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@101 -- # echo nr_hugepages=1025 00:05:06.460 nr_hugepages=1025 00:05:06.460 10:59:30 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@102 -- # echo resv_hugepages=0 00:05:06.460 resv_hugepages=0 00:05:06.460 10:59:30 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@103 -- # echo surplus_hugepages=0 00:05:06.460 surplus_hugepages=0 00:05:06.460 10:59:30 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@104 -- # echo anon_hugepages=0 00:05:06.460 anon_hugepages=0 00:05:06.460 10:59:30 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@106 -- # (( 1025 == nr_hugepages + surp + resv )) 00:05:06.460 10:59:30 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@108 -- # (( 1025 == nr_hugepages )) 00:05:06.460 10:59:30 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@109 -- # get_meminfo HugePages_Total 00:05:06.460 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:06.460 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:05:06.460 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:06.460 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:06.460 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:06.460 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:06.460 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:06.460 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:06.460 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:06.460 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.460 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.460 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283744 kB' 'MemFree: 41625028 kB' 'MemAvailable: 45311104 kB' 'Buffers: 8940 kB' 'Cached: 12494684 kB' 'SwapCached: 0 kB' 'Active: 9527444 kB' 'Inactive: 3662944 kB' 'Active(anon): 9122032 kB' 'Inactive(anon): 0 kB' 'Active(file): 405412 kB' 'Inactive(file): 3662944 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 690056 kB' 'Mapped: 142108 kB' 'Shmem: 8435268 kB' 'KReclaimable: 229176 kB' 'Slab: 839496 kB' 'SReclaimable: 229176 kB' 'SUnreclaim: 610320 kB' 'KernelStack: 21712 kB' 'PageTables: 7336 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480876 kB' 'Committed_AS: 10906896 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214288 kB' 'VmallocChunk: 0 kB' 'Percpu: 73024 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 574836 kB' 'DirectMap2M: 11694080 kB' 'DirectMap1G: 57671680 kB' 00:05:06.460 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.460 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.460 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.460 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.460 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.460 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.460 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.460 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.460 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.460 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.460 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.460 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.460 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.460 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.460 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.460 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.460 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.460 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.461 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 1025 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@109 -- # (( 1025 == nr_hugepages + surp + resv )) 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@111 -- # get_nodes 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@26 -- # local node 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=513 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=512 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@31 -- # no_nodes=2 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@32 -- # (( no_nodes > 0 )) 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 0 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node=0 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32585368 kB' 'MemFree: 26178972 kB' 'MemUsed: 6406396 kB' 'SwapCached: 0 kB' 'Active: 3306532 kB' 'Inactive: 154240 kB' 'Active(anon): 3062416 kB' 'Inactive(anon): 0 kB' 'Active(file): 244116 kB' 'Inactive(file): 154240 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3073152 kB' 'Mapped: 90972 kB' 'AnonPages: 390884 kB' 'Shmem: 2674796 kB' 'KernelStack: 11432 kB' 'PageTables: 4472 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 106600 kB' 'Slab: 419932 kB' 'SReclaimable: 106600 kB' 'SUnreclaim: 313332 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 513' 'HugePages_Free: 513' 'HugePages_Surp: 0' 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.462 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 1 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node=1 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.463 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27698376 kB' 'MemFree: 15448260 kB' 'MemUsed: 12250116 kB' 'SwapCached: 0 kB' 'Active: 6221092 kB' 'Inactive: 3508704 kB' 'Active(anon): 6059796 kB' 'Inactive(anon): 0 kB' 'Active(file): 161296 kB' 'Inactive(file): 3508704 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 9430516 kB' 'Mapped: 51144 kB' 'AnonPages: 299388 kB' 'Shmem: 5760516 kB' 'KernelStack: 10216 kB' 'PageTables: 2696 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 122576 kB' 'Slab: 419564 kB' 'SReclaimable: 122576 kB' 'SUnreclaim: 296988 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.464 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.465 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.465 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.465 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.465 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.465 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.465 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.465 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.465 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.465 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.465 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.465 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.465 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.465 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.465 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.465 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.465 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.465 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.465 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.465 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.465 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.465 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.465 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.465 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.465 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.465 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.465 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.465 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.465 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.465 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.465 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.465 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.465 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.465 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.726 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.726 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.726 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.726 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.726 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.726 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.726 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.726 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.726 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:06.726 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.726 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.726 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.726 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:05:06.726 10:59:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:06.726 10:59:31 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:06.726 10:59:31 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:06.726 10:59:31 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:06.726 10:59:31 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:06.726 10:59:31 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # echo 'node0=513 expecting 513' 00:05:06.726 node0=513 expecting 513 00:05:06.726 10:59:31 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:06.726 10:59:31 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:06.726 10:59:31 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:06.726 10:59:31 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # echo 'node1=512 expecting 512' 00:05:06.726 node1=512 expecting 512 00:05:06.726 10:59:31 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@129 -- # [[ 512 513 == \5\1\2\ \5\1\3 ]] 00:05:06.726 00:05:06.726 real 0m3.783s 00:05:06.726 user 0m1.394s 00:05:06.726 sys 0m2.462s 00:05:06.726 10:59:31 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:06.726 10:59:31 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@10 -- # set +x 00:05:06.726 ************************************ 00:05:06.726 END TEST odd_alloc 00:05:06.726 ************************************ 00:05:06.726 10:59:31 setup.sh.hugepages -- setup/hugepages.sh@203 -- # run_test custom_alloc custom_alloc 00:05:06.726 10:59:31 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:06.726 10:59:31 setup.sh.hugepages -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:06.726 10:59:31 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:05:06.726 ************************************ 00:05:06.726 START TEST custom_alloc 00:05:06.726 ************************************ 00:05:06.726 10:59:31 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@1129 -- # custom_alloc 00:05:06.726 10:59:31 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@157 -- # local IFS=, 00:05:06.726 10:59:31 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@159 -- # local node 00:05:06.726 10:59:31 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@160 -- # nodes_hp=() 00:05:06.726 10:59:31 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@160 -- # local nodes_hp 00:05:06.726 10:59:31 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@162 -- # local nr_hugepages=0 _nr_hugepages=0 00:05:06.726 10:59:31 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@164 -- # get_test_nr_hugepages 1048576 00:05:06.726 10:59:31 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@48 -- # local size=1048576 00:05:06.726 10:59:31 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@49 -- # (( 1 > 1 )) 00:05:06.726 10:59:31 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@54 -- # (( size >= default_hugepages )) 00:05:06.726 10:59:31 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@56 -- # nr_hugepages=512 00:05:06.726 10:59:31 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@57 -- # get_test_nr_hugepages_per_node 00:05:06.726 10:59:31 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@61 -- # user_nodes=() 00:05:06.726 10:59:31 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@61 -- # local user_nodes 00:05:06.726 10:59:31 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@63 -- # local _nr_hugepages=512 00:05:06.726 10:59:31 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _no_nodes=2 00:05:06.726 10:59:31 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@66 -- # nodes_test=() 00:05:06.726 10:59:31 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@66 -- # local -g nodes_test 00:05:06.726 10:59:31 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@68 -- # (( 0 > 0 )) 00:05:06.726 10:59:31 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@73 -- # (( 0 > 0 )) 00:05:06.726 10:59:31 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:05:06.726 10:59:31 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # nodes_test[_no_nodes - 1]=256 00:05:06.726 10:59:31 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@82 -- # : 256 00:05:06.726 10:59:31 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@83 -- # : 1 00:05:06.726 10:59:31 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:05:06.726 10:59:31 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # nodes_test[_no_nodes - 1]=256 00:05:06.726 10:59:31 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@82 -- # : 0 00:05:06.726 10:59:31 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@83 -- # : 0 00:05:06.726 10:59:31 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:05:06.726 10:59:31 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@165 -- # nodes_hp[0]=512 00:05:06.726 10:59:31 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@166 -- # (( 2 > 1 )) 00:05:06.727 10:59:31 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@167 -- # get_test_nr_hugepages 2097152 00:05:06.727 10:59:31 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@48 -- # local size=2097152 00:05:06.727 10:59:31 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@49 -- # (( 1 > 1 )) 00:05:06.727 10:59:31 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@54 -- # (( size >= default_hugepages )) 00:05:06.727 10:59:31 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@56 -- # nr_hugepages=1024 00:05:06.727 10:59:31 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@57 -- # get_test_nr_hugepages_per_node 00:05:06.727 10:59:31 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@61 -- # user_nodes=() 00:05:06.727 10:59:31 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@61 -- # local user_nodes 00:05:06.727 10:59:31 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@63 -- # local _nr_hugepages=1024 00:05:06.727 10:59:31 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _no_nodes=2 00:05:06.727 10:59:31 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@66 -- # nodes_test=() 00:05:06.727 10:59:31 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@66 -- # local -g nodes_test 00:05:06.727 10:59:31 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@68 -- # (( 0 > 0 )) 00:05:06.727 10:59:31 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@73 -- # (( 1 > 0 )) 00:05:06.727 10:59:31 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # for _no_nodes in "${!nodes_hp[@]}" 00:05:06.727 10:59:31 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # nodes_test[_no_nodes]=512 00:05:06.727 10:59:31 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@77 -- # return 0 00:05:06.727 10:59:31 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@168 -- # nodes_hp[1]=1024 00:05:06.727 10:59:31 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@171 -- # for node in "${!nodes_hp[@]}" 00:05:06.727 10:59:31 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@172 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:05:06.727 10:59:31 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@173 -- # (( _nr_hugepages += nodes_hp[node] )) 00:05:06.727 10:59:31 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@171 -- # for node in "${!nodes_hp[@]}" 00:05:06.727 10:59:31 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@172 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:05:06.727 10:59:31 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@173 -- # (( _nr_hugepages += nodes_hp[node] )) 00:05:06.727 10:59:31 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@176 -- # get_test_nr_hugepages_per_node 00:05:06.727 10:59:31 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@61 -- # user_nodes=() 00:05:06.727 10:59:31 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@61 -- # local user_nodes 00:05:06.727 10:59:31 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@63 -- # local _nr_hugepages=1024 00:05:06.727 10:59:31 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _no_nodes=2 00:05:06.727 10:59:31 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@66 -- # nodes_test=() 00:05:06.727 10:59:31 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@66 -- # local -g nodes_test 00:05:06.727 10:59:31 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@68 -- # (( 0 > 0 )) 00:05:06.727 10:59:31 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@73 -- # (( 2 > 0 )) 00:05:06.727 10:59:31 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # for _no_nodes in "${!nodes_hp[@]}" 00:05:06.727 10:59:31 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # nodes_test[_no_nodes]=512 00:05:06.727 10:59:31 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # for _no_nodes in "${!nodes_hp[@]}" 00:05:06.727 10:59:31 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # nodes_test[_no_nodes]=1024 00:05:06.727 10:59:31 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@77 -- # return 0 00:05:06.727 10:59:31 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@177 -- # HUGENODE='nodes_hp[0]=512,nodes_hp[1]=1024' 00:05:06.727 10:59:31 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@177 -- # setup output 00:05:06.727 10:59:31 setup.sh.hugepages.custom_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:05:06.727 10:59:31 setup.sh.hugepages.custom_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:10.023 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:10.024 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:10.024 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:10.024 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:10.024 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:10.024 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:10.024 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:10.024 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:10.024 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:10.024 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:10.024 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:10.024 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:10.024 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:10.024 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:10.024 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:10.024 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:10.024 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:05:10.289 10:59:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@178 -- # nr_hugepages=1536 00:05:10.289 10:59:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@178 -- # verify_nr_hugepages 00:05:10.289 10:59:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@88 -- # local node 00:05:10.289 10:59:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@89 -- # local sorted_t 00:05:10.289 10:59:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@90 -- # local sorted_s 00:05:10.289 10:59:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@91 -- # local surp 00:05:10.289 10:59:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@92 -- # local resv 00:05:10.289 10:59:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@93 -- # local anon 00:05:10.289 10:59:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@95 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:10.289 10:59:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@96 -- # get_meminfo AnonHugePages 00:05:10.289 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:10.289 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:05:10.289 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:10.289 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:10.289 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:10.289 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:10.289 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:10.289 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:10.289 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:10.289 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.289 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.289 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283744 kB' 'MemFree: 40588884 kB' 'MemAvailable: 44274920 kB' 'Buffers: 8940 kB' 'Cached: 12494816 kB' 'SwapCached: 0 kB' 'Active: 9533716 kB' 'Inactive: 3662944 kB' 'Active(anon): 9128304 kB' 'Inactive(anon): 0 kB' 'Active(file): 405412 kB' 'Inactive(file): 3662944 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 696248 kB' 'Mapped: 142568 kB' 'Shmem: 8435400 kB' 'KReclaimable: 229096 kB' 'Slab: 839948 kB' 'SReclaimable: 229096 kB' 'SUnreclaim: 610852 kB' 'KernelStack: 21792 kB' 'PageTables: 7632 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957612 kB' 'Committed_AS: 10911260 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214320 kB' 'VmallocChunk: 0 kB' 'Percpu: 73024 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 574836 kB' 'DirectMap2M: 11694080 kB' 'DirectMap1G: 57671680 kB' 00:05:10.289 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.289 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.289 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.289 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.289 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.289 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.289 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.289 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.289 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.289 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.289 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.289 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.289 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.289 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.289 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.289 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.289 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.289 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.289 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.289 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.289 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.289 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.289 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.289 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.290 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@96 -- # anon=0 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@98 -- # get_meminfo HugePages_Surp 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283744 kB' 'MemFree: 40587068 kB' 'MemAvailable: 44273104 kB' 'Buffers: 8940 kB' 'Cached: 12494820 kB' 'SwapCached: 0 kB' 'Active: 9536280 kB' 'Inactive: 3662944 kB' 'Active(anon): 9130868 kB' 'Inactive(anon): 0 kB' 'Active(file): 405412 kB' 'Inactive(file): 3662944 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 695792 kB' 'Mapped: 142528 kB' 'Shmem: 8435404 kB' 'KReclaimable: 229096 kB' 'Slab: 840008 kB' 'SReclaimable: 229096 kB' 'SUnreclaim: 610912 kB' 'KernelStack: 21872 kB' 'PageTables: 7904 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957612 kB' 'Committed_AS: 10912732 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214288 kB' 'VmallocChunk: 0 kB' 'Percpu: 73024 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 574836 kB' 'DirectMap2M: 11694080 kB' 'DirectMap1G: 57671680 kB' 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.291 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.292 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@98 -- # surp=0 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Rsvd 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283744 kB' 'MemFree: 40595748 kB' 'MemAvailable: 44281784 kB' 'Buffers: 8940 kB' 'Cached: 12494836 kB' 'SwapCached: 0 kB' 'Active: 9531156 kB' 'Inactive: 3662944 kB' 'Active(anon): 9125744 kB' 'Inactive(anon): 0 kB' 'Active(file): 405412 kB' 'Inactive(file): 3662944 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 693692 kB' 'Mapped: 142116 kB' 'Shmem: 8435420 kB' 'KReclaimable: 229096 kB' 'Slab: 840008 kB' 'SReclaimable: 229096 kB' 'SUnreclaim: 610912 kB' 'KernelStack: 21872 kB' 'PageTables: 7872 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957612 kB' 'Committed_AS: 10907568 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214288 kB' 'VmallocChunk: 0 kB' 'Percpu: 73024 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 574836 kB' 'DirectMap2M: 11694080 kB' 'DirectMap1G: 57671680 kB' 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.293 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.294 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@99 -- # resv=0 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@101 -- # echo nr_hugepages=1536 00:05:10.295 nr_hugepages=1536 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@102 -- # echo resv_hugepages=0 00:05:10.295 resv_hugepages=0 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@103 -- # echo surplus_hugepages=0 00:05:10.295 surplus_hugepages=0 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@104 -- # echo anon_hugepages=0 00:05:10.295 anon_hugepages=0 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@106 -- # (( 1536 == nr_hugepages + surp + resv )) 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@108 -- # (( 1536 == nr_hugepages )) 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@109 -- # get_meminfo HugePages_Total 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283744 kB' 'MemFree: 40596152 kB' 'MemAvailable: 44282188 kB' 'Buffers: 8940 kB' 'Cached: 12494876 kB' 'SwapCached: 0 kB' 'Active: 9530780 kB' 'Inactive: 3662944 kB' 'Active(anon): 9125368 kB' 'Inactive(anon): 0 kB' 'Active(file): 405412 kB' 'Inactive(file): 3662944 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 693264 kB' 'Mapped: 142116 kB' 'Shmem: 8435460 kB' 'KReclaimable: 229096 kB' 'Slab: 840008 kB' 'SReclaimable: 229096 kB' 'SUnreclaim: 610912 kB' 'KernelStack: 21856 kB' 'PageTables: 7824 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957612 kB' 'Committed_AS: 10907588 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214288 kB' 'VmallocChunk: 0 kB' 'Percpu: 73024 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 574836 kB' 'DirectMap2M: 11694080 kB' 'DirectMap1G: 57671680 kB' 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.295 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 1536 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@109 -- # (( 1536 == nr_hugepages + surp + resv )) 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@111 -- # get_nodes 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@26 -- # local node 00:05:10.296 10:59:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=512 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=1024 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@31 -- # no_nodes=2 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@32 -- # (( no_nodes > 0 )) 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 0 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node=0 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32585368 kB' 'MemFree: 26207264 kB' 'MemUsed: 6378104 kB' 'SwapCached: 0 kB' 'Active: 3309024 kB' 'Inactive: 154240 kB' 'Active(anon): 3064908 kB' 'Inactive(anon): 0 kB' 'Active(file): 244116 kB' 'Inactive(file): 154240 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3073244 kB' 'Mapped: 90964 kB' 'AnonPages: 393240 kB' 'Shmem: 2674888 kB' 'KernelStack: 11544 kB' 'PageTables: 4844 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 106536 kB' 'Slab: 420060 kB' 'SReclaimable: 106536 kB' 'SUnreclaim: 313524 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.297 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.298 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.298 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.298 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.298 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.298 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.298 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.298 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.298 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.298 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.298 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.298 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.298 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.298 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.298 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.298 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.298 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.298 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.298 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.298 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.298 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.298 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.298 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.298 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.298 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.298 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.298 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.298 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.298 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.298 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.298 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.298 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.298 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.298 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.298 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.298 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.298 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.298 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.298 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.298 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.298 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.298 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.298 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.298 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.298 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.298 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.298 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.298 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.298 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.298 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.298 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.298 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.298 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.298 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:05:10.298 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:10.298 10:59:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:10.298 10:59:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:10.559 10:59:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:10.559 10:59:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 1 00:05:10.559 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node=1 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27698376 kB' 'MemFree: 14389204 kB' 'MemUsed: 13309172 kB' 'SwapCached: 0 kB' 'Active: 6222180 kB' 'Inactive: 3508704 kB' 'Active(anon): 6060884 kB' 'Inactive(anon): 0 kB' 'Active(file): 161296 kB' 'Inactive(file): 3508704 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 9430576 kB' 'Mapped: 51152 kB' 'AnonPages: 300452 kB' 'Shmem: 5760576 kB' 'KernelStack: 10328 kB' 'PageTables: 3028 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 122560 kB' 'Slab: 419948 kB' 'SReclaimable: 122560 kB' 'SUnreclaim: 297388 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.560 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.561 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.561 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.561 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.561 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.561 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.561 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.561 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.561 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.561 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.561 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.561 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.561 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.561 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.561 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.561 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.561 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.561 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.561 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.561 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.561 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.561 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.561 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.561 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.561 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.561 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.561 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.561 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.561 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.561 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.561 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.561 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.561 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.561 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.561 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.561 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.561 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.561 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.561 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.561 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.561 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.561 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.561 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:10.561 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.561 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.561 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.561 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:05:10.561 10:59:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:10.561 10:59:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:10.561 10:59:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:10.561 10:59:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:10.561 10:59:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:10.561 10:59:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # echo 'node0=512 expecting 512' 00:05:10.561 node0=512 expecting 512 00:05:10.561 10:59:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:10.561 10:59:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:10.561 10:59:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:10.561 10:59:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # echo 'node1=1024 expecting 1024' 00:05:10.561 node1=1024 expecting 1024 00:05:10.561 10:59:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@129 -- # [[ 512,1024 == \5\1\2\,\1\0\2\4 ]] 00:05:10.561 00:05:10.561 real 0m3.783s 00:05:10.561 user 0m1.348s 00:05:10.561 sys 0m2.501s 00:05:10.561 10:59:34 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:10.561 10:59:34 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@10 -- # set +x 00:05:10.561 ************************************ 00:05:10.561 END TEST custom_alloc 00:05:10.561 ************************************ 00:05:10.561 10:59:35 setup.sh.hugepages -- setup/hugepages.sh@204 -- # run_test no_shrink_alloc no_shrink_alloc 00:05:10.561 10:59:35 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:10.561 10:59:35 setup.sh.hugepages -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:10.561 10:59:35 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:05:10.561 ************************************ 00:05:10.561 START TEST no_shrink_alloc 00:05:10.561 ************************************ 00:05:10.561 10:59:35 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@1129 -- # no_shrink_alloc 00:05:10.561 10:59:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@185 -- # get_test_nr_hugepages 2097152 0 00:05:10.561 10:59:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@48 -- # local size=2097152 00:05:10.561 10:59:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@49 -- # (( 2 > 1 )) 00:05:10.561 10:59:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@50 -- # shift 00:05:10.561 10:59:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@51 -- # node_ids=('0') 00:05:10.561 10:59:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@51 -- # local node_ids 00:05:10.561 10:59:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@54 -- # (( size >= default_hugepages )) 00:05:10.561 10:59:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@56 -- # nr_hugepages=1024 00:05:10.561 10:59:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@57 -- # get_test_nr_hugepages_per_node 0 00:05:10.561 10:59:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@61 -- # user_nodes=('0') 00:05:10.561 10:59:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@61 -- # local user_nodes 00:05:10.561 10:59:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@63 -- # local _nr_hugepages=1024 00:05:10.561 10:59:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@64 -- # local _no_nodes=2 00:05:10.561 10:59:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@66 -- # nodes_test=() 00:05:10.561 10:59:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@66 -- # local -g nodes_test 00:05:10.561 10:59:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@68 -- # (( 1 > 0 )) 00:05:10.561 10:59:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@69 -- # for _no_nodes in "${user_nodes[@]}" 00:05:10.561 10:59:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@70 -- # nodes_test[_no_nodes]=1024 00:05:10.561 10:59:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@72 -- # return 0 00:05:10.561 10:59:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@188 -- # NRHUGE=1024 00:05:10.561 10:59:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@188 -- # HUGENODE=0 00:05:10.561 10:59:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@188 -- # setup output 00:05:10.561 10:59:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:05:10.561 10:59:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:13.859 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:13.859 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:13.859 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:13.859 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:13.859 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:13.859 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:13.859 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:13.859 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:13.859 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:13.859 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:13.859 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:13.859 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:13.859 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:13.859 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:13.859 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:13.859 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:13.859 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@189 -- # verify_nr_hugepages 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@88 -- # local node 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@89 -- # local sorted_t 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@90 -- # local sorted_s 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@91 -- # local surp 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@92 -- # local resv 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@93 -- # local anon 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@95 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # get_meminfo AnonHugePages 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283744 kB' 'MemFree: 41651660 kB' 'MemAvailable: 45337696 kB' 'Buffers: 8940 kB' 'Cached: 12495004 kB' 'SwapCached: 0 kB' 'Active: 9536196 kB' 'Inactive: 3662944 kB' 'Active(anon): 9130784 kB' 'Inactive(anon): 0 kB' 'Active(file): 405412 kB' 'Inactive(file): 3662944 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 697524 kB' 'Mapped: 142204 kB' 'Shmem: 8435588 kB' 'KReclaimable: 229096 kB' 'Slab: 840452 kB' 'SReclaimable: 229096 kB' 'SUnreclaim: 611356 kB' 'KernelStack: 21776 kB' 'PageTables: 7576 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481900 kB' 'Committed_AS: 10908364 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214320 kB' 'VmallocChunk: 0 kB' 'Percpu: 73024 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 574836 kB' 'DirectMap2M: 11694080 kB' 'DirectMap1G: 57671680 kB' 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.125 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # anon=0 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@98 -- # get_meminfo HugePages_Surp 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283744 kB' 'MemFree: 41651472 kB' 'MemAvailable: 45337508 kB' 'Buffers: 8940 kB' 'Cached: 12495008 kB' 'SwapCached: 0 kB' 'Active: 9535392 kB' 'Inactive: 3662944 kB' 'Active(anon): 9129980 kB' 'Inactive(anon): 0 kB' 'Active(file): 405412 kB' 'Inactive(file): 3662944 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 697744 kB' 'Mapped: 142128 kB' 'Shmem: 8435592 kB' 'KReclaimable: 229096 kB' 'Slab: 840464 kB' 'SReclaimable: 229096 kB' 'SUnreclaim: 611368 kB' 'KernelStack: 21776 kB' 'PageTables: 7564 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481900 kB' 'Committed_AS: 10908384 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214288 kB' 'VmallocChunk: 0 kB' 'Percpu: 73024 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 574836 kB' 'DirectMap2M: 11694080 kB' 'DirectMap1G: 57671680 kB' 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.126 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.127 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@98 -- # surp=0 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Rsvd 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283744 kB' 'MemFree: 41651640 kB' 'MemAvailable: 45337676 kB' 'Buffers: 8940 kB' 'Cached: 12495024 kB' 'SwapCached: 0 kB' 'Active: 9535404 kB' 'Inactive: 3662944 kB' 'Active(anon): 9129992 kB' 'Inactive(anon): 0 kB' 'Active(file): 405412 kB' 'Inactive(file): 3662944 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 697748 kB' 'Mapped: 142128 kB' 'Shmem: 8435608 kB' 'KReclaimable: 229096 kB' 'Slab: 840464 kB' 'SReclaimable: 229096 kB' 'SUnreclaim: 611368 kB' 'KernelStack: 21776 kB' 'PageTables: 7564 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481900 kB' 'Committed_AS: 10908404 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214288 kB' 'VmallocChunk: 0 kB' 'Percpu: 73024 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 574836 kB' 'DirectMap2M: 11694080 kB' 'DirectMap1G: 57671680 kB' 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.128 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.129 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # resv=0 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@101 -- # echo nr_hugepages=1024 00:05:14.130 nr_hugepages=1024 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@102 -- # echo resv_hugepages=0 00:05:14.130 resv_hugepages=0 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@103 -- # echo surplus_hugepages=0 00:05:14.130 surplus_hugepages=0 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@104 -- # echo anon_hugepages=0 00:05:14.130 anon_hugepages=0 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@106 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@108 -- # (( 1024 == nr_hugepages )) 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # get_meminfo HugePages_Total 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.130 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283744 kB' 'MemFree: 41650884 kB' 'MemAvailable: 45336920 kB' 'Buffers: 8940 kB' 'Cached: 12495064 kB' 'SwapCached: 0 kB' 'Active: 9535072 kB' 'Inactive: 3662944 kB' 'Active(anon): 9129660 kB' 'Inactive(anon): 0 kB' 'Active(file): 405412 kB' 'Inactive(file): 3662944 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 697336 kB' 'Mapped: 142128 kB' 'Shmem: 8435648 kB' 'KReclaimable: 229096 kB' 'Slab: 840464 kB' 'SReclaimable: 229096 kB' 'SUnreclaim: 611368 kB' 'KernelStack: 21760 kB' 'PageTables: 7516 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481900 kB' 'Committed_AS: 10908428 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214288 kB' 'VmallocChunk: 0 kB' 'Percpu: 73024 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 574836 kB' 'DirectMap2M: 11694080 kB' 'DirectMap1G: 57671680 kB' 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.131 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.132 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.132 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.132 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.132 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.132 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.132 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.132 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.132 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.132 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.132 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.132 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.132 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.132 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.132 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.132 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.132 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.132 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.132 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.132 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.132 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.132 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.132 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.132 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.132 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.132 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.132 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.132 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.132 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.132 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.132 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.132 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.132 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.132 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.132 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.132 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.132 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.132 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.132 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.132 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.132 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.132 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.132 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.132 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.132 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.132 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.132 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.132 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.132 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.132 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.132 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.132 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.132 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.132 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.132 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.132 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.132 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.132 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.132 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.132 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.132 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.132 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.132 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.132 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.132 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.132 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.132 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.132 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.132 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.132 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.132 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.132 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.132 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.132 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.132 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.132 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.132 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.132 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 1024 00:05:14.132 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:14.394 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:14.394 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@111 -- # get_nodes 00:05:14.394 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@26 -- # local node 00:05:14.394 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:14.394 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=1024 00:05:14.394 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:14.394 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=0 00:05:14.394 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@31 -- # no_nodes=2 00:05:14.394 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@32 -- # (( no_nodes > 0 )) 00:05:14.394 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:14.394 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:14.394 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 0 00:05:14.394 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:14.394 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node=0 00:05:14.394 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:14.394 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:14.394 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:14.394 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:14.394 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:14.394 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:14.394 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:14.394 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32585368 kB' 'MemFree: 25153512 kB' 'MemUsed: 7431856 kB' 'SwapCached: 0 kB' 'Active: 3313104 kB' 'Inactive: 154240 kB' 'Active(anon): 3068988 kB' 'Inactive(anon): 0 kB' 'Active(file): 244116 kB' 'Inactive(file): 154240 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3073368 kB' 'Mapped: 90964 kB' 'AnonPages: 397164 kB' 'Shmem: 2675012 kB' 'KernelStack: 11464 kB' 'PageTables: 4568 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 106536 kB' 'Slab: 420260 kB' 'SReclaimable: 106536 kB' 'SUnreclaim: 313724 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.395 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.396 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.396 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.396 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.396 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.396 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.396 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.396 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.396 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.396 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.396 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.396 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.396 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.396 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.396 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.396 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.396 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.396 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.396 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.396 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.396 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.396 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.396 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.396 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.396 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.396 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.396 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.396 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.396 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.396 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.396 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.396 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.396 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.396 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.396 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.396 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.396 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.396 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:14.396 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:14.396 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:14.396 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:14.396 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:14.396 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:14.396 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # echo 'node0=1024 expecting 1024' 00:05:14.396 node0=1024 expecting 1024 00:05:14.396 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@129 -- # [[ 1024 == \1\0\2\4 ]] 00:05:14.396 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@192 -- # CLEAR_HUGE=no 00:05:14.396 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@192 -- # NRHUGE=512 00:05:14.396 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@192 -- # HUGENODE=0 00:05:14.396 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@192 -- # setup output 00:05:14.396 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:05:14.396 10:59:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:17.693 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:17.693 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:17.693 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:17.693 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:17.693 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:17.693 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:17.693 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:17.693 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:17.693 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:17.693 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:17.693 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:17.693 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:17.693 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:17.693 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:17.693 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:17.693 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:17.693 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:05:17.693 INFO: Requested 512 hugepages but 1024 already allocated on node0 00:05:17.693 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@194 -- # verify_nr_hugepages 00:05:17.693 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@88 -- # local node 00:05:17.693 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@89 -- # local sorted_t 00:05:17.693 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@90 -- # local sorted_s 00:05:17.693 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@91 -- # local surp 00:05:17.693 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@92 -- # local resv 00:05:17.693 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@93 -- # local anon 00:05:17.693 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@95 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:17.693 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # get_meminfo AnonHugePages 00:05:17.693 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:17.693 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:17.693 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:17.693 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:17.693 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:17.693 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:17.693 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:17.693 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:17.693 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:17.693 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.693 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.693 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283744 kB' 'MemFree: 41637212 kB' 'MemAvailable: 45323180 kB' 'Buffers: 8940 kB' 'Cached: 12495148 kB' 'SwapCached: 0 kB' 'Active: 9539816 kB' 'Inactive: 3662944 kB' 'Active(anon): 9134404 kB' 'Inactive(anon): 0 kB' 'Active(file): 405412 kB' 'Inactive(file): 3662944 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 701944 kB' 'Mapped: 142280 kB' 'Shmem: 8435732 kB' 'KReclaimable: 228960 kB' 'Slab: 840372 kB' 'SReclaimable: 228960 kB' 'SUnreclaim: 611412 kB' 'KernelStack: 21952 kB' 'PageTables: 7960 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481900 kB' 'Committed_AS: 10913320 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214432 kB' 'VmallocChunk: 0 kB' 'Percpu: 73024 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 574836 kB' 'DirectMap2M: 11694080 kB' 'DirectMap1G: 57671680 kB' 00:05:17.960 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.960 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.960 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.960 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.960 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.960 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.960 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.960 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.960 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.960 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.960 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.960 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.960 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.960 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.960 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.960 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.960 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.960 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.960 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.960 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.960 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.960 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.960 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.960 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.960 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.960 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.961 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # anon=0 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@98 -- # get_meminfo HugePages_Surp 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283744 kB' 'MemFree: 41637256 kB' 'MemAvailable: 45323208 kB' 'Buffers: 8940 kB' 'Cached: 12495152 kB' 'SwapCached: 0 kB' 'Active: 9539300 kB' 'Inactive: 3662944 kB' 'Active(anon): 9133888 kB' 'Inactive(anon): 0 kB' 'Active(file): 405412 kB' 'Inactive(file): 3662944 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 701468 kB' 'Mapped: 142164 kB' 'Shmem: 8435736 kB' 'KReclaimable: 228928 kB' 'Slab: 840240 kB' 'SReclaimable: 228928 kB' 'SUnreclaim: 611312 kB' 'KernelStack: 21952 kB' 'PageTables: 8060 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481900 kB' 'Committed_AS: 10911912 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214352 kB' 'VmallocChunk: 0 kB' 'Percpu: 73024 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 574836 kB' 'DirectMap2M: 11694080 kB' 'DirectMap1G: 57671680 kB' 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.962 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.963 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@98 -- # surp=0 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Rsvd 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283744 kB' 'MemFree: 41636724 kB' 'MemAvailable: 45322676 kB' 'Buffers: 8940 kB' 'Cached: 12495152 kB' 'SwapCached: 0 kB' 'Active: 9539024 kB' 'Inactive: 3662944 kB' 'Active(anon): 9133612 kB' 'Inactive(anon): 0 kB' 'Active(file): 405412 kB' 'Inactive(file): 3662944 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 701192 kB' 'Mapped: 142164 kB' 'Shmem: 8435736 kB' 'KReclaimable: 228928 kB' 'Slab: 840240 kB' 'SReclaimable: 228928 kB' 'SUnreclaim: 611312 kB' 'KernelStack: 21696 kB' 'PageTables: 7348 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481900 kB' 'Committed_AS: 10910316 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214320 kB' 'VmallocChunk: 0 kB' 'Percpu: 73024 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 574836 kB' 'DirectMap2M: 11694080 kB' 'DirectMap1G: 57671680 kB' 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.964 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.965 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # resv=0 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@101 -- # echo nr_hugepages=1024 00:05:17.966 nr_hugepages=1024 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@102 -- # echo resv_hugepages=0 00:05:17.966 resv_hugepages=0 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@103 -- # echo surplus_hugepages=0 00:05:17.966 surplus_hugepages=0 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@104 -- # echo anon_hugepages=0 00:05:17.966 anon_hugepages=0 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@106 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@108 -- # (( 1024 == nr_hugepages )) 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # get_meminfo HugePages_Total 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283744 kB' 'MemFree: 41635744 kB' 'MemAvailable: 45321696 kB' 'Buffers: 8940 kB' 'Cached: 12495192 kB' 'SwapCached: 0 kB' 'Active: 9539104 kB' 'Inactive: 3662944 kB' 'Active(anon): 9133692 kB' 'Inactive(anon): 0 kB' 'Active(file): 405412 kB' 'Inactive(file): 3662944 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 701232 kB' 'Mapped: 142164 kB' 'Shmem: 8435776 kB' 'KReclaimable: 228928 kB' 'Slab: 840240 kB' 'SReclaimable: 228928 kB' 'SUnreclaim: 611312 kB' 'KernelStack: 21760 kB' 'PageTables: 7828 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481900 kB' 'Committed_AS: 10910336 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214384 kB' 'VmallocChunk: 0 kB' 'Percpu: 73024 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 574836 kB' 'DirectMap2M: 11694080 kB' 'DirectMap1G: 57671680 kB' 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.966 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.967 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 1024 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@111 -- # get_nodes 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@26 -- # local node 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=1024 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=0 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@31 -- # no_nodes=2 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@32 -- # (( no_nodes > 0 )) 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 0 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node=0 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32585368 kB' 'MemFree: 25146960 kB' 'MemUsed: 7438408 kB' 'SwapCached: 0 kB' 'Active: 3316772 kB' 'Inactive: 154240 kB' 'Active(anon): 3072656 kB' 'Inactive(anon): 0 kB' 'Active(file): 244116 kB' 'Inactive(file): 154240 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3073496 kB' 'Mapped: 90964 kB' 'AnonPages: 400736 kB' 'Shmem: 2675140 kB' 'KernelStack: 11448 kB' 'PageTables: 4524 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 106408 kB' 'Slab: 420300 kB' 'SReclaimable: 106408 kB' 'SUnreclaim: 313892 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.968 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.969 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.969 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.969 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.969 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.969 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.969 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.969 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.969 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.969 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.969 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.969 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.969 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.969 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.969 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.969 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.969 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.969 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.969 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.969 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.969 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.969 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.969 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.969 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.969 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.969 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.969 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.969 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.969 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.969 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.969 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.969 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.969 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.969 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.969 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.969 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.969 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.969 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.969 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.969 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.969 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.969 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.969 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.969 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.969 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.969 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.969 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.969 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.969 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.969 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.969 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.969 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.969 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.969 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.969 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.969 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.969 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.969 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.969 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.969 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.969 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.969 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.969 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.969 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.969 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.969 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.969 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.969 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.969 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.969 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.969 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.969 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.969 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.969 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.969 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:17.969 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:17.969 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:17.969 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:17.969 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:17.969 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:17.969 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # echo 'node0=1024 expecting 1024' 00:05:17.969 node0=1024 expecting 1024 00:05:17.969 10:59:42 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@129 -- # [[ 1024 == \1\0\2\4 ]] 00:05:17.969 00:05:17.969 real 0m7.435s 00:05:17.969 user 0m2.690s 00:05:17.969 sys 0m4.878s 00:05:17.969 10:59:42 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:17.969 10:59:42 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@10 -- # set +x 00:05:17.969 ************************************ 00:05:17.969 END TEST no_shrink_alloc 00:05:17.969 ************************************ 00:05:17.969 10:59:42 setup.sh.hugepages -- setup/hugepages.sh@206 -- # clear_hp 00:05:17.969 10:59:42 setup.sh.hugepages -- setup/hugepages.sh@36 -- # local node hp 00:05:17.969 10:59:42 setup.sh.hugepages -- setup/hugepages.sh@38 -- # for node in "${!nodes_sys[@]}" 00:05:17.969 10:59:42 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:17.969 10:59:42 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:05:17.969 10:59:42 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:17.969 10:59:42 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:05:17.969 10:59:42 setup.sh.hugepages -- setup/hugepages.sh@38 -- # for node in "${!nodes_sys[@]}" 00:05:17.969 10:59:42 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:17.969 10:59:42 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:05:17.969 10:59:42 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:17.969 10:59:42 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:05:17.969 10:59:42 setup.sh.hugepages -- setup/hugepages.sh@44 -- # export CLEAR_HUGE=yes 00:05:17.969 10:59:42 setup.sh.hugepages -- setup/hugepages.sh@44 -- # CLEAR_HUGE=yes 00:05:17.969 00:05:17.969 real 0m24.959s 00:05:17.969 user 0m8.735s 00:05:17.969 sys 0m15.113s 00:05:17.969 10:59:42 setup.sh.hugepages -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:17.969 10:59:42 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:05:17.969 ************************************ 00:05:17.969 END TEST hugepages 00:05:17.969 ************************************ 00:05:17.969 10:59:42 setup.sh -- setup/test-setup.sh@14 -- # run_test driver /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/driver.sh 00:05:17.969 10:59:42 setup.sh -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:17.969 10:59:42 setup.sh -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:17.969 10:59:42 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:05:18.230 ************************************ 00:05:18.230 START TEST driver 00:05:18.230 ************************************ 00:05:18.230 10:59:42 setup.sh.driver -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/driver.sh 00:05:18.230 * Looking for test storage... 00:05:18.230 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:05:18.230 10:59:42 setup.sh.driver -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:05:18.230 10:59:42 setup.sh.driver -- common/autotest_common.sh@1693 -- # lcov --version 00:05:18.230 10:59:42 setup.sh.driver -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:05:18.230 10:59:42 setup.sh.driver -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:05:18.230 10:59:42 setup.sh.driver -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:05:18.230 10:59:42 setup.sh.driver -- scripts/common.sh@333 -- # local ver1 ver1_l 00:05:18.230 10:59:42 setup.sh.driver -- scripts/common.sh@334 -- # local ver2 ver2_l 00:05:18.230 10:59:42 setup.sh.driver -- scripts/common.sh@336 -- # IFS=.-: 00:05:18.230 10:59:42 setup.sh.driver -- scripts/common.sh@336 -- # read -ra ver1 00:05:18.230 10:59:42 setup.sh.driver -- scripts/common.sh@337 -- # IFS=.-: 00:05:18.230 10:59:42 setup.sh.driver -- scripts/common.sh@337 -- # read -ra ver2 00:05:18.230 10:59:42 setup.sh.driver -- scripts/common.sh@338 -- # local 'op=<' 00:05:18.230 10:59:42 setup.sh.driver -- scripts/common.sh@340 -- # ver1_l=2 00:05:18.230 10:59:42 setup.sh.driver -- scripts/common.sh@341 -- # ver2_l=1 00:05:18.230 10:59:42 setup.sh.driver -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:05:18.230 10:59:42 setup.sh.driver -- scripts/common.sh@344 -- # case "$op" in 00:05:18.230 10:59:42 setup.sh.driver -- scripts/common.sh@345 -- # : 1 00:05:18.230 10:59:42 setup.sh.driver -- scripts/common.sh@364 -- # (( v = 0 )) 00:05:18.230 10:59:42 setup.sh.driver -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:18.230 10:59:42 setup.sh.driver -- scripts/common.sh@365 -- # decimal 1 00:05:18.230 10:59:42 setup.sh.driver -- scripts/common.sh@353 -- # local d=1 00:05:18.230 10:59:42 setup.sh.driver -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:18.230 10:59:42 setup.sh.driver -- scripts/common.sh@355 -- # echo 1 00:05:18.230 10:59:42 setup.sh.driver -- scripts/common.sh@365 -- # ver1[v]=1 00:05:18.230 10:59:42 setup.sh.driver -- scripts/common.sh@366 -- # decimal 2 00:05:18.230 10:59:42 setup.sh.driver -- scripts/common.sh@353 -- # local d=2 00:05:18.230 10:59:42 setup.sh.driver -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:18.230 10:59:42 setup.sh.driver -- scripts/common.sh@355 -- # echo 2 00:05:18.230 10:59:42 setup.sh.driver -- scripts/common.sh@366 -- # ver2[v]=2 00:05:18.230 10:59:42 setup.sh.driver -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:05:18.230 10:59:42 setup.sh.driver -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:05:18.230 10:59:42 setup.sh.driver -- scripts/common.sh@368 -- # return 0 00:05:18.230 10:59:42 setup.sh.driver -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:18.230 10:59:42 setup.sh.driver -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:05:18.230 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:18.230 --rc genhtml_branch_coverage=1 00:05:18.230 --rc genhtml_function_coverage=1 00:05:18.230 --rc genhtml_legend=1 00:05:18.230 --rc geninfo_all_blocks=1 00:05:18.230 --rc geninfo_unexecuted_blocks=1 00:05:18.230 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:18.230 ' 00:05:18.230 10:59:42 setup.sh.driver -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:05:18.230 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:18.230 --rc genhtml_branch_coverage=1 00:05:18.230 --rc genhtml_function_coverage=1 00:05:18.230 --rc genhtml_legend=1 00:05:18.230 --rc geninfo_all_blocks=1 00:05:18.231 --rc geninfo_unexecuted_blocks=1 00:05:18.231 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:18.231 ' 00:05:18.231 10:59:42 setup.sh.driver -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:05:18.231 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:18.231 --rc genhtml_branch_coverage=1 00:05:18.231 --rc genhtml_function_coverage=1 00:05:18.231 --rc genhtml_legend=1 00:05:18.231 --rc geninfo_all_blocks=1 00:05:18.231 --rc geninfo_unexecuted_blocks=1 00:05:18.231 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:18.231 ' 00:05:18.231 10:59:42 setup.sh.driver -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:05:18.231 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:18.231 --rc genhtml_branch_coverage=1 00:05:18.231 --rc genhtml_function_coverage=1 00:05:18.231 --rc genhtml_legend=1 00:05:18.231 --rc geninfo_all_blocks=1 00:05:18.231 --rc geninfo_unexecuted_blocks=1 00:05:18.231 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:18.231 ' 00:05:18.231 10:59:42 setup.sh.driver -- setup/driver.sh@68 -- # setup reset 00:05:18.231 10:59:42 setup.sh.driver -- setup/common.sh@9 -- # [[ reset == output ]] 00:05:18.231 10:59:42 setup.sh.driver -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:05:23.516 10:59:47 setup.sh.driver -- setup/driver.sh@69 -- # run_test guess_driver guess_driver 00:05:23.516 10:59:47 setup.sh.driver -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:23.516 10:59:47 setup.sh.driver -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:23.516 10:59:47 setup.sh.driver -- common/autotest_common.sh@10 -- # set +x 00:05:23.516 ************************************ 00:05:23.516 START TEST guess_driver 00:05:23.516 ************************************ 00:05:23.516 10:59:47 setup.sh.driver.guess_driver -- common/autotest_common.sh@1129 -- # guess_driver 00:05:23.516 10:59:47 setup.sh.driver.guess_driver -- setup/driver.sh@46 -- # local driver setup_driver marker 00:05:23.516 10:59:47 setup.sh.driver.guess_driver -- setup/driver.sh@47 -- # local fail=0 00:05:23.516 10:59:47 setup.sh.driver.guess_driver -- setup/driver.sh@49 -- # pick_driver 00:05:23.516 10:59:47 setup.sh.driver.guess_driver -- setup/driver.sh@36 -- # vfio 00:05:23.516 10:59:47 setup.sh.driver.guess_driver -- setup/driver.sh@21 -- # local iommu_grups 00:05:23.516 10:59:47 setup.sh.driver.guess_driver -- setup/driver.sh@22 -- # local unsafe_vfio 00:05:23.516 10:59:47 setup.sh.driver.guess_driver -- setup/driver.sh@24 -- # [[ -e /sys/module/vfio/parameters/enable_unsafe_noiommu_mode ]] 00:05:23.516 10:59:47 setup.sh.driver.guess_driver -- setup/driver.sh@25 -- # unsafe_vfio=N 00:05:23.516 10:59:47 setup.sh.driver.guess_driver -- setup/driver.sh@27 -- # iommu_groups=(/sys/kernel/iommu_groups/*) 00:05:23.516 10:59:47 setup.sh.driver.guess_driver -- setup/driver.sh@29 -- # (( 176 > 0 )) 00:05:23.516 10:59:47 setup.sh.driver.guess_driver -- setup/driver.sh@30 -- # is_driver vfio_pci 00:05:23.516 10:59:47 setup.sh.driver.guess_driver -- setup/driver.sh@14 -- # mod vfio_pci 00:05:23.516 10:59:47 setup.sh.driver.guess_driver -- setup/driver.sh@12 -- # dep vfio_pci 00:05:23.516 10:59:47 setup.sh.driver.guess_driver -- setup/driver.sh@11 -- # modprobe --show-depends vfio_pci 00:05:23.516 10:59:47 setup.sh.driver.guess_driver -- setup/driver.sh@12 -- # [[ insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/virt/lib/irqbypass.ko.xz 00:05:23.516 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:05:23.516 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:05:23.516 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:05:23.516 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:05:23.516 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/vfio_iommu_type1.ko.xz 00:05:23.516 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/pci/vfio-pci-core.ko.xz 00:05:23.516 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/pci/vfio-pci.ko.xz == *\.\k\o* ]] 00:05:23.516 10:59:47 setup.sh.driver.guess_driver -- setup/driver.sh@30 -- # return 0 00:05:23.516 10:59:47 setup.sh.driver.guess_driver -- setup/driver.sh@37 -- # echo vfio-pci 00:05:23.516 10:59:47 setup.sh.driver.guess_driver -- setup/driver.sh@49 -- # driver=vfio-pci 00:05:23.516 10:59:47 setup.sh.driver.guess_driver -- setup/driver.sh@51 -- # [[ vfio-pci == \N\o\ \v\a\l\i\d\ \d\r\i\v\e\r\ \f\o\u\n\d ]] 00:05:23.516 10:59:47 setup.sh.driver.guess_driver -- setup/driver.sh@56 -- # echo 'Looking for driver=vfio-pci' 00:05:23.516 Looking for driver=vfio-pci 00:05:23.516 10:59:47 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:23.516 10:59:47 setup.sh.driver.guess_driver -- setup/driver.sh@45 -- # setup output config 00:05:23.516 10:59:47 setup.sh.driver.guess_driver -- setup/common.sh@9 -- # [[ output == output ]] 00:05:23.516 10:59:47 setup.sh.driver.guess_driver -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:26.814 10:59:51 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:26.814 10:59:51 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:26.814 10:59:51 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:26.814 10:59:51 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:26.814 10:59:51 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:26.814 10:59:51 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:26.814 10:59:51 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:26.814 10:59:51 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:26.814 10:59:51 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:26.814 10:59:51 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:26.814 10:59:51 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:26.814 10:59:51 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:26.814 10:59:51 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:26.814 10:59:51 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:26.814 10:59:51 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:26.814 10:59:51 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:26.814 10:59:51 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:26.814 10:59:51 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:26.814 10:59:51 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:26.814 10:59:51 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:26.814 10:59:51 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:26.814 10:59:51 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:26.814 10:59:51 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:26.814 10:59:51 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:27.075 10:59:51 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:27.075 10:59:51 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:27.075 10:59:51 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:27.075 10:59:51 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:27.075 10:59:51 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:27.075 10:59:51 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:27.075 10:59:51 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:27.075 10:59:51 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:27.075 10:59:51 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:27.075 10:59:51 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:27.075 10:59:51 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:27.075 10:59:51 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:27.075 10:59:51 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:27.075 10:59:51 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:27.075 10:59:51 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:27.075 10:59:51 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:27.075 10:59:51 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:27.075 10:59:51 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:27.075 10:59:51 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:27.075 10:59:51 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:27.075 10:59:51 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:27.075 10:59:51 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:27.075 10:59:51 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:27.075 10:59:51 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:28.988 10:59:53 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:28.988 10:59:53 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:28.988 10:59:53 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:28.988 10:59:53 setup.sh.driver.guess_driver -- setup/driver.sh@64 -- # (( fail == 0 )) 00:05:28.988 10:59:53 setup.sh.driver.guess_driver -- setup/driver.sh@65 -- # setup reset 00:05:28.988 10:59:53 setup.sh.driver.guess_driver -- setup/common.sh@9 -- # [[ reset == output ]] 00:05:28.988 10:59:53 setup.sh.driver.guess_driver -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:05:34.274 00:05:34.274 real 0m10.361s 00:05:34.274 user 0m2.712s 00:05:34.274 sys 0m5.282s 00:05:34.274 10:59:58 setup.sh.driver.guess_driver -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:34.274 10:59:58 setup.sh.driver.guess_driver -- common/autotest_common.sh@10 -- # set +x 00:05:34.274 ************************************ 00:05:34.274 END TEST guess_driver 00:05:34.274 ************************************ 00:05:34.274 00:05:34.274 real 0m15.665s 00:05:34.274 user 0m4.109s 00:05:34.274 sys 0m8.340s 00:05:34.274 10:59:58 setup.sh.driver -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:34.274 10:59:58 setup.sh.driver -- common/autotest_common.sh@10 -- # set +x 00:05:34.274 ************************************ 00:05:34.274 END TEST driver 00:05:34.274 ************************************ 00:05:34.274 10:59:58 setup.sh -- setup/test-setup.sh@15 -- # run_test devices /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/devices.sh 00:05:34.274 10:59:58 setup.sh -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:34.274 10:59:58 setup.sh -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:34.274 10:59:58 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:05:34.274 ************************************ 00:05:34.274 START TEST devices 00:05:34.274 ************************************ 00:05:34.274 10:59:58 setup.sh.devices -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/devices.sh 00:05:34.274 * Looking for test storage... 00:05:34.274 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:05:34.274 10:59:58 setup.sh.devices -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:05:34.275 10:59:58 setup.sh.devices -- common/autotest_common.sh@1693 -- # lcov --version 00:05:34.275 10:59:58 setup.sh.devices -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:05:34.275 10:59:58 setup.sh.devices -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:05:34.275 10:59:58 setup.sh.devices -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:05:34.275 10:59:58 setup.sh.devices -- scripts/common.sh@333 -- # local ver1 ver1_l 00:05:34.275 10:59:58 setup.sh.devices -- scripts/common.sh@334 -- # local ver2 ver2_l 00:05:34.275 10:59:58 setup.sh.devices -- scripts/common.sh@336 -- # IFS=.-: 00:05:34.275 10:59:58 setup.sh.devices -- scripts/common.sh@336 -- # read -ra ver1 00:05:34.275 10:59:58 setup.sh.devices -- scripts/common.sh@337 -- # IFS=.-: 00:05:34.275 10:59:58 setup.sh.devices -- scripts/common.sh@337 -- # read -ra ver2 00:05:34.275 10:59:58 setup.sh.devices -- scripts/common.sh@338 -- # local 'op=<' 00:05:34.275 10:59:58 setup.sh.devices -- scripts/common.sh@340 -- # ver1_l=2 00:05:34.275 10:59:58 setup.sh.devices -- scripts/common.sh@341 -- # ver2_l=1 00:05:34.275 10:59:58 setup.sh.devices -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:05:34.275 10:59:58 setup.sh.devices -- scripts/common.sh@344 -- # case "$op" in 00:05:34.275 10:59:58 setup.sh.devices -- scripts/common.sh@345 -- # : 1 00:05:34.275 10:59:58 setup.sh.devices -- scripts/common.sh@364 -- # (( v = 0 )) 00:05:34.275 10:59:58 setup.sh.devices -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:34.275 10:59:58 setup.sh.devices -- scripts/common.sh@365 -- # decimal 1 00:05:34.275 10:59:58 setup.sh.devices -- scripts/common.sh@353 -- # local d=1 00:05:34.275 10:59:58 setup.sh.devices -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:34.275 10:59:58 setup.sh.devices -- scripts/common.sh@355 -- # echo 1 00:05:34.275 10:59:58 setup.sh.devices -- scripts/common.sh@365 -- # ver1[v]=1 00:05:34.275 10:59:58 setup.sh.devices -- scripts/common.sh@366 -- # decimal 2 00:05:34.275 10:59:58 setup.sh.devices -- scripts/common.sh@353 -- # local d=2 00:05:34.275 10:59:58 setup.sh.devices -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:34.275 10:59:58 setup.sh.devices -- scripts/common.sh@355 -- # echo 2 00:05:34.275 10:59:58 setup.sh.devices -- scripts/common.sh@366 -- # ver2[v]=2 00:05:34.275 10:59:58 setup.sh.devices -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:05:34.275 10:59:58 setup.sh.devices -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:05:34.275 10:59:58 setup.sh.devices -- scripts/common.sh@368 -- # return 0 00:05:34.275 10:59:58 setup.sh.devices -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:34.275 10:59:58 setup.sh.devices -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:05:34.275 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:34.275 --rc genhtml_branch_coverage=1 00:05:34.275 --rc genhtml_function_coverage=1 00:05:34.275 --rc genhtml_legend=1 00:05:34.275 --rc geninfo_all_blocks=1 00:05:34.275 --rc geninfo_unexecuted_blocks=1 00:05:34.275 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:34.275 ' 00:05:34.275 10:59:58 setup.sh.devices -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:05:34.275 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:34.275 --rc genhtml_branch_coverage=1 00:05:34.275 --rc genhtml_function_coverage=1 00:05:34.275 --rc genhtml_legend=1 00:05:34.275 --rc geninfo_all_blocks=1 00:05:34.275 --rc geninfo_unexecuted_blocks=1 00:05:34.275 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:34.275 ' 00:05:34.275 10:59:58 setup.sh.devices -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:05:34.275 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:34.275 --rc genhtml_branch_coverage=1 00:05:34.275 --rc genhtml_function_coverage=1 00:05:34.275 --rc genhtml_legend=1 00:05:34.275 --rc geninfo_all_blocks=1 00:05:34.275 --rc geninfo_unexecuted_blocks=1 00:05:34.275 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:34.275 ' 00:05:34.275 10:59:58 setup.sh.devices -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:05:34.275 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:34.275 --rc genhtml_branch_coverage=1 00:05:34.275 --rc genhtml_function_coverage=1 00:05:34.275 --rc genhtml_legend=1 00:05:34.275 --rc geninfo_all_blocks=1 00:05:34.275 --rc geninfo_unexecuted_blocks=1 00:05:34.275 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:34.275 ' 00:05:34.275 10:59:58 setup.sh.devices -- setup/devices.sh@190 -- # trap cleanup EXIT 00:05:34.275 10:59:58 setup.sh.devices -- setup/devices.sh@192 -- # setup reset 00:05:34.275 10:59:58 setup.sh.devices -- setup/common.sh@9 -- # [[ reset == output ]] 00:05:34.275 10:59:58 setup.sh.devices -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:05:38.481 11:00:02 setup.sh.devices -- setup/devices.sh@194 -- # get_zoned_devs 00:05:38.481 11:00:02 setup.sh.devices -- common/autotest_common.sh@1657 -- # zoned_devs=() 00:05:38.481 11:00:02 setup.sh.devices -- common/autotest_common.sh@1657 -- # local -gA zoned_devs 00:05:38.481 11:00:02 setup.sh.devices -- common/autotest_common.sh@1658 -- # local nvme bdf 00:05:38.481 11:00:02 setup.sh.devices -- common/autotest_common.sh@1660 -- # for nvme in /sys/block/nvme* 00:05:38.481 11:00:02 setup.sh.devices -- common/autotest_common.sh@1661 -- # is_block_zoned nvme0n1 00:05:38.481 11:00:02 setup.sh.devices -- common/autotest_common.sh@1650 -- # local device=nvme0n1 00:05:38.481 11:00:02 setup.sh.devices -- common/autotest_common.sh@1652 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:05:38.481 11:00:02 setup.sh.devices -- common/autotest_common.sh@1653 -- # [[ none != none ]] 00:05:38.481 11:00:02 setup.sh.devices -- setup/devices.sh@196 -- # blocks=() 00:05:38.481 11:00:02 setup.sh.devices -- setup/devices.sh@196 -- # declare -a blocks 00:05:38.481 11:00:02 setup.sh.devices -- setup/devices.sh@197 -- # blocks_to_pci=() 00:05:38.481 11:00:02 setup.sh.devices -- setup/devices.sh@197 -- # declare -A blocks_to_pci 00:05:38.481 11:00:02 setup.sh.devices -- setup/devices.sh@198 -- # min_disk_size=3221225472 00:05:38.481 11:00:02 setup.sh.devices -- setup/devices.sh@200 -- # for block in "/sys/block/nvme"!(*c*) 00:05:38.481 11:00:02 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0n1 00:05:38.481 11:00:02 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0 00:05:38.481 11:00:02 setup.sh.devices -- setup/devices.sh@202 -- # pci=0000:d8:00.0 00:05:38.481 11:00:02 setup.sh.devices -- setup/devices.sh@203 -- # [[ '' == *\0\0\0\0\:\d\8\:\0\0\.\0* ]] 00:05:38.481 11:00:02 setup.sh.devices -- setup/devices.sh@204 -- # block_in_use nvme0n1 00:05:38.481 11:00:02 setup.sh.devices -- scripts/common.sh@381 -- # local block=nvme0n1 pt 00:05:38.481 11:00:02 setup.sh.devices -- scripts/common.sh@390 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/spdk-gpt.py nvme0n1 00:05:38.481 No valid GPT data, bailing 00:05:38.481 11:00:02 setup.sh.devices -- scripts/common.sh@394 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:05:38.481 11:00:02 setup.sh.devices -- scripts/common.sh@394 -- # pt= 00:05:38.481 11:00:02 setup.sh.devices -- scripts/common.sh@395 -- # return 1 00:05:38.481 11:00:02 setup.sh.devices -- setup/devices.sh@204 -- # sec_size_to_bytes nvme0n1 00:05:38.481 11:00:02 setup.sh.devices -- setup/common.sh@76 -- # local dev=nvme0n1 00:05:38.481 11:00:02 setup.sh.devices -- setup/common.sh@78 -- # [[ -e /sys/block/nvme0n1 ]] 00:05:38.481 11:00:02 setup.sh.devices -- setup/common.sh@80 -- # echo 1600321314816 00:05:38.481 11:00:02 setup.sh.devices -- setup/devices.sh@204 -- # (( 1600321314816 >= min_disk_size )) 00:05:38.481 11:00:02 setup.sh.devices -- setup/devices.sh@205 -- # blocks+=("${block##*/}") 00:05:38.481 11:00:02 setup.sh.devices -- setup/devices.sh@206 -- # blocks_to_pci["${block##*/}"]=0000:d8:00.0 00:05:38.481 11:00:02 setup.sh.devices -- setup/devices.sh@209 -- # (( 1 > 0 )) 00:05:38.481 11:00:02 setup.sh.devices -- setup/devices.sh@211 -- # declare -r test_disk=nvme0n1 00:05:38.481 11:00:02 setup.sh.devices -- setup/devices.sh@213 -- # run_test nvme_mount nvme_mount 00:05:38.481 11:00:02 setup.sh.devices -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:38.481 11:00:02 setup.sh.devices -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:38.481 11:00:02 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:05:38.481 ************************************ 00:05:38.481 START TEST nvme_mount 00:05:38.481 ************************************ 00:05:38.481 11:00:02 setup.sh.devices.nvme_mount -- common/autotest_common.sh@1129 -- # nvme_mount 00:05:38.481 11:00:02 setup.sh.devices.nvme_mount -- setup/devices.sh@95 -- # nvme_disk=nvme0n1 00:05:38.481 11:00:02 setup.sh.devices.nvme_mount -- setup/devices.sh@96 -- # nvme_disk_p=nvme0n1p1 00:05:38.481 11:00:02 setup.sh.devices.nvme_mount -- setup/devices.sh@97 -- # nvme_mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:38.481 11:00:02 setup.sh.devices.nvme_mount -- setup/devices.sh@98 -- # nvme_dummy_test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:38.481 11:00:02 setup.sh.devices.nvme_mount -- setup/devices.sh@101 -- # partition_drive nvme0n1 1 00:05:38.481 11:00:02 setup.sh.devices.nvme_mount -- setup/common.sh@39 -- # local disk=nvme0n1 00:05:38.481 11:00:02 setup.sh.devices.nvme_mount -- setup/common.sh@40 -- # local part_no=1 00:05:38.481 11:00:02 setup.sh.devices.nvme_mount -- setup/common.sh@41 -- # local size=1073741824 00:05:38.481 11:00:02 setup.sh.devices.nvme_mount -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:05:38.481 11:00:02 setup.sh.devices.nvme_mount -- setup/common.sh@44 -- # parts=() 00:05:38.481 11:00:02 setup.sh.devices.nvme_mount -- setup/common.sh@44 -- # local parts 00:05:38.481 11:00:02 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part = 1 )) 00:05:38.481 11:00:02 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:38.481 11:00:02 setup.sh.devices.nvme_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:05:38.481 11:00:02 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part++ )) 00:05:38.481 11:00:02 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:38.481 11:00:02 setup.sh.devices.nvme_mount -- setup/common.sh@51 -- # (( size /= 512 )) 00:05:38.481 11:00:02 setup.sh.devices.nvme_mount -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:05:38.481 11:00:02 setup.sh.devices.nvme_mount -- setup/common.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 00:05:39.053 Creating new GPT entries in memory. 00:05:39.053 GPT data structures destroyed! You may now partition the disk using fdisk or 00:05:39.053 other utilities. 00:05:39.053 11:00:03 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part = 1 )) 00:05:39.053 11:00:03 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:39.053 11:00:03 setup.sh.devices.nvme_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:05:39.053 11:00:03 setup.sh.devices.nvme_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:05:39.053 11:00:03 setup.sh.devices.nvme_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:05:40.437 Creating new GPT entries in memory. 00:05:40.437 The operation has completed successfully. 00:05:40.437 11:00:04 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part++ )) 00:05:40.437 11:00:04 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:40.437 11:00:04 setup.sh.devices.nvme_mount -- setup/common.sh@62 -- # wait 117505 00:05:40.437 11:00:04 setup.sh.devices.nvme_mount -- setup/devices.sh@102 -- # mkfs /dev/nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:40.437 11:00:04 setup.sh.devices.nvme_mount -- setup/common.sh@66 -- # local dev=/dev/nvme0n1p1 mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount size= 00:05:40.437 11:00:04 setup.sh.devices.nvme_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:40.437 11:00:04 setup.sh.devices.nvme_mount -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1p1 ]] 00:05:40.437 11:00:04 setup.sh.devices.nvme_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1p1 00:05:40.437 11:00:04 setup.sh.devices.nvme_mount -- setup/common.sh@72 -- # mount /dev/nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:40.437 11:00:04 setup.sh.devices.nvme_mount -- setup/devices.sh@105 -- # verify 0000:d8:00.0 nvme0n1:nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:40.437 11:00:04 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:05:40.437 11:00:04 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1p1 00:05:40.437 11:00:04 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:40.437 11:00:04 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:40.437 11:00:04 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:05:40.437 11:00:04 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:40.437 11:00:04 setup.sh.devices.nvme_mount -- setup/devices.sh@56 -- # : 00:05:40.437 11:00:04 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:05:40.437 11:00:04 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:40.437 11:00:04 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:05:40.437 11:00:04 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:05:40.437 11:00:04 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:40.437 11:00:04 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:43.744 11:00:08 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:43.744 11:00:08 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:43.744 11:00:08 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:43.744 11:00:08 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:43.744 11:00:08 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:43.744 11:00:08 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:43.744 11:00:08 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:43.744 11:00:08 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:43.744 11:00:08 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:43.744 11:00:08 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:43.744 11:00:08 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:43.744 11:00:08 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:43.744 11:00:08 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:43.744 11:00:08 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:43.744 11:00:08 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:43.744 11:00:08 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:43.744 11:00:08 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:43.744 11:00:08 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:43.744 11:00:08 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:43.744 11:00:08 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:43.744 11:00:08 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:43.744 11:00:08 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:43.744 11:00:08 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:43.744 11:00:08 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:43.744 11:00:08 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:43.744 11:00:08 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:43.744 11:00:08 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:43.744 11:00:08 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:43.744 11:00:08 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:43.744 11:00:08 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:43.744 11:00:08 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:43.744 11:00:08 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:43.744 11:00:08 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:43.744 11:00:08 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1p1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1\p\1* ]] 00:05:43.744 11:00:08 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:05:43.744 11:00:08 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:43.744 11:00:08 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:43.744 11:00:08 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount ]] 00:05:43.744 11:00:08 setup.sh.devices.nvme_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:43.744 11:00:08 setup.sh.devices.nvme_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:43.744 11:00:08 setup.sh.devices.nvme_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:43.744 11:00:08 setup.sh.devices.nvme_mount -- setup/devices.sh@110 -- # cleanup_nvme 00:05:43.744 11:00:08 setup.sh.devices.nvme_mount -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:43.744 11:00:08 setup.sh.devices.nvme_mount -- setup/devices.sh@21 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:43.745 11:00:08 setup.sh.devices.nvme_mount -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:43.745 11:00:08 setup.sh.devices.nvme_mount -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:05:43.745 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:05:43.745 11:00:08 setup.sh.devices.nvme_mount -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:05:43.745 11:00:08 setup.sh.devices.nvme_mount -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:05:44.005 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:05:44.005 /dev/nvme0n1: 8 bytes were erased at offset 0x1749a955e00 (gpt): 45 46 49 20 50 41 52 54 00:05:44.005 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:05:44.005 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:05:44.005 11:00:08 setup.sh.devices.nvme_mount -- setup/devices.sh@113 -- # mkfs /dev/nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 1024M 00:05:44.005 11:00:08 setup.sh.devices.nvme_mount -- setup/common.sh@66 -- # local dev=/dev/nvme0n1 mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount size=1024M 00:05:44.005 11:00:08 setup.sh.devices.nvme_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:44.005 11:00:08 setup.sh.devices.nvme_mount -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1 ]] 00:05:44.005 11:00:08 setup.sh.devices.nvme_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1 1024M 00:05:44.005 11:00:08 setup.sh.devices.nvme_mount -- setup/common.sh@72 -- # mount /dev/nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:44.005 11:00:08 setup.sh.devices.nvme_mount -- setup/devices.sh@116 -- # verify 0000:d8:00.0 nvme0n1:nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:44.005 11:00:08 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:05:44.005 11:00:08 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1 00:05:44.005 11:00:08 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:44.005 11:00:08 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:44.005 11:00:08 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:05:44.005 11:00:08 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:44.005 11:00:08 setup.sh.devices.nvme_mount -- setup/devices.sh@56 -- # : 00:05:44.005 11:00:08 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:05:44.005 11:00:08 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:44.005 11:00:08 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:05:44.005 11:00:08 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:05:44.005 11:00:08 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:44.005 11:00:08 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:47.303 11:00:11 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:47.303 11:00:11 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:47.303 11:00:11 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:47.303 11:00:11 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:47.303 11:00:11 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:47.303 11:00:11 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:47.303 11:00:11 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:47.303 11:00:11 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:47.303 11:00:11 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:47.303 11:00:11 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:47.303 11:00:11 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:47.303 11:00:11 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:47.303 11:00:11 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:47.303 11:00:11 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:47.303 11:00:11 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:47.303 11:00:11 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:47.303 11:00:11 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:47.304 11:00:11 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:47.304 11:00:11 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:47.304 11:00:11 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:47.304 11:00:11 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:47.304 11:00:11 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:47.304 11:00:11 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:47.304 11:00:11 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:47.304 11:00:11 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:47.304 11:00:11 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:47.304 11:00:11 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:47.304 11:00:11 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:47.304 11:00:11 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:47.304 11:00:11 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:47.304 11:00:11 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:47.304 11:00:11 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:47.304 11:00:11 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:47.304 11:00:11 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1* ]] 00:05:47.304 11:00:11 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:05:47.304 11:00:11 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:47.565 11:00:12 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:47.565 11:00:12 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount ]] 00:05:47.565 11:00:12 setup.sh.devices.nvme_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:47.565 11:00:12 setup.sh.devices.nvme_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:47.565 11:00:12 setup.sh.devices.nvme_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:47.565 11:00:12 setup.sh.devices.nvme_mount -- setup/devices.sh@123 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:47.565 11:00:12 setup.sh.devices.nvme_mount -- setup/devices.sh@125 -- # verify 0000:d8:00.0 data@nvme0n1 '' '' 00:05:47.565 11:00:12 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:05:47.565 11:00:12 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=data@nvme0n1 00:05:47.565 11:00:12 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point= 00:05:47.565 11:00:12 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file= 00:05:47.565 11:00:12 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:05:47.565 11:00:12 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n '' ]] 00:05:47.565 11:00:12 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:05:47.565 11:00:12 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:47.565 11:00:12 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:05:47.565 11:00:12 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:05:47.565 11:00:12 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:47.565 11:00:12 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:50.863 11:00:15 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:50.863 11:00:15 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:50.863 11:00:15 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:50.863 11:00:15 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:50.863 11:00:15 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:50.863 11:00:15 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:50.863 11:00:15 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:50.863 11:00:15 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:50.863 11:00:15 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:50.863 11:00:15 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:50.863 11:00:15 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:50.863 11:00:15 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:50.863 11:00:15 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:50.863 11:00:15 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:50.863 11:00:15 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:50.863 11:00:15 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:50.864 11:00:15 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:50.864 11:00:15 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:50.864 11:00:15 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:50.864 11:00:15 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:50.864 11:00:15 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:50.864 11:00:15 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:50.864 11:00:15 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:50.864 11:00:15 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:50.864 11:00:15 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:50.864 11:00:15 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:50.864 11:00:15 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:50.864 11:00:15 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:50.864 11:00:15 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:50.864 11:00:15 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:50.864 11:00:15 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:50.864 11:00:15 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:50.864 11:00:15 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:50.864 11:00:15 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: data@nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\d\a\t\a\@\n\v\m\e\0\n\1* ]] 00:05:50.864 11:00:15 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:05:50.864 11:00:15 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:51.125 11:00:15 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:51.125 11:00:15 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n '' ]] 00:05:51.125 11:00:15 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # return 0 00:05:51.125 11:00:15 setup.sh.devices.nvme_mount -- setup/devices.sh@128 -- # cleanup_nvme 00:05:51.125 11:00:15 setup.sh.devices.nvme_mount -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:51.125 11:00:15 setup.sh.devices.nvme_mount -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:51.125 11:00:15 setup.sh.devices.nvme_mount -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:05:51.125 11:00:15 setup.sh.devices.nvme_mount -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:05:51.125 /dev/nvme0n1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:05:51.125 00:05:51.125 real 0m13.060s 00:05:51.125 user 0m3.797s 00:05:51.125 sys 0m7.206s 00:05:51.125 11:00:15 setup.sh.devices.nvme_mount -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:51.125 11:00:15 setup.sh.devices.nvme_mount -- common/autotest_common.sh@10 -- # set +x 00:05:51.125 ************************************ 00:05:51.125 END TEST nvme_mount 00:05:51.125 ************************************ 00:05:51.125 11:00:15 setup.sh.devices -- setup/devices.sh@214 -- # run_test dm_mount dm_mount 00:05:51.125 11:00:15 setup.sh.devices -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:51.125 11:00:15 setup.sh.devices -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:51.125 11:00:15 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:05:51.125 ************************************ 00:05:51.125 START TEST dm_mount 00:05:51.125 ************************************ 00:05:51.125 11:00:15 setup.sh.devices.dm_mount -- common/autotest_common.sh@1129 -- # dm_mount 00:05:51.125 11:00:15 setup.sh.devices.dm_mount -- setup/devices.sh@144 -- # pv=nvme0n1 00:05:51.125 11:00:15 setup.sh.devices.dm_mount -- setup/devices.sh@145 -- # pv0=nvme0n1p1 00:05:51.125 11:00:15 setup.sh.devices.dm_mount -- setup/devices.sh@146 -- # pv1=nvme0n1p2 00:05:51.125 11:00:15 setup.sh.devices.dm_mount -- setup/devices.sh@148 -- # partition_drive nvme0n1 00:05:51.125 11:00:15 setup.sh.devices.dm_mount -- setup/common.sh@39 -- # local disk=nvme0n1 00:05:51.125 11:00:15 setup.sh.devices.dm_mount -- setup/common.sh@40 -- # local part_no=2 00:05:51.125 11:00:15 setup.sh.devices.dm_mount -- setup/common.sh@41 -- # local size=1073741824 00:05:51.125 11:00:15 setup.sh.devices.dm_mount -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:05:51.125 11:00:15 setup.sh.devices.dm_mount -- setup/common.sh@44 -- # parts=() 00:05:51.125 11:00:15 setup.sh.devices.dm_mount -- setup/common.sh@44 -- # local parts 00:05:51.125 11:00:15 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part = 1 )) 00:05:51.125 11:00:15 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:51.125 11:00:15 setup.sh.devices.dm_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:05:51.125 11:00:15 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part++ )) 00:05:51.125 11:00:15 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:51.125 11:00:15 setup.sh.devices.dm_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:05:51.125 11:00:15 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part++ )) 00:05:51.125 11:00:15 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:51.125 11:00:15 setup.sh.devices.dm_mount -- setup/common.sh@51 -- # (( size /= 512 )) 00:05:51.125 11:00:15 setup.sh.devices.dm_mount -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:05:51.125 11:00:15 setup.sh.devices.dm_mount -- setup/common.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 nvme0n1p2 00:05:52.509 Creating new GPT entries in memory. 00:05:52.509 GPT data structures destroyed! You may now partition the disk using fdisk or 00:05:52.509 other utilities. 00:05:52.509 11:00:16 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part = 1 )) 00:05:52.509 11:00:16 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:52.509 11:00:16 setup.sh.devices.dm_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:05:52.509 11:00:16 setup.sh.devices.dm_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:05:52.509 11:00:16 setup.sh.devices.dm_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:05:53.449 Creating new GPT entries in memory. 00:05:53.449 The operation has completed successfully. 00:05:53.449 11:00:17 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part++ )) 00:05:53.449 11:00:17 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:53.449 11:00:17 setup.sh.devices.dm_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:05:53.449 11:00:17 setup.sh.devices.dm_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:05:53.449 11:00:17 setup.sh.devices.dm_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=2:2099200:4196351 00:05:54.390 The operation has completed successfully. 00:05:54.390 11:00:18 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part++ )) 00:05:54.390 11:00:18 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:54.390 11:00:18 setup.sh.devices.dm_mount -- setup/common.sh@62 -- # wait 122547 00:05:54.391 11:00:18 setup.sh.devices.dm_mount -- setup/devices.sh@150 -- # dm_name=nvme_dm_test 00:05:54.391 11:00:18 setup.sh.devices.dm_mount -- setup/devices.sh@151 -- # dm_mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:54.391 11:00:18 setup.sh.devices.dm_mount -- setup/devices.sh@152 -- # dm_dummy_test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:54.391 11:00:18 setup.sh.devices.dm_mount -- setup/devices.sh@155 -- # dmsetup create nvme_dm_test 00:05:54.391 11:00:18 setup.sh.devices.dm_mount -- setup/devices.sh@160 -- # for t in {1..5} 00:05:54.391 11:00:18 setup.sh.devices.dm_mount -- setup/devices.sh@161 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:05:54.391 11:00:18 setup.sh.devices.dm_mount -- setup/devices.sh@161 -- # break 00:05:54.391 11:00:18 setup.sh.devices.dm_mount -- setup/devices.sh@164 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:05:54.391 11:00:18 setup.sh.devices.dm_mount -- setup/devices.sh@165 -- # readlink -f /dev/mapper/nvme_dm_test 00:05:54.391 11:00:18 setup.sh.devices.dm_mount -- setup/devices.sh@165 -- # dm=/dev/dm-0 00:05:54.391 11:00:18 setup.sh.devices.dm_mount -- setup/devices.sh@166 -- # dm=dm-0 00:05:54.391 11:00:18 setup.sh.devices.dm_mount -- setup/devices.sh@168 -- # [[ -e /sys/class/block/nvme0n1p1/holders/dm-0 ]] 00:05:54.391 11:00:18 setup.sh.devices.dm_mount -- setup/devices.sh@169 -- # [[ -e /sys/class/block/nvme0n1p2/holders/dm-0 ]] 00:05:54.391 11:00:18 setup.sh.devices.dm_mount -- setup/devices.sh@171 -- # mkfs /dev/mapper/nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:54.391 11:00:18 setup.sh.devices.dm_mount -- setup/common.sh@66 -- # local dev=/dev/mapper/nvme_dm_test mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount size= 00:05:54.391 11:00:18 setup.sh.devices.dm_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:54.391 11:00:18 setup.sh.devices.dm_mount -- setup/common.sh@70 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:05:54.391 11:00:18 setup.sh.devices.dm_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/mapper/nvme_dm_test 00:05:54.391 11:00:18 setup.sh.devices.dm_mount -- setup/common.sh@72 -- # mount /dev/mapper/nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:54.391 11:00:19 setup.sh.devices.dm_mount -- setup/devices.sh@174 -- # verify 0000:d8:00.0 nvme0n1:nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:54.391 11:00:19 setup.sh.devices.dm_mount -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:05:54.391 11:00:19 setup.sh.devices.dm_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme_dm_test 00:05:54.391 11:00:19 setup.sh.devices.dm_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:54.391 11:00:19 setup.sh.devices.dm_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:54.391 11:00:19 setup.sh.devices.dm_mount -- setup/devices.sh@53 -- # local found=0 00:05:54.391 11:00:19 setup.sh.devices.dm_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:05:54.391 11:00:19 setup.sh.devices.dm_mount -- setup/devices.sh@56 -- # : 00:05:54.391 11:00:19 setup.sh.devices.dm_mount -- setup/devices.sh@59 -- # local pci status 00:05:54.391 11:00:19 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:54.391 11:00:19 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:05:54.391 11:00:19 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # setup output config 00:05:54.391 11:00:19 setup.sh.devices.dm_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:54.391 11:00:19 setup.sh.devices.dm_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:57.690 11:00:22 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:57.690 11:00:22 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:57.690 11:00:22 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:57.690 11:00:22 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:57.690 11:00:22 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:57.690 11:00:22 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:57.690 11:00:22 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:57.690 11:00:22 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:57.690 11:00:22 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:57.690 11:00:22 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:57.690 11:00:22 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:57.690 11:00:22 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:57.690 11:00:22 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:57.690 11:00:22 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:57.690 11:00:22 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:57.690 11:00:22 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:57.690 11:00:22 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:57.690 11:00:22 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:57.690 11:00:22 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:57.690 11:00:22 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:57.690 11:00:22 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:57.690 11:00:22 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:57.690 11:00:22 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:57.690 11:00:22 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:57.690 11:00:22 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:57.690 11:00:22 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:57.690 11:00:22 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:57.690 11:00:22 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:57.690 11:00:22 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:57.690 11:00:22 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:57.690 11:00:22 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:57.690 11:00:22 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:57.690 11:00:22 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:57.690 11:00:22 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0,mount@nvme0n1:nvme_dm_test, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\_\d\m\_\t\e\s\t* ]] 00:05:57.690 11:00:22 setup.sh.devices.dm_mount -- setup/devices.sh@63 -- # found=1 00:05:57.690 11:00:22 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:57.951 11:00:22 setup.sh.devices.dm_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:57.951 11:00:22 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount ]] 00:05:57.951 11:00:22 setup.sh.devices.dm_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:57.951 11:00:22 setup.sh.devices.dm_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:05:57.951 11:00:22 setup.sh.devices.dm_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:57.951 11:00:22 setup.sh.devices.dm_mount -- setup/devices.sh@182 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:57.951 11:00:22 setup.sh.devices.dm_mount -- setup/devices.sh@184 -- # verify 0000:d8:00.0 holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 '' '' 00:05:57.951 11:00:22 setup.sh.devices.dm_mount -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:05:57.951 11:00:22 setup.sh.devices.dm_mount -- setup/devices.sh@49 -- # local mounts=holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 00:05:57.951 11:00:22 setup.sh.devices.dm_mount -- setup/devices.sh@50 -- # local mount_point= 00:05:57.951 11:00:22 setup.sh.devices.dm_mount -- setup/devices.sh@51 -- # local test_file= 00:05:57.951 11:00:22 setup.sh.devices.dm_mount -- setup/devices.sh@53 -- # local found=0 00:05:57.951 11:00:22 setup.sh.devices.dm_mount -- setup/devices.sh@55 -- # [[ -n '' ]] 00:05:57.951 11:00:22 setup.sh.devices.dm_mount -- setup/devices.sh@59 -- # local pci status 00:05:57.951 11:00:22 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:05:57.951 11:00:22 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:57.951 11:00:22 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # setup output config 00:05:57.951 11:00:22 setup.sh.devices.dm_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:57.951 11:00:22 setup.sh.devices.dm_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:06:01.249 11:00:25 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:01.249 11:00:25 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:01.249 11:00:25 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:01.249 11:00:25 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:01.249 11:00:25 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:01.249 11:00:25 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:01.249 11:00:25 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:01.249 11:00:25 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:01.249 11:00:25 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:01.249 11:00:25 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:01.249 11:00:25 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:01.249 11:00:25 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:01.249 11:00:25 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:01.249 11:00:25 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:01.249 11:00:25 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:01.249 11:00:25 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:01.249 11:00:25 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:01.249 11:00:25 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:01.249 11:00:25 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:01.249 11:00:25 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:01.249 11:00:25 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:01.249 11:00:25 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:01.249 11:00:25 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:01.249 11:00:25 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:01.249 11:00:25 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:01.249 11:00:25 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:01.249 11:00:25 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:01.249 11:00:25 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:01.249 11:00:25 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:01.249 11:00:25 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:01.249 11:00:25 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:01.249 11:00:25 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:01.249 11:00:25 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:01.249 11:00:25 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\1\:\d\m\-\0\,\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\2\:\d\m\-\0* ]] 00:06:01.249 11:00:25 setup.sh.devices.dm_mount -- setup/devices.sh@63 -- # found=1 00:06:01.249 11:00:25 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:01.510 11:00:25 setup.sh.devices.dm_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:06:01.510 11:00:25 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # [[ -n '' ]] 00:06:01.510 11:00:25 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # return 0 00:06:01.510 11:00:25 setup.sh.devices.dm_mount -- setup/devices.sh@187 -- # cleanup_dm 00:06:01.510 11:00:25 setup.sh.devices.dm_mount -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:06:01.510 11:00:25 setup.sh.devices.dm_mount -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:06:01.510 11:00:25 setup.sh.devices.dm_mount -- setup/devices.sh@37 -- # dmsetup remove --force nvme_dm_test 00:06:01.510 11:00:25 setup.sh.devices.dm_mount -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:06:01.510 11:00:25 setup.sh.devices.dm_mount -- setup/devices.sh@40 -- # wipefs --all /dev/nvme0n1p1 00:06:01.510 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:06:01.510 11:00:26 setup.sh.devices.dm_mount -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:06:01.510 11:00:26 setup.sh.devices.dm_mount -- setup/devices.sh@43 -- # wipefs --all /dev/nvme0n1p2 00:06:01.510 00:06:01.510 real 0m10.276s 00:06:01.510 user 0m2.493s 00:06:01.510 sys 0m4.877s 00:06:01.510 11:00:26 setup.sh.devices.dm_mount -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:01.510 11:00:26 setup.sh.devices.dm_mount -- common/autotest_common.sh@10 -- # set +x 00:06:01.510 ************************************ 00:06:01.510 END TEST dm_mount 00:06:01.510 ************************************ 00:06:01.510 11:00:26 setup.sh.devices -- setup/devices.sh@1 -- # cleanup 00:06:01.510 11:00:26 setup.sh.devices -- setup/devices.sh@11 -- # cleanup_nvme 00:06:01.510 11:00:26 setup.sh.devices -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:06:01.510 11:00:26 setup.sh.devices -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:06:01.510 11:00:26 setup.sh.devices -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:06:01.510 11:00:26 setup.sh.devices -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:06:01.510 11:00:26 setup.sh.devices -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:06:01.771 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:06:01.771 /dev/nvme0n1: 8 bytes were erased at offset 0x1749a955e00 (gpt): 45 46 49 20 50 41 52 54 00:06:01.771 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:06:01.771 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:06:01.771 11:00:26 setup.sh.devices -- setup/devices.sh@12 -- # cleanup_dm 00:06:01.771 11:00:26 setup.sh.devices -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:06:01.771 11:00:26 setup.sh.devices -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:06:01.771 11:00:26 setup.sh.devices -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:06:01.771 11:00:26 setup.sh.devices -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:06:01.771 11:00:26 setup.sh.devices -- setup/devices.sh@14 -- # [[ -b /dev/nvme0n1 ]] 00:06:01.771 11:00:26 setup.sh.devices -- setup/devices.sh@15 -- # wipefs --all /dev/nvme0n1 00:06:01.771 00:06:01.771 real 0m27.980s 00:06:01.771 user 0m7.867s 00:06:01.771 sys 0m15.083s 00:06:01.771 11:00:26 setup.sh.devices -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:01.771 11:00:26 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:06:01.771 ************************************ 00:06:01.771 END TEST devices 00:06:01.771 ************************************ 00:06:01.771 00:06:01.771 real 1m34.629s 00:06:01.771 user 0m29.108s 00:06:01.771 sys 0m54.479s 00:06:01.771 11:00:26 setup.sh -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:01.771 11:00:26 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:06:01.771 ************************************ 00:06:01.771 END TEST setup.sh 00:06:01.771 ************************************ 00:06:02.031 11:00:26 -- spdk/autotest.sh@115 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:06:05.329 Hugepages 00:06:05.329 node hugesize free / total 00:06:05.329 node0 1048576kB 0 / 0 00:06:05.329 node0 2048kB 1024 / 1024 00:06:05.329 node1 1048576kB 0 / 0 00:06:05.329 node1 2048kB 1024 / 1024 00:06:05.329 00:06:05.329 Type BDF Vendor Device NUMA Driver Device Block devices 00:06:05.329 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:06:05.329 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:06:05.329 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:06:05.329 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:06:05.329 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:06:05.329 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:06:05.329 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:06:05.329 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:06:05.329 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:06:05.329 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:06:05.329 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:06:05.329 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:06:05.329 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:06:05.329 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:06:05.329 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:06:05.329 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:06:05.589 NVMe 0000:d8:00.0 8086 0a54 1 nvme nvme0 nvme0n1 00:06:05.589 11:00:30 -- spdk/autotest.sh@117 -- # uname -s 00:06:05.589 11:00:30 -- spdk/autotest.sh@117 -- # [[ Linux == Linux ]] 00:06:05.589 11:00:30 -- spdk/autotest.sh@119 -- # nvme_namespace_revert 00:06:05.589 11:00:30 -- common/autotest_common.sh@1516 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:06:08.890 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:06:08.890 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:06:08.890 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:06:08.890 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:06:08.890 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:06:08.890 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:06:08.890 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:06:08.890 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:06:08.890 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:06:08.890 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:06:08.890 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:06:09.150 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:06:09.150 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:06:09.150 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:06:09.150 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:06:09.150 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:06:10.538 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:06:10.799 11:00:35 -- common/autotest_common.sh@1517 -- # sleep 1 00:06:11.743 11:00:36 -- common/autotest_common.sh@1518 -- # bdfs=() 00:06:11.743 11:00:36 -- common/autotest_common.sh@1518 -- # local bdfs 00:06:11.743 11:00:36 -- common/autotest_common.sh@1520 -- # bdfs=($(get_nvme_bdfs)) 00:06:11.743 11:00:36 -- common/autotest_common.sh@1520 -- # get_nvme_bdfs 00:06:11.743 11:00:36 -- common/autotest_common.sh@1498 -- # bdfs=() 00:06:11.743 11:00:36 -- common/autotest_common.sh@1498 -- # local bdfs 00:06:11.743 11:00:36 -- common/autotest_common.sh@1499 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:06:11.743 11:00:36 -- common/autotest_common.sh@1499 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/gen_nvme.sh 00:06:11.743 11:00:36 -- common/autotest_common.sh@1499 -- # jq -r '.config[].params.traddr' 00:06:11.743 11:00:36 -- common/autotest_common.sh@1500 -- # (( 1 == 0 )) 00:06:11.743 11:00:36 -- common/autotest_common.sh@1504 -- # printf '%s\n' 0000:d8:00.0 00:06:11.743 11:00:36 -- common/autotest_common.sh@1522 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:06:15.051 Waiting for block devices as requested 00:06:15.311 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:06:15.311 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:06:15.311 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:06:15.573 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:06:15.573 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:06:15.573 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:06:15.833 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:06:15.833 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:06:15.833 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:06:16.093 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:06:16.093 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:06:16.093 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:06:16.354 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:06:16.354 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:06:16.354 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:06:16.614 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:06:16.614 0000:d8:00.0 (8086 0a54): vfio-pci -> nvme 00:06:16.874 11:00:41 -- common/autotest_common.sh@1524 -- # for bdf in "${bdfs[@]}" 00:06:16.874 11:00:41 -- common/autotest_common.sh@1525 -- # get_nvme_ctrlr_from_bdf 0000:d8:00.0 00:06:16.874 11:00:41 -- common/autotest_common.sh@1487 -- # readlink -f /sys/class/nvme/nvme0 00:06:16.874 11:00:41 -- common/autotest_common.sh@1487 -- # grep 0000:d8:00.0/nvme/nvme 00:06:16.874 11:00:41 -- common/autotest_common.sh@1487 -- # bdf_sysfs_path=/sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 00:06:16.874 11:00:41 -- common/autotest_common.sh@1488 -- # [[ -z /sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 ]] 00:06:16.874 11:00:41 -- common/autotest_common.sh@1492 -- # basename /sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 00:06:16.874 11:00:41 -- common/autotest_common.sh@1492 -- # printf '%s\n' nvme0 00:06:16.874 11:00:41 -- common/autotest_common.sh@1525 -- # nvme_ctrlr=/dev/nvme0 00:06:16.874 11:00:41 -- common/autotest_common.sh@1526 -- # [[ -z /dev/nvme0 ]] 00:06:16.874 11:00:41 -- common/autotest_common.sh@1531 -- # nvme id-ctrl /dev/nvme0 00:06:16.874 11:00:41 -- common/autotest_common.sh@1531 -- # grep oacs 00:06:16.874 11:00:41 -- common/autotest_common.sh@1531 -- # cut -d: -f2 00:06:16.874 11:00:41 -- common/autotest_common.sh@1531 -- # oacs=' 0xe' 00:06:16.874 11:00:41 -- common/autotest_common.sh@1532 -- # oacs_ns_manage=8 00:06:16.874 11:00:41 -- common/autotest_common.sh@1534 -- # [[ 8 -ne 0 ]] 00:06:16.874 11:00:41 -- common/autotest_common.sh@1540 -- # nvme id-ctrl /dev/nvme0 00:06:16.874 11:00:41 -- common/autotest_common.sh@1540 -- # grep unvmcap 00:06:16.874 11:00:41 -- common/autotest_common.sh@1540 -- # cut -d: -f2 00:06:16.874 11:00:41 -- common/autotest_common.sh@1540 -- # unvmcap=' 0' 00:06:16.874 11:00:41 -- common/autotest_common.sh@1541 -- # [[ 0 -eq 0 ]] 00:06:16.874 11:00:41 -- common/autotest_common.sh@1543 -- # continue 00:06:16.874 11:00:41 -- spdk/autotest.sh@122 -- # timing_exit pre_cleanup 00:06:16.874 11:00:41 -- common/autotest_common.sh@732 -- # xtrace_disable 00:06:16.874 11:00:41 -- common/autotest_common.sh@10 -- # set +x 00:06:16.874 11:00:41 -- spdk/autotest.sh@125 -- # timing_enter afterboot 00:06:16.874 11:00:41 -- common/autotest_common.sh@726 -- # xtrace_disable 00:06:16.874 11:00:41 -- common/autotest_common.sh@10 -- # set +x 00:06:16.874 11:00:41 -- spdk/autotest.sh@126 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:06:20.174 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:06:20.174 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:06:20.434 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:06:20.434 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:06:20.434 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:06:20.434 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:06:20.434 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:06:20.434 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:06:20.434 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:06:20.434 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:06:20.434 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:06:20.434 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:06:20.434 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:06:20.434 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:06:20.434 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:06:20.434 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:06:22.347 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:06:22.347 11:00:46 -- spdk/autotest.sh@127 -- # timing_exit afterboot 00:06:22.347 11:00:46 -- common/autotest_common.sh@732 -- # xtrace_disable 00:06:22.347 11:00:46 -- common/autotest_common.sh@10 -- # set +x 00:06:22.347 11:00:46 -- spdk/autotest.sh@131 -- # opal_revert_cleanup 00:06:22.347 11:00:46 -- common/autotest_common.sh@1578 -- # mapfile -t bdfs 00:06:22.347 11:00:46 -- common/autotest_common.sh@1578 -- # get_nvme_bdfs_by_id 0x0a54 00:06:22.347 11:00:46 -- common/autotest_common.sh@1563 -- # bdfs=() 00:06:22.347 11:00:46 -- common/autotest_common.sh@1563 -- # _bdfs=() 00:06:22.347 11:00:46 -- common/autotest_common.sh@1563 -- # local bdfs _bdfs 00:06:22.347 11:00:46 -- common/autotest_common.sh@1564 -- # _bdfs=($(get_nvme_bdfs)) 00:06:22.347 11:00:46 -- common/autotest_common.sh@1564 -- # get_nvme_bdfs 00:06:22.347 11:00:46 -- common/autotest_common.sh@1498 -- # bdfs=() 00:06:22.347 11:00:46 -- common/autotest_common.sh@1498 -- # local bdfs 00:06:22.347 11:00:46 -- common/autotest_common.sh@1499 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:06:22.347 11:00:46 -- common/autotest_common.sh@1499 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/gen_nvme.sh 00:06:22.347 11:00:46 -- common/autotest_common.sh@1499 -- # jq -r '.config[].params.traddr' 00:06:22.347 11:00:46 -- common/autotest_common.sh@1500 -- # (( 1 == 0 )) 00:06:22.347 11:00:46 -- common/autotest_common.sh@1504 -- # printf '%s\n' 0000:d8:00.0 00:06:22.348 11:00:46 -- common/autotest_common.sh@1565 -- # for bdf in "${_bdfs[@]}" 00:06:22.348 11:00:46 -- common/autotest_common.sh@1566 -- # cat /sys/bus/pci/devices/0000:d8:00.0/device 00:06:22.348 11:00:46 -- common/autotest_common.sh@1566 -- # device=0x0a54 00:06:22.348 11:00:46 -- common/autotest_common.sh@1567 -- # [[ 0x0a54 == \0\x\0\a\5\4 ]] 00:06:22.348 11:00:46 -- common/autotest_common.sh@1568 -- # bdfs+=($bdf) 00:06:22.348 11:00:46 -- common/autotest_common.sh@1572 -- # (( 1 > 0 )) 00:06:22.348 11:00:46 -- common/autotest_common.sh@1573 -- # printf '%s\n' 0000:d8:00.0 00:06:22.348 11:00:46 -- common/autotest_common.sh@1579 -- # [[ -z 0000:d8:00.0 ]] 00:06:22.348 11:00:46 -- common/autotest_common.sh@1584 -- # spdk_tgt_pid=132423 00:06:22.348 11:00:46 -- common/autotest_common.sh@1583 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:22.348 11:00:46 -- common/autotest_common.sh@1585 -- # waitforlisten 132423 00:06:22.348 11:00:46 -- common/autotest_common.sh@835 -- # '[' -z 132423 ']' 00:06:22.348 11:00:46 -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:22.348 11:00:46 -- common/autotest_common.sh@840 -- # local max_retries=100 00:06:22.348 11:00:46 -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:22.348 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:22.348 11:00:46 -- common/autotest_common.sh@844 -- # xtrace_disable 00:06:22.348 11:00:46 -- common/autotest_common.sh@10 -- # set +x 00:06:22.348 [2024-11-17 11:00:46.834654] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:06:22.348 [2024-11-17 11:00:46.834700] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid132423 ] 00:06:22.348 [2024-11-17 11:00:46.916837] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:22.348 [2024-11-17 11:00:46.939514] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:06:22.608 11:00:47 -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:06:22.608 11:00:47 -- common/autotest_common.sh@868 -- # return 0 00:06:22.608 11:00:47 -- common/autotest_common.sh@1587 -- # bdf_id=0 00:06:22.608 11:00:47 -- common/autotest_common.sh@1588 -- # for bdf in "${bdfs[@]}" 00:06:22.608 11:00:47 -- common/autotest_common.sh@1589 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py bdev_nvme_attach_controller -b nvme0 -t pcie -a 0000:d8:00.0 00:06:25.906 nvme0n1 00:06:25.906 11:00:50 -- common/autotest_common.sh@1591 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py bdev_nvme_opal_revert -b nvme0 -p test 00:06:25.906 [2024-11-17 11:00:50.348625] vbdev_opal_rpc.c: 125:rpc_bdev_nvme_opal_revert: *ERROR*: nvme0 not support opal 00:06:25.906 request: 00:06:25.906 { 00:06:25.906 "nvme_ctrlr_name": "nvme0", 00:06:25.906 "password": "test", 00:06:25.906 "method": "bdev_nvme_opal_revert", 00:06:25.906 "req_id": 1 00:06:25.906 } 00:06:25.906 Got JSON-RPC error response 00:06:25.906 response: 00:06:25.906 { 00:06:25.906 "code": -32602, 00:06:25.906 "message": "Invalid parameters" 00:06:25.906 } 00:06:25.906 11:00:50 -- common/autotest_common.sh@1591 -- # true 00:06:25.906 11:00:50 -- common/autotest_common.sh@1592 -- # (( ++bdf_id )) 00:06:25.906 11:00:50 -- common/autotest_common.sh@1595 -- # killprocess 132423 00:06:25.906 11:00:50 -- common/autotest_common.sh@954 -- # '[' -z 132423 ']' 00:06:25.906 11:00:50 -- common/autotest_common.sh@958 -- # kill -0 132423 00:06:25.906 11:00:50 -- common/autotest_common.sh@959 -- # uname 00:06:25.906 11:00:50 -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:06:25.906 11:00:50 -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 132423 00:06:25.906 11:00:50 -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:06:25.906 11:00:50 -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:06:25.906 11:00:50 -- common/autotest_common.sh@972 -- # echo 'killing process with pid 132423' 00:06:25.906 killing process with pid 132423 00:06:25.906 11:00:50 -- common/autotest_common.sh@973 -- # kill 132423 00:06:25.906 11:00:50 -- common/autotest_common.sh@978 -- # wait 132423 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.906 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.907 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.907 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.907 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.907 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.907 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.907 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.907 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.907 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.907 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.907 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.907 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.907 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.907 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.907 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.907 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.907 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.907 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.907 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.907 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.907 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.907 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.907 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.907 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.907 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.907 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.907 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.907 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.907 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.907 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.907 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.907 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.907 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.907 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.907 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.907 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.907 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.907 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.907 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.907 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.907 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.907 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.907 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.907 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.907 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.907 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.907 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.907 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.907 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.907 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.907 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.907 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.907 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.907 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.907 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.907 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.907 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.907 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.907 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.907 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.907 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.907 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.907 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.907 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.907 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.907 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.907 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.907 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.907 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.907 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.907 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.907 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.907 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.907 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.907 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.907 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.907 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.907 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.907 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.907 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.907 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.907 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.907 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.907 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.907 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:25.907 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:28.450 11:00:52 -- spdk/autotest.sh@137 -- # '[' 0 -eq 1 ']' 00:06:28.450 11:00:52 -- spdk/autotest.sh@141 -- # '[' 1 -eq 1 ']' 00:06:28.450 11:00:52 -- spdk/autotest.sh@142 -- # [[ 0 -eq 1 ]] 00:06:28.450 11:00:52 -- spdk/autotest.sh@142 -- # [[ 0 -eq 1 ]] 00:06:28.450 11:00:52 -- spdk/autotest.sh@149 -- # timing_enter lib 00:06:28.450 11:00:52 -- common/autotest_common.sh@726 -- # xtrace_disable 00:06:28.450 11:00:52 -- common/autotest_common.sh@10 -- # set +x 00:06:28.450 11:00:52 -- spdk/autotest.sh@151 -- # [[ 0 -eq 1 ]] 00:06:28.450 11:00:52 -- spdk/autotest.sh@155 -- # run_test env /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env.sh 00:06:28.450 11:00:52 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:28.450 11:00:52 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:28.451 11:00:52 -- common/autotest_common.sh@10 -- # set +x 00:06:28.451 ************************************ 00:06:28.451 START TEST env 00:06:28.451 ************************************ 00:06:28.451 11:00:52 env -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env.sh 00:06:28.451 * Looking for test storage... 00:06:28.451 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env 00:06:28.451 11:00:52 env -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:06:28.451 11:00:52 env -- common/autotest_common.sh@1693 -- # lcov --version 00:06:28.451 11:00:52 env -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:06:28.451 11:00:52 env -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:06:28.451 11:00:52 env -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:28.451 11:00:52 env -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:28.451 11:00:52 env -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:28.451 11:00:52 env -- scripts/common.sh@336 -- # IFS=.-: 00:06:28.451 11:00:52 env -- scripts/common.sh@336 -- # read -ra ver1 00:06:28.451 11:00:52 env -- scripts/common.sh@337 -- # IFS=.-: 00:06:28.451 11:00:52 env -- scripts/common.sh@337 -- # read -ra ver2 00:06:28.451 11:00:52 env -- scripts/common.sh@338 -- # local 'op=<' 00:06:28.451 11:00:52 env -- scripts/common.sh@340 -- # ver1_l=2 00:06:28.451 11:00:52 env -- scripts/common.sh@341 -- # ver2_l=1 00:06:28.451 11:00:52 env -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:28.451 11:00:52 env -- scripts/common.sh@344 -- # case "$op" in 00:06:28.451 11:00:52 env -- scripts/common.sh@345 -- # : 1 00:06:28.451 11:00:52 env -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:28.451 11:00:52 env -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:28.451 11:00:52 env -- scripts/common.sh@365 -- # decimal 1 00:06:28.451 11:00:52 env -- scripts/common.sh@353 -- # local d=1 00:06:28.451 11:00:52 env -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:28.451 11:00:52 env -- scripts/common.sh@355 -- # echo 1 00:06:28.451 11:00:52 env -- scripts/common.sh@365 -- # ver1[v]=1 00:06:28.451 11:00:52 env -- scripts/common.sh@366 -- # decimal 2 00:06:28.451 11:00:52 env -- scripts/common.sh@353 -- # local d=2 00:06:28.451 11:00:52 env -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:28.451 11:00:52 env -- scripts/common.sh@355 -- # echo 2 00:06:28.451 11:00:52 env -- scripts/common.sh@366 -- # ver2[v]=2 00:06:28.451 11:00:52 env -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:28.451 11:00:52 env -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:28.451 11:00:52 env -- scripts/common.sh@368 -- # return 0 00:06:28.451 11:00:52 env -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:28.451 11:00:52 env -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:06:28.451 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:28.451 --rc genhtml_branch_coverage=1 00:06:28.451 --rc genhtml_function_coverage=1 00:06:28.451 --rc genhtml_legend=1 00:06:28.451 --rc geninfo_all_blocks=1 00:06:28.451 --rc geninfo_unexecuted_blocks=1 00:06:28.451 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:28.451 ' 00:06:28.451 11:00:52 env -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:06:28.451 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:28.451 --rc genhtml_branch_coverage=1 00:06:28.451 --rc genhtml_function_coverage=1 00:06:28.451 --rc genhtml_legend=1 00:06:28.451 --rc geninfo_all_blocks=1 00:06:28.451 --rc geninfo_unexecuted_blocks=1 00:06:28.451 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:28.451 ' 00:06:28.451 11:00:52 env -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:06:28.451 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:28.451 --rc genhtml_branch_coverage=1 00:06:28.451 --rc genhtml_function_coverage=1 00:06:28.451 --rc genhtml_legend=1 00:06:28.451 --rc geninfo_all_blocks=1 00:06:28.451 --rc geninfo_unexecuted_blocks=1 00:06:28.451 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:28.451 ' 00:06:28.451 11:00:52 env -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:06:28.451 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:28.451 --rc genhtml_branch_coverage=1 00:06:28.451 --rc genhtml_function_coverage=1 00:06:28.451 --rc genhtml_legend=1 00:06:28.451 --rc geninfo_all_blocks=1 00:06:28.451 --rc geninfo_unexecuted_blocks=1 00:06:28.451 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:28.451 ' 00:06:28.451 11:00:52 env -- env/env.sh@10 -- # run_test env_memory /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/memory/memory_ut 00:06:28.451 11:00:52 env -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:28.451 11:00:52 env -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:28.451 11:00:52 env -- common/autotest_common.sh@10 -- # set +x 00:06:28.451 ************************************ 00:06:28.451 START TEST env_memory 00:06:28.451 ************************************ 00:06:28.451 11:00:52 env.env_memory -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/memory/memory_ut 00:06:28.451 00:06:28.451 00:06:28.451 CUnit - A unit testing framework for C - Version 2.1-3 00:06:28.451 http://cunit.sourceforge.net/ 00:06:28.451 00:06:28.451 00:06:28.451 Suite: memory 00:06:28.451 Test: alloc and free memory map ...[2024-11-17 11:00:52.929312] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 283:spdk_mem_map_alloc: *ERROR*: Initial mem_map notify failed 00:06:28.451 passed 00:06:28.451 Test: mem map translation ...[2024-11-17 11:00:52.942331] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 596:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=2097152 len=1234 00:06:28.451 [2024-11-17 11:00:52.942351] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 596:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=1234 len=2097152 00:06:28.451 [2024-11-17 11:00:52.942381] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 589:spdk_mem_map_set_translation: *ERROR*: invalid usermode virtual address 281474976710656 00:06:28.451 [2024-11-17 11:00:52.942393] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 605:spdk_mem_map_set_translation: *ERROR*: could not get 0xffffffe00000 map 00:06:28.451 passed 00:06:28.451 Test: mem map registration ...[2024-11-17 11:00:52.962768] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 348:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=200000 len=1234 00:06:28.451 [2024-11-17 11:00:52.962784] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 348:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=4d2 len=2097152 00:06:28.451 passed 00:06:28.451 Test: mem map adjacent registrations ...passed 00:06:28.451 00:06:28.451 Run Summary: Type Total Ran Passed Failed Inactive 00:06:28.451 suites 1 1 n/a 0 0 00:06:28.451 tests 4 4 4 0 0 00:06:28.451 asserts 152 152 152 0 n/a 00:06:28.451 00:06:28.451 Elapsed time = 0.073 seconds 00:06:28.451 00:06:28.451 real 0m0.083s 00:06:28.451 user 0m0.071s 00:06:28.451 sys 0m0.011s 00:06:28.451 11:00:52 env.env_memory -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:28.451 11:00:52 env.env_memory -- common/autotest_common.sh@10 -- # set +x 00:06:28.451 ************************************ 00:06:28.451 END TEST env_memory 00:06:28.451 ************************************ 00:06:28.451 11:00:53 env -- env/env.sh@11 -- # run_test env_vtophys /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/vtophys/vtophys 00:06:28.451 11:00:53 env -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:28.451 11:00:53 env -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:28.451 11:00:53 env -- common/autotest_common.sh@10 -- # set +x 00:06:28.451 ************************************ 00:06:28.451 START TEST env_vtophys 00:06:28.451 ************************************ 00:06:28.451 11:00:53 env.env_vtophys -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/vtophys/vtophys 00:06:28.451 EAL: lib.eal log level changed from notice to debug 00:06:28.451 EAL: Detected lcore 0 as core 0 on socket 0 00:06:28.451 EAL: Detected lcore 1 as core 1 on socket 0 00:06:28.451 EAL: Detected lcore 2 as core 2 on socket 0 00:06:28.451 EAL: Detected lcore 3 as core 3 on socket 0 00:06:28.451 EAL: Detected lcore 4 as core 4 on socket 0 00:06:28.451 EAL: Detected lcore 5 as core 5 on socket 0 00:06:28.451 EAL: Detected lcore 6 as core 6 on socket 0 00:06:28.451 EAL: Detected lcore 7 as core 8 on socket 0 00:06:28.451 EAL: Detected lcore 8 as core 9 on socket 0 00:06:28.451 EAL: Detected lcore 9 as core 10 on socket 0 00:06:28.451 EAL: Detected lcore 10 as core 11 on socket 0 00:06:28.451 EAL: Detected lcore 11 as core 12 on socket 0 00:06:28.451 EAL: Detected lcore 12 as core 13 on socket 0 00:06:28.451 EAL: Detected lcore 13 as core 14 on socket 0 00:06:28.451 EAL: Detected lcore 14 as core 16 on socket 0 00:06:28.451 EAL: Detected lcore 15 as core 17 on socket 0 00:06:28.451 EAL: Detected lcore 16 as core 18 on socket 0 00:06:28.451 EAL: Detected lcore 17 as core 19 on socket 0 00:06:28.451 EAL: Detected lcore 18 as core 20 on socket 0 00:06:28.451 EAL: Detected lcore 19 as core 21 on socket 0 00:06:28.451 EAL: Detected lcore 20 as core 22 on socket 0 00:06:28.451 EAL: Detected lcore 21 as core 24 on socket 0 00:06:28.451 EAL: Detected lcore 22 as core 25 on socket 0 00:06:28.451 EAL: Detected lcore 23 as core 26 on socket 0 00:06:28.451 EAL: Detected lcore 24 as core 27 on socket 0 00:06:28.451 EAL: Detected lcore 25 as core 28 on socket 0 00:06:28.451 EAL: Detected lcore 26 as core 29 on socket 0 00:06:28.451 EAL: Detected lcore 27 as core 30 on socket 0 00:06:28.451 EAL: Detected lcore 28 as core 0 on socket 1 00:06:28.451 EAL: Detected lcore 29 as core 1 on socket 1 00:06:28.451 EAL: Detected lcore 30 as core 2 on socket 1 00:06:28.451 EAL: Detected lcore 31 as core 3 on socket 1 00:06:28.451 EAL: Detected lcore 32 as core 4 on socket 1 00:06:28.451 EAL: Detected lcore 33 as core 5 on socket 1 00:06:28.451 EAL: Detected lcore 34 as core 6 on socket 1 00:06:28.451 EAL: Detected lcore 35 as core 8 on socket 1 00:06:28.451 EAL: Detected lcore 36 as core 9 on socket 1 00:06:28.451 EAL: Detected lcore 37 as core 10 on socket 1 00:06:28.451 EAL: Detected lcore 38 as core 11 on socket 1 00:06:28.451 EAL: Detected lcore 39 as core 12 on socket 1 00:06:28.452 EAL: Detected lcore 40 as core 13 on socket 1 00:06:28.452 EAL: Detected lcore 41 as core 14 on socket 1 00:06:28.452 EAL: Detected lcore 42 as core 16 on socket 1 00:06:28.452 EAL: Detected lcore 43 as core 17 on socket 1 00:06:28.452 EAL: Detected lcore 44 as core 18 on socket 1 00:06:28.452 EAL: Detected lcore 45 as core 19 on socket 1 00:06:28.452 EAL: Detected lcore 46 as core 20 on socket 1 00:06:28.452 EAL: Detected lcore 47 as core 21 on socket 1 00:06:28.452 EAL: Detected lcore 48 as core 22 on socket 1 00:06:28.452 EAL: Detected lcore 49 as core 24 on socket 1 00:06:28.452 EAL: Detected lcore 50 as core 25 on socket 1 00:06:28.452 EAL: Detected lcore 51 as core 26 on socket 1 00:06:28.452 EAL: Detected lcore 52 as core 27 on socket 1 00:06:28.452 EAL: Detected lcore 53 as core 28 on socket 1 00:06:28.452 EAL: Detected lcore 54 as core 29 on socket 1 00:06:28.452 EAL: Detected lcore 55 as core 30 on socket 1 00:06:28.452 EAL: Detected lcore 56 as core 0 on socket 0 00:06:28.452 EAL: Detected lcore 57 as core 1 on socket 0 00:06:28.452 EAL: Detected lcore 58 as core 2 on socket 0 00:06:28.452 EAL: Detected lcore 59 as core 3 on socket 0 00:06:28.452 EAL: Detected lcore 60 as core 4 on socket 0 00:06:28.452 EAL: Detected lcore 61 as core 5 on socket 0 00:06:28.452 EAL: Detected lcore 62 as core 6 on socket 0 00:06:28.452 EAL: Detected lcore 63 as core 8 on socket 0 00:06:28.452 EAL: Detected lcore 64 as core 9 on socket 0 00:06:28.452 EAL: Detected lcore 65 as core 10 on socket 0 00:06:28.452 EAL: Detected lcore 66 as core 11 on socket 0 00:06:28.452 EAL: Detected lcore 67 as core 12 on socket 0 00:06:28.452 EAL: Detected lcore 68 as core 13 on socket 0 00:06:28.452 EAL: Detected lcore 69 as core 14 on socket 0 00:06:28.452 EAL: Detected lcore 70 as core 16 on socket 0 00:06:28.452 EAL: Detected lcore 71 as core 17 on socket 0 00:06:28.452 EAL: Detected lcore 72 as core 18 on socket 0 00:06:28.452 EAL: Detected lcore 73 as core 19 on socket 0 00:06:28.452 EAL: Detected lcore 74 as core 20 on socket 0 00:06:28.452 EAL: Detected lcore 75 as core 21 on socket 0 00:06:28.452 EAL: Detected lcore 76 as core 22 on socket 0 00:06:28.452 EAL: Detected lcore 77 as core 24 on socket 0 00:06:28.452 EAL: Detected lcore 78 as core 25 on socket 0 00:06:28.452 EAL: Detected lcore 79 as core 26 on socket 0 00:06:28.452 EAL: Detected lcore 80 as core 27 on socket 0 00:06:28.452 EAL: Detected lcore 81 as core 28 on socket 0 00:06:28.452 EAL: Detected lcore 82 as core 29 on socket 0 00:06:28.452 EAL: Detected lcore 83 as core 30 on socket 0 00:06:28.452 EAL: Detected lcore 84 as core 0 on socket 1 00:06:28.452 EAL: Detected lcore 85 as core 1 on socket 1 00:06:28.452 EAL: Detected lcore 86 as core 2 on socket 1 00:06:28.452 EAL: Detected lcore 87 as core 3 on socket 1 00:06:28.452 EAL: Detected lcore 88 as core 4 on socket 1 00:06:28.452 EAL: Detected lcore 89 as core 5 on socket 1 00:06:28.452 EAL: Detected lcore 90 as core 6 on socket 1 00:06:28.452 EAL: Detected lcore 91 as core 8 on socket 1 00:06:28.452 EAL: Detected lcore 92 as core 9 on socket 1 00:06:28.452 EAL: Detected lcore 93 as core 10 on socket 1 00:06:28.452 EAL: Detected lcore 94 as core 11 on socket 1 00:06:28.452 EAL: Detected lcore 95 as core 12 on socket 1 00:06:28.452 EAL: Detected lcore 96 as core 13 on socket 1 00:06:28.452 EAL: Detected lcore 97 as core 14 on socket 1 00:06:28.452 EAL: Detected lcore 98 as core 16 on socket 1 00:06:28.452 EAL: Detected lcore 99 as core 17 on socket 1 00:06:28.452 EAL: Detected lcore 100 as core 18 on socket 1 00:06:28.452 EAL: Detected lcore 101 as core 19 on socket 1 00:06:28.452 EAL: Detected lcore 102 as core 20 on socket 1 00:06:28.452 EAL: Detected lcore 103 as core 21 on socket 1 00:06:28.452 EAL: Detected lcore 104 as core 22 on socket 1 00:06:28.452 EAL: Detected lcore 105 as core 24 on socket 1 00:06:28.452 EAL: Detected lcore 106 as core 25 on socket 1 00:06:28.452 EAL: Detected lcore 107 as core 26 on socket 1 00:06:28.452 EAL: Detected lcore 108 as core 27 on socket 1 00:06:28.452 EAL: Detected lcore 109 as core 28 on socket 1 00:06:28.452 EAL: Detected lcore 110 as core 29 on socket 1 00:06:28.452 EAL: Detected lcore 111 as core 30 on socket 1 00:06:28.452 EAL: Maximum logical cores by configuration: 128 00:06:28.452 EAL: Detected CPU lcores: 112 00:06:28.452 EAL: Detected NUMA nodes: 2 00:06:28.452 EAL: Checking presence of .so 'librte_eal.so.23.0' 00:06:28.452 EAL: Checking presence of .so 'librte_eal.so.23' 00:06:28.452 EAL: Checking presence of .so 'librte_eal.so' 00:06:28.452 EAL: Detected static linkage of DPDK 00:06:28.452 EAL: No shared files mode enabled, IPC will be disabled 00:06:28.713 EAL: Bus pci wants IOVA as 'DC' 00:06:28.713 EAL: Buses did not request a specific IOVA mode. 00:06:28.713 EAL: IOMMU is available, selecting IOVA as VA mode. 00:06:28.713 EAL: Selected IOVA mode 'VA' 00:06:28.713 EAL: Probing VFIO support... 00:06:28.713 EAL: IOMMU type 1 (Type 1) is supported 00:06:28.713 EAL: IOMMU type 7 (sPAPR) is not supported 00:06:28.713 EAL: IOMMU type 8 (No-IOMMU) is not supported 00:06:28.713 EAL: VFIO support initialized 00:06:28.713 EAL: Ask a virtual area of 0x2e000 bytes 00:06:28.713 EAL: Virtual area found at 0x200000000000 (size = 0x2e000) 00:06:28.713 EAL: Setting up physically contiguous memory... 00:06:28.713 EAL: Setting maximum number of open files to 524288 00:06:28.713 EAL: Detected memory type: socket_id:0 hugepage_sz:2097152 00:06:28.713 EAL: Detected memory type: socket_id:1 hugepage_sz:2097152 00:06:28.713 EAL: Creating 4 segment lists: n_segs:8192 socket_id:0 hugepage_sz:2097152 00:06:28.713 EAL: Ask a virtual area of 0x61000 bytes 00:06:28.713 EAL: Virtual area found at 0x20000002e000 (size = 0x61000) 00:06:28.713 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:06:28.713 EAL: Ask a virtual area of 0x400000000 bytes 00:06:28.713 EAL: Virtual area found at 0x200000200000 (size = 0x400000000) 00:06:28.713 EAL: VA reserved for memseg list at 0x200000200000, size 400000000 00:06:28.713 EAL: Ask a virtual area of 0x61000 bytes 00:06:28.713 EAL: Virtual area found at 0x200400200000 (size = 0x61000) 00:06:28.713 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:06:28.713 EAL: Ask a virtual area of 0x400000000 bytes 00:06:28.713 EAL: Virtual area found at 0x200400400000 (size = 0x400000000) 00:06:28.713 EAL: VA reserved for memseg list at 0x200400400000, size 400000000 00:06:28.713 EAL: Ask a virtual area of 0x61000 bytes 00:06:28.713 EAL: Virtual area found at 0x200800400000 (size = 0x61000) 00:06:28.713 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:06:28.713 EAL: Ask a virtual area of 0x400000000 bytes 00:06:28.713 EAL: Virtual area found at 0x200800600000 (size = 0x400000000) 00:06:28.713 EAL: VA reserved for memseg list at 0x200800600000, size 400000000 00:06:28.713 EAL: Ask a virtual area of 0x61000 bytes 00:06:28.713 EAL: Virtual area found at 0x200c00600000 (size = 0x61000) 00:06:28.713 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:06:28.713 EAL: Ask a virtual area of 0x400000000 bytes 00:06:28.713 EAL: Virtual area found at 0x200c00800000 (size = 0x400000000) 00:06:28.713 EAL: VA reserved for memseg list at 0x200c00800000, size 400000000 00:06:28.713 EAL: Creating 4 segment lists: n_segs:8192 socket_id:1 hugepage_sz:2097152 00:06:28.713 EAL: Ask a virtual area of 0x61000 bytes 00:06:28.713 EAL: Virtual area found at 0x201000800000 (size = 0x61000) 00:06:28.713 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:06:28.713 EAL: Ask a virtual area of 0x400000000 bytes 00:06:28.713 EAL: Virtual area found at 0x201000a00000 (size = 0x400000000) 00:06:28.713 EAL: VA reserved for memseg list at 0x201000a00000, size 400000000 00:06:28.713 EAL: Ask a virtual area of 0x61000 bytes 00:06:28.713 EAL: Virtual area found at 0x201400a00000 (size = 0x61000) 00:06:28.713 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:06:28.713 EAL: Ask a virtual area of 0x400000000 bytes 00:06:28.713 EAL: Virtual area found at 0x201400c00000 (size = 0x400000000) 00:06:28.713 EAL: VA reserved for memseg list at 0x201400c00000, size 400000000 00:06:28.713 EAL: Ask a virtual area of 0x61000 bytes 00:06:28.713 EAL: Virtual area found at 0x201800c00000 (size = 0x61000) 00:06:28.713 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:06:28.713 EAL: Ask a virtual area of 0x400000000 bytes 00:06:28.713 EAL: Virtual area found at 0x201800e00000 (size = 0x400000000) 00:06:28.713 EAL: VA reserved for memseg list at 0x201800e00000, size 400000000 00:06:28.713 EAL: Ask a virtual area of 0x61000 bytes 00:06:28.713 EAL: Virtual area found at 0x201c00e00000 (size = 0x61000) 00:06:28.713 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:06:28.713 EAL: Ask a virtual area of 0x400000000 bytes 00:06:28.713 EAL: Virtual area found at 0x201c01000000 (size = 0x400000000) 00:06:28.713 EAL: VA reserved for memseg list at 0x201c01000000, size 400000000 00:06:28.713 EAL: Hugepages will be freed exactly as allocated. 00:06:28.713 EAL: No shared files mode enabled, IPC is disabled 00:06:28.713 EAL: No shared files mode enabled, IPC is disabled 00:06:28.713 EAL: TSC frequency is ~2500000 KHz 00:06:28.713 EAL: Main lcore 0 is ready (tid=7fcadefbda00;cpuset=[0]) 00:06:28.713 EAL: Trying to obtain current memory policy. 00:06:28.713 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:28.713 EAL: Restoring previous memory policy: 0 00:06:28.713 EAL: request: mp_malloc_sync 00:06:28.713 EAL: No shared files mode enabled, IPC is disabled 00:06:28.713 EAL: Heap on socket 0 was expanded by 2MB 00:06:28.713 EAL: No shared files mode enabled, IPC is disabled 00:06:28.713 EAL: Mem event callback 'spdk:(nil)' registered 00:06:28.713 00:06:28.713 00:06:28.713 CUnit - A unit testing framework for C - Version 2.1-3 00:06:28.713 http://cunit.sourceforge.net/ 00:06:28.713 00:06:28.713 00:06:28.713 Suite: components_suite 00:06:28.713 Test: vtophys_malloc_test ...passed 00:06:28.713 Test: vtophys_spdk_malloc_test ...EAL: Trying to obtain current memory policy. 00:06:28.713 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:28.713 EAL: Restoring previous memory policy: 4 00:06:28.713 EAL: Calling mem event callback 'spdk:(nil)' 00:06:28.713 EAL: request: mp_malloc_sync 00:06:28.713 EAL: No shared files mode enabled, IPC is disabled 00:06:28.713 EAL: Heap on socket 0 was expanded by 4MB 00:06:28.713 EAL: Calling mem event callback 'spdk:(nil)' 00:06:28.713 EAL: request: mp_malloc_sync 00:06:28.713 EAL: No shared files mode enabled, IPC is disabled 00:06:28.713 EAL: Heap on socket 0 was shrunk by 4MB 00:06:28.713 EAL: Trying to obtain current memory policy. 00:06:28.713 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:28.713 EAL: Restoring previous memory policy: 4 00:06:28.714 EAL: Calling mem event callback 'spdk:(nil)' 00:06:28.714 EAL: request: mp_malloc_sync 00:06:28.714 EAL: No shared files mode enabled, IPC is disabled 00:06:28.714 EAL: Heap on socket 0 was expanded by 6MB 00:06:28.714 EAL: Calling mem event callback 'spdk:(nil)' 00:06:28.714 EAL: request: mp_malloc_sync 00:06:28.714 EAL: No shared files mode enabled, IPC is disabled 00:06:28.714 EAL: Heap on socket 0 was shrunk by 6MB 00:06:28.714 EAL: Trying to obtain current memory policy. 00:06:28.714 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:28.714 EAL: Restoring previous memory policy: 4 00:06:28.714 EAL: Calling mem event callback 'spdk:(nil)' 00:06:28.714 EAL: request: mp_malloc_sync 00:06:28.714 EAL: No shared files mode enabled, IPC is disabled 00:06:28.714 EAL: Heap on socket 0 was expanded by 10MB 00:06:28.714 EAL: Calling mem event callback 'spdk:(nil)' 00:06:28.714 EAL: request: mp_malloc_sync 00:06:28.714 EAL: No shared files mode enabled, IPC is disabled 00:06:28.714 EAL: Heap on socket 0 was shrunk by 10MB 00:06:28.714 EAL: Trying to obtain current memory policy. 00:06:28.714 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:28.714 EAL: Restoring previous memory policy: 4 00:06:28.714 EAL: Calling mem event callback 'spdk:(nil)' 00:06:28.714 EAL: request: mp_malloc_sync 00:06:28.714 EAL: No shared files mode enabled, IPC is disabled 00:06:28.714 EAL: Heap on socket 0 was expanded by 18MB 00:06:28.714 EAL: Calling mem event callback 'spdk:(nil)' 00:06:28.714 EAL: request: mp_malloc_sync 00:06:28.714 EAL: No shared files mode enabled, IPC is disabled 00:06:28.714 EAL: Heap on socket 0 was shrunk by 18MB 00:06:28.714 EAL: Trying to obtain current memory policy. 00:06:28.714 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:28.714 EAL: Restoring previous memory policy: 4 00:06:28.714 EAL: Calling mem event callback 'spdk:(nil)' 00:06:28.714 EAL: request: mp_malloc_sync 00:06:28.714 EAL: No shared files mode enabled, IPC is disabled 00:06:28.714 EAL: Heap on socket 0 was expanded by 34MB 00:06:28.714 EAL: Calling mem event callback 'spdk:(nil)' 00:06:28.714 EAL: request: mp_malloc_sync 00:06:28.714 EAL: No shared files mode enabled, IPC is disabled 00:06:28.714 EAL: Heap on socket 0 was shrunk by 34MB 00:06:28.714 EAL: Trying to obtain current memory policy. 00:06:28.714 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:28.714 EAL: Restoring previous memory policy: 4 00:06:28.714 EAL: Calling mem event callback 'spdk:(nil)' 00:06:28.714 EAL: request: mp_malloc_sync 00:06:28.714 EAL: No shared files mode enabled, IPC is disabled 00:06:28.714 EAL: Heap on socket 0 was expanded by 66MB 00:06:28.714 EAL: Calling mem event callback 'spdk:(nil)' 00:06:28.714 EAL: request: mp_malloc_sync 00:06:28.714 EAL: No shared files mode enabled, IPC is disabled 00:06:28.714 EAL: Heap on socket 0 was shrunk by 66MB 00:06:28.714 EAL: Trying to obtain current memory policy. 00:06:28.714 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:28.714 EAL: Restoring previous memory policy: 4 00:06:28.714 EAL: Calling mem event callback 'spdk:(nil)' 00:06:28.714 EAL: request: mp_malloc_sync 00:06:28.714 EAL: No shared files mode enabled, IPC is disabled 00:06:28.714 EAL: Heap on socket 0 was expanded by 130MB 00:06:28.714 EAL: Calling mem event callback 'spdk:(nil)' 00:06:28.714 EAL: request: mp_malloc_sync 00:06:28.714 EAL: No shared files mode enabled, IPC is disabled 00:06:28.714 EAL: Heap on socket 0 was shrunk by 130MB 00:06:28.714 EAL: Trying to obtain current memory policy. 00:06:28.714 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:28.714 EAL: Restoring previous memory policy: 4 00:06:28.714 EAL: Calling mem event callback 'spdk:(nil)' 00:06:28.714 EAL: request: mp_malloc_sync 00:06:28.714 EAL: No shared files mode enabled, IPC is disabled 00:06:28.714 EAL: Heap on socket 0 was expanded by 258MB 00:06:28.974 EAL: Calling mem event callback 'spdk:(nil)' 00:06:28.974 EAL: request: mp_malloc_sync 00:06:28.974 EAL: No shared files mode enabled, IPC is disabled 00:06:28.974 EAL: Heap on socket 0 was shrunk by 258MB 00:06:28.974 EAL: Trying to obtain current memory policy. 00:06:28.974 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:28.974 EAL: Restoring previous memory policy: 4 00:06:28.974 EAL: Calling mem event callback 'spdk:(nil)' 00:06:28.974 EAL: request: mp_malloc_sync 00:06:28.974 EAL: No shared files mode enabled, IPC is disabled 00:06:28.974 EAL: Heap on socket 0 was expanded by 514MB 00:06:28.974 EAL: Calling mem event callback 'spdk:(nil)' 00:06:29.233 EAL: request: mp_malloc_sync 00:06:29.233 EAL: No shared files mode enabled, IPC is disabled 00:06:29.233 EAL: Heap on socket 0 was shrunk by 514MB 00:06:29.233 EAL: Trying to obtain current memory policy. 00:06:29.233 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:29.233 EAL: Restoring previous memory policy: 4 00:06:29.233 EAL: Calling mem event callback 'spdk:(nil)' 00:06:29.233 EAL: request: mp_malloc_sync 00:06:29.233 EAL: No shared files mode enabled, IPC is disabled 00:06:29.233 EAL: Heap on socket 0 was expanded by 1026MB 00:06:29.492 EAL: Calling mem event callback 'spdk:(nil)' 00:06:29.752 EAL: request: mp_malloc_sync 00:06:29.752 EAL: No shared files mode enabled, IPC is disabled 00:06:29.752 EAL: Heap on socket 0 was shrunk by 1026MB 00:06:29.752 passed 00:06:29.752 00:06:29.752 Run Summary: Type Total Ran Passed Failed Inactive 00:06:29.752 suites 1 1 n/a 0 0 00:06:29.752 tests 2 2 2 0 0 00:06:29.752 asserts 497 497 497 0 n/a 00:06:29.752 00:06:29.752 Elapsed time = 0.973 seconds 00:06:29.752 EAL: Calling mem event callback 'spdk:(nil)' 00:06:29.752 EAL: request: mp_malloc_sync 00:06:29.752 EAL: No shared files mode enabled, IPC is disabled 00:06:29.752 EAL: Heap on socket 0 was shrunk by 2MB 00:06:29.752 EAL: No shared files mode enabled, IPC is disabled 00:06:29.752 EAL: No shared files mode enabled, IPC is disabled 00:06:29.752 EAL: No shared files mode enabled, IPC is disabled 00:06:29.752 00:06:29.752 real 0m1.114s 00:06:29.752 user 0m0.635s 00:06:29.752 sys 0m0.451s 00:06:29.752 11:00:54 env.env_vtophys -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:29.752 11:00:54 env.env_vtophys -- common/autotest_common.sh@10 -- # set +x 00:06:29.752 ************************************ 00:06:29.752 END TEST env_vtophys 00:06:29.752 ************************************ 00:06:29.752 11:00:54 env -- env/env.sh@12 -- # run_test env_pci /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/pci/pci_ut 00:06:29.752 11:00:54 env -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:29.752 11:00:54 env -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:29.752 11:00:54 env -- common/autotest_common.sh@10 -- # set +x 00:06:29.752 ************************************ 00:06:29.752 START TEST env_pci 00:06:29.752 ************************************ 00:06:29.752 11:00:54 env.env_pci -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/pci/pci_ut 00:06:29.752 00:06:29.752 00:06:29.752 CUnit - A unit testing framework for C - Version 2.1-3 00:06:29.752 http://cunit.sourceforge.net/ 00:06:29.752 00:06:29.752 00:06:29.752 Suite: pci 00:06:29.752 Test: pci_hook ...[2024-11-17 11:00:54.289577] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/pci.c:1118:spdk_pci_device_claim: *ERROR*: Cannot create lock on device /var/tmp/spdk_pci_lock_10000:00:01.0, probably process 133719 has claimed it 00:06:29.752 EAL: Cannot find device (10000:00:01.0) 00:06:29.752 EAL: Failed to attach device on primary process 00:06:29.752 passed 00:06:29.752 00:06:29.752 Run Summary: Type Total Ran Passed Failed Inactive 00:06:29.752 suites 1 1 n/a 0 0 00:06:29.753 tests 1 1 1 0 0 00:06:29.753 asserts 25 25 25 0 n/a 00:06:29.753 00:06:29.753 Elapsed time = 0.035 seconds 00:06:29.753 00:06:29.753 real 0m0.054s 00:06:29.753 user 0m0.014s 00:06:29.753 sys 0m0.040s 00:06:29.753 11:00:54 env.env_pci -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:29.753 11:00:54 env.env_pci -- common/autotest_common.sh@10 -- # set +x 00:06:29.753 ************************************ 00:06:29.753 END TEST env_pci 00:06:29.753 ************************************ 00:06:29.753 11:00:54 env -- env/env.sh@14 -- # argv='-c 0x1 ' 00:06:29.753 11:00:54 env -- env/env.sh@15 -- # uname 00:06:29.753 11:00:54 env -- env/env.sh@15 -- # '[' Linux = Linux ']' 00:06:29.753 11:00:54 env -- env/env.sh@22 -- # argv+=--base-virtaddr=0x200000000000 00:06:29.753 11:00:54 env -- env/env.sh@24 -- # run_test env_dpdk_post_init /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:06:29.753 11:00:54 env -- common/autotest_common.sh@1105 -- # '[' 5 -le 1 ']' 00:06:29.753 11:00:54 env -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:29.753 11:00:54 env -- common/autotest_common.sh@10 -- # set +x 00:06:30.013 ************************************ 00:06:30.013 START TEST env_dpdk_post_init 00:06:30.013 ************************************ 00:06:30.013 11:00:54 env.env_dpdk_post_init -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:06:30.013 EAL: Detected CPU lcores: 112 00:06:30.013 EAL: Detected NUMA nodes: 2 00:06:30.013 EAL: Detected static linkage of DPDK 00:06:30.013 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:06:30.013 EAL: Selected IOVA mode 'VA' 00:06:30.013 EAL: VFIO support initialized 00:06:30.013 TELEMETRY: No legacy callbacks, legacy socket not created 00:06:30.013 EAL: Using IOMMU type 1 (Type 1) 00:06:30.967 EAL: Probe PCI driver: spdk_nvme (8086:0a54) device: 0000:d8:00.0 (socket 1) 00:06:34.392 EAL: Releasing PCI mapped resource for 0000:d8:00.0 00:06:34.392 EAL: Calling pci_unmap_resource for 0000:d8:00.0 at 0x202001000000 00:06:34.683 Starting DPDK initialization... 00:06:34.683 Starting SPDK post initialization... 00:06:34.683 SPDK NVMe probe 00:06:34.683 Attaching to 0000:d8:00.0 00:06:34.683 Attached to 0000:d8:00.0 00:06:34.683 Cleaning up... 00:06:34.683 00:06:34.683 real 0m4.702s 00:06:34.683 user 0m3.480s 00:06:34.683 sys 0m0.468s 00:06:34.683 11:00:59 env.env_dpdk_post_init -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:34.683 11:00:59 env.env_dpdk_post_init -- common/autotest_common.sh@10 -- # set +x 00:06:34.684 ************************************ 00:06:34.684 END TEST env_dpdk_post_init 00:06:34.684 ************************************ 00:06:34.684 11:00:59 env -- env/env.sh@26 -- # uname 00:06:34.684 11:00:59 env -- env/env.sh@26 -- # '[' Linux = Linux ']' 00:06:34.684 11:00:59 env -- env/env.sh@29 -- # run_test env_mem_callbacks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:06:34.684 11:00:59 env -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:34.684 11:00:59 env -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:34.684 11:00:59 env -- common/autotest_common.sh@10 -- # set +x 00:06:34.684 ************************************ 00:06:34.684 START TEST env_mem_callbacks 00:06:34.684 ************************************ 00:06:34.684 11:00:59 env.env_mem_callbacks -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:06:34.684 EAL: Detected CPU lcores: 112 00:06:34.684 EAL: Detected NUMA nodes: 2 00:06:34.684 EAL: Detected static linkage of DPDK 00:06:34.684 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:06:34.684 EAL: Selected IOVA mode 'VA' 00:06:34.684 EAL: VFIO support initialized 00:06:34.684 TELEMETRY: No legacy callbacks, legacy socket not created 00:06:34.684 00:06:34.684 00:06:34.684 CUnit - A unit testing framework for C - Version 2.1-3 00:06:34.684 http://cunit.sourceforge.net/ 00:06:34.684 00:06:34.684 00:06:34.684 Suite: memory 00:06:34.684 Test: test ... 00:06:34.684 register 0x200000200000 2097152 00:06:34.684 malloc 3145728 00:06:34.684 register 0x200000400000 4194304 00:06:34.684 buf 0x200000500000 len 3145728 PASSED 00:06:34.684 malloc 64 00:06:34.684 buf 0x2000004fff40 len 64 PASSED 00:06:34.684 malloc 4194304 00:06:34.684 register 0x200000800000 6291456 00:06:34.684 buf 0x200000a00000 len 4194304 PASSED 00:06:34.684 free 0x200000500000 3145728 00:06:34.684 free 0x2000004fff40 64 00:06:34.684 unregister 0x200000400000 4194304 PASSED 00:06:34.684 free 0x200000a00000 4194304 00:06:34.684 unregister 0x200000800000 6291456 PASSED 00:06:34.684 malloc 8388608 00:06:34.684 register 0x200000400000 10485760 00:06:34.684 buf 0x200000600000 len 8388608 PASSED 00:06:34.684 free 0x200000600000 8388608 00:06:34.684 unregister 0x200000400000 10485760 PASSED 00:06:34.684 passed 00:06:34.684 00:06:34.684 Run Summary: Type Total Ran Passed Failed Inactive 00:06:34.684 suites 1 1 n/a 0 0 00:06:34.684 tests 1 1 1 0 0 00:06:34.684 asserts 15 15 15 0 n/a 00:06:34.684 00:06:34.684 Elapsed time = 0.008 seconds 00:06:34.684 00:06:34.684 real 0m0.069s 00:06:34.684 user 0m0.017s 00:06:34.684 sys 0m0.051s 00:06:34.684 11:00:59 env.env_mem_callbacks -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:34.684 11:00:59 env.env_mem_callbacks -- common/autotest_common.sh@10 -- # set +x 00:06:34.684 ************************************ 00:06:34.684 END TEST env_mem_callbacks 00:06:34.684 ************************************ 00:06:34.684 00:06:34.684 real 0m6.653s 00:06:34.684 user 0m4.472s 00:06:34.684 sys 0m1.445s 00:06:34.684 11:00:59 env -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:34.684 11:00:59 env -- common/autotest_common.sh@10 -- # set +x 00:06:34.684 ************************************ 00:06:34.684 END TEST env 00:06:34.684 ************************************ 00:06:34.977 11:00:59 -- spdk/autotest.sh@156 -- # run_test rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/rpc.sh 00:06:34.977 11:00:59 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:34.977 11:00:59 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:34.977 11:00:59 -- common/autotest_common.sh@10 -- # set +x 00:06:34.977 ************************************ 00:06:34.977 START TEST rpc 00:06:34.977 ************************************ 00:06:34.977 11:00:59 rpc -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/rpc.sh 00:06:34.977 * Looking for test storage... 00:06:34.978 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:06:34.978 11:00:59 rpc -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:06:34.978 11:00:59 rpc -- common/autotest_common.sh@1693 -- # lcov --version 00:06:34.978 11:00:59 rpc -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:06:34.978 11:00:59 rpc -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:06:34.978 11:00:59 rpc -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:34.978 11:00:59 rpc -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:34.978 11:00:59 rpc -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:34.978 11:00:59 rpc -- scripts/common.sh@336 -- # IFS=.-: 00:06:34.978 11:00:59 rpc -- scripts/common.sh@336 -- # read -ra ver1 00:06:34.978 11:00:59 rpc -- scripts/common.sh@337 -- # IFS=.-: 00:06:34.978 11:00:59 rpc -- scripts/common.sh@337 -- # read -ra ver2 00:06:34.978 11:00:59 rpc -- scripts/common.sh@338 -- # local 'op=<' 00:06:34.978 11:00:59 rpc -- scripts/common.sh@340 -- # ver1_l=2 00:06:34.978 11:00:59 rpc -- scripts/common.sh@341 -- # ver2_l=1 00:06:34.978 11:00:59 rpc -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:34.978 11:00:59 rpc -- scripts/common.sh@344 -- # case "$op" in 00:06:34.978 11:00:59 rpc -- scripts/common.sh@345 -- # : 1 00:06:34.978 11:00:59 rpc -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:34.978 11:00:59 rpc -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:34.978 11:00:59 rpc -- scripts/common.sh@365 -- # decimal 1 00:06:34.978 11:00:59 rpc -- scripts/common.sh@353 -- # local d=1 00:06:34.978 11:00:59 rpc -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:34.978 11:00:59 rpc -- scripts/common.sh@355 -- # echo 1 00:06:34.978 11:00:59 rpc -- scripts/common.sh@365 -- # ver1[v]=1 00:06:34.978 11:00:59 rpc -- scripts/common.sh@366 -- # decimal 2 00:06:34.978 11:00:59 rpc -- scripts/common.sh@353 -- # local d=2 00:06:34.978 11:00:59 rpc -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:34.978 11:00:59 rpc -- scripts/common.sh@355 -- # echo 2 00:06:34.978 11:00:59 rpc -- scripts/common.sh@366 -- # ver2[v]=2 00:06:34.978 11:00:59 rpc -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:34.978 11:00:59 rpc -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:34.978 11:00:59 rpc -- scripts/common.sh@368 -- # return 0 00:06:34.978 11:00:59 rpc -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:34.978 11:00:59 rpc -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:06:34.978 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:34.978 --rc genhtml_branch_coverage=1 00:06:34.978 --rc genhtml_function_coverage=1 00:06:34.978 --rc genhtml_legend=1 00:06:34.978 --rc geninfo_all_blocks=1 00:06:34.978 --rc geninfo_unexecuted_blocks=1 00:06:34.978 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:34.978 ' 00:06:34.978 11:00:59 rpc -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:06:34.978 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:34.978 --rc genhtml_branch_coverage=1 00:06:34.978 --rc genhtml_function_coverage=1 00:06:34.978 --rc genhtml_legend=1 00:06:34.978 --rc geninfo_all_blocks=1 00:06:34.978 --rc geninfo_unexecuted_blocks=1 00:06:34.978 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:34.978 ' 00:06:35.266 11:00:59 rpc -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:06:35.266 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:35.266 --rc genhtml_branch_coverage=1 00:06:35.266 --rc genhtml_function_coverage=1 00:06:35.266 --rc genhtml_legend=1 00:06:35.266 --rc geninfo_all_blocks=1 00:06:35.266 --rc geninfo_unexecuted_blocks=1 00:06:35.266 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:35.266 ' 00:06:35.266 11:00:59 rpc -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:06:35.266 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:35.266 --rc genhtml_branch_coverage=1 00:06:35.266 --rc genhtml_function_coverage=1 00:06:35.266 --rc genhtml_legend=1 00:06:35.266 --rc geninfo_all_blocks=1 00:06:35.266 --rc geninfo_unexecuted_blocks=1 00:06:35.266 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:35.266 ' 00:06:35.266 11:00:59 rpc -- rpc/rpc.sh@65 -- # spdk_pid=134890 00:06:35.266 11:00:59 rpc -- rpc/rpc.sh@64 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -e bdev 00:06:35.266 11:00:59 rpc -- rpc/rpc.sh@66 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:35.266 11:00:59 rpc -- rpc/rpc.sh@67 -- # waitforlisten 134890 00:06:35.266 11:00:59 rpc -- common/autotest_common.sh@835 -- # '[' -z 134890 ']' 00:06:35.266 11:00:59 rpc -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:35.266 11:00:59 rpc -- common/autotest_common.sh@840 -- # local max_retries=100 00:06:35.266 11:00:59 rpc -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:35.266 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:35.266 11:00:59 rpc -- common/autotest_common.sh@844 -- # xtrace_disable 00:06:35.266 11:00:59 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:35.266 [2024-11-17 11:00:59.647313] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:06:35.266 [2024-11-17 11:00:59.647372] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid134890 ] 00:06:35.266 [2024-11-17 11:00:59.732218] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:35.266 [2024-11-17 11:00:59.754057] app.c: 612:app_setup_trace: *NOTICE*: Tracepoint Group Mask bdev specified. 00:06:35.266 [2024-11-17 11:00:59.754096] app.c: 616:app_setup_trace: *NOTICE*: Use 'spdk_trace -s spdk_tgt -p 134890' to capture a snapshot of events at runtime. 00:06:35.266 [2024-11-17 11:00:59.754105] app.c: 618:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:06:35.266 [2024-11-17 11:00:59.754113] app.c: 619:app_setup_trace: *NOTICE*: SPDK application currently running. 00:06:35.266 [2024-11-17 11:00:59.754120] app.c: 620:app_setup_trace: *NOTICE*: Or copy /dev/shm/spdk_tgt_trace.pid134890 for offline analysis/debug. 00:06:35.266 [2024-11-17 11:00:59.754701] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:06:35.553 11:00:59 rpc -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:06:35.553 11:00:59 rpc -- common/autotest_common.sh@868 -- # return 0 00:06:35.553 11:00:59 rpc -- rpc/rpc.sh@69 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:06:35.553 11:00:59 rpc -- rpc/rpc.sh@69 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:06:35.553 11:00:59 rpc -- rpc/rpc.sh@72 -- # rpc=rpc_cmd 00:06:35.553 11:00:59 rpc -- rpc/rpc.sh@73 -- # run_test rpc_integrity rpc_integrity 00:06:35.553 11:00:59 rpc -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:35.553 11:00:59 rpc -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:35.554 11:00:59 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:35.554 ************************************ 00:06:35.554 START TEST rpc_integrity 00:06:35.554 ************************************ 00:06:35.554 11:00:59 rpc.rpc_integrity -- common/autotest_common.sh@1129 -- # rpc_integrity 00:06:35.554 11:00:59 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:06:35.554 11:00:59 rpc.rpc_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:35.554 11:00:59 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:35.554 11:00:59 rpc.rpc_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:35.554 11:00:59 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:06:35.554 11:00:59 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # jq length 00:06:35.554 11:01:00 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:06:35.554 11:01:00 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:06:35.554 11:01:00 rpc.rpc_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:35.554 11:01:00 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:35.554 11:01:00 rpc.rpc_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:35.554 11:01:00 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc0 00:06:35.554 11:01:00 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:06:35.554 11:01:00 rpc.rpc_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:35.554 11:01:00 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:35.554 11:01:00 rpc.rpc_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:35.554 11:01:00 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:06:35.554 { 00:06:35.554 "name": "Malloc0", 00:06:35.554 "aliases": [ 00:06:35.554 "0fed679d-e494-442d-9dbf-6ddf56756a87" 00:06:35.554 ], 00:06:35.554 "product_name": "Malloc disk", 00:06:35.554 "block_size": 512, 00:06:35.554 "num_blocks": 16384, 00:06:35.554 "uuid": "0fed679d-e494-442d-9dbf-6ddf56756a87", 00:06:35.554 "assigned_rate_limits": { 00:06:35.554 "rw_ios_per_sec": 0, 00:06:35.554 "rw_mbytes_per_sec": 0, 00:06:35.554 "r_mbytes_per_sec": 0, 00:06:35.554 "w_mbytes_per_sec": 0 00:06:35.554 }, 00:06:35.554 "claimed": false, 00:06:35.554 "zoned": false, 00:06:35.554 "supported_io_types": { 00:06:35.554 "read": true, 00:06:35.554 "write": true, 00:06:35.554 "unmap": true, 00:06:35.554 "flush": true, 00:06:35.554 "reset": true, 00:06:35.554 "nvme_admin": false, 00:06:35.554 "nvme_io": false, 00:06:35.554 "nvme_io_md": false, 00:06:35.554 "write_zeroes": true, 00:06:35.554 "zcopy": true, 00:06:35.554 "get_zone_info": false, 00:06:35.554 "zone_management": false, 00:06:35.554 "zone_append": false, 00:06:35.554 "compare": false, 00:06:35.554 "compare_and_write": false, 00:06:35.554 "abort": true, 00:06:35.554 "seek_hole": false, 00:06:35.554 "seek_data": false, 00:06:35.554 "copy": true, 00:06:35.554 "nvme_iov_md": false 00:06:35.554 }, 00:06:35.554 "memory_domains": [ 00:06:35.554 { 00:06:35.554 "dma_device_id": "system", 00:06:35.554 "dma_device_type": 1 00:06:35.554 }, 00:06:35.554 { 00:06:35.554 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:35.554 "dma_device_type": 2 00:06:35.554 } 00:06:35.554 ], 00:06:35.554 "driver_specific": {} 00:06:35.554 } 00:06:35.554 ]' 00:06:35.554 11:01:00 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # jq length 00:06:35.554 11:01:00 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:06:35.554 11:01:00 rpc.rpc_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc0 -p Passthru0 00:06:35.554 11:01:00 rpc.rpc_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:35.554 11:01:00 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:35.554 [2024-11-17 11:01:00.128443] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc0 00:06:35.554 [2024-11-17 11:01:00.128475] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:06:35.554 [2024-11-17 11:01:00.128494] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x596e8c0 00:06:35.554 [2024-11-17 11:01:00.128503] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:06:35.554 [2024-11-17 11:01:00.129444] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:06:35.554 [2024-11-17 11:01:00.129466] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:06:35.554 Passthru0 00:06:35.554 11:01:00 rpc.rpc_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:35.554 11:01:00 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:06:35.554 11:01:00 rpc.rpc_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:35.554 11:01:00 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:35.554 11:01:00 rpc.rpc_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:35.554 11:01:00 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:06:35.554 { 00:06:35.554 "name": "Malloc0", 00:06:35.554 "aliases": [ 00:06:35.554 "0fed679d-e494-442d-9dbf-6ddf56756a87" 00:06:35.554 ], 00:06:35.554 "product_name": "Malloc disk", 00:06:35.554 "block_size": 512, 00:06:35.554 "num_blocks": 16384, 00:06:35.554 "uuid": "0fed679d-e494-442d-9dbf-6ddf56756a87", 00:06:35.554 "assigned_rate_limits": { 00:06:35.554 "rw_ios_per_sec": 0, 00:06:35.554 "rw_mbytes_per_sec": 0, 00:06:35.554 "r_mbytes_per_sec": 0, 00:06:35.554 "w_mbytes_per_sec": 0 00:06:35.554 }, 00:06:35.554 "claimed": true, 00:06:35.554 "claim_type": "exclusive_write", 00:06:35.554 "zoned": false, 00:06:35.554 "supported_io_types": { 00:06:35.554 "read": true, 00:06:35.554 "write": true, 00:06:35.554 "unmap": true, 00:06:35.554 "flush": true, 00:06:35.554 "reset": true, 00:06:35.554 "nvme_admin": false, 00:06:35.554 "nvme_io": false, 00:06:35.554 "nvme_io_md": false, 00:06:35.554 "write_zeroes": true, 00:06:35.554 "zcopy": true, 00:06:35.554 "get_zone_info": false, 00:06:35.554 "zone_management": false, 00:06:35.554 "zone_append": false, 00:06:35.554 "compare": false, 00:06:35.554 "compare_and_write": false, 00:06:35.554 "abort": true, 00:06:35.554 "seek_hole": false, 00:06:35.554 "seek_data": false, 00:06:35.554 "copy": true, 00:06:35.554 "nvme_iov_md": false 00:06:35.554 }, 00:06:35.554 "memory_domains": [ 00:06:35.554 { 00:06:35.554 "dma_device_id": "system", 00:06:35.554 "dma_device_type": 1 00:06:35.554 }, 00:06:35.554 { 00:06:35.554 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:35.554 "dma_device_type": 2 00:06:35.554 } 00:06:35.554 ], 00:06:35.554 "driver_specific": {} 00:06:35.554 }, 00:06:35.554 { 00:06:35.554 "name": "Passthru0", 00:06:35.554 "aliases": [ 00:06:35.554 "2f658606-a922-56a5-bfbe-c1636eddc2ba" 00:06:35.554 ], 00:06:35.554 "product_name": "passthru", 00:06:35.554 "block_size": 512, 00:06:35.554 "num_blocks": 16384, 00:06:35.554 "uuid": "2f658606-a922-56a5-bfbe-c1636eddc2ba", 00:06:35.554 "assigned_rate_limits": { 00:06:35.554 "rw_ios_per_sec": 0, 00:06:35.554 "rw_mbytes_per_sec": 0, 00:06:35.554 "r_mbytes_per_sec": 0, 00:06:35.554 "w_mbytes_per_sec": 0 00:06:35.554 }, 00:06:35.554 "claimed": false, 00:06:35.554 "zoned": false, 00:06:35.554 "supported_io_types": { 00:06:35.554 "read": true, 00:06:35.554 "write": true, 00:06:35.554 "unmap": true, 00:06:35.554 "flush": true, 00:06:35.554 "reset": true, 00:06:35.554 "nvme_admin": false, 00:06:35.554 "nvme_io": false, 00:06:35.554 "nvme_io_md": false, 00:06:35.554 "write_zeroes": true, 00:06:35.554 "zcopy": true, 00:06:35.554 "get_zone_info": false, 00:06:35.554 "zone_management": false, 00:06:35.554 "zone_append": false, 00:06:35.554 "compare": false, 00:06:35.554 "compare_and_write": false, 00:06:35.554 "abort": true, 00:06:35.554 "seek_hole": false, 00:06:35.554 "seek_data": false, 00:06:35.554 "copy": true, 00:06:35.554 "nvme_iov_md": false 00:06:35.554 }, 00:06:35.554 "memory_domains": [ 00:06:35.554 { 00:06:35.554 "dma_device_id": "system", 00:06:35.554 "dma_device_type": 1 00:06:35.554 }, 00:06:35.554 { 00:06:35.554 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:35.554 "dma_device_type": 2 00:06:35.554 } 00:06:35.554 ], 00:06:35.554 "driver_specific": { 00:06:35.554 "passthru": { 00:06:35.554 "name": "Passthru0", 00:06:35.554 "base_bdev_name": "Malloc0" 00:06:35.554 } 00:06:35.554 } 00:06:35.554 } 00:06:35.554 ]' 00:06:35.554 11:01:00 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # jq length 00:06:35.829 11:01:00 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:06:35.829 11:01:00 rpc.rpc_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:06:35.829 11:01:00 rpc.rpc_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:35.829 11:01:00 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:35.829 11:01:00 rpc.rpc_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:35.829 11:01:00 rpc.rpc_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc0 00:06:35.829 11:01:00 rpc.rpc_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:35.829 11:01:00 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:35.829 11:01:00 rpc.rpc_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:35.829 11:01:00 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:06:35.829 11:01:00 rpc.rpc_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:35.829 11:01:00 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:35.829 11:01:00 rpc.rpc_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:35.829 11:01:00 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:06:35.829 11:01:00 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # jq length 00:06:35.829 11:01:00 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:06:35.829 00:06:35.829 real 0m0.303s 00:06:35.829 user 0m0.183s 00:06:35.829 sys 0m0.054s 00:06:35.829 11:01:00 rpc.rpc_integrity -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:35.829 11:01:00 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:35.829 ************************************ 00:06:35.829 END TEST rpc_integrity 00:06:35.829 ************************************ 00:06:35.829 11:01:00 rpc -- rpc/rpc.sh@74 -- # run_test rpc_plugins rpc_plugins 00:06:35.829 11:01:00 rpc -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:35.829 11:01:00 rpc -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:35.829 11:01:00 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:35.829 ************************************ 00:06:35.829 START TEST rpc_plugins 00:06:35.829 ************************************ 00:06:35.829 11:01:00 rpc.rpc_plugins -- common/autotest_common.sh@1129 -- # rpc_plugins 00:06:35.829 11:01:00 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # rpc_cmd --plugin rpc_plugin create_malloc 00:06:35.829 11:01:00 rpc.rpc_plugins -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:35.829 11:01:00 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:35.829 11:01:00 rpc.rpc_plugins -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:35.829 11:01:00 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # malloc=Malloc1 00:06:35.829 11:01:00 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # rpc_cmd bdev_get_bdevs 00:06:35.829 11:01:00 rpc.rpc_plugins -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:35.829 11:01:00 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:35.829 11:01:00 rpc.rpc_plugins -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:35.829 11:01:00 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # bdevs='[ 00:06:35.829 { 00:06:35.829 "name": "Malloc1", 00:06:35.829 "aliases": [ 00:06:35.829 "61af0c2b-dd8c-434a-8a4e-63d93b0cab37" 00:06:35.829 ], 00:06:35.829 "product_name": "Malloc disk", 00:06:35.829 "block_size": 4096, 00:06:35.829 "num_blocks": 256, 00:06:35.829 "uuid": "61af0c2b-dd8c-434a-8a4e-63d93b0cab37", 00:06:35.829 "assigned_rate_limits": { 00:06:35.829 "rw_ios_per_sec": 0, 00:06:35.829 "rw_mbytes_per_sec": 0, 00:06:35.829 "r_mbytes_per_sec": 0, 00:06:35.829 "w_mbytes_per_sec": 0 00:06:35.829 }, 00:06:35.829 "claimed": false, 00:06:35.829 "zoned": false, 00:06:35.829 "supported_io_types": { 00:06:35.829 "read": true, 00:06:35.829 "write": true, 00:06:35.829 "unmap": true, 00:06:35.829 "flush": true, 00:06:35.829 "reset": true, 00:06:35.829 "nvme_admin": false, 00:06:35.829 "nvme_io": false, 00:06:35.829 "nvme_io_md": false, 00:06:35.829 "write_zeroes": true, 00:06:35.829 "zcopy": true, 00:06:35.829 "get_zone_info": false, 00:06:35.829 "zone_management": false, 00:06:35.829 "zone_append": false, 00:06:35.829 "compare": false, 00:06:35.829 "compare_and_write": false, 00:06:35.829 "abort": true, 00:06:35.829 "seek_hole": false, 00:06:35.829 "seek_data": false, 00:06:35.829 "copy": true, 00:06:35.829 "nvme_iov_md": false 00:06:35.829 }, 00:06:35.829 "memory_domains": [ 00:06:35.829 { 00:06:35.829 "dma_device_id": "system", 00:06:35.829 "dma_device_type": 1 00:06:35.829 }, 00:06:35.829 { 00:06:35.829 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:35.829 "dma_device_type": 2 00:06:35.829 } 00:06:35.829 ], 00:06:35.829 "driver_specific": {} 00:06:35.829 } 00:06:35.829 ]' 00:06:35.829 11:01:00 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # jq length 00:06:35.829 11:01:00 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # '[' 1 == 1 ']' 00:06:35.829 11:01:00 rpc.rpc_plugins -- rpc/rpc.sh@34 -- # rpc_cmd --plugin rpc_plugin delete_malloc Malloc1 00:06:35.829 11:01:00 rpc.rpc_plugins -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:35.829 11:01:00 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:35.829 11:01:00 rpc.rpc_plugins -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:35.829 11:01:00 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # rpc_cmd bdev_get_bdevs 00:06:35.830 11:01:00 rpc.rpc_plugins -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:35.830 11:01:00 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:35.830 11:01:00 rpc.rpc_plugins -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:35.830 11:01:00 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # bdevs='[]' 00:06:36.110 11:01:00 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # jq length 00:06:36.110 11:01:00 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # '[' 0 == 0 ']' 00:06:36.110 00:06:36.110 real 0m0.150s 00:06:36.110 user 0m0.093s 00:06:36.110 sys 0m0.020s 00:06:36.110 11:01:00 rpc.rpc_plugins -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:36.110 11:01:00 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:36.110 ************************************ 00:06:36.110 END TEST rpc_plugins 00:06:36.110 ************************************ 00:06:36.110 11:01:00 rpc -- rpc/rpc.sh@75 -- # run_test rpc_trace_cmd_test rpc_trace_cmd_test 00:06:36.110 11:01:00 rpc -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:36.110 11:01:00 rpc -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:36.110 11:01:00 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:36.110 ************************************ 00:06:36.110 START TEST rpc_trace_cmd_test 00:06:36.110 ************************************ 00:06:36.110 11:01:00 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1129 -- # rpc_trace_cmd_test 00:06:36.110 11:01:00 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@40 -- # local info 00:06:36.110 11:01:00 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # rpc_cmd trace_get_info 00:06:36.110 11:01:00 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:36.110 11:01:00 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:06:36.110 11:01:00 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:36.110 11:01:00 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # info='{ 00:06:36.110 "tpoint_shm_path": "/dev/shm/spdk_tgt_trace.pid134890", 00:06:36.110 "tpoint_group_mask": "0x8", 00:06:36.110 "iscsi_conn": { 00:06:36.110 "mask": "0x2", 00:06:36.110 "tpoint_mask": "0x0" 00:06:36.110 }, 00:06:36.110 "scsi": { 00:06:36.110 "mask": "0x4", 00:06:36.110 "tpoint_mask": "0x0" 00:06:36.110 }, 00:06:36.110 "bdev": { 00:06:36.110 "mask": "0x8", 00:06:36.110 "tpoint_mask": "0xffffffffffffffff" 00:06:36.110 }, 00:06:36.110 "nvmf_rdma": { 00:06:36.110 "mask": "0x10", 00:06:36.110 "tpoint_mask": "0x0" 00:06:36.110 }, 00:06:36.110 "nvmf_tcp": { 00:06:36.110 "mask": "0x20", 00:06:36.110 "tpoint_mask": "0x0" 00:06:36.110 }, 00:06:36.110 "ftl": { 00:06:36.110 "mask": "0x40", 00:06:36.110 "tpoint_mask": "0x0" 00:06:36.110 }, 00:06:36.110 "blobfs": { 00:06:36.110 "mask": "0x80", 00:06:36.110 "tpoint_mask": "0x0" 00:06:36.110 }, 00:06:36.110 "dsa": { 00:06:36.110 "mask": "0x200", 00:06:36.110 "tpoint_mask": "0x0" 00:06:36.110 }, 00:06:36.110 "thread": { 00:06:36.110 "mask": "0x400", 00:06:36.110 "tpoint_mask": "0x0" 00:06:36.110 }, 00:06:36.110 "nvme_pcie": { 00:06:36.110 "mask": "0x800", 00:06:36.110 "tpoint_mask": "0x0" 00:06:36.110 }, 00:06:36.110 "iaa": { 00:06:36.110 "mask": "0x1000", 00:06:36.110 "tpoint_mask": "0x0" 00:06:36.110 }, 00:06:36.110 "nvme_tcp": { 00:06:36.110 "mask": "0x2000", 00:06:36.110 "tpoint_mask": "0x0" 00:06:36.110 }, 00:06:36.110 "bdev_nvme": { 00:06:36.110 "mask": "0x4000", 00:06:36.110 "tpoint_mask": "0x0" 00:06:36.110 }, 00:06:36.110 "sock": { 00:06:36.110 "mask": "0x8000", 00:06:36.110 "tpoint_mask": "0x0" 00:06:36.110 }, 00:06:36.110 "blob": { 00:06:36.110 "mask": "0x10000", 00:06:36.110 "tpoint_mask": "0x0" 00:06:36.110 }, 00:06:36.110 "bdev_raid": { 00:06:36.110 "mask": "0x20000", 00:06:36.110 "tpoint_mask": "0x0" 00:06:36.110 }, 00:06:36.110 "scheduler": { 00:06:36.110 "mask": "0x40000", 00:06:36.110 "tpoint_mask": "0x0" 00:06:36.110 } 00:06:36.110 }' 00:06:36.111 11:01:00 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # jq length 00:06:36.111 11:01:00 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # '[' 19 -gt 2 ']' 00:06:36.111 11:01:00 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # jq 'has("tpoint_group_mask")' 00:06:36.111 11:01:00 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # '[' true = true ']' 00:06:36.111 11:01:00 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # jq 'has("tpoint_shm_path")' 00:06:36.385 11:01:00 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # '[' true = true ']' 00:06:36.385 11:01:00 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # jq 'has("bdev")' 00:06:36.385 11:01:00 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # '[' true = true ']' 00:06:36.385 11:01:00 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # jq -r .bdev.tpoint_mask 00:06:36.385 11:01:00 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # '[' 0xffffffffffffffff '!=' 0x0 ']' 00:06:36.385 00:06:36.385 real 0m0.240s 00:06:36.385 user 0m0.195s 00:06:36.385 sys 0m0.037s 00:06:36.385 11:01:00 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:36.385 11:01:00 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:06:36.385 ************************************ 00:06:36.385 END TEST rpc_trace_cmd_test 00:06:36.385 ************************************ 00:06:36.385 11:01:00 rpc -- rpc/rpc.sh@76 -- # [[ 0 -eq 1 ]] 00:06:36.385 11:01:00 rpc -- rpc/rpc.sh@80 -- # rpc=rpc_cmd 00:06:36.385 11:01:00 rpc -- rpc/rpc.sh@81 -- # run_test rpc_daemon_integrity rpc_integrity 00:06:36.385 11:01:00 rpc -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:36.385 11:01:00 rpc -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:36.385 11:01:00 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:36.385 ************************************ 00:06:36.385 START TEST rpc_daemon_integrity 00:06:36.385 ************************************ 00:06:36.385 11:01:00 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1129 -- # rpc_integrity 00:06:36.385 11:01:00 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:06:36.385 11:01:00 rpc.rpc_daemon_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:36.385 11:01:00 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:36.385 11:01:00 rpc.rpc_daemon_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:36.385 11:01:00 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:06:36.385 11:01:00 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # jq length 00:06:36.385 11:01:00 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:06:36.385 11:01:00 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:06:36.385 11:01:00 rpc.rpc_daemon_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:36.385 11:01:00 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:36.385 11:01:00 rpc.rpc_daemon_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:36.385 11:01:00 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc2 00:06:36.385 11:01:00 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:06:36.385 11:01:00 rpc.rpc_daemon_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:36.385 11:01:00 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:36.385 11:01:01 rpc.rpc_daemon_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:36.385 11:01:01 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:06:36.385 { 00:06:36.385 "name": "Malloc2", 00:06:36.385 "aliases": [ 00:06:36.385 "0065f0f7-55b6-49fe-8b5a-fd73c64f169b" 00:06:36.385 ], 00:06:36.385 "product_name": "Malloc disk", 00:06:36.385 "block_size": 512, 00:06:36.385 "num_blocks": 16384, 00:06:36.385 "uuid": "0065f0f7-55b6-49fe-8b5a-fd73c64f169b", 00:06:36.385 "assigned_rate_limits": { 00:06:36.385 "rw_ios_per_sec": 0, 00:06:36.385 "rw_mbytes_per_sec": 0, 00:06:36.385 "r_mbytes_per_sec": 0, 00:06:36.385 "w_mbytes_per_sec": 0 00:06:36.385 }, 00:06:36.385 "claimed": false, 00:06:36.385 "zoned": false, 00:06:36.385 "supported_io_types": { 00:06:36.385 "read": true, 00:06:36.385 "write": true, 00:06:36.385 "unmap": true, 00:06:36.385 "flush": true, 00:06:36.385 "reset": true, 00:06:36.385 "nvme_admin": false, 00:06:36.385 "nvme_io": false, 00:06:36.385 "nvme_io_md": false, 00:06:36.385 "write_zeroes": true, 00:06:36.385 "zcopy": true, 00:06:36.385 "get_zone_info": false, 00:06:36.385 "zone_management": false, 00:06:36.385 "zone_append": false, 00:06:36.385 "compare": false, 00:06:36.385 "compare_and_write": false, 00:06:36.385 "abort": true, 00:06:36.385 "seek_hole": false, 00:06:36.385 "seek_data": false, 00:06:36.385 "copy": true, 00:06:36.385 "nvme_iov_md": false 00:06:36.385 }, 00:06:36.385 "memory_domains": [ 00:06:36.385 { 00:06:36.385 "dma_device_id": "system", 00:06:36.385 "dma_device_type": 1 00:06:36.385 }, 00:06:36.385 { 00:06:36.385 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:36.385 "dma_device_type": 2 00:06:36.385 } 00:06:36.385 ], 00:06:36.385 "driver_specific": {} 00:06:36.385 } 00:06:36.385 ]' 00:06:36.385 11:01:01 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # jq length 00:06:36.679 11:01:01 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:06:36.679 11:01:01 rpc.rpc_daemon_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc2 -p Passthru0 00:06:36.679 11:01:01 rpc.rpc_daemon_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:36.679 11:01:01 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:36.679 [2024-11-17 11:01:01.071022] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc2 00:06:36.679 [2024-11-17 11:01:01.071057] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:06:36.679 [2024-11-17 11:01:01.071080] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x5964a20 00:06:36.679 [2024-11-17 11:01:01.071089] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:06:36.679 [2024-11-17 11:01:01.071834] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:06:36.679 [2024-11-17 11:01:01.071855] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:06:36.679 Passthru0 00:06:36.679 11:01:01 rpc.rpc_daemon_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:36.679 11:01:01 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:06:36.679 11:01:01 rpc.rpc_daemon_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:36.679 11:01:01 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:36.679 11:01:01 rpc.rpc_daemon_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:36.679 11:01:01 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:06:36.679 { 00:06:36.679 "name": "Malloc2", 00:06:36.679 "aliases": [ 00:06:36.679 "0065f0f7-55b6-49fe-8b5a-fd73c64f169b" 00:06:36.679 ], 00:06:36.679 "product_name": "Malloc disk", 00:06:36.679 "block_size": 512, 00:06:36.679 "num_blocks": 16384, 00:06:36.679 "uuid": "0065f0f7-55b6-49fe-8b5a-fd73c64f169b", 00:06:36.679 "assigned_rate_limits": { 00:06:36.679 "rw_ios_per_sec": 0, 00:06:36.679 "rw_mbytes_per_sec": 0, 00:06:36.679 "r_mbytes_per_sec": 0, 00:06:36.679 "w_mbytes_per_sec": 0 00:06:36.679 }, 00:06:36.679 "claimed": true, 00:06:36.679 "claim_type": "exclusive_write", 00:06:36.679 "zoned": false, 00:06:36.679 "supported_io_types": { 00:06:36.679 "read": true, 00:06:36.679 "write": true, 00:06:36.679 "unmap": true, 00:06:36.680 "flush": true, 00:06:36.680 "reset": true, 00:06:36.680 "nvme_admin": false, 00:06:36.680 "nvme_io": false, 00:06:36.680 "nvme_io_md": false, 00:06:36.680 "write_zeroes": true, 00:06:36.680 "zcopy": true, 00:06:36.680 "get_zone_info": false, 00:06:36.680 "zone_management": false, 00:06:36.680 "zone_append": false, 00:06:36.680 "compare": false, 00:06:36.680 "compare_and_write": false, 00:06:36.680 "abort": true, 00:06:36.680 "seek_hole": false, 00:06:36.680 "seek_data": false, 00:06:36.680 "copy": true, 00:06:36.680 "nvme_iov_md": false 00:06:36.680 }, 00:06:36.680 "memory_domains": [ 00:06:36.680 { 00:06:36.680 "dma_device_id": "system", 00:06:36.680 "dma_device_type": 1 00:06:36.680 }, 00:06:36.680 { 00:06:36.680 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:36.680 "dma_device_type": 2 00:06:36.680 } 00:06:36.680 ], 00:06:36.680 "driver_specific": {} 00:06:36.680 }, 00:06:36.680 { 00:06:36.680 "name": "Passthru0", 00:06:36.680 "aliases": [ 00:06:36.680 "47e4a805-778f-5e52-8982-ee52abbb7402" 00:06:36.680 ], 00:06:36.680 "product_name": "passthru", 00:06:36.680 "block_size": 512, 00:06:36.680 "num_blocks": 16384, 00:06:36.680 "uuid": "47e4a805-778f-5e52-8982-ee52abbb7402", 00:06:36.680 "assigned_rate_limits": { 00:06:36.680 "rw_ios_per_sec": 0, 00:06:36.680 "rw_mbytes_per_sec": 0, 00:06:36.680 "r_mbytes_per_sec": 0, 00:06:36.680 "w_mbytes_per_sec": 0 00:06:36.680 }, 00:06:36.680 "claimed": false, 00:06:36.680 "zoned": false, 00:06:36.680 "supported_io_types": { 00:06:36.680 "read": true, 00:06:36.680 "write": true, 00:06:36.680 "unmap": true, 00:06:36.680 "flush": true, 00:06:36.680 "reset": true, 00:06:36.680 "nvme_admin": false, 00:06:36.680 "nvme_io": false, 00:06:36.680 "nvme_io_md": false, 00:06:36.680 "write_zeroes": true, 00:06:36.680 "zcopy": true, 00:06:36.680 "get_zone_info": false, 00:06:36.680 "zone_management": false, 00:06:36.680 "zone_append": false, 00:06:36.680 "compare": false, 00:06:36.680 "compare_and_write": false, 00:06:36.680 "abort": true, 00:06:36.680 "seek_hole": false, 00:06:36.680 "seek_data": false, 00:06:36.680 "copy": true, 00:06:36.680 "nvme_iov_md": false 00:06:36.680 }, 00:06:36.680 "memory_domains": [ 00:06:36.680 { 00:06:36.680 "dma_device_id": "system", 00:06:36.680 "dma_device_type": 1 00:06:36.680 }, 00:06:36.680 { 00:06:36.680 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:36.680 "dma_device_type": 2 00:06:36.680 } 00:06:36.680 ], 00:06:36.680 "driver_specific": { 00:06:36.680 "passthru": { 00:06:36.680 "name": "Passthru0", 00:06:36.680 "base_bdev_name": "Malloc2" 00:06:36.680 } 00:06:36.680 } 00:06:36.680 } 00:06:36.680 ]' 00:06:36.680 11:01:01 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # jq length 00:06:36.680 11:01:01 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:06:36.680 11:01:01 rpc.rpc_daemon_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:06:36.680 11:01:01 rpc.rpc_daemon_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:36.680 11:01:01 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:36.680 11:01:01 rpc.rpc_daemon_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:36.680 11:01:01 rpc.rpc_daemon_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc2 00:06:36.680 11:01:01 rpc.rpc_daemon_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:36.680 11:01:01 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:36.680 11:01:01 rpc.rpc_daemon_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:36.680 11:01:01 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:06:36.680 11:01:01 rpc.rpc_daemon_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:36.680 11:01:01 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:36.680 11:01:01 rpc.rpc_daemon_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:36.680 11:01:01 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:06:36.680 11:01:01 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # jq length 00:06:36.680 11:01:01 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:06:36.680 00:06:36.680 real 0m0.305s 00:06:36.680 user 0m0.182s 00:06:36.680 sys 0m0.058s 00:06:36.680 11:01:01 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:36.680 11:01:01 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:36.680 ************************************ 00:06:36.680 END TEST rpc_daemon_integrity 00:06:36.680 ************************************ 00:06:36.680 11:01:01 rpc -- rpc/rpc.sh@83 -- # trap - SIGINT SIGTERM EXIT 00:06:36.680 11:01:01 rpc -- rpc/rpc.sh@84 -- # killprocess 134890 00:06:36.680 11:01:01 rpc -- common/autotest_common.sh@954 -- # '[' -z 134890 ']' 00:06:36.680 11:01:01 rpc -- common/autotest_common.sh@958 -- # kill -0 134890 00:06:36.680 11:01:01 rpc -- common/autotest_common.sh@959 -- # uname 00:06:36.680 11:01:01 rpc -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:06:36.680 11:01:01 rpc -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 134890 00:06:37.000 11:01:01 rpc -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:06:37.000 11:01:01 rpc -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:06:37.000 11:01:01 rpc -- common/autotest_common.sh@972 -- # echo 'killing process with pid 134890' 00:06:37.000 killing process with pid 134890 00:06:37.000 11:01:01 rpc -- common/autotest_common.sh@973 -- # kill 134890 00:06:37.000 11:01:01 rpc -- common/autotest_common.sh@978 -- # wait 134890 00:06:37.307 00:06:37.307 real 0m2.200s 00:06:37.307 user 0m2.804s 00:06:37.307 sys 0m0.830s 00:06:37.307 11:01:01 rpc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:37.307 11:01:01 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:37.308 ************************************ 00:06:37.308 END TEST rpc 00:06:37.308 ************************************ 00:06:37.308 11:01:01 -- spdk/autotest.sh@157 -- # run_test skip_rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:06:37.308 11:01:01 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:37.308 11:01:01 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:37.308 11:01:01 -- common/autotest_common.sh@10 -- # set +x 00:06:37.308 ************************************ 00:06:37.308 START TEST skip_rpc 00:06:37.308 ************************************ 00:06:37.308 11:01:01 skip_rpc -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:06:37.308 * Looking for test storage... 00:06:37.308 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:06:37.308 11:01:01 skip_rpc -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:06:37.308 11:01:01 skip_rpc -- common/autotest_common.sh@1693 -- # lcov --version 00:06:37.308 11:01:01 skip_rpc -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:06:37.308 11:01:01 skip_rpc -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:06:37.308 11:01:01 skip_rpc -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:37.308 11:01:01 skip_rpc -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:37.308 11:01:01 skip_rpc -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:37.308 11:01:01 skip_rpc -- scripts/common.sh@336 -- # IFS=.-: 00:06:37.308 11:01:01 skip_rpc -- scripts/common.sh@336 -- # read -ra ver1 00:06:37.308 11:01:01 skip_rpc -- scripts/common.sh@337 -- # IFS=.-: 00:06:37.308 11:01:01 skip_rpc -- scripts/common.sh@337 -- # read -ra ver2 00:06:37.308 11:01:01 skip_rpc -- scripts/common.sh@338 -- # local 'op=<' 00:06:37.308 11:01:01 skip_rpc -- scripts/common.sh@340 -- # ver1_l=2 00:06:37.308 11:01:01 skip_rpc -- scripts/common.sh@341 -- # ver2_l=1 00:06:37.308 11:01:01 skip_rpc -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:37.308 11:01:01 skip_rpc -- scripts/common.sh@344 -- # case "$op" in 00:06:37.308 11:01:01 skip_rpc -- scripts/common.sh@345 -- # : 1 00:06:37.308 11:01:01 skip_rpc -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:37.308 11:01:01 skip_rpc -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:37.308 11:01:01 skip_rpc -- scripts/common.sh@365 -- # decimal 1 00:06:37.308 11:01:01 skip_rpc -- scripts/common.sh@353 -- # local d=1 00:06:37.308 11:01:01 skip_rpc -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:37.308 11:01:01 skip_rpc -- scripts/common.sh@355 -- # echo 1 00:06:37.308 11:01:01 skip_rpc -- scripts/common.sh@365 -- # ver1[v]=1 00:06:37.308 11:01:01 skip_rpc -- scripts/common.sh@366 -- # decimal 2 00:06:37.308 11:01:01 skip_rpc -- scripts/common.sh@353 -- # local d=2 00:06:37.308 11:01:01 skip_rpc -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:37.308 11:01:01 skip_rpc -- scripts/common.sh@355 -- # echo 2 00:06:37.308 11:01:01 skip_rpc -- scripts/common.sh@366 -- # ver2[v]=2 00:06:37.308 11:01:01 skip_rpc -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:37.308 11:01:01 skip_rpc -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:37.308 11:01:01 skip_rpc -- scripts/common.sh@368 -- # return 0 00:06:37.308 11:01:01 skip_rpc -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:37.308 11:01:01 skip_rpc -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:06:37.308 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:37.308 --rc genhtml_branch_coverage=1 00:06:37.308 --rc genhtml_function_coverage=1 00:06:37.308 --rc genhtml_legend=1 00:06:37.308 --rc geninfo_all_blocks=1 00:06:37.308 --rc geninfo_unexecuted_blocks=1 00:06:37.308 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:37.308 ' 00:06:37.308 11:01:01 skip_rpc -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:06:37.308 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:37.308 --rc genhtml_branch_coverage=1 00:06:37.308 --rc genhtml_function_coverage=1 00:06:37.308 --rc genhtml_legend=1 00:06:37.308 --rc geninfo_all_blocks=1 00:06:37.308 --rc geninfo_unexecuted_blocks=1 00:06:37.308 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:37.308 ' 00:06:37.308 11:01:01 skip_rpc -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:06:37.308 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:37.308 --rc genhtml_branch_coverage=1 00:06:37.308 --rc genhtml_function_coverage=1 00:06:37.308 --rc genhtml_legend=1 00:06:37.308 --rc geninfo_all_blocks=1 00:06:37.308 --rc geninfo_unexecuted_blocks=1 00:06:37.308 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:37.308 ' 00:06:37.308 11:01:01 skip_rpc -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:06:37.308 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:37.308 --rc genhtml_branch_coverage=1 00:06:37.308 --rc genhtml_function_coverage=1 00:06:37.308 --rc genhtml_legend=1 00:06:37.308 --rc geninfo_all_blocks=1 00:06:37.308 --rc geninfo_unexecuted_blocks=1 00:06:37.308 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:37.308 ' 00:06:37.308 11:01:01 skip_rpc -- rpc/skip_rpc.sh@11 -- # CONFIG_PATH=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/config.json 00:06:37.308 11:01:01 skip_rpc -- rpc/skip_rpc.sh@12 -- # LOG_PATH=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/log.txt 00:06:37.308 11:01:01 skip_rpc -- rpc/skip_rpc.sh@73 -- # run_test skip_rpc test_skip_rpc 00:06:37.308 11:01:01 skip_rpc -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:37.308 11:01:01 skip_rpc -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:37.308 11:01:01 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:37.308 ************************************ 00:06:37.308 START TEST skip_rpc 00:06:37.308 ************************************ 00:06:37.308 11:01:01 skip_rpc.skip_rpc -- common/autotest_common.sh@1129 -- # test_skip_rpc 00:06:37.586 11:01:01 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@16 -- # local spdk_pid=135370 00:06:37.586 11:01:01 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@18 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:37.586 11:01:01 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 00:06:37.586 11:01:01 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@19 -- # sleep 5 00:06:37.586 [2024-11-17 11:01:01.972548] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:06:37.586 [2024-11-17 11:01:01.972605] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid135370 ] 00:06:37.586 [2024-11-17 11:01:02.057683] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:37.586 [2024-11-17 11:01:02.079239] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:06:42.875 11:01:06 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@21 -- # NOT rpc_cmd spdk_get_version 00:06:42.875 11:01:06 skip_rpc.skip_rpc -- common/autotest_common.sh@652 -- # local es=0 00:06:42.875 11:01:06 skip_rpc.skip_rpc -- common/autotest_common.sh@654 -- # valid_exec_arg rpc_cmd spdk_get_version 00:06:42.875 11:01:06 skip_rpc.skip_rpc -- common/autotest_common.sh@640 -- # local arg=rpc_cmd 00:06:42.875 11:01:06 skip_rpc.skip_rpc -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:06:42.875 11:01:06 skip_rpc.skip_rpc -- common/autotest_common.sh@644 -- # type -t rpc_cmd 00:06:42.875 11:01:06 skip_rpc.skip_rpc -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:06:42.875 11:01:06 skip_rpc.skip_rpc -- common/autotest_common.sh@655 -- # rpc_cmd spdk_get_version 00:06:42.875 11:01:06 skip_rpc.skip_rpc -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:42.875 11:01:06 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:42.875 11:01:06 skip_rpc.skip_rpc -- common/autotest_common.sh@591 -- # [[ 1 == 0 ]] 00:06:42.875 11:01:06 skip_rpc.skip_rpc -- common/autotest_common.sh@655 -- # es=1 00:06:42.875 11:01:06 skip_rpc.skip_rpc -- common/autotest_common.sh@663 -- # (( es > 128 )) 00:06:42.875 11:01:06 skip_rpc.skip_rpc -- common/autotest_common.sh@674 -- # [[ -n '' ]] 00:06:42.875 11:01:06 skip_rpc.skip_rpc -- common/autotest_common.sh@679 -- # (( !es == 0 )) 00:06:42.875 11:01:06 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@22 -- # trap - SIGINT SIGTERM EXIT 00:06:42.875 11:01:06 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@23 -- # killprocess 135370 00:06:42.875 11:01:06 skip_rpc.skip_rpc -- common/autotest_common.sh@954 -- # '[' -z 135370 ']' 00:06:42.875 11:01:06 skip_rpc.skip_rpc -- common/autotest_common.sh@958 -- # kill -0 135370 00:06:42.875 11:01:06 skip_rpc.skip_rpc -- common/autotest_common.sh@959 -- # uname 00:06:42.875 11:01:06 skip_rpc.skip_rpc -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:06:42.875 11:01:06 skip_rpc.skip_rpc -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 135370 00:06:42.875 11:01:07 skip_rpc.skip_rpc -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:06:42.875 11:01:07 skip_rpc.skip_rpc -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:06:42.875 11:01:07 skip_rpc.skip_rpc -- common/autotest_common.sh@972 -- # echo 'killing process with pid 135370' 00:06:42.875 killing process with pid 135370 00:06:42.875 11:01:07 skip_rpc.skip_rpc -- common/autotest_common.sh@973 -- # kill 135370 00:06:42.875 11:01:07 skip_rpc.skip_rpc -- common/autotest_common.sh@978 -- # wait 135370 00:06:42.875 00:06:42.875 real 0m5.365s 00:06:42.875 user 0m5.118s 00:06:42.875 sys 0m0.298s 00:06:42.875 11:01:07 skip_rpc.skip_rpc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:42.875 11:01:07 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:42.875 ************************************ 00:06:42.875 END TEST skip_rpc 00:06:42.875 ************************************ 00:06:42.875 11:01:07 skip_rpc -- rpc/skip_rpc.sh@74 -- # run_test skip_rpc_with_json test_skip_rpc_with_json 00:06:42.875 11:01:07 skip_rpc -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:42.875 11:01:07 skip_rpc -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:42.875 11:01:07 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:42.875 ************************************ 00:06:42.875 START TEST skip_rpc_with_json 00:06:42.875 ************************************ 00:06:42.875 11:01:07 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1129 -- # test_skip_rpc_with_json 00:06:42.875 11:01:07 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@44 -- # gen_json_config 00:06:42.875 11:01:07 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@28 -- # local spdk_pid=136457 00:06:42.875 11:01:07 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@30 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:42.875 11:01:07 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:42.875 11:01:07 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@31 -- # waitforlisten 136457 00:06:42.875 11:01:07 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@835 -- # '[' -z 136457 ']' 00:06:42.875 11:01:07 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:42.875 11:01:07 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@840 -- # local max_retries=100 00:06:42.875 11:01:07 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:42.875 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:42.875 11:01:07 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@844 -- # xtrace_disable 00:06:42.875 11:01:07 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:42.875 [2024-11-17 11:01:07.419114] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:06:42.875 [2024-11-17 11:01:07.419192] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid136457 ] 00:06:42.875 [2024-11-17 11:01:07.506548] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:42.875 [2024-11-17 11:01:07.528684] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:06:43.135 11:01:07 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:06:43.135 11:01:07 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@868 -- # return 0 00:06:43.135 11:01:07 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_get_transports --trtype tcp 00:06:43.135 11:01:07 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:43.135 11:01:07 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:43.135 [2024-11-17 11:01:07.724983] nvmf_rpc.c:2703:rpc_nvmf_get_transports: *ERROR*: transport 'tcp' does not exist 00:06:43.135 request: 00:06:43.135 { 00:06:43.135 "trtype": "tcp", 00:06:43.135 "method": "nvmf_get_transports", 00:06:43.135 "req_id": 1 00:06:43.135 } 00:06:43.135 Got JSON-RPC error response 00:06:43.135 response: 00:06:43.135 { 00:06:43.135 "code": -19, 00:06:43.135 "message": "No such device" 00:06:43.135 } 00:06:43.135 11:01:07 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@591 -- # [[ 1 == 0 ]] 00:06:43.135 11:01:07 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_create_transport -t tcp 00:06:43.135 11:01:07 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:43.135 11:01:07 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:43.135 [2024-11-17 11:01:07.737080] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:06:43.135 11:01:07 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:43.135 11:01:07 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@36 -- # rpc_cmd save_config 00:06:43.135 11:01:07 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:43.135 11:01:07 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:43.394 11:01:07 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:43.395 11:01:07 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@37 -- # cat /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/config.json 00:06:43.395 { 00:06:43.395 "subsystems": [ 00:06:43.395 { 00:06:43.395 "subsystem": "scheduler", 00:06:43.395 "config": [ 00:06:43.395 { 00:06:43.395 "method": "framework_set_scheduler", 00:06:43.395 "params": { 00:06:43.395 "name": "static" 00:06:43.395 } 00:06:43.395 } 00:06:43.395 ] 00:06:43.395 }, 00:06:43.395 { 00:06:43.395 "subsystem": "vmd", 00:06:43.395 "config": [] 00:06:43.395 }, 00:06:43.395 { 00:06:43.395 "subsystem": "sock", 00:06:43.395 "config": [ 00:06:43.395 { 00:06:43.395 "method": "sock_set_default_impl", 00:06:43.395 "params": { 00:06:43.395 "impl_name": "posix" 00:06:43.395 } 00:06:43.395 }, 00:06:43.395 { 00:06:43.395 "method": "sock_impl_set_options", 00:06:43.395 "params": { 00:06:43.395 "impl_name": "ssl", 00:06:43.395 "recv_buf_size": 4096, 00:06:43.395 "send_buf_size": 4096, 00:06:43.395 "enable_recv_pipe": true, 00:06:43.395 "enable_quickack": false, 00:06:43.395 "enable_placement_id": 0, 00:06:43.395 "enable_zerocopy_send_server": true, 00:06:43.395 "enable_zerocopy_send_client": false, 00:06:43.395 "zerocopy_threshold": 0, 00:06:43.395 "tls_version": 0, 00:06:43.395 "enable_ktls": false 00:06:43.395 } 00:06:43.395 }, 00:06:43.395 { 00:06:43.395 "method": "sock_impl_set_options", 00:06:43.395 "params": { 00:06:43.395 "impl_name": "posix", 00:06:43.395 "recv_buf_size": 2097152, 00:06:43.395 "send_buf_size": 2097152, 00:06:43.395 "enable_recv_pipe": true, 00:06:43.395 "enable_quickack": false, 00:06:43.395 "enable_placement_id": 0, 00:06:43.395 "enable_zerocopy_send_server": true, 00:06:43.395 "enable_zerocopy_send_client": false, 00:06:43.395 "zerocopy_threshold": 0, 00:06:43.395 "tls_version": 0, 00:06:43.395 "enable_ktls": false 00:06:43.395 } 00:06:43.395 } 00:06:43.395 ] 00:06:43.395 }, 00:06:43.395 { 00:06:43.395 "subsystem": "iobuf", 00:06:43.395 "config": [ 00:06:43.395 { 00:06:43.395 "method": "iobuf_set_options", 00:06:43.395 "params": { 00:06:43.395 "small_pool_count": 8192, 00:06:43.395 "large_pool_count": 1024, 00:06:43.395 "small_bufsize": 8192, 00:06:43.395 "large_bufsize": 135168, 00:06:43.395 "enable_numa": false 00:06:43.395 } 00:06:43.395 } 00:06:43.395 ] 00:06:43.395 }, 00:06:43.395 { 00:06:43.395 "subsystem": "keyring", 00:06:43.395 "config": [] 00:06:43.395 }, 00:06:43.395 { 00:06:43.395 "subsystem": "vfio_user_target", 00:06:43.395 "config": null 00:06:43.395 }, 00:06:43.395 { 00:06:43.395 "subsystem": "fsdev", 00:06:43.395 "config": [ 00:06:43.395 { 00:06:43.395 "method": "fsdev_set_opts", 00:06:43.395 "params": { 00:06:43.395 "fsdev_io_pool_size": 65535, 00:06:43.395 "fsdev_io_cache_size": 256 00:06:43.395 } 00:06:43.395 } 00:06:43.395 ] 00:06:43.395 }, 00:06:43.395 { 00:06:43.395 "subsystem": "accel", 00:06:43.395 "config": [ 00:06:43.395 { 00:06:43.395 "method": "accel_set_options", 00:06:43.395 "params": { 00:06:43.395 "small_cache_size": 128, 00:06:43.395 "large_cache_size": 16, 00:06:43.395 "task_count": 2048, 00:06:43.395 "sequence_count": 2048, 00:06:43.395 "buf_count": 2048 00:06:43.395 } 00:06:43.395 } 00:06:43.395 ] 00:06:43.395 }, 00:06:43.395 { 00:06:43.395 "subsystem": "bdev", 00:06:43.395 "config": [ 00:06:43.395 { 00:06:43.395 "method": "bdev_set_options", 00:06:43.395 "params": { 00:06:43.395 "bdev_io_pool_size": 65535, 00:06:43.395 "bdev_io_cache_size": 256, 00:06:43.395 "bdev_auto_examine": true, 00:06:43.395 "iobuf_small_cache_size": 128, 00:06:43.395 "iobuf_large_cache_size": 16 00:06:43.395 } 00:06:43.395 }, 00:06:43.395 { 00:06:43.395 "method": "bdev_raid_set_options", 00:06:43.395 "params": { 00:06:43.395 "process_window_size_kb": 1024, 00:06:43.395 "process_max_bandwidth_mb_sec": 0 00:06:43.395 } 00:06:43.395 }, 00:06:43.395 { 00:06:43.395 "method": "bdev_nvme_set_options", 00:06:43.395 "params": { 00:06:43.395 "action_on_timeout": "none", 00:06:43.395 "timeout_us": 0, 00:06:43.395 "timeout_admin_us": 0, 00:06:43.395 "keep_alive_timeout_ms": 10000, 00:06:43.395 "arbitration_burst": 0, 00:06:43.395 "low_priority_weight": 0, 00:06:43.395 "medium_priority_weight": 0, 00:06:43.395 "high_priority_weight": 0, 00:06:43.395 "nvme_adminq_poll_period_us": 10000, 00:06:43.395 "nvme_ioq_poll_period_us": 0, 00:06:43.395 "io_queue_requests": 0, 00:06:43.395 "delay_cmd_submit": true, 00:06:43.395 "transport_retry_count": 4, 00:06:43.395 "bdev_retry_count": 3, 00:06:43.395 "transport_ack_timeout": 0, 00:06:43.395 "ctrlr_loss_timeout_sec": 0, 00:06:43.395 "reconnect_delay_sec": 0, 00:06:43.395 "fast_io_fail_timeout_sec": 0, 00:06:43.395 "disable_auto_failback": false, 00:06:43.395 "generate_uuids": false, 00:06:43.395 "transport_tos": 0, 00:06:43.395 "nvme_error_stat": false, 00:06:43.395 "rdma_srq_size": 0, 00:06:43.395 "io_path_stat": false, 00:06:43.395 "allow_accel_sequence": false, 00:06:43.395 "rdma_max_cq_size": 0, 00:06:43.395 "rdma_cm_event_timeout_ms": 0, 00:06:43.395 "dhchap_digests": [ 00:06:43.395 "sha256", 00:06:43.395 "sha384", 00:06:43.395 "sha512" 00:06:43.395 ], 00:06:43.395 "dhchap_dhgroups": [ 00:06:43.395 "null", 00:06:43.395 "ffdhe2048", 00:06:43.395 "ffdhe3072", 00:06:43.395 "ffdhe4096", 00:06:43.395 "ffdhe6144", 00:06:43.395 "ffdhe8192" 00:06:43.395 ] 00:06:43.395 } 00:06:43.395 }, 00:06:43.395 { 00:06:43.395 "method": "bdev_nvme_set_hotplug", 00:06:43.395 "params": { 00:06:43.395 "period_us": 100000, 00:06:43.395 "enable": false 00:06:43.395 } 00:06:43.395 }, 00:06:43.395 { 00:06:43.395 "method": "bdev_iscsi_set_options", 00:06:43.395 "params": { 00:06:43.395 "timeout_sec": 30 00:06:43.395 } 00:06:43.395 }, 00:06:43.395 { 00:06:43.395 "method": "bdev_wait_for_examine" 00:06:43.395 } 00:06:43.395 ] 00:06:43.395 }, 00:06:43.395 { 00:06:43.395 "subsystem": "nvmf", 00:06:43.395 "config": [ 00:06:43.395 { 00:06:43.395 "method": "nvmf_set_config", 00:06:43.395 "params": { 00:06:43.395 "discovery_filter": "match_any", 00:06:43.395 "admin_cmd_passthru": { 00:06:43.395 "identify_ctrlr": false 00:06:43.395 }, 00:06:43.395 "dhchap_digests": [ 00:06:43.395 "sha256", 00:06:43.395 "sha384", 00:06:43.395 "sha512" 00:06:43.395 ], 00:06:43.395 "dhchap_dhgroups": [ 00:06:43.395 "null", 00:06:43.395 "ffdhe2048", 00:06:43.395 "ffdhe3072", 00:06:43.395 "ffdhe4096", 00:06:43.395 "ffdhe6144", 00:06:43.395 "ffdhe8192" 00:06:43.395 ] 00:06:43.395 } 00:06:43.395 }, 00:06:43.395 { 00:06:43.395 "method": "nvmf_set_max_subsystems", 00:06:43.395 "params": { 00:06:43.395 "max_subsystems": 1024 00:06:43.395 } 00:06:43.395 }, 00:06:43.395 { 00:06:43.395 "method": "nvmf_set_crdt", 00:06:43.395 "params": { 00:06:43.395 "crdt1": 0, 00:06:43.395 "crdt2": 0, 00:06:43.395 "crdt3": 0 00:06:43.395 } 00:06:43.395 }, 00:06:43.395 { 00:06:43.395 "method": "nvmf_create_transport", 00:06:43.395 "params": { 00:06:43.395 "trtype": "TCP", 00:06:43.395 "max_queue_depth": 128, 00:06:43.395 "max_io_qpairs_per_ctrlr": 127, 00:06:43.395 "in_capsule_data_size": 4096, 00:06:43.395 "max_io_size": 131072, 00:06:43.395 "io_unit_size": 131072, 00:06:43.395 "max_aq_depth": 128, 00:06:43.395 "num_shared_buffers": 511, 00:06:43.395 "buf_cache_size": 4294967295, 00:06:43.395 "dif_insert_or_strip": false, 00:06:43.395 "zcopy": false, 00:06:43.395 "c2h_success": true, 00:06:43.395 "sock_priority": 0, 00:06:43.395 "abort_timeout_sec": 1, 00:06:43.395 "ack_timeout": 0, 00:06:43.395 "data_wr_pool_size": 0 00:06:43.395 } 00:06:43.395 } 00:06:43.395 ] 00:06:43.395 }, 00:06:43.395 { 00:06:43.395 "subsystem": "nbd", 00:06:43.395 "config": [] 00:06:43.395 }, 00:06:43.395 { 00:06:43.395 "subsystem": "ublk", 00:06:43.395 "config": [] 00:06:43.395 }, 00:06:43.395 { 00:06:43.395 "subsystem": "vhost_blk", 00:06:43.395 "config": [] 00:06:43.395 }, 00:06:43.395 { 00:06:43.395 "subsystem": "scsi", 00:06:43.395 "config": null 00:06:43.395 }, 00:06:43.395 { 00:06:43.395 "subsystem": "iscsi", 00:06:43.395 "config": [ 00:06:43.395 { 00:06:43.395 "method": "iscsi_set_options", 00:06:43.395 "params": { 00:06:43.396 "node_base": "iqn.2016-06.io.spdk", 00:06:43.396 "max_sessions": 128, 00:06:43.396 "max_connections_per_session": 2, 00:06:43.396 "max_queue_depth": 64, 00:06:43.396 "default_time2wait": 2, 00:06:43.396 "default_time2retain": 20, 00:06:43.396 "first_burst_length": 8192, 00:06:43.396 "immediate_data": true, 00:06:43.396 "allow_duplicated_isid": false, 00:06:43.396 "error_recovery_level": 0, 00:06:43.396 "nop_timeout": 60, 00:06:43.396 "nop_in_interval": 30, 00:06:43.396 "disable_chap": false, 00:06:43.396 "require_chap": false, 00:06:43.396 "mutual_chap": false, 00:06:43.396 "chap_group": 0, 00:06:43.396 "max_large_datain_per_connection": 64, 00:06:43.396 "max_r2t_per_connection": 4, 00:06:43.396 "pdu_pool_size": 36864, 00:06:43.396 "immediate_data_pool_size": 16384, 00:06:43.396 "data_out_pool_size": 2048 00:06:43.396 } 00:06:43.396 } 00:06:43.396 ] 00:06:43.396 }, 00:06:43.396 { 00:06:43.396 "subsystem": "vhost_scsi", 00:06:43.396 "config": [] 00:06:43.396 } 00:06:43.396 ] 00:06:43.396 } 00:06:43.396 11:01:07 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@39 -- # trap - SIGINT SIGTERM EXIT 00:06:43.396 11:01:07 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@40 -- # killprocess 136457 00:06:43.396 11:01:07 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # '[' -z 136457 ']' 00:06:43.396 11:01:07 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@958 -- # kill -0 136457 00:06:43.396 11:01:07 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@959 -- # uname 00:06:43.396 11:01:07 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:06:43.396 11:01:07 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 136457 00:06:43.396 11:01:07 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:06:43.396 11:01:07 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:06:43.396 11:01:07 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@972 -- # echo 'killing process with pid 136457' 00:06:43.396 killing process with pid 136457 00:06:43.396 11:01:07 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@973 -- # kill 136457 00:06:43.396 11:01:07 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@978 -- # wait 136457 00:06:43.655 11:01:08 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@47 -- # local spdk_pid=136495 00:06:43.655 11:01:08 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@48 -- # sleep 5 00:06:43.655 11:01:08 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@46 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/config.json 00:06:48.932 11:01:13 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@50 -- # killprocess 136495 00:06:48.932 11:01:13 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # '[' -z 136495 ']' 00:06:48.932 11:01:13 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@958 -- # kill -0 136495 00:06:48.932 11:01:13 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@959 -- # uname 00:06:48.932 11:01:13 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:06:48.932 11:01:13 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 136495 00:06:48.932 11:01:13 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:06:48.932 11:01:13 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:06:48.932 11:01:13 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@972 -- # echo 'killing process with pid 136495' 00:06:48.932 killing process with pid 136495 00:06:48.932 11:01:13 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@973 -- # kill 136495 00:06:48.932 11:01:13 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@978 -- # wait 136495 00:06:49.192 11:01:13 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@51 -- # grep -q 'TCP Transport Init' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/log.txt 00:06:49.192 11:01:13 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@52 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/log.txt 00:06:49.192 00:06:49.192 real 0m6.237s 00:06:49.192 user 0m5.904s 00:06:49.192 sys 0m0.668s 00:06:49.192 11:01:13 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:49.192 11:01:13 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:49.192 ************************************ 00:06:49.192 END TEST skip_rpc_with_json 00:06:49.192 ************************************ 00:06:49.192 11:01:13 skip_rpc -- rpc/skip_rpc.sh@75 -- # run_test skip_rpc_with_delay test_skip_rpc_with_delay 00:06:49.192 11:01:13 skip_rpc -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:49.192 11:01:13 skip_rpc -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:49.192 11:01:13 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:49.192 ************************************ 00:06:49.192 START TEST skip_rpc_with_delay 00:06:49.192 ************************************ 00:06:49.192 11:01:13 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1129 -- # test_skip_rpc_with_delay 00:06:49.192 11:01:13 skip_rpc.skip_rpc_with_delay -- rpc/skip_rpc.sh@57 -- # NOT /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:06:49.192 11:01:13 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@652 -- # local es=0 00:06:49.192 11:01:13 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@654 -- # valid_exec_arg /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:06:49.192 11:01:13 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # local arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:49.192 11:01:13 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:06:49.192 11:01:13 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@644 -- # type -t /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:49.192 11:01:13 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:06:49.192 11:01:13 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@646 -- # type -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:49.192 11:01:13 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:06:49.192 11:01:13 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@646 -- # arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:49.192 11:01:13 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@646 -- # [[ -x /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:06:49.192 11:01:13 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@655 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:06:49.192 [2024-11-17 11:01:13.743336] app.c: 842:spdk_app_start: *ERROR*: Cannot use '--wait-for-rpc' if no RPC server is going to be started. 00:06:49.192 11:01:13 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@655 -- # es=1 00:06:49.192 11:01:13 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@663 -- # (( es > 128 )) 00:06:49.192 11:01:13 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@674 -- # [[ -n '' ]] 00:06:49.192 11:01:13 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@679 -- # (( !es == 0 )) 00:06:49.192 00:06:49.192 real 0m0.048s 00:06:49.192 user 0m0.024s 00:06:49.192 sys 0m0.024s 00:06:49.192 11:01:13 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:49.192 11:01:13 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@10 -- # set +x 00:06:49.192 ************************************ 00:06:49.192 END TEST skip_rpc_with_delay 00:06:49.192 ************************************ 00:06:49.192 11:01:13 skip_rpc -- rpc/skip_rpc.sh@77 -- # uname 00:06:49.192 11:01:13 skip_rpc -- rpc/skip_rpc.sh@77 -- # '[' Linux '!=' FreeBSD ']' 00:06:49.192 11:01:13 skip_rpc -- rpc/skip_rpc.sh@78 -- # run_test exit_on_failed_rpc_init test_exit_on_failed_rpc_init 00:06:49.192 11:01:13 skip_rpc -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:49.192 11:01:13 skip_rpc -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:49.192 11:01:13 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:49.452 ************************************ 00:06:49.452 START TEST exit_on_failed_rpc_init 00:06:49.452 ************************************ 00:06:49.452 11:01:13 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1129 -- # test_exit_on_failed_rpc_init 00:06:49.452 11:01:13 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@62 -- # local spdk_pid=137586 00:06:49.452 11:01:13 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@63 -- # waitforlisten 137586 00:06:49.452 11:01:13 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@61 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:49.452 11:01:13 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@835 -- # '[' -z 137586 ']' 00:06:49.452 11:01:13 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:49.452 11:01:13 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@840 -- # local max_retries=100 00:06:49.452 11:01:13 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:49.452 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:49.452 11:01:13 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@844 -- # xtrace_disable 00:06:49.452 11:01:13 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:06:49.452 [2024-11-17 11:01:13.876764] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:06:49.452 [2024-11-17 11:01:13.876849] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid137586 ] 00:06:49.452 [2024-11-17 11:01:13.964818] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:49.452 [2024-11-17 11:01:13.988720] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:06:49.713 11:01:14 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:06:49.713 11:01:14 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@868 -- # return 0 00:06:49.713 11:01:14 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@65 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:49.713 11:01:14 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@67 -- # NOT /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:06:49.713 11:01:14 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@652 -- # local es=0 00:06:49.713 11:01:14 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@654 -- # valid_exec_arg /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:06:49.713 11:01:14 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # local arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:49.713 11:01:14 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:06:49.713 11:01:14 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@644 -- # type -t /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:49.713 11:01:14 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:06:49.713 11:01:14 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@646 -- # type -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:49.713 11:01:14 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:06:49.713 11:01:14 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@646 -- # arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:49.713 11:01:14 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@646 -- # [[ -x /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:06:49.713 11:01:14 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@655 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:06:49.713 [2024-11-17 11:01:14.207286] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:06:49.713 [2024-11-17 11:01:14.207365] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid137592 ] 00:06:49.713 [2024-11-17 11:01:14.289882] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:49.713 [2024-11-17 11:01:14.312133] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:06:49.713 [2024-11-17 11:01:14.312211] rpc.c: 181:_spdk_rpc_listen: *ERROR*: RPC Unix domain socket path /var/tmp/spdk.sock in use. Specify another. 00:06:49.713 [2024-11-17 11:01:14.312225] rpc.c: 166:spdk_rpc_initialize: *ERROR*: Unable to start RPC service at /var/tmp/spdk.sock 00:06:49.713 [2024-11-17 11:01:14.312233] app.c:1064:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:06:49.713 11:01:14 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@655 -- # es=234 00:06:49.713 11:01:14 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@663 -- # (( es > 128 )) 00:06:49.713 11:01:14 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@664 -- # es=106 00:06:49.713 11:01:14 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@665 -- # case "$es" in 00:06:49.713 11:01:14 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@672 -- # es=1 00:06:49.713 11:01:14 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@679 -- # (( !es == 0 )) 00:06:49.713 11:01:14 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@69 -- # trap - SIGINT SIGTERM EXIT 00:06:49.713 11:01:14 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@70 -- # killprocess 137586 00:06:49.713 11:01:14 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@954 -- # '[' -z 137586 ']' 00:06:49.713 11:01:14 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@958 -- # kill -0 137586 00:06:49.713 11:01:14 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@959 -- # uname 00:06:49.713 11:01:14 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:06:49.713 11:01:14 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 137586 00:06:49.973 11:01:14 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:06:49.973 11:01:14 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:06:49.973 11:01:14 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@972 -- # echo 'killing process with pid 137586' 00:06:49.973 killing process with pid 137586 00:06:49.973 11:01:14 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@973 -- # kill 137586 00:06:49.973 11:01:14 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@978 -- # wait 137586 00:06:50.233 00:06:50.233 real 0m0.840s 00:06:50.233 user 0m0.823s 00:06:50.233 sys 0m0.431s 00:06:50.233 11:01:14 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:50.233 11:01:14 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:06:50.233 ************************************ 00:06:50.233 END TEST exit_on_failed_rpc_init 00:06:50.233 ************************************ 00:06:50.233 11:01:14 skip_rpc -- rpc/skip_rpc.sh@81 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/config.json 00:06:50.233 00:06:50.233 real 0m13.030s 00:06:50.233 user 0m12.094s 00:06:50.233 sys 0m1.780s 00:06:50.233 11:01:14 skip_rpc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:50.233 11:01:14 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:50.233 ************************************ 00:06:50.233 END TEST skip_rpc 00:06:50.233 ************************************ 00:06:50.233 11:01:14 -- spdk/autotest.sh@158 -- # run_test rpc_client /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:06:50.233 11:01:14 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:50.233 11:01:14 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:50.233 11:01:14 -- common/autotest_common.sh@10 -- # set +x 00:06:50.233 ************************************ 00:06:50.233 START TEST rpc_client 00:06:50.233 ************************************ 00:06:50.233 11:01:14 rpc_client -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:06:50.493 * Looking for test storage... 00:06:50.493 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client 00:06:50.493 11:01:14 rpc_client -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:06:50.493 11:01:14 rpc_client -- common/autotest_common.sh@1693 -- # lcov --version 00:06:50.493 11:01:14 rpc_client -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:06:50.493 11:01:15 rpc_client -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:06:50.493 11:01:15 rpc_client -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:50.493 11:01:15 rpc_client -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:50.493 11:01:15 rpc_client -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:50.493 11:01:15 rpc_client -- scripts/common.sh@336 -- # IFS=.-: 00:06:50.493 11:01:15 rpc_client -- scripts/common.sh@336 -- # read -ra ver1 00:06:50.493 11:01:15 rpc_client -- scripts/common.sh@337 -- # IFS=.-: 00:06:50.493 11:01:15 rpc_client -- scripts/common.sh@337 -- # read -ra ver2 00:06:50.493 11:01:15 rpc_client -- scripts/common.sh@338 -- # local 'op=<' 00:06:50.493 11:01:15 rpc_client -- scripts/common.sh@340 -- # ver1_l=2 00:06:50.493 11:01:15 rpc_client -- scripts/common.sh@341 -- # ver2_l=1 00:06:50.493 11:01:15 rpc_client -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:50.493 11:01:15 rpc_client -- scripts/common.sh@344 -- # case "$op" in 00:06:50.493 11:01:15 rpc_client -- scripts/common.sh@345 -- # : 1 00:06:50.493 11:01:15 rpc_client -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:50.493 11:01:15 rpc_client -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:50.493 11:01:15 rpc_client -- scripts/common.sh@365 -- # decimal 1 00:06:50.493 11:01:15 rpc_client -- scripts/common.sh@353 -- # local d=1 00:06:50.493 11:01:15 rpc_client -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:50.493 11:01:15 rpc_client -- scripts/common.sh@355 -- # echo 1 00:06:50.493 11:01:15 rpc_client -- scripts/common.sh@365 -- # ver1[v]=1 00:06:50.493 11:01:15 rpc_client -- scripts/common.sh@366 -- # decimal 2 00:06:50.493 11:01:15 rpc_client -- scripts/common.sh@353 -- # local d=2 00:06:50.493 11:01:15 rpc_client -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:50.493 11:01:15 rpc_client -- scripts/common.sh@355 -- # echo 2 00:06:50.493 11:01:15 rpc_client -- scripts/common.sh@366 -- # ver2[v]=2 00:06:50.493 11:01:15 rpc_client -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:50.493 11:01:15 rpc_client -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:50.493 11:01:15 rpc_client -- scripts/common.sh@368 -- # return 0 00:06:50.493 11:01:15 rpc_client -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:50.493 11:01:15 rpc_client -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:06:50.493 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:50.493 --rc genhtml_branch_coverage=1 00:06:50.493 --rc genhtml_function_coverage=1 00:06:50.493 --rc genhtml_legend=1 00:06:50.493 --rc geninfo_all_blocks=1 00:06:50.493 --rc geninfo_unexecuted_blocks=1 00:06:50.493 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:50.493 ' 00:06:50.493 11:01:15 rpc_client -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:06:50.493 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:50.493 --rc genhtml_branch_coverage=1 00:06:50.493 --rc genhtml_function_coverage=1 00:06:50.493 --rc genhtml_legend=1 00:06:50.493 --rc geninfo_all_blocks=1 00:06:50.493 --rc geninfo_unexecuted_blocks=1 00:06:50.493 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:50.493 ' 00:06:50.493 11:01:15 rpc_client -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:06:50.493 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:50.493 --rc genhtml_branch_coverage=1 00:06:50.493 --rc genhtml_function_coverage=1 00:06:50.493 --rc genhtml_legend=1 00:06:50.493 --rc geninfo_all_blocks=1 00:06:50.493 --rc geninfo_unexecuted_blocks=1 00:06:50.493 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:50.493 ' 00:06:50.493 11:01:15 rpc_client -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:06:50.493 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:50.493 --rc genhtml_branch_coverage=1 00:06:50.493 --rc genhtml_function_coverage=1 00:06:50.493 --rc genhtml_legend=1 00:06:50.493 --rc geninfo_all_blocks=1 00:06:50.493 --rc geninfo_unexecuted_blocks=1 00:06:50.493 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:50.493 ' 00:06:50.493 11:01:15 rpc_client -- rpc_client/rpc_client.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client_test 00:06:50.493 OK 00:06:50.493 11:01:15 rpc_client -- rpc_client/rpc_client.sh@12 -- # trap - SIGINT SIGTERM EXIT 00:06:50.493 00:06:50.493 real 0m0.224s 00:06:50.493 user 0m0.119s 00:06:50.493 sys 0m0.124s 00:06:50.493 11:01:15 rpc_client -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:50.493 11:01:15 rpc_client -- common/autotest_common.sh@10 -- # set +x 00:06:50.493 ************************************ 00:06:50.493 END TEST rpc_client 00:06:50.493 ************************************ 00:06:50.493 11:01:15 -- spdk/autotest.sh@159 -- # run_test json_config /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config.sh 00:06:50.493 11:01:15 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:50.493 11:01:15 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:50.493 11:01:15 -- common/autotest_common.sh@10 -- # set +x 00:06:50.493 ************************************ 00:06:50.493 START TEST json_config 00:06:50.493 ************************************ 00:06:50.493 11:01:15 json_config -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config.sh 00:06:50.761 11:01:15 json_config -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:06:50.761 11:01:15 json_config -- common/autotest_common.sh@1693 -- # lcov --version 00:06:50.761 11:01:15 json_config -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:06:50.761 11:01:15 json_config -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:06:50.761 11:01:15 json_config -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:50.761 11:01:15 json_config -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:50.761 11:01:15 json_config -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:50.761 11:01:15 json_config -- scripts/common.sh@336 -- # IFS=.-: 00:06:50.761 11:01:15 json_config -- scripts/common.sh@336 -- # read -ra ver1 00:06:50.761 11:01:15 json_config -- scripts/common.sh@337 -- # IFS=.-: 00:06:50.761 11:01:15 json_config -- scripts/common.sh@337 -- # read -ra ver2 00:06:50.761 11:01:15 json_config -- scripts/common.sh@338 -- # local 'op=<' 00:06:50.761 11:01:15 json_config -- scripts/common.sh@340 -- # ver1_l=2 00:06:50.761 11:01:15 json_config -- scripts/common.sh@341 -- # ver2_l=1 00:06:50.761 11:01:15 json_config -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:50.761 11:01:15 json_config -- scripts/common.sh@344 -- # case "$op" in 00:06:50.761 11:01:15 json_config -- scripts/common.sh@345 -- # : 1 00:06:50.761 11:01:15 json_config -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:50.761 11:01:15 json_config -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:50.761 11:01:15 json_config -- scripts/common.sh@365 -- # decimal 1 00:06:50.761 11:01:15 json_config -- scripts/common.sh@353 -- # local d=1 00:06:50.761 11:01:15 json_config -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:50.761 11:01:15 json_config -- scripts/common.sh@355 -- # echo 1 00:06:50.761 11:01:15 json_config -- scripts/common.sh@365 -- # ver1[v]=1 00:06:50.761 11:01:15 json_config -- scripts/common.sh@366 -- # decimal 2 00:06:50.761 11:01:15 json_config -- scripts/common.sh@353 -- # local d=2 00:06:50.761 11:01:15 json_config -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:50.761 11:01:15 json_config -- scripts/common.sh@355 -- # echo 2 00:06:50.761 11:01:15 json_config -- scripts/common.sh@366 -- # ver2[v]=2 00:06:50.761 11:01:15 json_config -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:50.761 11:01:15 json_config -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:50.761 11:01:15 json_config -- scripts/common.sh@368 -- # return 0 00:06:50.761 11:01:15 json_config -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:50.761 11:01:15 json_config -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:06:50.761 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:50.761 --rc genhtml_branch_coverage=1 00:06:50.761 --rc genhtml_function_coverage=1 00:06:50.761 --rc genhtml_legend=1 00:06:50.761 --rc geninfo_all_blocks=1 00:06:50.761 --rc geninfo_unexecuted_blocks=1 00:06:50.761 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:50.761 ' 00:06:50.761 11:01:15 json_config -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:06:50.761 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:50.761 --rc genhtml_branch_coverage=1 00:06:50.761 --rc genhtml_function_coverage=1 00:06:50.761 --rc genhtml_legend=1 00:06:50.761 --rc geninfo_all_blocks=1 00:06:50.761 --rc geninfo_unexecuted_blocks=1 00:06:50.761 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:50.761 ' 00:06:50.761 11:01:15 json_config -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:06:50.761 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:50.761 --rc genhtml_branch_coverage=1 00:06:50.761 --rc genhtml_function_coverage=1 00:06:50.761 --rc genhtml_legend=1 00:06:50.761 --rc geninfo_all_blocks=1 00:06:50.761 --rc geninfo_unexecuted_blocks=1 00:06:50.761 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:50.761 ' 00:06:50.761 11:01:15 json_config -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:06:50.761 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:50.761 --rc genhtml_branch_coverage=1 00:06:50.761 --rc genhtml_function_coverage=1 00:06:50.761 --rc genhtml_legend=1 00:06:50.761 --rc geninfo_all_blocks=1 00:06:50.761 --rc geninfo_unexecuted_blocks=1 00:06:50.761 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:50.761 ' 00:06:50.762 11:01:15 json_config -- json_config/json_config.sh@8 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:06:50.762 11:01:15 json_config -- nvmf/common.sh@7 -- # uname -s 00:06:50.762 11:01:15 json_config -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:06:50.762 11:01:15 json_config -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:06:50.762 11:01:15 json_config -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:06:50.762 11:01:15 json_config -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:06:50.762 11:01:15 json_config -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:06:50.762 11:01:15 json_config -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:06:50.762 11:01:15 json_config -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:06:50.762 11:01:15 json_config -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:06:50.762 11:01:15 json_config -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:06:50.762 11:01:15 json_config -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:06:50.762 11:01:15 json_config -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:06:50.762 11:01:15 json_config -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:06:50.762 11:01:15 json_config -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:06:50.762 11:01:15 json_config -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:06:50.762 11:01:15 json_config -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:06:50.762 11:01:15 json_config -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:06:50.762 11:01:15 json_config -- nvmf/common.sh@49 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:06:50.762 11:01:15 json_config -- scripts/common.sh@15 -- # shopt -s extglob 00:06:50.762 11:01:15 json_config -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:06:50.762 11:01:15 json_config -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:06:50.762 11:01:15 json_config -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:06:50.762 11:01:15 json_config -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:50.762 11:01:15 json_config -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:50.762 11:01:15 json_config -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:50.762 11:01:15 json_config -- paths/export.sh@5 -- # export PATH 00:06:50.762 11:01:15 json_config -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:50.762 11:01:15 json_config -- nvmf/common.sh@51 -- # : 0 00:06:50.762 11:01:15 json_config -- nvmf/common.sh@52 -- # export NVMF_APP_SHM_ID 00:06:50.762 11:01:15 json_config -- nvmf/common.sh@53 -- # build_nvmf_app_args 00:06:50.762 11:01:15 json_config -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:06:50.762 11:01:15 json_config -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:06:50.762 11:01:15 json_config -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:06:50.762 11:01:15 json_config -- nvmf/common.sh@33 -- # '[' '' -eq 1 ']' 00:06:50.762 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh: line 33: [: : integer expression expected 00:06:50.762 11:01:15 json_config -- nvmf/common.sh@37 -- # '[' -n '' ']' 00:06:50.762 11:01:15 json_config -- nvmf/common.sh@39 -- # '[' 0 -eq 1 ']' 00:06:50.762 11:01:15 json_config -- nvmf/common.sh@55 -- # have_pci_nics=0 00:06:50.762 11:01:15 json_config -- json_config/json_config.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/common.sh 00:06:50.762 11:01:15 json_config -- json_config/json_config.sh@11 -- # [[ 0 -eq 1 ]] 00:06:50.762 11:01:15 json_config -- json_config/json_config.sh@15 -- # [[ 0 -ne 1 ]] 00:06:50.762 11:01:15 json_config -- json_config/json_config.sh@15 -- # [[ 0 -eq 1 ]] 00:06:50.762 11:01:15 json_config -- json_config/json_config.sh@26 -- # (( SPDK_TEST_BLOCKDEV + SPDK_TEST_ISCSI + SPDK_TEST_NVMF + SPDK_TEST_VHOST + SPDK_TEST_VHOST_INIT + SPDK_TEST_RBD == 0 )) 00:06:50.762 11:01:15 json_config -- json_config/json_config.sh@27 -- # echo 'WARNING: No tests are enabled so not running JSON configuration tests' 00:06:50.762 WARNING: No tests are enabled so not running JSON configuration tests 00:06:50.762 11:01:15 json_config -- json_config/json_config.sh@28 -- # exit 0 00:06:50.762 00:06:50.762 real 0m0.205s 00:06:50.762 user 0m0.126s 00:06:50.762 sys 0m0.089s 00:06:50.762 11:01:15 json_config -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:50.762 11:01:15 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:50.762 ************************************ 00:06:50.762 END TEST json_config 00:06:50.762 ************************************ 00:06:50.762 11:01:15 -- spdk/autotest.sh@160 -- # run_test json_config_extra_key /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:06:50.762 11:01:15 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:50.762 11:01:15 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:50.762 11:01:15 -- common/autotest_common.sh@10 -- # set +x 00:06:51.023 ************************************ 00:06:51.023 START TEST json_config_extra_key 00:06:51.023 ************************************ 00:06:51.023 11:01:15 json_config_extra_key -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:06:51.023 11:01:15 json_config_extra_key -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:06:51.023 11:01:15 json_config_extra_key -- common/autotest_common.sh@1693 -- # lcov --version 00:06:51.023 11:01:15 json_config_extra_key -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:06:51.023 11:01:15 json_config_extra_key -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:06:51.023 11:01:15 json_config_extra_key -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:51.023 11:01:15 json_config_extra_key -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:51.023 11:01:15 json_config_extra_key -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:51.023 11:01:15 json_config_extra_key -- scripts/common.sh@336 -- # IFS=.-: 00:06:51.023 11:01:15 json_config_extra_key -- scripts/common.sh@336 -- # read -ra ver1 00:06:51.023 11:01:15 json_config_extra_key -- scripts/common.sh@337 -- # IFS=.-: 00:06:51.023 11:01:15 json_config_extra_key -- scripts/common.sh@337 -- # read -ra ver2 00:06:51.023 11:01:15 json_config_extra_key -- scripts/common.sh@338 -- # local 'op=<' 00:06:51.023 11:01:15 json_config_extra_key -- scripts/common.sh@340 -- # ver1_l=2 00:06:51.023 11:01:15 json_config_extra_key -- scripts/common.sh@341 -- # ver2_l=1 00:06:51.023 11:01:15 json_config_extra_key -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:51.023 11:01:15 json_config_extra_key -- scripts/common.sh@344 -- # case "$op" in 00:06:51.023 11:01:15 json_config_extra_key -- scripts/common.sh@345 -- # : 1 00:06:51.023 11:01:15 json_config_extra_key -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:51.023 11:01:15 json_config_extra_key -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:51.023 11:01:15 json_config_extra_key -- scripts/common.sh@365 -- # decimal 1 00:06:51.023 11:01:15 json_config_extra_key -- scripts/common.sh@353 -- # local d=1 00:06:51.023 11:01:15 json_config_extra_key -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:51.023 11:01:15 json_config_extra_key -- scripts/common.sh@355 -- # echo 1 00:06:51.023 11:01:15 json_config_extra_key -- scripts/common.sh@365 -- # ver1[v]=1 00:06:51.023 11:01:15 json_config_extra_key -- scripts/common.sh@366 -- # decimal 2 00:06:51.023 11:01:15 json_config_extra_key -- scripts/common.sh@353 -- # local d=2 00:06:51.023 11:01:15 json_config_extra_key -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:51.023 11:01:15 json_config_extra_key -- scripts/common.sh@355 -- # echo 2 00:06:51.023 11:01:15 json_config_extra_key -- scripts/common.sh@366 -- # ver2[v]=2 00:06:51.023 11:01:15 json_config_extra_key -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:51.023 11:01:15 json_config_extra_key -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:51.023 11:01:15 json_config_extra_key -- scripts/common.sh@368 -- # return 0 00:06:51.023 11:01:15 json_config_extra_key -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:51.023 11:01:15 json_config_extra_key -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:06:51.023 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:51.023 --rc genhtml_branch_coverage=1 00:06:51.023 --rc genhtml_function_coverage=1 00:06:51.023 --rc genhtml_legend=1 00:06:51.023 --rc geninfo_all_blocks=1 00:06:51.023 --rc geninfo_unexecuted_blocks=1 00:06:51.023 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:51.023 ' 00:06:51.023 11:01:15 json_config_extra_key -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:06:51.023 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:51.023 --rc genhtml_branch_coverage=1 00:06:51.023 --rc genhtml_function_coverage=1 00:06:51.023 --rc genhtml_legend=1 00:06:51.023 --rc geninfo_all_blocks=1 00:06:51.023 --rc geninfo_unexecuted_blocks=1 00:06:51.023 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:51.023 ' 00:06:51.023 11:01:15 json_config_extra_key -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:06:51.023 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:51.023 --rc genhtml_branch_coverage=1 00:06:51.023 --rc genhtml_function_coverage=1 00:06:51.023 --rc genhtml_legend=1 00:06:51.023 --rc geninfo_all_blocks=1 00:06:51.023 --rc geninfo_unexecuted_blocks=1 00:06:51.023 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:51.023 ' 00:06:51.023 11:01:15 json_config_extra_key -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:06:51.023 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:51.023 --rc genhtml_branch_coverage=1 00:06:51.023 --rc genhtml_function_coverage=1 00:06:51.023 --rc genhtml_legend=1 00:06:51.023 --rc geninfo_all_blocks=1 00:06:51.023 --rc geninfo_unexecuted_blocks=1 00:06:51.023 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:51.023 ' 00:06:51.023 11:01:15 json_config_extra_key -- json_config/json_config_extra_key.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:06:51.023 11:01:15 json_config_extra_key -- nvmf/common.sh@7 -- # uname -s 00:06:51.023 11:01:15 json_config_extra_key -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:06:51.023 11:01:15 json_config_extra_key -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:06:51.023 11:01:15 json_config_extra_key -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:06:51.023 11:01:15 json_config_extra_key -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:06:51.023 11:01:15 json_config_extra_key -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:06:51.023 11:01:15 json_config_extra_key -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:06:51.023 11:01:15 json_config_extra_key -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:06:51.023 11:01:15 json_config_extra_key -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:06:51.023 11:01:15 json_config_extra_key -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:06:51.023 11:01:15 json_config_extra_key -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:06:51.023 11:01:15 json_config_extra_key -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:06:51.023 11:01:15 json_config_extra_key -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:06:51.023 11:01:15 json_config_extra_key -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:06:51.023 11:01:15 json_config_extra_key -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:06:51.023 11:01:15 json_config_extra_key -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:06:51.023 11:01:15 json_config_extra_key -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:06:51.023 11:01:15 json_config_extra_key -- nvmf/common.sh@49 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:06:51.023 11:01:15 json_config_extra_key -- scripts/common.sh@15 -- # shopt -s extglob 00:06:51.023 11:01:15 json_config_extra_key -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:06:51.023 11:01:15 json_config_extra_key -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:06:51.024 11:01:15 json_config_extra_key -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:06:51.024 11:01:15 json_config_extra_key -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:51.024 11:01:15 json_config_extra_key -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:51.024 11:01:15 json_config_extra_key -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:51.024 11:01:15 json_config_extra_key -- paths/export.sh@5 -- # export PATH 00:06:51.024 11:01:15 json_config_extra_key -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:51.024 11:01:15 json_config_extra_key -- nvmf/common.sh@51 -- # : 0 00:06:51.024 11:01:15 json_config_extra_key -- nvmf/common.sh@52 -- # export NVMF_APP_SHM_ID 00:06:51.024 11:01:15 json_config_extra_key -- nvmf/common.sh@53 -- # build_nvmf_app_args 00:06:51.024 11:01:15 json_config_extra_key -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:06:51.024 11:01:15 json_config_extra_key -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:06:51.024 11:01:15 json_config_extra_key -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:06:51.024 11:01:15 json_config_extra_key -- nvmf/common.sh@33 -- # '[' '' -eq 1 ']' 00:06:51.024 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh: line 33: [: : integer expression expected 00:06:51.024 11:01:15 json_config_extra_key -- nvmf/common.sh@37 -- # '[' -n '' ']' 00:06:51.024 11:01:15 json_config_extra_key -- nvmf/common.sh@39 -- # '[' 0 -eq 1 ']' 00:06:51.024 11:01:15 json_config_extra_key -- nvmf/common.sh@55 -- # have_pci_nics=0 00:06:51.024 11:01:15 json_config_extra_key -- json_config/json_config_extra_key.sh@10 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/common.sh 00:06:51.024 11:01:15 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # app_pid=(['target']='') 00:06:51.024 11:01:15 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # declare -A app_pid 00:06:51.024 11:01:15 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock') 00:06:51.024 11:01:15 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # declare -A app_socket 00:06:51.024 11:01:15 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # app_params=(['target']='-m 0x1 -s 1024') 00:06:51.024 11:01:15 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # declare -A app_params 00:06:51.024 11:01:15 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # configs_path=(['target']='/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json') 00:06:51.024 11:01:15 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # declare -A configs_path 00:06:51.024 11:01:15 json_config_extra_key -- json_config/json_config_extra_key.sh@22 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:06:51.024 11:01:15 json_config_extra_key -- json_config/json_config_extra_key.sh@24 -- # echo 'INFO: launching applications...' 00:06:51.024 INFO: launching applications... 00:06:51.024 11:01:15 json_config_extra_key -- json_config/json_config_extra_key.sh@25 -- # json_config_test_start_app target --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json 00:06:51.024 11:01:15 json_config_extra_key -- json_config/common.sh@9 -- # local app=target 00:06:51.024 11:01:15 json_config_extra_key -- json_config/common.sh@10 -- # shift 00:06:51.024 11:01:15 json_config_extra_key -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:06:51.024 11:01:15 json_config_extra_key -- json_config/common.sh@13 -- # [[ -z '' ]] 00:06:51.024 11:01:15 json_config_extra_key -- json_config/common.sh@15 -- # local app_extra_params= 00:06:51.024 11:01:15 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:51.024 11:01:15 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:51.024 11:01:15 json_config_extra_key -- json_config/common.sh@22 -- # app_pid["$app"]=138033 00:06:51.024 11:01:15 json_config_extra_key -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:06:51.024 Waiting for target to run... 00:06:51.024 11:01:15 json_config_extra_key -- json_config/common.sh@25 -- # waitforlisten 138033 /var/tmp/spdk_tgt.sock 00:06:51.024 11:01:15 json_config_extra_key -- common/autotest_common.sh@835 -- # '[' -z 138033 ']' 00:06:51.024 11:01:15 json_config_extra_key -- json_config/common.sh@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json 00:06:51.024 11:01:15 json_config_extra_key -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:06:51.024 11:01:15 json_config_extra_key -- common/autotest_common.sh@840 -- # local max_retries=100 00:06:51.024 11:01:15 json_config_extra_key -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:06:51.024 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:06:51.024 11:01:15 json_config_extra_key -- common/autotest_common.sh@844 -- # xtrace_disable 00:06:51.024 11:01:15 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:06:51.024 [2024-11-17 11:01:15.660141] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:06:51.024 [2024-11-17 11:01:15.660226] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid138033 ] 00:06:51.594 [2024-11-17 11:01:15.964585] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:51.594 [2024-11-17 11:01:15.977191] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:06:52.164 11:01:16 json_config_extra_key -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:06:52.164 11:01:16 json_config_extra_key -- common/autotest_common.sh@868 -- # return 0 00:06:52.164 11:01:16 json_config_extra_key -- json_config/common.sh@26 -- # echo '' 00:06:52.164 00:06:52.164 11:01:16 json_config_extra_key -- json_config/json_config_extra_key.sh@27 -- # echo 'INFO: shutting down applications...' 00:06:52.164 INFO: shutting down applications... 00:06:52.164 11:01:16 json_config_extra_key -- json_config/json_config_extra_key.sh@28 -- # json_config_test_shutdown_app target 00:06:52.164 11:01:16 json_config_extra_key -- json_config/common.sh@31 -- # local app=target 00:06:52.164 11:01:16 json_config_extra_key -- json_config/common.sh@34 -- # [[ -n 22 ]] 00:06:52.164 11:01:16 json_config_extra_key -- json_config/common.sh@35 -- # [[ -n 138033 ]] 00:06:52.164 11:01:16 json_config_extra_key -- json_config/common.sh@38 -- # kill -SIGINT 138033 00:06:52.164 11:01:16 json_config_extra_key -- json_config/common.sh@40 -- # (( i = 0 )) 00:06:52.164 11:01:16 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:06:52.164 11:01:16 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 138033 00:06:52.164 11:01:16 json_config_extra_key -- json_config/common.sh@45 -- # sleep 0.5 00:06:52.424 11:01:17 json_config_extra_key -- json_config/common.sh@40 -- # (( i++ )) 00:06:52.424 11:01:17 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:06:52.424 11:01:17 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 138033 00:06:52.424 11:01:17 json_config_extra_key -- json_config/common.sh@42 -- # app_pid["$app"]= 00:06:52.424 11:01:17 json_config_extra_key -- json_config/common.sh@43 -- # break 00:06:52.424 11:01:17 json_config_extra_key -- json_config/common.sh@48 -- # [[ -n '' ]] 00:06:52.424 11:01:17 json_config_extra_key -- json_config/common.sh@53 -- # echo 'SPDK target shutdown done' 00:06:52.424 SPDK target shutdown done 00:06:52.424 11:01:17 json_config_extra_key -- json_config/json_config_extra_key.sh@30 -- # echo Success 00:06:52.424 Success 00:06:52.424 00:06:52.424 real 0m1.599s 00:06:52.424 user 0m1.315s 00:06:52.424 sys 0m0.451s 00:06:52.424 11:01:17 json_config_extra_key -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:52.424 11:01:17 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:06:52.424 ************************************ 00:06:52.424 END TEST json_config_extra_key 00:06:52.424 ************************************ 00:06:52.424 11:01:17 -- spdk/autotest.sh@161 -- # run_test alias_rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:06:52.424 11:01:17 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:52.424 11:01:17 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:52.424 11:01:17 -- common/autotest_common.sh@10 -- # set +x 00:06:52.684 ************************************ 00:06:52.684 START TEST alias_rpc 00:06:52.684 ************************************ 00:06:52.684 11:01:17 alias_rpc -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:06:52.684 * Looking for test storage... 00:06:52.684 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc 00:06:52.684 11:01:17 alias_rpc -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:06:52.684 11:01:17 alias_rpc -- common/autotest_common.sh@1693 -- # lcov --version 00:06:52.684 11:01:17 alias_rpc -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:06:52.684 11:01:17 alias_rpc -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:06:52.684 11:01:17 alias_rpc -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:52.684 11:01:17 alias_rpc -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:52.684 11:01:17 alias_rpc -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:52.684 11:01:17 alias_rpc -- scripts/common.sh@336 -- # IFS=.-: 00:06:52.684 11:01:17 alias_rpc -- scripts/common.sh@336 -- # read -ra ver1 00:06:52.684 11:01:17 alias_rpc -- scripts/common.sh@337 -- # IFS=.-: 00:06:52.684 11:01:17 alias_rpc -- scripts/common.sh@337 -- # read -ra ver2 00:06:52.684 11:01:17 alias_rpc -- scripts/common.sh@338 -- # local 'op=<' 00:06:52.684 11:01:17 alias_rpc -- scripts/common.sh@340 -- # ver1_l=2 00:06:52.684 11:01:17 alias_rpc -- scripts/common.sh@341 -- # ver2_l=1 00:06:52.684 11:01:17 alias_rpc -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:52.684 11:01:17 alias_rpc -- scripts/common.sh@344 -- # case "$op" in 00:06:52.684 11:01:17 alias_rpc -- scripts/common.sh@345 -- # : 1 00:06:52.684 11:01:17 alias_rpc -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:52.684 11:01:17 alias_rpc -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:52.684 11:01:17 alias_rpc -- scripts/common.sh@365 -- # decimal 1 00:06:52.684 11:01:17 alias_rpc -- scripts/common.sh@353 -- # local d=1 00:06:52.684 11:01:17 alias_rpc -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:52.684 11:01:17 alias_rpc -- scripts/common.sh@355 -- # echo 1 00:06:52.684 11:01:17 alias_rpc -- scripts/common.sh@365 -- # ver1[v]=1 00:06:52.684 11:01:17 alias_rpc -- scripts/common.sh@366 -- # decimal 2 00:06:52.684 11:01:17 alias_rpc -- scripts/common.sh@353 -- # local d=2 00:06:52.684 11:01:17 alias_rpc -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:52.684 11:01:17 alias_rpc -- scripts/common.sh@355 -- # echo 2 00:06:52.684 11:01:17 alias_rpc -- scripts/common.sh@366 -- # ver2[v]=2 00:06:52.684 11:01:17 alias_rpc -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:52.684 11:01:17 alias_rpc -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:52.684 11:01:17 alias_rpc -- scripts/common.sh@368 -- # return 0 00:06:52.685 11:01:17 alias_rpc -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:52.685 11:01:17 alias_rpc -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:06:52.685 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:52.685 --rc genhtml_branch_coverage=1 00:06:52.685 --rc genhtml_function_coverage=1 00:06:52.685 --rc genhtml_legend=1 00:06:52.685 --rc geninfo_all_blocks=1 00:06:52.685 --rc geninfo_unexecuted_blocks=1 00:06:52.685 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:52.685 ' 00:06:52.685 11:01:17 alias_rpc -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:06:52.685 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:52.685 --rc genhtml_branch_coverage=1 00:06:52.685 --rc genhtml_function_coverage=1 00:06:52.685 --rc genhtml_legend=1 00:06:52.685 --rc geninfo_all_blocks=1 00:06:52.685 --rc geninfo_unexecuted_blocks=1 00:06:52.685 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:52.685 ' 00:06:52.685 11:01:17 alias_rpc -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:06:52.685 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:52.685 --rc genhtml_branch_coverage=1 00:06:52.685 --rc genhtml_function_coverage=1 00:06:52.685 --rc genhtml_legend=1 00:06:52.685 --rc geninfo_all_blocks=1 00:06:52.685 --rc geninfo_unexecuted_blocks=1 00:06:52.685 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:52.685 ' 00:06:52.685 11:01:17 alias_rpc -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:06:52.685 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:52.685 --rc genhtml_branch_coverage=1 00:06:52.685 --rc genhtml_function_coverage=1 00:06:52.685 --rc genhtml_legend=1 00:06:52.685 --rc geninfo_all_blocks=1 00:06:52.685 --rc geninfo_unexecuted_blocks=1 00:06:52.685 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:52.685 ' 00:06:52.685 11:01:17 alias_rpc -- alias_rpc/alias_rpc.sh@10 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:06:52.685 11:01:17 alias_rpc -- alias_rpc/alias_rpc.sh@13 -- # spdk_tgt_pid=138356 00:06:52.685 11:01:17 alias_rpc -- alias_rpc/alias_rpc.sh@14 -- # waitforlisten 138356 00:06:52.685 11:01:17 alias_rpc -- alias_rpc/alias_rpc.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:52.685 11:01:17 alias_rpc -- common/autotest_common.sh@835 -- # '[' -z 138356 ']' 00:06:52.685 11:01:17 alias_rpc -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:52.685 11:01:17 alias_rpc -- common/autotest_common.sh@840 -- # local max_retries=100 00:06:52.685 11:01:17 alias_rpc -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:52.685 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:52.685 11:01:17 alias_rpc -- common/autotest_common.sh@844 -- # xtrace_disable 00:06:52.685 11:01:17 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:52.685 [2024-11-17 11:01:17.327090] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:06:52.685 [2024-11-17 11:01:17.327157] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid138356 ] 00:06:52.944 [2024-11-17 11:01:17.412119] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:52.944 [2024-11-17 11:01:17.433519] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:06:53.204 11:01:17 alias_rpc -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:06:53.204 11:01:17 alias_rpc -- common/autotest_common.sh@868 -- # return 0 00:06:53.204 11:01:17 alias_rpc -- alias_rpc/alias_rpc.sh@17 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py load_config -i 00:06:53.463 11:01:17 alias_rpc -- alias_rpc/alias_rpc.sh@19 -- # killprocess 138356 00:06:53.464 11:01:17 alias_rpc -- common/autotest_common.sh@954 -- # '[' -z 138356 ']' 00:06:53.464 11:01:17 alias_rpc -- common/autotest_common.sh@958 -- # kill -0 138356 00:06:53.464 11:01:17 alias_rpc -- common/autotest_common.sh@959 -- # uname 00:06:53.464 11:01:17 alias_rpc -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:06:53.464 11:01:17 alias_rpc -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 138356 00:06:53.464 11:01:17 alias_rpc -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:06:53.464 11:01:17 alias_rpc -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:06:53.464 11:01:17 alias_rpc -- common/autotest_common.sh@972 -- # echo 'killing process with pid 138356' 00:06:53.464 killing process with pid 138356 00:06:53.464 11:01:17 alias_rpc -- common/autotest_common.sh@973 -- # kill 138356 00:06:53.464 11:01:17 alias_rpc -- common/autotest_common.sh@978 -- # wait 138356 00:06:53.723 00:06:53.723 real 0m1.113s 00:06:53.723 user 0m1.103s 00:06:53.723 sys 0m0.476s 00:06:53.723 11:01:18 alias_rpc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:53.723 11:01:18 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:53.723 ************************************ 00:06:53.723 END TEST alias_rpc 00:06:53.723 ************************************ 00:06:53.723 11:01:18 -- spdk/autotest.sh@163 -- # [[ 0 -eq 0 ]] 00:06:53.723 11:01:18 -- spdk/autotest.sh@164 -- # run_test spdkcli_tcp /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/tcp.sh 00:06:53.723 11:01:18 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:53.723 11:01:18 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:53.723 11:01:18 -- common/autotest_common.sh@10 -- # set +x 00:06:53.723 ************************************ 00:06:53.723 START TEST spdkcli_tcp 00:06:53.723 ************************************ 00:06:53.723 11:01:18 spdkcli_tcp -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/tcp.sh 00:06:53.984 * Looking for test storage... 00:06:53.984 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli 00:06:53.984 11:01:18 spdkcli_tcp -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:06:53.984 11:01:18 spdkcli_tcp -- common/autotest_common.sh@1693 -- # lcov --version 00:06:53.984 11:01:18 spdkcli_tcp -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:06:53.984 11:01:18 spdkcli_tcp -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:06:53.984 11:01:18 spdkcli_tcp -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:53.984 11:01:18 spdkcli_tcp -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:53.984 11:01:18 spdkcli_tcp -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:53.984 11:01:18 spdkcli_tcp -- scripts/common.sh@336 -- # IFS=.-: 00:06:53.984 11:01:18 spdkcli_tcp -- scripts/common.sh@336 -- # read -ra ver1 00:06:53.984 11:01:18 spdkcli_tcp -- scripts/common.sh@337 -- # IFS=.-: 00:06:53.984 11:01:18 spdkcli_tcp -- scripts/common.sh@337 -- # read -ra ver2 00:06:53.984 11:01:18 spdkcli_tcp -- scripts/common.sh@338 -- # local 'op=<' 00:06:53.984 11:01:18 spdkcli_tcp -- scripts/common.sh@340 -- # ver1_l=2 00:06:53.984 11:01:18 spdkcli_tcp -- scripts/common.sh@341 -- # ver2_l=1 00:06:53.984 11:01:18 spdkcli_tcp -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:53.984 11:01:18 spdkcli_tcp -- scripts/common.sh@344 -- # case "$op" in 00:06:53.984 11:01:18 spdkcli_tcp -- scripts/common.sh@345 -- # : 1 00:06:53.984 11:01:18 spdkcli_tcp -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:53.984 11:01:18 spdkcli_tcp -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:53.984 11:01:18 spdkcli_tcp -- scripts/common.sh@365 -- # decimal 1 00:06:53.984 11:01:18 spdkcli_tcp -- scripts/common.sh@353 -- # local d=1 00:06:53.984 11:01:18 spdkcli_tcp -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:53.984 11:01:18 spdkcli_tcp -- scripts/common.sh@355 -- # echo 1 00:06:53.984 11:01:18 spdkcli_tcp -- scripts/common.sh@365 -- # ver1[v]=1 00:06:53.984 11:01:18 spdkcli_tcp -- scripts/common.sh@366 -- # decimal 2 00:06:53.984 11:01:18 spdkcli_tcp -- scripts/common.sh@353 -- # local d=2 00:06:53.984 11:01:18 spdkcli_tcp -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:53.984 11:01:18 spdkcli_tcp -- scripts/common.sh@355 -- # echo 2 00:06:53.984 11:01:18 spdkcli_tcp -- scripts/common.sh@366 -- # ver2[v]=2 00:06:53.984 11:01:18 spdkcli_tcp -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:53.984 11:01:18 spdkcli_tcp -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:53.984 11:01:18 spdkcli_tcp -- scripts/common.sh@368 -- # return 0 00:06:53.984 11:01:18 spdkcli_tcp -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:53.984 11:01:18 spdkcli_tcp -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:06:53.984 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:53.984 --rc genhtml_branch_coverage=1 00:06:53.984 --rc genhtml_function_coverage=1 00:06:53.984 --rc genhtml_legend=1 00:06:53.984 --rc geninfo_all_blocks=1 00:06:53.984 --rc geninfo_unexecuted_blocks=1 00:06:53.984 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:53.984 ' 00:06:53.984 11:01:18 spdkcli_tcp -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:06:53.984 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:53.984 --rc genhtml_branch_coverage=1 00:06:53.984 --rc genhtml_function_coverage=1 00:06:53.984 --rc genhtml_legend=1 00:06:53.984 --rc geninfo_all_blocks=1 00:06:53.984 --rc geninfo_unexecuted_blocks=1 00:06:53.984 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:53.984 ' 00:06:53.984 11:01:18 spdkcli_tcp -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:06:53.984 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:53.984 --rc genhtml_branch_coverage=1 00:06:53.984 --rc genhtml_function_coverage=1 00:06:53.984 --rc genhtml_legend=1 00:06:53.984 --rc geninfo_all_blocks=1 00:06:53.984 --rc geninfo_unexecuted_blocks=1 00:06:53.984 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:53.984 ' 00:06:53.984 11:01:18 spdkcli_tcp -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:06:53.984 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:53.984 --rc genhtml_branch_coverage=1 00:06:53.984 --rc genhtml_function_coverage=1 00:06:53.984 --rc genhtml_legend=1 00:06:53.984 --rc geninfo_all_blocks=1 00:06:53.984 --rc geninfo_unexecuted_blocks=1 00:06:53.984 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:53.984 ' 00:06:53.984 11:01:18 spdkcli_tcp -- spdkcli/tcp.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/common.sh 00:06:53.984 11:01:18 spdkcli_tcp -- spdkcli/common.sh@6 -- # spdkcli_job=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/spdkcli_job.py 00:06:53.984 11:01:18 spdkcli_tcp -- spdkcli/common.sh@7 -- # spdk_clear_config_py=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/clear_config.py 00:06:53.984 11:01:18 spdkcli_tcp -- spdkcli/tcp.sh@18 -- # IP_ADDRESS=127.0.0.1 00:06:53.985 11:01:18 spdkcli_tcp -- spdkcli/tcp.sh@19 -- # PORT=9998 00:06:53.985 11:01:18 spdkcli_tcp -- spdkcli/tcp.sh@21 -- # trap 'err_cleanup; exit 1' SIGINT SIGTERM EXIT 00:06:53.985 11:01:18 spdkcli_tcp -- spdkcli/tcp.sh@23 -- # timing_enter run_spdk_tgt_tcp 00:06:53.985 11:01:18 spdkcli_tcp -- common/autotest_common.sh@726 -- # xtrace_disable 00:06:53.985 11:01:18 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:06:53.985 11:01:18 spdkcli_tcp -- spdkcli/tcp.sh@25 -- # spdk_tgt_pid=138685 00:06:53.985 11:01:18 spdkcli_tcp -- spdkcli/tcp.sh@24 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x3 -p 0 00:06:53.985 11:01:18 spdkcli_tcp -- spdkcli/tcp.sh@27 -- # waitforlisten 138685 00:06:53.985 11:01:18 spdkcli_tcp -- common/autotest_common.sh@835 -- # '[' -z 138685 ']' 00:06:53.985 11:01:18 spdkcli_tcp -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:53.985 11:01:18 spdkcli_tcp -- common/autotest_common.sh@840 -- # local max_retries=100 00:06:53.985 11:01:18 spdkcli_tcp -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:53.985 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:53.985 11:01:18 spdkcli_tcp -- common/autotest_common.sh@844 -- # xtrace_disable 00:06:53.985 11:01:18 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:06:53.985 [2024-11-17 11:01:18.528966] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:06:53.985 [2024-11-17 11:01:18.529064] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid138685 ] 00:06:53.985 [2024-11-17 11:01:18.615269] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:53.985 [2024-11-17 11:01:18.638425] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:06:53.985 [2024-11-17 11:01:18.638426] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:06:54.243 11:01:18 spdkcli_tcp -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:06:54.243 11:01:18 spdkcli_tcp -- common/autotest_common.sh@868 -- # return 0 00:06:54.243 11:01:18 spdkcli_tcp -- spdkcli/tcp.sh@31 -- # socat_pid=138689 00:06:54.243 11:01:18 spdkcli_tcp -- spdkcli/tcp.sh@33 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -r 100 -t 2 -s 127.0.0.1 -p 9998 rpc_get_methods 00:06:54.243 11:01:18 spdkcli_tcp -- spdkcli/tcp.sh@30 -- # socat TCP-LISTEN:9998 UNIX-CONNECT:/var/tmp/spdk.sock 00:06:54.504 [ 00:06:54.504 "spdk_get_version", 00:06:54.504 "rpc_get_methods", 00:06:54.504 "notify_get_notifications", 00:06:54.504 "notify_get_types", 00:06:54.504 "trace_get_info", 00:06:54.504 "trace_get_tpoint_group_mask", 00:06:54.504 "trace_disable_tpoint_group", 00:06:54.504 "trace_enable_tpoint_group", 00:06:54.504 "trace_clear_tpoint_mask", 00:06:54.504 "trace_set_tpoint_mask", 00:06:54.504 "fsdev_set_opts", 00:06:54.504 "fsdev_get_opts", 00:06:54.504 "framework_get_pci_devices", 00:06:54.504 "framework_get_config", 00:06:54.504 "framework_get_subsystems", 00:06:54.504 "vfu_tgt_set_base_path", 00:06:54.504 "keyring_get_keys", 00:06:54.504 "iobuf_get_stats", 00:06:54.504 "iobuf_set_options", 00:06:54.504 "sock_get_default_impl", 00:06:54.504 "sock_set_default_impl", 00:06:54.504 "sock_impl_set_options", 00:06:54.504 "sock_impl_get_options", 00:06:54.504 "vmd_rescan", 00:06:54.504 "vmd_remove_device", 00:06:54.504 "vmd_enable", 00:06:54.504 "accel_get_stats", 00:06:54.504 "accel_set_options", 00:06:54.504 "accel_set_driver", 00:06:54.504 "accel_crypto_key_destroy", 00:06:54.504 "accel_crypto_keys_get", 00:06:54.504 "accel_crypto_key_create", 00:06:54.504 "accel_assign_opc", 00:06:54.504 "accel_get_module_info", 00:06:54.504 "accel_get_opc_assignments", 00:06:54.504 "bdev_get_histogram", 00:06:54.504 "bdev_enable_histogram", 00:06:54.504 "bdev_set_qos_limit", 00:06:54.504 "bdev_set_qd_sampling_period", 00:06:54.504 "bdev_get_bdevs", 00:06:54.504 "bdev_reset_iostat", 00:06:54.504 "bdev_get_iostat", 00:06:54.504 "bdev_examine", 00:06:54.504 "bdev_wait_for_examine", 00:06:54.504 "bdev_set_options", 00:06:54.504 "scsi_get_devices", 00:06:54.504 "thread_set_cpumask", 00:06:54.504 "scheduler_set_options", 00:06:54.504 "framework_get_governor", 00:06:54.504 "framework_get_scheduler", 00:06:54.504 "framework_set_scheduler", 00:06:54.504 "framework_get_reactors", 00:06:54.504 "thread_get_io_channels", 00:06:54.504 "thread_get_pollers", 00:06:54.504 "thread_get_stats", 00:06:54.504 "framework_monitor_context_switch", 00:06:54.504 "spdk_kill_instance", 00:06:54.504 "log_enable_timestamps", 00:06:54.504 "log_get_flags", 00:06:54.504 "log_clear_flag", 00:06:54.504 "log_set_flag", 00:06:54.504 "log_get_level", 00:06:54.504 "log_set_level", 00:06:54.504 "log_get_print_level", 00:06:54.504 "log_set_print_level", 00:06:54.504 "framework_enable_cpumask_locks", 00:06:54.504 "framework_disable_cpumask_locks", 00:06:54.504 "framework_wait_init", 00:06:54.504 "framework_start_init", 00:06:54.504 "virtio_blk_create_transport", 00:06:54.504 "virtio_blk_get_transports", 00:06:54.504 "vhost_controller_set_coalescing", 00:06:54.504 "vhost_get_controllers", 00:06:54.504 "vhost_delete_controller", 00:06:54.504 "vhost_create_blk_controller", 00:06:54.504 "vhost_scsi_controller_remove_target", 00:06:54.504 "vhost_scsi_controller_add_target", 00:06:54.504 "vhost_start_scsi_controller", 00:06:54.504 "vhost_create_scsi_controller", 00:06:54.504 "ublk_recover_disk", 00:06:54.504 "ublk_get_disks", 00:06:54.504 "ublk_stop_disk", 00:06:54.504 "ublk_start_disk", 00:06:54.504 "ublk_destroy_target", 00:06:54.504 "ublk_create_target", 00:06:54.504 "nbd_get_disks", 00:06:54.504 "nbd_stop_disk", 00:06:54.504 "nbd_start_disk", 00:06:54.504 "env_dpdk_get_mem_stats", 00:06:54.504 "nvmf_stop_mdns_prr", 00:06:54.504 "nvmf_publish_mdns_prr", 00:06:54.504 "nvmf_subsystem_get_listeners", 00:06:54.504 "nvmf_subsystem_get_qpairs", 00:06:54.504 "nvmf_subsystem_get_controllers", 00:06:54.504 "nvmf_get_stats", 00:06:54.504 "nvmf_get_transports", 00:06:54.504 "nvmf_create_transport", 00:06:54.504 "nvmf_get_targets", 00:06:54.504 "nvmf_delete_target", 00:06:54.504 "nvmf_create_target", 00:06:54.504 "nvmf_subsystem_allow_any_host", 00:06:54.504 "nvmf_subsystem_set_keys", 00:06:54.504 "nvmf_subsystem_remove_host", 00:06:54.504 "nvmf_subsystem_add_host", 00:06:54.504 "nvmf_ns_remove_host", 00:06:54.504 "nvmf_ns_add_host", 00:06:54.504 "nvmf_subsystem_remove_ns", 00:06:54.504 "nvmf_subsystem_set_ns_ana_group", 00:06:54.504 "nvmf_subsystem_add_ns", 00:06:54.504 "nvmf_subsystem_listener_set_ana_state", 00:06:54.504 "nvmf_discovery_get_referrals", 00:06:54.504 "nvmf_discovery_remove_referral", 00:06:54.504 "nvmf_discovery_add_referral", 00:06:54.504 "nvmf_subsystem_remove_listener", 00:06:54.504 "nvmf_subsystem_add_listener", 00:06:54.504 "nvmf_delete_subsystem", 00:06:54.504 "nvmf_create_subsystem", 00:06:54.504 "nvmf_get_subsystems", 00:06:54.504 "nvmf_set_crdt", 00:06:54.504 "nvmf_set_config", 00:06:54.504 "nvmf_set_max_subsystems", 00:06:54.504 "iscsi_get_histogram", 00:06:54.504 "iscsi_enable_histogram", 00:06:54.504 "iscsi_set_options", 00:06:54.504 "iscsi_get_auth_groups", 00:06:54.504 "iscsi_auth_group_remove_secret", 00:06:54.504 "iscsi_auth_group_add_secret", 00:06:54.504 "iscsi_delete_auth_group", 00:06:54.504 "iscsi_create_auth_group", 00:06:54.504 "iscsi_set_discovery_auth", 00:06:54.504 "iscsi_get_options", 00:06:54.504 "iscsi_target_node_request_logout", 00:06:54.504 "iscsi_target_node_set_redirect", 00:06:54.504 "iscsi_target_node_set_auth", 00:06:54.504 "iscsi_target_node_add_lun", 00:06:54.504 "iscsi_get_stats", 00:06:54.504 "iscsi_get_connections", 00:06:54.504 "iscsi_portal_group_set_auth", 00:06:54.504 "iscsi_start_portal_group", 00:06:54.504 "iscsi_delete_portal_group", 00:06:54.504 "iscsi_create_portal_group", 00:06:54.504 "iscsi_get_portal_groups", 00:06:54.504 "iscsi_delete_target_node", 00:06:54.504 "iscsi_target_node_remove_pg_ig_maps", 00:06:54.504 "iscsi_target_node_add_pg_ig_maps", 00:06:54.504 "iscsi_create_target_node", 00:06:54.504 "iscsi_get_target_nodes", 00:06:54.504 "iscsi_delete_initiator_group", 00:06:54.504 "iscsi_initiator_group_remove_initiators", 00:06:54.505 "iscsi_initiator_group_add_initiators", 00:06:54.505 "iscsi_create_initiator_group", 00:06:54.505 "iscsi_get_initiator_groups", 00:06:54.505 "fsdev_aio_delete", 00:06:54.505 "fsdev_aio_create", 00:06:54.505 "keyring_linux_set_options", 00:06:54.505 "keyring_file_remove_key", 00:06:54.505 "keyring_file_add_key", 00:06:54.505 "vfu_virtio_create_fs_endpoint", 00:06:54.505 "vfu_virtio_create_scsi_endpoint", 00:06:54.505 "vfu_virtio_scsi_remove_target", 00:06:54.505 "vfu_virtio_scsi_add_target", 00:06:54.505 "vfu_virtio_create_blk_endpoint", 00:06:54.505 "vfu_virtio_delete_endpoint", 00:06:54.505 "iaa_scan_accel_module", 00:06:54.505 "dsa_scan_accel_module", 00:06:54.505 "ioat_scan_accel_module", 00:06:54.505 "accel_error_inject_error", 00:06:54.505 "bdev_iscsi_delete", 00:06:54.505 "bdev_iscsi_create", 00:06:54.505 "bdev_iscsi_set_options", 00:06:54.505 "bdev_virtio_attach_controller", 00:06:54.505 "bdev_virtio_scsi_get_devices", 00:06:54.505 "bdev_virtio_detach_controller", 00:06:54.505 "bdev_virtio_blk_set_hotplug", 00:06:54.505 "bdev_ftl_set_property", 00:06:54.505 "bdev_ftl_get_properties", 00:06:54.505 "bdev_ftl_get_stats", 00:06:54.505 "bdev_ftl_unmap", 00:06:54.505 "bdev_ftl_unload", 00:06:54.505 "bdev_ftl_delete", 00:06:54.505 "bdev_ftl_load", 00:06:54.505 "bdev_ftl_create", 00:06:54.505 "bdev_aio_delete", 00:06:54.505 "bdev_aio_rescan", 00:06:54.505 "bdev_aio_create", 00:06:54.505 "blobfs_create", 00:06:54.505 "blobfs_detect", 00:06:54.505 "blobfs_set_cache_size", 00:06:54.505 "bdev_zone_block_delete", 00:06:54.505 "bdev_zone_block_create", 00:06:54.505 "bdev_delay_delete", 00:06:54.505 "bdev_delay_create", 00:06:54.505 "bdev_delay_update_latency", 00:06:54.505 "bdev_split_delete", 00:06:54.505 "bdev_split_create", 00:06:54.505 "bdev_error_inject_error", 00:06:54.505 "bdev_error_delete", 00:06:54.505 "bdev_error_create", 00:06:54.505 "bdev_raid_set_options", 00:06:54.505 "bdev_raid_remove_base_bdev", 00:06:54.505 "bdev_raid_add_base_bdev", 00:06:54.505 "bdev_raid_delete", 00:06:54.505 "bdev_raid_create", 00:06:54.505 "bdev_raid_get_bdevs", 00:06:54.505 "bdev_lvol_set_parent_bdev", 00:06:54.505 "bdev_lvol_set_parent", 00:06:54.505 "bdev_lvol_check_shallow_copy", 00:06:54.505 "bdev_lvol_start_shallow_copy", 00:06:54.505 "bdev_lvol_grow_lvstore", 00:06:54.505 "bdev_lvol_get_lvols", 00:06:54.505 "bdev_lvol_get_lvstores", 00:06:54.505 "bdev_lvol_delete", 00:06:54.505 "bdev_lvol_set_read_only", 00:06:54.505 "bdev_lvol_resize", 00:06:54.505 "bdev_lvol_decouple_parent", 00:06:54.505 "bdev_lvol_inflate", 00:06:54.505 "bdev_lvol_rename", 00:06:54.505 "bdev_lvol_clone_bdev", 00:06:54.505 "bdev_lvol_clone", 00:06:54.505 "bdev_lvol_snapshot", 00:06:54.505 "bdev_lvol_create", 00:06:54.505 "bdev_lvol_delete_lvstore", 00:06:54.505 "bdev_lvol_rename_lvstore", 00:06:54.505 "bdev_lvol_create_lvstore", 00:06:54.505 "bdev_passthru_delete", 00:06:54.505 "bdev_passthru_create", 00:06:54.505 "bdev_nvme_cuse_unregister", 00:06:54.505 "bdev_nvme_cuse_register", 00:06:54.505 "bdev_opal_new_user", 00:06:54.505 "bdev_opal_set_lock_state", 00:06:54.505 "bdev_opal_delete", 00:06:54.505 "bdev_opal_get_info", 00:06:54.505 "bdev_opal_create", 00:06:54.505 "bdev_nvme_opal_revert", 00:06:54.505 "bdev_nvme_opal_init", 00:06:54.505 "bdev_nvme_send_cmd", 00:06:54.505 "bdev_nvme_set_keys", 00:06:54.505 "bdev_nvme_get_path_iostat", 00:06:54.505 "bdev_nvme_get_mdns_discovery_info", 00:06:54.505 "bdev_nvme_stop_mdns_discovery", 00:06:54.505 "bdev_nvme_start_mdns_discovery", 00:06:54.505 "bdev_nvme_set_multipath_policy", 00:06:54.505 "bdev_nvme_set_preferred_path", 00:06:54.505 "bdev_nvme_get_io_paths", 00:06:54.505 "bdev_nvme_remove_error_injection", 00:06:54.505 "bdev_nvme_add_error_injection", 00:06:54.505 "bdev_nvme_get_discovery_info", 00:06:54.505 "bdev_nvme_stop_discovery", 00:06:54.505 "bdev_nvme_start_discovery", 00:06:54.505 "bdev_nvme_get_controller_health_info", 00:06:54.505 "bdev_nvme_disable_controller", 00:06:54.505 "bdev_nvme_enable_controller", 00:06:54.505 "bdev_nvme_reset_controller", 00:06:54.505 "bdev_nvme_get_transport_statistics", 00:06:54.505 "bdev_nvme_apply_firmware", 00:06:54.505 "bdev_nvme_detach_controller", 00:06:54.505 "bdev_nvme_get_controllers", 00:06:54.505 "bdev_nvme_attach_controller", 00:06:54.505 "bdev_nvme_set_hotplug", 00:06:54.505 "bdev_nvme_set_options", 00:06:54.505 "bdev_null_resize", 00:06:54.505 "bdev_null_delete", 00:06:54.505 "bdev_null_create", 00:06:54.505 "bdev_malloc_delete", 00:06:54.505 "bdev_malloc_create" 00:06:54.505 ] 00:06:54.505 11:01:19 spdkcli_tcp -- spdkcli/tcp.sh@35 -- # timing_exit run_spdk_tgt_tcp 00:06:54.505 11:01:19 spdkcli_tcp -- common/autotest_common.sh@732 -- # xtrace_disable 00:06:54.505 11:01:19 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:06:54.505 11:01:19 spdkcli_tcp -- spdkcli/tcp.sh@37 -- # trap - SIGINT SIGTERM EXIT 00:06:54.505 11:01:19 spdkcli_tcp -- spdkcli/tcp.sh@38 -- # killprocess 138685 00:06:54.505 11:01:19 spdkcli_tcp -- common/autotest_common.sh@954 -- # '[' -z 138685 ']' 00:06:54.505 11:01:19 spdkcli_tcp -- common/autotest_common.sh@958 -- # kill -0 138685 00:06:54.505 11:01:19 spdkcli_tcp -- common/autotest_common.sh@959 -- # uname 00:06:54.505 11:01:19 spdkcli_tcp -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:06:54.505 11:01:19 spdkcli_tcp -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 138685 00:06:54.505 11:01:19 spdkcli_tcp -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:06:54.505 11:01:19 spdkcli_tcp -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:06:54.505 11:01:19 spdkcli_tcp -- common/autotest_common.sh@972 -- # echo 'killing process with pid 138685' 00:06:54.505 killing process with pid 138685 00:06:54.505 11:01:19 spdkcli_tcp -- common/autotest_common.sh@973 -- # kill 138685 00:06:54.505 11:01:19 spdkcli_tcp -- common/autotest_common.sh@978 -- # wait 138685 00:06:55.076 00:06:55.076 real 0m1.127s 00:06:55.076 user 0m1.853s 00:06:55.076 sys 0m0.534s 00:06:55.076 11:01:19 spdkcli_tcp -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:55.076 11:01:19 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:06:55.076 ************************************ 00:06:55.076 END TEST spdkcli_tcp 00:06:55.076 ************************************ 00:06:55.076 11:01:19 -- spdk/autotest.sh@167 -- # run_test dpdk_mem_utility /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:06:55.076 11:01:19 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:55.076 11:01:19 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:55.076 11:01:19 -- common/autotest_common.sh@10 -- # set +x 00:06:55.076 ************************************ 00:06:55.076 START TEST dpdk_mem_utility 00:06:55.076 ************************************ 00:06:55.076 11:01:19 dpdk_mem_utility -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:06:55.076 * Looking for test storage... 00:06:55.076 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility 00:06:55.076 11:01:19 dpdk_mem_utility -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:06:55.076 11:01:19 dpdk_mem_utility -- common/autotest_common.sh@1693 -- # lcov --version 00:06:55.076 11:01:19 dpdk_mem_utility -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:06:55.076 11:01:19 dpdk_mem_utility -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:06:55.076 11:01:19 dpdk_mem_utility -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:55.076 11:01:19 dpdk_mem_utility -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:55.076 11:01:19 dpdk_mem_utility -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:55.076 11:01:19 dpdk_mem_utility -- scripts/common.sh@336 -- # IFS=.-: 00:06:55.076 11:01:19 dpdk_mem_utility -- scripts/common.sh@336 -- # read -ra ver1 00:06:55.076 11:01:19 dpdk_mem_utility -- scripts/common.sh@337 -- # IFS=.-: 00:06:55.076 11:01:19 dpdk_mem_utility -- scripts/common.sh@337 -- # read -ra ver2 00:06:55.076 11:01:19 dpdk_mem_utility -- scripts/common.sh@338 -- # local 'op=<' 00:06:55.076 11:01:19 dpdk_mem_utility -- scripts/common.sh@340 -- # ver1_l=2 00:06:55.076 11:01:19 dpdk_mem_utility -- scripts/common.sh@341 -- # ver2_l=1 00:06:55.076 11:01:19 dpdk_mem_utility -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:55.076 11:01:19 dpdk_mem_utility -- scripts/common.sh@344 -- # case "$op" in 00:06:55.076 11:01:19 dpdk_mem_utility -- scripts/common.sh@345 -- # : 1 00:06:55.076 11:01:19 dpdk_mem_utility -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:55.076 11:01:19 dpdk_mem_utility -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:55.076 11:01:19 dpdk_mem_utility -- scripts/common.sh@365 -- # decimal 1 00:06:55.076 11:01:19 dpdk_mem_utility -- scripts/common.sh@353 -- # local d=1 00:06:55.076 11:01:19 dpdk_mem_utility -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:55.076 11:01:19 dpdk_mem_utility -- scripts/common.sh@355 -- # echo 1 00:06:55.076 11:01:19 dpdk_mem_utility -- scripts/common.sh@365 -- # ver1[v]=1 00:06:55.076 11:01:19 dpdk_mem_utility -- scripts/common.sh@366 -- # decimal 2 00:06:55.076 11:01:19 dpdk_mem_utility -- scripts/common.sh@353 -- # local d=2 00:06:55.076 11:01:19 dpdk_mem_utility -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:55.076 11:01:19 dpdk_mem_utility -- scripts/common.sh@355 -- # echo 2 00:06:55.076 11:01:19 dpdk_mem_utility -- scripts/common.sh@366 -- # ver2[v]=2 00:06:55.076 11:01:19 dpdk_mem_utility -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:55.076 11:01:19 dpdk_mem_utility -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:55.076 11:01:19 dpdk_mem_utility -- scripts/common.sh@368 -- # return 0 00:06:55.076 11:01:19 dpdk_mem_utility -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:55.076 11:01:19 dpdk_mem_utility -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:06:55.076 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:55.076 --rc genhtml_branch_coverage=1 00:06:55.076 --rc genhtml_function_coverage=1 00:06:55.077 --rc genhtml_legend=1 00:06:55.077 --rc geninfo_all_blocks=1 00:06:55.077 --rc geninfo_unexecuted_blocks=1 00:06:55.077 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:55.077 ' 00:06:55.077 11:01:19 dpdk_mem_utility -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:06:55.077 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:55.077 --rc genhtml_branch_coverage=1 00:06:55.077 --rc genhtml_function_coverage=1 00:06:55.077 --rc genhtml_legend=1 00:06:55.077 --rc geninfo_all_blocks=1 00:06:55.077 --rc geninfo_unexecuted_blocks=1 00:06:55.077 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:55.077 ' 00:06:55.077 11:01:19 dpdk_mem_utility -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:06:55.077 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:55.077 --rc genhtml_branch_coverage=1 00:06:55.077 --rc genhtml_function_coverage=1 00:06:55.077 --rc genhtml_legend=1 00:06:55.077 --rc geninfo_all_blocks=1 00:06:55.077 --rc geninfo_unexecuted_blocks=1 00:06:55.077 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:55.077 ' 00:06:55.077 11:01:19 dpdk_mem_utility -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:06:55.077 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:55.077 --rc genhtml_branch_coverage=1 00:06:55.077 --rc genhtml_function_coverage=1 00:06:55.077 --rc genhtml_legend=1 00:06:55.077 --rc geninfo_all_blocks=1 00:06:55.077 --rc geninfo_unexecuted_blocks=1 00:06:55.077 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:55.077 ' 00:06:55.077 11:01:19 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@10 -- # MEM_SCRIPT=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:06:55.077 11:01:19 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@13 -- # spdkpid=139021 00:06:55.077 11:01:19 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@15 -- # waitforlisten 139021 00:06:55.077 11:01:19 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:55.077 11:01:19 dpdk_mem_utility -- common/autotest_common.sh@835 -- # '[' -z 139021 ']' 00:06:55.077 11:01:19 dpdk_mem_utility -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:55.077 11:01:19 dpdk_mem_utility -- common/autotest_common.sh@840 -- # local max_retries=100 00:06:55.077 11:01:19 dpdk_mem_utility -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:55.077 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:55.077 11:01:19 dpdk_mem_utility -- common/autotest_common.sh@844 -- # xtrace_disable 00:06:55.077 11:01:19 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:06:55.077 [2024-11-17 11:01:19.728612] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:06:55.077 [2024-11-17 11:01:19.728673] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid139021 ] 00:06:55.337 [2024-11-17 11:01:19.811831] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:55.337 [2024-11-17 11:01:19.834087] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:06:55.598 11:01:20 dpdk_mem_utility -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:06:55.598 11:01:20 dpdk_mem_utility -- common/autotest_common.sh@868 -- # return 0 00:06:55.598 11:01:20 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@17 -- # trap 'killprocess $spdkpid' SIGINT SIGTERM EXIT 00:06:55.598 11:01:20 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@19 -- # rpc_cmd env_dpdk_get_mem_stats 00:06:55.598 11:01:20 dpdk_mem_utility -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:55.598 11:01:20 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:06:55.598 { 00:06:55.598 "filename": "/tmp/spdk_mem_dump.txt" 00:06:55.598 } 00:06:55.598 11:01:20 dpdk_mem_utility -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:55.598 11:01:20 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:06:55.598 DPDK memory size 810.000000 MiB in 1 heap(s) 00:06:55.598 1 heaps totaling size 810.000000 MiB 00:06:55.598 size: 810.000000 MiB heap id: 0 00:06:55.598 end heaps---------- 00:06:55.598 9 mempools totaling size 595.772034 MiB 00:06:55.598 size: 212.674988 MiB name: PDU_immediate_data_Pool 00:06:55.598 size: 158.602051 MiB name: PDU_data_out_Pool 00:06:55.598 size: 92.545471 MiB name: bdev_io_139021 00:06:55.598 size: 50.003479 MiB name: msgpool_139021 00:06:55.598 size: 36.509338 MiB name: fsdev_io_139021 00:06:55.598 size: 21.763794 MiB name: PDU_Pool 00:06:55.598 size: 19.513306 MiB name: SCSI_TASK_Pool 00:06:55.598 size: 4.133484 MiB name: evtpool_139021 00:06:55.598 size: 0.026123 MiB name: Session_Pool 00:06:55.598 end mempools------- 00:06:55.598 6 memzones totaling size 4.142822 MiB 00:06:55.598 size: 1.000366 MiB name: RG_ring_0_139021 00:06:55.598 size: 1.000366 MiB name: RG_ring_1_139021 00:06:55.598 size: 1.000366 MiB name: RG_ring_4_139021 00:06:55.598 size: 1.000366 MiB name: RG_ring_5_139021 00:06:55.598 size: 0.125366 MiB name: RG_ring_2_139021 00:06:55.598 size: 0.015991 MiB name: RG_ring_3_139021 00:06:55.598 end memzones------- 00:06:55.598 11:01:20 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@23 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py -m 0 00:06:55.598 heap id: 0 total size: 810.000000 MiB number of busy elements: 44 number of free elements: 15 00:06:55.598 list of free elements. size: 10.862488 MiB 00:06:55.598 element at address: 0x200018a00000 with size: 0.999878 MiB 00:06:55.598 element at address: 0x200018c00000 with size: 0.999878 MiB 00:06:55.598 element at address: 0x200000400000 with size: 0.998535 MiB 00:06:55.598 element at address: 0x200031800000 with size: 0.994446 MiB 00:06:55.598 element at address: 0x200008000000 with size: 0.959839 MiB 00:06:55.598 element at address: 0x200012c00000 with size: 0.954285 MiB 00:06:55.598 element at address: 0x200018e00000 with size: 0.936584 MiB 00:06:55.598 element at address: 0x200000200000 with size: 0.717346 MiB 00:06:55.598 element at address: 0x20001a600000 with size: 0.582886 MiB 00:06:55.598 element at address: 0x200000c00000 with size: 0.495422 MiB 00:06:55.598 element at address: 0x200003e00000 with size: 0.490723 MiB 00:06:55.598 element at address: 0x200019000000 with size: 0.485657 MiB 00:06:55.598 element at address: 0x200010600000 with size: 0.481934 MiB 00:06:55.598 element at address: 0x200027a00000 with size: 0.410034 MiB 00:06:55.598 element at address: 0x200000800000 with size: 0.355042 MiB 00:06:55.598 list of standard malloc elements. size: 199.218628 MiB 00:06:55.598 element at address: 0x2000081fff80 with size: 132.000122 MiB 00:06:55.598 element at address: 0x200003ffff80 with size: 64.000122 MiB 00:06:55.598 element at address: 0x200018afff80 with size: 1.000122 MiB 00:06:55.598 element at address: 0x200018cfff80 with size: 1.000122 MiB 00:06:55.598 element at address: 0x200018efff80 with size: 1.000122 MiB 00:06:55.598 element at address: 0x2000003d9f00 with size: 0.140747 MiB 00:06:55.598 element at address: 0x200018eeff00 with size: 0.062622 MiB 00:06:55.598 element at address: 0x2000003fdf80 with size: 0.007935 MiB 00:06:55.598 element at address: 0x200018eefdc0 with size: 0.000305 MiB 00:06:55.598 element at address: 0x2000002d7c40 with size: 0.000183 MiB 00:06:55.598 element at address: 0x2000003d9e40 with size: 0.000183 MiB 00:06:55.598 element at address: 0x2000004ffa00 with size: 0.000183 MiB 00:06:55.598 element at address: 0x2000004ffac0 with size: 0.000183 MiB 00:06:55.598 element at address: 0x2000004ffb80 with size: 0.000183 MiB 00:06:55.598 element at address: 0x2000004ffd80 with size: 0.000183 MiB 00:06:55.599 element at address: 0x2000004ffe40 with size: 0.000183 MiB 00:06:55.599 element at address: 0x20000085ae40 with size: 0.000183 MiB 00:06:55.599 element at address: 0x20000085b040 with size: 0.000183 MiB 00:06:55.599 element at address: 0x20000085b100 with size: 0.000183 MiB 00:06:55.599 element at address: 0x2000008db3c0 with size: 0.000183 MiB 00:06:55.599 element at address: 0x2000008db5c0 with size: 0.000183 MiB 00:06:55.599 element at address: 0x2000008df880 with size: 0.000183 MiB 00:06:55.599 element at address: 0x2000008ffb40 with size: 0.000183 MiB 00:06:55.599 element at address: 0x200000c7ed40 with size: 0.000183 MiB 00:06:55.599 element at address: 0x200000cff000 with size: 0.000183 MiB 00:06:55.599 element at address: 0x200000cff0c0 with size: 0.000183 MiB 00:06:55.599 element at address: 0x200003e7da00 with size: 0.000183 MiB 00:06:55.599 element at address: 0x200003e7dac0 with size: 0.000183 MiB 00:06:55.599 element at address: 0x200003efdd80 with size: 0.000183 MiB 00:06:55.599 element at address: 0x2000080fdd80 with size: 0.000183 MiB 00:06:55.599 element at address: 0x20001067b600 with size: 0.000183 MiB 00:06:55.599 element at address: 0x20001067b6c0 with size: 0.000183 MiB 00:06:55.599 element at address: 0x2000106fb980 with size: 0.000183 MiB 00:06:55.599 element at address: 0x200012cf44c0 with size: 0.000183 MiB 00:06:55.599 element at address: 0x200018eefc40 with size: 0.000183 MiB 00:06:55.599 element at address: 0x200018eefd00 with size: 0.000183 MiB 00:06:55.599 element at address: 0x2000190bc740 with size: 0.000183 MiB 00:06:55.599 element at address: 0x20001a695380 with size: 0.000183 MiB 00:06:55.599 element at address: 0x20001a695440 with size: 0.000183 MiB 00:06:55.599 element at address: 0x200027a68f80 with size: 0.000183 MiB 00:06:55.599 element at address: 0x200027a69040 with size: 0.000183 MiB 00:06:55.599 element at address: 0x200027a6fc40 with size: 0.000183 MiB 00:06:55.599 element at address: 0x200027a6fe40 with size: 0.000183 MiB 00:06:55.599 element at address: 0x200027a6ff00 with size: 0.000183 MiB 00:06:55.599 list of memzone associated elements. size: 599.918884 MiB 00:06:55.599 element at address: 0x20001a695500 with size: 211.416748 MiB 00:06:55.599 associated memzone info: size: 211.416626 MiB name: MP_PDU_immediate_data_Pool_0 00:06:55.599 element at address: 0x200027a6ffc0 with size: 157.562561 MiB 00:06:55.599 associated memzone info: size: 157.562439 MiB name: MP_PDU_data_out_Pool_0 00:06:55.599 element at address: 0x200012df4780 with size: 92.045044 MiB 00:06:55.599 associated memzone info: size: 92.044922 MiB name: MP_bdev_io_139021_0 00:06:55.599 element at address: 0x200000dff380 with size: 48.003052 MiB 00:06:55.599 associated memzone info: size: 48.002930 MiB name: MP_msgpool_139021_0 00:06:55.599 element at address: 0x2000107fdb80 with size: 36.008911 MiB 00:06:55.599 associated memzone info: size: 36.008789 MiB name: MP_fsdev_io_139021_0 00:06:55.599 element at address: 0x2000191be940 with size: 20.255554 MiB 00:06:55.599 associated memzone info: size: 20.255432 MiB name: MP_PDU_Pool_0 00:06:55.599 element at address: 0x2000319feb40 with size: 18.005066 MiB 00:06:55.599 associated memzone info: size: 18.004944 MiB name: MP_SCSI_TASK_Pool_0 00:06:55.599 element at address: 0x2000004fff00 with size: 3.000244 MiB 00:06:55.599 associated memzone info: size: 3.000122 MiB name: MP_evtpool_139021_0 00:06:55.599 element at address: 0x2000009ffe00 with size: 2.000488 MiB 00:06:55.599 associated memzone info: size: 2.000366 MiB name: RG_MP_msgpool_139021 00:06:55.599 element at address: 0x2000002d7d00 with size: 1.008118 MiB 00:06:55.599 associated memzone info: size: 1.007996 MiB name: MP_evtpool_139021 00:06:55.599 element at address: 0x2000106fba40 with size: 1.008118 MiB 00:06:55.599 associated memzone info: size: 1.007996 MiB name: MP_PDU_Pool 00:06:55.599 element at address: 0x2000190bc800 with size: 1.008118 MiB 00:06:55.599 associated memzone info: size: 1.007996 MiB name: MP_PDU_immediate_data_Pool 00:06:55.599 element at address: 0x2000080fde40 with size: 1.008118 MiB 00:06:55.599 associated memzone info: size: 1.007996 MiB name: MP_PDU_data_out_Pool 00:06:55.599 element at address: 0x200003efde40 with size: 1.008118 MiB 00:06:55.599 associated memzone info: size: 1.007996 MiB name: MP_SCSI_TASK_Pool 00:06:55.599 element at address: 0x200000cff180 with size: 1.000488 MiB 00:06:55.599 associated memzone info: size: 1.000366 MiB name: RG_ring_0_139021 00:06:55.599 element at address: 0x2000008ffc00 with size: 1.000488 MiB 00:06:55.599 associated memzone info: size: 1.000366 MiB name: RG_ring_1_139021 00:06:55.599 element at address: 0x200012cf4580 with size: 1.000488 MiB 00:06:55.599 associated memzone info: size: 1.000366 MiB name: RG_ring_4_139021 00:06:55.599 element at address: 0x2000318fe940 with size: 1.000488 MiB 00:06:55.599 associated memzone info: size: 1.000366 MiB name: RG_ring_5_139021 00:06:55.599 element at address: 0x20000085b1c0 with size: 0.500488 MiB 00:06:55.599 associated memzone info: size: 0.500366 MiB name: RG_MP_fsdev_io_139021 00:06:55.599 element at address: 0x200000c7ee00 with size: 0.500488 MiB 00:06:55.599 associated memzone info: size: 0.500366 MiB name: RG_MP_bdev_io_139021 00:06:55.599 element at address: 0x20001067b780 with size: 0.500488 MiB 00:06:55.599 associated memzone info: size: 0.500366 MiB name: RG_MP_PDU_Pool 00:06:55.599 element at address: 0x200003e7db80 with size: 0.500488 MiB 00:06:55.599 associated memzone info: size: 0.500366 MiB name: RG_MP_SCSI_TASK_Pool 00:06:55.599 element at address: 0x20001907c540 with size: 0.250488 MiB 00:06:55.599 associated memzone info: size: 0.250366 MiB name: RG_MP_PDU_immediate_data_Pool 00:06:55.599 element at address: 0x2000002b7a40 with size: 0.125488 MiB 00:06:55.599 associated memzone info: size: 0.125366 MiB name: RG_MP_evtpool_139021 00:06:55.599 element at address: 0x2000008df940 with size: 0.125488 MiB 00:06:55.599 associated memzone info: size: 0.125366 MiB name: RG_ring_2_139021 00:06:55.599 element at address: 0x2000080f5b80 with size: 0.031738 MiB 00:06:55.599 associated memzone info: size: 0.031616 MiB name: RG_MP_PDU_data_out_Pool 00:06:55.599 element at address: 0x200027a69100 with size: 0.023743 MiB 00:06:55.599 associated memzone info: size: 0.023621 MiB name: MP_Session_Pool_0 00:06:55.599 element at address: 0x2000008db680 with size: 0.016113 MiB 00:06:55.599 associated memzone info: size: 0.015991 MiB name: RG_ring_3_139021 00:06:55.599 element at address: 0x200027a6f240 with size: 0.002441 MiB 00:06:55.599 associated memzone info: size: 0.002319 MiB name: RG_MP_Session_Pool 00:06:55.599 element at address: 0x2000004ffc40 with size: 0.000305 MiB 00:06:55.599 associated memzone info: size: 0.000183 MiB name: MP_msgpool_139021 00:06:55.599 element at address: 0x2000008db480 with size: 0.000305 MiB 00:06:55.599 associated memzone info: size: 0.000183 MiB name: MP_fsdev_io_139021 00:06:55.599 element at address: 0x20000085af00 with size: 0.000305 MiB 00:06:55.599 associated memzone info: size: 0.000183 MiB name: MP_bdev_io_139021 00:06:55.599 element at address: 0x200027a6fd00 with size: 0.000305 MiB 00:06:55.599 associated memzone info: size: 0.000183 MiB name: MP_Session_Pool 00:06:55.599 11:01:20 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@25 -- # trap - SIGINT SIGTERM EXIT 00:06:55.599 11:01:20 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@26 -- # killprocess 139021 00:06:55.599 11:01:20 dpdk_mem_utility -- common/autotest_common.sh@954 -- # '[' -z 139021 ']' 00:06:55.599 11:01:20 dpdk_mem_utility -- common/autotest_common.sh@958 -- # kill -0 139021 00:06:55.599 11:01:20 dpdk_mem_utility -- common/autotest_common.sh@959 -- # uname 00:06:55.599 11:01:20 dpdk_mem_utility -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:06:55.599 11:01:20 dpdk_mem_utility -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 139021 00:06:55.599 11:01:20 dpdk_mem_utility -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:06:55.599 11:01:20 dpdk_mem_utility -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:06:55.599 11:01:20 dpdk_mem_utility -- common/autotest_common.sh@972 -- # echo 'killing process with pid 139021' 00:06:55.599 killing process with pid 139021 00:06:55.599 11:01:20 dpdk_mem_utility -- common/autotest_common.sh@973 -- # kill 139021 00:06:55.599 11:01:20 dpdk_mem_utility -- common/autotest_common.sh@978 -- # wait 139021 00:06:55.860 00:06:55.860 real 0m0.987s 00:06:55.860 user 0m0.907s 00:06:55.860 sys 0m0.438s 00:06:55.860 11:01:20 dpdk_mem_utility -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:55.860 11:01:20 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:06:55.860 ************************************ 00:06:55.860 END TEST dpdk_mem_utility 00:06:55.860 ************************************ 00:06:56.120 11:01:20 -- spdk/autotest.sh@168 -- # run_test event /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event.sh 00:06:56.120 11:01:20 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:56.120 11:01:20 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:56.120 11:01:20 -- common/autotest_common.sh@10 -- # set +x 00:06:56.120 ************************************ 00:06:56.120 START TEST event 00:06:56.120 ************************************ 00:06:56.120 11:01:20 event -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event.sh 00:06:56.120 * Looking for test storage... 00:06:56.120 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event 00:06:56.120 11:01:20 event -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:06:56.120 11:01:20 event -- common/autotest_common.sh@1693 -- # lcov --version 00:06:56.120 11:01:20 event -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:06:56.120 11:01:20 event -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:06:56.120 11:01:20 event -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:56.120 11:01:20 event -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:56.120 11:01:20 event -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:56.120 11:01:20 event -- scripts/common.sh@336 -- # IFS=.-: 00:06:56.120 11:01:20 event -- scripts/common.sh@336 -- # read -ra ver1 00:06:56.120 11:01:20 event -- scripts/common.sh@337 -- # IFS=.-: 00:06:56.120 11:01:20 event -- scripts/common.sh@337 -- # read -ra ver2 00:06:56.120 11:01:20 event -- scripts/common.sh@338 -- # local 'op=<' 00:06:56.120 11:01:20 event -- scripts/common.sh@340 -- # ver1_l=2 00:06:56.120 11:01:20 event -- scripts/common.sh@341 -- # ver2_l=1 00:06:56.120 11:01:20 event -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:56.121 11:01:20 event -- scripts/common.sh@344 -- # case "$op" in 00:06:56.121 11:01:20 event -- scripts/common.sh@345 -- # : 1 00:06:56.121 11:01:20 event -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:56.121 11:01:20 event -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:56.121 11:01:20 event -- scripts/common.sh@365 -- # decimal 1 00:06:56.121 11:01:20 event -- scripts/common.sh@353 -- # local d=1 00:06:56.121 11:01:20 event -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:56.121 11:01:20 event -- scripts/common.sh@355 -- # echo 1 00:06:56.121 11:01:20 event -- scripts/common.sh@365 -- # ver1[v]=1 00:06:56.121 11:01:20 event -- scripts/common.sh@366 -- # decimal 2 00:06:56.121 11:01:20 event -- scripts/common.sh@353 -- # local d=2 00:06:56.121 11:01:20 event -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:56.121 11:01:20 event -- scripts/common.sh@355 -- # echo 2 00:06:56.121 11:01:20 event -- scripts/common.sh@366 -- # ver2[v]=2 00:06:56.121 11:01:20 event -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:56.121 11:01:20 event -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:56.121 11:01:20 event -- scripts/common.sh@368 -- # return 0 00:06:56.381 11:01:20 event -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:56.381 11:01:20 event -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:06:56.381 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:56.381 --rc genhtml_branch_coverage=1 00:06:56.381 --rc genhtml_function_coverage=1 00:06:56.381 --rc genhtml_legend=1 00:06:56.381 --rc geninfo_all_blocks=1 00:06:56.381 --rc geninfo_unexecuted_blocks=1 00:06:56.381 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:56.381 ' 00:06:56.381 11:01:20 event -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:06:56.381 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:56.381 --rc genhtml_branch_coverage=1 00:06:56.381 --rc genhtml_function_coverage=1 00:06:56.381 --rc genhtml_legend=1 00:06:56.381 --rc geninfo_all_blocks=1 00:06:56.381 --rc geninfo_unexecuted_blocks=1 00:06:56.381 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:56.381 ' 00:06:56.381 11:01:20 event -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:06:56.381 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:56.381 --rc genhtml_branch_coverage=1 00:06:56.381 --rc genhtml_function_coverage=1 00:06:56.381 --rc genhtml_legend=1 00:06:56.381 --rc geninfo_all_blocks=1 00:06:56.381 --rc geninfo_unexecuted_blocks=1 00:06:56.381 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:56.381 ' 00:06:56.381 11:01:20 event -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:06:56.381 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:56.381 --rc genhtml_branch_coverage=1 00:06:56.381 --rc genhtml_function_coverage=1 00:06:56.381 --rc genhtml_legend=1 00:06:56.381 --rc geninfo_all_blocks=1 00:06:56.381 --rc geninfo_unexecuted_blocks=1 00:06:56.381 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:56.381 ' 00:06:56.381 11:01:20 event -- event/event.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/bdev/nbd_common.sh 00:06:56.381 11:01:20 event -- bdev/nbd_common.sh@6 -- # set -e 00:06:56.381 11:01:20 event -- event/event.sh@45 -- # run_test event_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:06:56.381 11:01:20 event -- common/autotest_common.sh@1105 -- # '[' 6 -le 1 ']' 00:06:56.381 11:01:20 event -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:56.381 11:01:20 event -- common/autotest_common.sh@10 -- # set +x 00:06:56.381 ************************************ 00:06:56.381 START TEST event_perf 00:06:56.381 ************************************ 00:06:56.381 11:01:20 event.event_perf -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:06:56.381 Running I/O for 1 seconds...[2024-11-17 11:01:20.839073] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:06:56.381 [2024-11-17 11:01:20.839199] [ DPDK EAL parameters: event_perf --no-shconf -c 0xF --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid139194 ] 00:06:56.382 [2024-11-17 11:01:20.930310] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 4 00:06:56.382 [2024-11-17 11:01:20.956096] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:06:56.382 [2024-11-17 11:01:20.956143] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 2 00:06:56.382 [2024-11-17 11:01:20.956227] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:06:56.382 [2024-11-17 11:01:20.956228] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 3 00:06:57.763 Running I/O for 1 seconds... 00:06:57.763 lcore 0: 191881 00:06:57.763 lcore 1: 191880 00:06:57.763 lcore 2: 191880 00:06:57.763 lcore 3: 191879 00:06:57.763 done. 00:06:57.763 00:06:57.763 real 0m1.167s 00:06:57.763 user 0m4.066s 00:06:57.763 sys 0m0.097s 00:06:57.763 11:01:21 event.event_perf -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:57.763 11:01:21 event.event_perf -- common/autotest_common.sh@10 -- # set +x 00:06:57.763 ************************************ 00:06:57.763 END TEST event_perf 00:06:57.763 ************************************ 00:06:57.763 11:01:22 event -- event/event.sh@46 -- # run_test event_reactor /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:06:57.763 11:01:22 event -- common/autotest_common.sh@1105 -- # '[' 4 -le 1 ']' 00:06:57.763 11:01:22 event -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:57.763 11:01:22 event -- common/autotest_common.sh@10 -- # set +x 00:06:57.763 ************************************ 00:06:57.763 START TEST event_reactor 00:06:57.763 ************************************ 00:06:57.763 11:01:22 event.event_reactor -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:06:57.763 [2024-11-17 11:01:22.092567] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:06:57.763 [2024-11-17 11:01:22.092684] [ DPDK EAL parameters: reactor --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid139387 ] 00:06:57.763 [2024-11-17 11:01:22.181822] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:57.763 [2024-11-17 11:01:22.206003] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:06:58.704 test_start 00:06:58.704 oneshot 00:06:58.704 tick 100 00:06:58.704 tick 100 00:06:58.704 tick 250 00:06:58.704 tick 100 00:06:58.704 tick 100 00:06:58.704 tick 100 00:06:58.704 tick 250 00:06:58.704 tick 500 00:06:58.704 tick 100 00:06:58.704 tick 100 00:06:58.704 tick 250 00:06:58.704 tick 100 00:06:58.704 tick 100 00:06:58.704 test_end 00:06:58.704 00:06:58.704 real 0m1.162s 00:06:58.704 user 0m1.058s 00:06:58.704 sys 0m0.099s 00:06:58.704 11:01:23 event.event_reactor -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:58.704 11:01:23 event.event_reactor -- common/autotest_common.sh@10 -- # set +x 00:06:58.704 ************************************ 00:06:58.704 END TEST event_reactor 00:06:58.704 ************************************ 00:06:58.704 11:01:23 event -- event/event.sh@47 -- # run_test event_reactor_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:06:58.704 11:01:23 event -- common/autotest_common.sh@1105 -- # '[' 4 -le 1 ']' 00:06:58.704 11:01:23 event -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:58.705 11:01:23 event -- common/autotest_common.sh@10 -- # set +x 00:06:58.705 ************************************ 00:06:58.705 START TEST event_reactor_perf 00:06:58.705 ************************************ 00:06:58.705 11:01:23 event.event_reactor_perf -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:06:58.705 [2024-11-17 11:01:23.341791] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:06:58.705 [2024-11-17 11:01:23.341874] [ DPDK EAL parameters: reactor_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid139667 ] 00:06:58.964 [2024-11-17 11:01:23.432816] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:58.964 [2024-11-17 11:01:23.455851] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:06:59.904 test_start 00:06:59.904 test_end 00:06:59.904 Performance: 958250 events per second 00:06:59.904 00:06:59.904 real 0m1.163s 00:06:59.904 user 0m1.063s 00:06:59.904 sys 0m0.095s 00:06:59.904 11:01:24 event.event_reactor_perf -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:59.904 11:01:24 event.event_reactor_perf -- common/autotest_common.sh@10 -- # set +x 00:06:59.904 ************************************ 00:06:59.904 END TEST event_reactor_perf 00:06:59.904 ************************************ 00:06:59.904 11:01:24 event -- event/event.sh@49 -- # uname -s 00:06:59.904 11:01:24 event -- event/event.sh@49 -- # '[' Linux = Linux ']' 00:06:59.904 11:01:24 event -- event/event.sh@50 -- # run_test event_scheduler /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:06:59.904 11:01:24 event -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:59.904 11:01:24 event -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:59.904 11:01:24 event -- common/autotest_common.sh@10 -- # set +x 00:07:00.164 ************************************ 00:07:00.164 START TEST event_scheduler 00:07:00.164 ************************************ 00:07:00.164 11:01:24 event.event_scheduler -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:07:00.164 * Looking for test storage... 00:07:00.164 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler 00:07:00.164 11:01:24 event.event_scheduler -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:07:00.164 11:01:24 event.event_scheduler -- common/autotest_common.sh@1693 -- # lcov --version 00:07:00.164 11:01:24 event.event_scheduler -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:07:00.164 11:01:24 event.event_scheduler -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:07:00.164 11:01:24 event.event_scheduler -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:00.164 11:01:24 event.event_scheduler -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:00.164 11:01:24 event.event_scheduler -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:00.164 11:01:24 event.event_scheduler -- scripts/common.sh@336 -- # IFS=.-: 00:07:00.164 11:01:24 event.event_scheduler -- scripts/common.sh@336 -- # read -ra ver1 00:07:00.164 11:01:24 event.event_scheduler -- scripts/common.sh@337 -- # IFS=.-: 00:07:00.164 11:01:24 event.event_scheduler -- scripts/common.sh@337 -- # read -ra ver2 00:07:00.164 11:01:24 event.event_scheduler -- scripts/common.sh@338 -- # local 'op=<' 00:07:00.164 11:01:24 event.event_scheduler -- scripts/common.sh@340 -- # ver1_l=2 00:07:00.164 11:01:24 event.event_scheduler -- scripts/common.sh@341 -- # ver2_l=1 00:07:00.164 11:01:24 event.event_scheduler -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:00.164 11:01:24 event.event_scheduler -- scripts/common.sh@344 -- # case "$op" in 00:07:00.164 11:01:24 event.event_scheduler -- scripts/common.sh@345 -- # : 1 00:07:00.164 11:01:24 event.event_scheduler -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:00.164 11:01:24 event.event_scheduler -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:00.164 11:01:24 event.event_scheduler -- scripts/common.sh@365 -- # decimal 1 00:07:00.164 11:01:24 event.event_scheduler -- scripts/common.sh@353 -- # local d=1 00:07:00.164 11:01:24 event.event_scheduler -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:00.164 11:01:24 event.event_scheduler -- scripts/common.sh@355 -- # echo 1 00:07:00.164 11:01:24 event.event_scheduler -- scripts/common.sh@365 -- # ver1[v]=1 00:07:00.164 11:01:24 event.event_scheduler -- scripts/common.sh@366 -- # decimal 2 00:07:00.164 11:01:24 event.event_scheduler -- scripts/common.sh@353 -- # local d=2 00:07:00.164 11:01:24 event.event_scheduler -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:00.164 11:01:24 event.event_scheduler -- scripts/common.sh@355 -- # echo 2 00:07:00.164 11:01:24 event.event_scheduler -- scripts/common.sh@366 -- # ver2[v]=2 00:07:00.164 11:01:24 event.event_scheduler -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:00.164 11:01:24 event.event_scheduler -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:00.164 11:01:24 event.event_scheduler -- scripts/common.sh@368 -- # return 0 00:07:00.164 11:01:24 event.event_scheduler -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:00.165 11:01:24 event.event_scheduler -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:07:00.165 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:00.165 --rc genhtml_branch_coverage=1 00:07:00.165 --rc genhtml_function_coverage=1 00:07:00.165 --rc genhtml_legend=1 00:07:00.165 --rc geninfo_all_blocks=1 00:07:00.165 --rc geninfo_unexecuted_blocks=1 00:07:00.165 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:00.165 ' 00:07:00.165 11:01:24 event.event_scheduler -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:07:00.165 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:00.165 --rc genhtml_branch_coverage=1 00:07:00.165 --rc genhtml_function_coverage=1 00:07:00.165 --rc genhtml_legend=1 00:07:00.165 --rc geninfo_all_blocks=1 00:07:00.165 --rc geninfo_unexecuted_blocks=1 00:07:00.165 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:00.165 ' 00:07:00.165 11:01:24 event.event_scheduler -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:07:00.165 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:00.165 --rc genhtml_branch_coverage=1 00:07:00.165 --rc genhtml_function_coverage=1 00:07:00.165 --rc genhtml_legend=1 00:07:00.165 --rc geninfo_all_blocks=1 00:07:00.165 --rc geninfo_unexecuted_blocks=1 00:07:00.165 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:00.165 ' 00:07:00.165 11:01:24 event.event_scheduler -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:07:00.165 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:00.165 --rc genhtml_branch_coverage=1 00:07:00.165 --rc genhtml_function_coverage=1 00:07:00.165 --rc genhtml_legend=1 00:07:00.165 --rc geninfo_all_blocks=1 00:07:00.165 --rc geninfo_unexecuted_blocks=1 00:07:00.165 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:00.165 ' 00:07:00.165 11:01:24 event.event_scheduler -- scheduler/scheduler.sh@29 -- # rpc=rpc_cmd 00:07:00.165 11:01:24 event.event_scheduler -- scheduler/scheduler.sh@35 -- # scheduler_pid=139989 00:07:00.165 11:01:24 event.event_scheduler -- scheduler/scheduler.sh@36 -- # trap 'killprocess $scheduler_pid; exit 1' SIGINT SIGTERM EXIT 00:07:00.165 11:01:24 event.event_scheduler -- scheduler/scheduler.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler -m 0xF -p 0x2 --wait-for-rpc -f 00:07:00.165 11:01:24 event.event_scheduler -- scheduler/scheduler.sh@37 -- # waitforlisten 139989 00:07:00.165 11:01:24 event.event_scheduler -- common/autotest_common.sh@835 -- # '[' -z 139989 ']' 00:07:00.165 11:01:24 event.event_scheduler -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:00.165 11:01:24 event.event_scheduler -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:00.165 11:01:24 event.event_scheduler -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:00.165 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:00.165 11:01:24 event.event_scheduler -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:00.165 11:01:24 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:00.165 [2024-11-17 11:01:24.807983] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:07:00.165 [2024-11-17 11:01:24.808074] [ DPDK EAL parameters: scheduler --no-shconf -c 0xF --main-lcore=2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid139989 ] 00:07:00.431 [2024-11-17 11:01:24.897433] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:00.431 [2024-11-17 11:01:24.923385] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:00.431 [2024-11-17 11:01:24.923498] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:07:00.431 [2024-11-17 11:01:24.923610] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 2 00:07:00.431 [2024-11-17 11:01:24.923611] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 3 00:07:00.431 11:01:24 event.event_scheduler -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:00.431 11:01:24 event.event_scheduler -- common/autotest_common.sh@868 -- # return 0 00:07:00.431 11:01:24 event.event_scheduler -- scheduler/scheduler.sh@39 -- # rpc_cmd framework_set_scheduler dynamic 00:07:00.431 11:01:24 event.event_scheduler -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:00.431 11:01:24 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:00.431 [2024-11-17 11:01:24.980268] dpdk_governor.c: 173:_init: *ERROR*: App core mask contains some but not all of a set of SMT siblings 00:07:00.431 [2024-11-17 11:01:24.980288] scheduler_dynamic.c: 280:init: *NOTICE*: Unable to initialize dpdk governor 00:07:00.431 [2024-11-17 11:01:24.980300] scheduler_dynamic.c: 427:set_opts: *NOTICE*: Setting scheduler load limit to 20 00:07:00.431 [2024-11-17 11:01:24.980307] scheduler_dynamic.c: 429:set_opts: *NOTICE*: Setting scheduler core limit to 80 00:07:00.431 [2024-11-17 11:01:24.980314] scheduler_dynamic.c: 431:set_opts: *NOTICE*: Setting scheduler core busy to 95 00:07:00.431 11:01:24 event.event_scheduler -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:00.431 11:01:24 event.event_scheduler -- scheduler/scheduler.sh@40 -- # rpc_cmd framework_start_init 00:07:00.431 11:01:24 event.event_scheduler -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:00.431 11:01:24 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:00.431 [2024-11-17 11:01:25.047585] scheduler.c: 382:test_start: *NOTICE*: Scheduler test application started. 00:07:00.431 11:01:25 event.event_scheduler -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:00.431 11:01:25 event.event_scheduler -- scheduler/scheduler.sh@43 -- # run_test scheduler_create_thread scheduler_create_thread 00:07:00.431 11:01:25 event.event_scheduler -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:00.431 11:01:25 event.event_scheduler -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:00.431 11:01:25 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:00.691 ************************************ 00:07:00.691 START TEST scheduler_create_thread 00:07:00.691 ************************************ 00:07:00.691 11:01:25 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1129 -- # scheduler_create_thread 00:07:00.691 11:01:25 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@12 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x1 -a 100 00:07:00.691 11:01:25 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:00.691 11:01:25 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:00.691 2 00:07:00.691 11:01:25 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:00.691 11:01:25 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@13 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x2 -a 100 00:07:00.691 11:01:25 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:00.691 11:01:25 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:00.691 3 00:07:00.691 11:01:25 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:00.691 11:01:25 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@14 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x4 -a 100 00:07:00.691 11:01:25 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:00.691 11:01:25 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:00.691 4 00:07:00.691 11:01:25 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:00.691 11:01:25 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@15 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x8 -a 100 00:07:00.691 11:01:25 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:00.691 11:01:25 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:00.691 5 00:07:00.691 11:01:25 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:00.691 11:01:25 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@16 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x1 -a 0 00:07:00.691 11:01:25 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:00.691 11:01:25 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:00.691 6 00:07:00.691 11:01:25 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:00.691 11:01:25 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@17 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x2 -a 0 00:07:00.691 11:01:25 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:00.691 11:01:25 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:00.691 7 00:07:00.691 11:01:25 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:00.691 11:01:25 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@18 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x4 -a 0 00:07:00.691 11:01:25 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:00.691 11:01:25 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:00.691 8 00:07:00.691 11:01:25 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:00.691 11:01:25 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@19 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x8 -a 0 00:07:00.691 11:01:25 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:00.691 11:01:25 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:00.691 9 00:07:00.691 11:01:25 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:00.691 11:01:25 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@21 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n one_third_active -a 30 00:07:00.691 11:01:25 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:00.691 11:01:25 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:00.691 10 00:07:00.692 11:01:25 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:00.692 11:01:25 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n half_active -a 0 00:07:00.692 11:01:25 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:00.692 11:01:25 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:00.692 11:01:25 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:00.692 11:01:25 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # thread_id=11 00:07:00.692 11:01:25 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@23 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_set_active 11 50 00:07:00.692 11:01:25 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:00.692 11:01:25 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:00.692 11:01:25 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:00.692 11:01:25 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n deleted -a 100 00:07:00.692 11:01:25 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:00.692 11:01:25 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:02.073 11:01:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:02.073 11:01:26 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # thread_id=12 00:07:02.073 11:01:26 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@26 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_delete 12 00:07:02.073 11:01:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:02.073 11:01:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:03.455 11:01:27 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:03.455 00:07:03.455 real 0m2.618s 00:07:03.455 user 0m0.020s 00:07:03.455 sys 0m0.012s 00:07:03.455 11:01:27 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:03.455 11:01:27 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:03.455 ************************************ 00:07:03.455 END TEST scheduler_create_thread 00:07:03.455 ************************************ 00:07:03.455 11:01:27 event.event_scheduler -- scheduler/scheduler.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:07:03.455 11:01:27 event.event_scheduler -- scheduler/scheduler.sh@46 -- # killprocess 139989 00:07:03.455 11:01:27 event.event_scheduler -- common/autotest_common.sh@954 -- # '[' -z 139989 ']' 00:07:03.455 11:01:27 event.event_scheduler -- common/autotest_common.sh@958 -- # kill -0 139989 00:07:03.455 11:01:27 event.event_scheduler -- common/autotest_common.sh@959 -- # uname 00:07:03.455 11:01:27 event.event_scheduler -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:03.455 11:01:27 event.event_scheduler -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 139989 00:07:03.455 11:01:27 event.event_scheduler -- common/autotest_common.sh@960 -- # process_name=reactor_2 00:07:03.455 11:01:27 event.event_scheduler -- common/autotest_common.sh@964 -- # '[' reactor_2 = sudo ']' 00:07:03.455 11:01:27 event.event_scheduler -- common/autotest_common.sh@972 -- # echo 'killing process with pid 139989' 00:07:03.455 killing process with pid 139989 00:07:03.455 11:01:27 event.event_scheduler -- common/autotest_common.sh@973 -- # kill 139989 00:07:03.455 11:01:27 event.event_scheduler -- common/autotest_common.sh@978 -- # wait 139989 00:07:03.715 [2024-11-17 11:01:28.189647] scheduler.c: 360:test_shutdown: *NOTICE*: Scheduler test application stopped. 00:07:03.715 00:07:03.715 real 0m3.763s 00:07:03.715 user 0m5.617s 00:07:03.715 sys 0m0.458s 00:07:03.715 11:01:28 event.event_scheduler -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:03.715 11:01:28 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:03.715 ************************************ 00:07:03.715 END TEST event_scheduler 00:07:03.715 ************************************ 00:07:03.975 11:01:28 event -- event/event.sh@51 -- # modprobe -n nbd 00:07:03.975 11:01:28 event -- event/event.sh@52 -- # run_test app_repeat app_repeat_test 00:07:03.975 11:01:28 event -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:03.975 11:01:28 event -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:03.975 11:01:28 event -- common/autotest_common.sh@10 -- # set +x 00:07:03.975 ************************************ 00:07:03.975 START TEST app_repeat 00:07:03.975 ************************************ 00:07:03.975 11:01:28 event.app_repeat -- common/autotest_common.sh@1129 -- # app_repeat_test 00:07:03.975 11:01:28 event.app_repeat -- event/event.sh@12 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:03.975 11:01:28 event.app_repeat -- event/event.sh@13 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:03.975 11:01:28 event.app_repeat -- event/event.sh@13 -- # local nbd_list 00:07:03.975 11:01:28 event.app_repeat -- event/event.sh@14 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:03.975 11:01:28 event.app_repeat -- event/event.sh@14 -- # local bdev_list 00:07:03.975 11:01:28 event.app_repeat -- event/event.sh@15 -- # local repeat_times=4 00:07:03.975 11:01:28 event.app_repeat -- event/event.sh@17 -- # modprobe nbd 00:07:03.975 11:01:28 event.app_repeat -- event/event.sh@19 -- # repeat_pid=140602 00:07:03.975 11:01:28 event.app_repeat -- event/event.sh@20 -- # trap 'killprocess $repeat_pid; exit 1' SIGINT SIGTERM EXIT 00:07:03.976 11:01:28 event.app_repeat -- event/event.sh@18 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/app_repeat/app_repeat -r /var/tmp/spdk-nbd.sock -m 0x3 -t 4 00:07:03.976 11:01:28 event.app_repeat -- event/event.sh@21 -- # echo 'Process app_repeat pid: 140602' 00:07:03.976 Process app_repeat pid: 140602 00:07:03.976 11:01:28 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:07:03.976 11:01:28 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 0' 00:07:03.976 spdk_app_start Round 0 00:07:03.976 11:01:28 event.app_repeat -- event/event.sh@25 -- # waitforlisten 140602 /var/tmp/spdk-nbd.sock 00:07:03.976 11:01:28 event.app_repeat -- common/autotest_common.sh@835 -- # '[' -z 140602 ']' 00:07:03.976 11:01:28 event.app_repeat -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:03.976 11:01:28 event.app_repeat -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:03.976 11:01:28 event.app_repeat -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:03.976 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:03.976 11:01:28 event.app_repeat -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:03.976 11:01:28 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:03.976 [2024-11-17 11:01:28.456121] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:07:03.976 [2024-11-17 11:01:28.456204] [ DPDK EAL parameters: app_repeat --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid140602 ] 00:07:03.976 [2024-11-17 11:01:28.545542] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:03.976 [2024-11-17 11:01:28.569036] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:07:03.976 [2024-11-17 11:01:28.569036] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:04.235 11:01:28 event.app_repeat -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:04.236 11:01:28 event.app_repeat -- common/autotest_common.sh@868 -- # return 0 00:07:04.236 11:01:28 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:04.236 Malloc0 00:07:04.236 11:01:28 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:04.495 Malloc1 00:07:04.496 11:01:29 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:04.496 11:01:29 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:04.496 11:01:29 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:04.496 11:01:29 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:07:04.496 11:01:29 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:04.496 11:01:29 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:07:04.496 11:01:29 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:04.496 11:01:29 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:04.496 11:01:29 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:04.496 11:01:29 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:07:04.496 11:01:29 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:04.496 11:01:29 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:07:04.496 11:01:29 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:07:04.496 11:01:29 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:07:04.496 11:01:29 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:04.496 11:01:29 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:07:04.755 /dev/nbd0 00:07:04.755 11:01:29 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:07:04.755 11:01:29 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:07:04.755 11:01:29 event.app_repeat -- common/autotest_common.sh@872 -- # local nbd_name=nbd0 00:07:04.755 11:01:29 event.app_repeat -- common/autotest_common.sh@873 -- # local i 00:07:04.755 11:01:29 event.app_repeat -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:07:04.755 11:01:29 event.app_repeat -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:07:04.755 11:01:29 event.app_repeat -- common/autotest_common.sh@876 -- # grep -q -w nbd0 /proc/partitions 00:07:04.755 11:01:29 event.app_repeat -- common/autotest_common.sh@877 -- # break 00:07:04.755 11:01:29 event.app_repeat -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:07:04.755 11:01:29 event.app_repeat -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:07:04.755 11:01:29 event.app_repeat -- common/autotest_common.sh@889 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:04.755 1+0 records in 00:07:04.755 1+0 records out 00:07:04.755 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000123218 s, 33.2 MB/s 00:07:04.755 11:01:29 event.app_repeat -- common/autotest_common.sh@890 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:04.755 11:01:29 event.app_repeat -- common/autotest_common.sh@890 -- # size=4096 00:07:04.755 11:01:29 event.app_repeat -- common/autotest_common.sh@891 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:04.755 11:01:29 event.app_repeat -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:07:04.755 11:01:29 event.app_repeat -- common/autotest_common.sh@893 -- # return 0 00:07:04.755 11:01:29 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:04.755 11:01:29 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:04.755 11:01:29 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:07:05.015 /dev/nbd1 00:07:05.015 11:01:29 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:07:05.015 11:01:29 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:07:05.015 11:01:29 event.app_repeat -- common/autotest_common.sh@872 -- # local nbd_name=nbd1 00:07:05.015 11:01:29 event.app_repeat -- common/autotest_common.sh@873 -- # local i 00:07:05.015 11:01:29 event.app_repeat -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:07:05.015 11:01:29 event.app_repeat -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:07:05.015 11:01:29 event.app_repeat -- common/autotest_common.sh@876 -- # grep -q -w nbd1 /proc/partitions 00:07:05.015 11:01:29 event.app_repeat -- common/autotest_common.sh@877 -- # break 00:07:05.015 11:01:29 event.app_repeat -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:07:05.015 11:01:29 event.app_repeat -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:07:05.015 11:01:29 event.app_repeat -- common/autotest_common.sh@889 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:05.015 1+0 records in 00:07:05.015 1+0 records out 00:07:05.015 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000290133 s, 14.1 MB/s 00:07:05.015 11:01:29 event.app_repeat -- common/autotest_common.sh@890 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:05.015 11:01:29 event.app_repeat -- common/autotest_common.sh@890 -- # size=4096 00:07:05.015 11:01:29 event.app_repeat -- common/autotest_common.sh@891 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:05.015 11:01:29 event.app_repeat -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:07:05.015 11:01:29 event.app_repeat -- common/autotest_common.sh@893 -- # return 0 00:07:05.015 11:01:29 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:05.015 11:01:29 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:05.015 11:01:29 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:05.015 11:01:29 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:05.015 11:01:29 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:05.275 11:01:29 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:07:05.275 { 00:07:05.275 "nbd_device": "/dev/nbd0", 00:07:05.275 "bdev_name": "Malloc0" 00:07:05.275 }, 00:07:05.275 { 00:07:05.275 "nbd_device": "/dev/nbd1", 00:07:05.275 "bdev_name": "Malloc1" 00:07:05.275 } 00:07:05.275 ]' 00:07:05.275 11:01:29 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:07:05.275 { 00:07:05.275 "nbd_device": "/dev/nbd0", 00:07:05.275 "bdev_name": "Malloc0" 00:07:05.275 }, 00:07:05.275 { 00:07:05.275 "nbd_device": "/dev/nbd1", 00:07:05.275 "bdev_name": "Malloc1" 00:07:05.275 } 00:07:05.275 ]' 00:07:05.275 11:01:29 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:05.275 11:01:29 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:07:05.275 /dev/nbd1' 00:07:05.275 11:01:29 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:07:05.276 /dev/nbd1' 00:07:05.276 11:01:29 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:05.276 11:01:29 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:07:05.276 11:01:29 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:07:05.276 11:01:29 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:07:05.276 11:01:29 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:07:05.276 11:01:29 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:07:05.276 11:01:29 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:05.276 11:01:29 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:05.276 11:01:29 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:07:05.276 11:01:29 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:05.276 11:01:29 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:07:05.276 11:01:29 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:07:05.276 256+0 records in 00:07:05.276 256+0 records out 00:07:05.276 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00347211 s, 302 MB/s 00:07:05.276 11:01:29 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:05.276 11:01:29 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:07:05.276 256+0 records in 00:07:05.276 256+0 records out 00:07:05.276 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0191474 s, 54.8 MB/s 00:07:05.276 11:01:29 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:05.276 11:01:29 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:07:05.276 256+0 records in 00:07:05.276 256+0 records out 00:07:05.276 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.022665 s, 46.3 MB/s 00:07:05.276 11:01:29 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:07:05.276 11:01:29 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:05.276 11:01:29 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:05.276 11:01:29 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:07:05.276 11:01:29 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:05.276 11:01:29 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:07:05.276 11:01:29 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:07:05.276 11:01:29 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:05.276 11:01:29 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:07:05.276 11:01:29 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:05.276 11:01:29 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:07:05.276 11:01:29 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:05.276 11:01:29 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:07:05.276 11:01:29 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:05.276 11:01:29 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:05.276 11:01:29 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:05.276 11:01:29 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:07:05.276 11:01:29 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:05.276 11:01:29 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:05.536 11:01:30 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:05.536 11:01:30 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:05.536 11:01:30 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:05.536 11:01:30 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:05.536 11:01:30 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:05.536 11:01:30 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:05.536 11:01:30 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:05.536 11:01:30 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:05.536 11:01:30 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:05.536 11:01:30 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:05.795 11:01:30 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:05.795 11:01:30 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:05.795 11:01:30 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:05.795 11:01:30 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:05.796 11:01:30 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:05.796 11:01:30 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:05.796 11:01:30 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:05.796 11:01:30 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:05.796 11:01:30 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:05.796 11:01:30 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:05.796 11:01:30 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:06.056 11:01:30 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:06.056 11:01:30 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:06.056 11:01:30 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:06.056 11:01:30 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:06.056 11:01:30 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:07:06.056 11:01:30 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:06.056 11:01:30 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:07:06.056 11:01:30 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:07:06.056 11:01:30 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:07:06.056 11:01:30 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:07:06.056 11:01:30 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:07:06.056 11:01:30 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:07:06.056 11:01:30 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:07:06.315 11:01:30 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:07:06.315 [2024-11-17 11:01:30.923533] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:06.315 [2024-11-17 11:01:30.943030] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:06.315 [2024-11-17 11:01:30.943030] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:07:06.575 [2024-11-17 11:01:30.983792] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:07:06.575 [2024-11-17 11:01:30.983838] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:07:09.870 11:01:33 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:07:09.870 11:01:33 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 1' 00:07:09.870 spdk_app_start Round 1 00:07:09.870 11:01:33 event.app_repeat -- event/event.sh@25 -- # waitforlisten 140602 /var/tmp/spdk-nbd.sock 00:07:09.870 11:01:33 event.app_repeat -- common/autotest_common.sh@835 -- # '[' -z 140602 ']' 00:07:09.870 11:01:33 event.app_repeat -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:09.870 11:01:33 event.app_repeat -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:09.870 11:01:33 event.app_repeat -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:09.870 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:09.870 11:01:33 event.app_repeat -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:09.870 11:01:33 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:09.870 11:01:33 event.app_repeat -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:09.870 11:01:33 event.app_repeat -- common/autotest_common.sh@868 -- # return 0 00:07:09.870 11:01:33 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:09.870 Malloc0 00:07:09.871 11:01:34 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:09.871 Malloc1 00:07:09.871 11:01:34 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:09.871 11:01:34 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:09.871 11:01:34 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:09.871 11:01:34 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:07:09.871 11:01:34 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:09.871 11:01:34 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:07:09.871 11:01:34 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:09.871 11:01:34 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:09.871 11:01:34 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:09.871 11:01:34 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:07:09.871 11:01:34 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:09.871 11:01:34 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:07:09.871 11:01:34 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:07:09.871 11:01:34 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:07:09.871 11:01:34 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:09.871 11:01:34 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:07:10.131 /dev/nbd0 00:07:10.131 11:01:34 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:07:10.131 11:01:34 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:07:10.131 11:01:34 event.app_repeat -- common/autotest_common.sh@872 -- # local nbd_name=nbd0 00:07:10.131 11:01:34 event.app_repeat -- common/autotest_common.sh@873 -- # local i 00:07:10.131 11:01:34 event.app_repeat -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:07:10.131 11:01:34 event.app_repeat -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:07:10.131 11:01:34 event.app_repeat -- common/autotest_common.sh@876 -- # grep -q -w nbd0 /proc/partitions 00:07:10.131 11:01:34 event.app_repeat -- common/autotest_common.sh@877 -- # break 00:07:10.131 11:01:34 event.app_repeat -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:07:10.131 11:01:34 event.app_repeat -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:07:10.131 11:01:34 event.app_repeat -- common/autotest_common.sh@889 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:10.131 1+0 records in 00:07:10.131 1+0 records out 00:07:10.131 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000253945 s, 16.1 MB/s 00:07:10.131 11:01:34 event.app_repeat -- common/autotest_common.sh@890 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:10.131 11:01:34 event.app_repeat -- common/autotest_common.sh@890 -- # size=4096 00:07:10.131 11:01:34 event.app_repeat -- common/autotest_common.sh@891 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:10.131 11:01:34 event.app_repeat -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:07:10.131 11:01:34 event.app_repeat -- common/autotest_common.sh@893 -- # return 0 00:07:10.131 11:01:34 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:10.131 11:01:34 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:10.131 11:01:34 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:07:10.392 /dev/nbd1 00:07:10.392 11:01:34 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:07:10.392 11:01:34 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:07:10.392 11:01:34 event.app_repeat -- common/autotest_common.sh@872 -- # local nbd_name=nbd1 00:07:10.392 11:01:34 event.app_repeat -- common/autotest_common.sh@873 -- # local i 00:07:10.392 11:01:34 event.app_repeat -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:07:10.392 11:01:34 event.app_repeat -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:07:10.392 11:01:34 event.app_repeat -- common/autotest_common.sh@876 -- # grep -q -w nbd1 /proc/partitions 00:07:10.392 11:01:34 event.app_repeat -- common/autotest_common.sh@877 -- # break 00:07:10.392 11:01:34 event.app_repeat -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:07:10.392 11:01:34 event.app_repeat -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:07:10.392 11:01:34 event.app_repeat -- common/autotest_common.sh@889 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:10.392 1+0 records in 00:07:10.392 1+0 records out 00:07:10.392 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000268392 s, 15.3 MB/s 00:07:10.392 11:01:34 event.app_repeat -- common/autotest_common.sh@890 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:10.392 11:01:34 event.app_repeat -- common/autotest_common.sh@890 -- # size=4096 00:07:10.392 11:01:34 event.app_repeat -- common/autotest_common.sh@891 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:10.392 11:01:34 event.app_repeat -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:07:10.392 11:01:34 event.app_repeat -- common/autotest_common.sh@893 -- # return 0 00:07:10.392 11:01:34 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:10.392 11:01:34 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:10.392 11:01:34 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:10.392 11:01:34 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:10.392 11:01:34 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:10.652 11:01:35 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:07:10.652 { 00:07:10.652 "nbd_device": "/dev/nbd0", 00:07:10.652 "bdev_name": "Malloc0" 00:07:10.652 }, 00:07:10.652 { 00:07:10.652 "nbd_device": "/dev/nbd1", 00:07:10.652 "bdev_name": "Malloc1" 00:07:10.652 } 00:07:10.652 ]' 00:07:10.652 11:01:35 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:07:10.652 { 00:07:10.652 "nbd_device": "/dev/nbd0", 00:07:10.652 "bdev_name": "Malloc0" 00:07:10.652 }, 00:07:10.652 { 00:07:10.652 "nbd_device": "/dev/nbd1", 00:07:10.652 "bdev_name": "Malloc1" 00:07:10.652 } 00:07:10.652 ]' 00:07:10.652 11:01:35 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:10.652 11:01:35 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:07:10.652 /dev/nbd1' 00:07:10.652 11:01:35 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:07:10.652 /dev/nbd1' 00:07:10.652 11:01:35 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:10.652 11:01:35 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:07:10.652 11:01:35 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:07:10.652 11:01:35 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:07:10.652 11:01:35 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:07:10.652 11:01:35 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:07:10.652 11:01:35 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:10.652 11:01:35 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:10.652 11:01:35 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:07:10.652 11:01:35 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:10.652 11:01:35 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:07:10.652 11:01:35 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:07:10.652 256+0 records in 00:07:10.652 256+0 records out 00:07:10.652 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0117053 s, 89.6 MB/s 00:07:10.652 11:01:35 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:10.652 11:01:35 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:07:10.652 256+0 records in 00:07:10.652 256+0 records out 00:07:10.652 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0194758 s, 53.8 MB/s 00:07:10.652 11:01:35 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:10.652 11:01:35 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:07:10.652 256+0 records in 00:07:10.652 256+0 records out 00:07:10.652 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0219952 s, 47.7 MB/s 00:07:10.652 11:01:35 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:07:10.652 11:01:35 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:10.652 11:01:35 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:10.652 11:01:35 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:07:10.652 11:01:35 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:10.652 11:01:35 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:07:10.652 11:01:35 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:07:10.652 11:01:35 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:10.652 11:01:35 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:07:10.652 11:01:35 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:10.652 11:01:35 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:07:10.652 11:01:35 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:10.652 11:01:35 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:07:10.652 11:01:35 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:10.652 11:01:35 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:10.652 11:01:35 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:10.652 11:01:35 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:07:10.652 11:01:35 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:10.652 11:01:35 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:10.913 11:01:35 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:10.913 11:01:35 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:10.913 11:01:35 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:10.913 11:01:35 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:10.913 11:01:35 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:10.913 11:01:35 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:10.913 11:01:35 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:10.913 11:01:35 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:10.913 11:01:35 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:10.913 11:01:35 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:11.173 11:01:35 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:11.173 11:01:35 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:11.173 11:01:35 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:11.173 11:01:35 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:11.173 11:01:35 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:11.173 11:01:35 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:11.173 11:01:35 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:11.173 11:01:35 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:11.173 11:01:35 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:11.173 11:01:35 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:11.173 11:01:35 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:11.434 11:01:35 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:11.434 11:01:35 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:11.434 11:01:35 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:11.434 11:01:35 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:11.434 11:01:35 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:07:11.434 11:01:35 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:11.434 11:01:35 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:07:11.434 11:01:35 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:07:11.434 11:01:35 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:07:11.434 11:01:35 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:07:11.434 11:01:35 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:07:11.434 11:01:35 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:07:11.434 11:01:35 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:07:11.694 11:01:36 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:07:11.694 [2024-11-17 11:01:36.260980] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:11.694 [2024-11-17 11:01:36.280689] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:11.694 [2024-11-17 11:01:36.280689] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:07:11.694 [2024-11-17 11:01:36.322165] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:07:11.694 [2024-11-17 11:01:36.322212] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:07:14.987 11:01:39 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:07:14.987 11:01:39 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 2' 00:07:14.987 spdk_app_start Round 2 00:07:14.987 11:01:39 event.app_repeat -- event/event.sh@25 -- # waitforlisten 140602 /var/tmp/spdk-nbd.sock 00:07:14.987 11:01:39 event.app_repeat -- common/autotest_common.sh@835 -- # '[' -z 140602 ']' 00:07:14.987 11:01:39 event.app_repeat -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:14.987 11:01:39 event.app_repeat -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:14.987 11:01:39 event.app_repeat -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:14.987 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:14.987 11:01:39 event.app_repeat -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:14.987 11:01:39 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:14.987 11:01:39 event.app_repeat -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:14.987 11:01:39 event.app_repeat -- common/autotest_common.sh@868 -- # return 0 00:07:14.987 11:01:39 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:14.987 Malloc0 00:07:14.987 11:01:39 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:15.247 Malloc1 00:07:15.247 11:01:39 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:15.247 11:01:39 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:15.247 11:01:39 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:15.247 11:01:39 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:07:15.247 11:01:39 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:15.247 11:01:39 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:07:15.247 11:01:39 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:15.247 11:01:39 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:15.247 11:01:39 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:15.247 11:01:39 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:07:15.247 11:01:39 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:15.247 11:01:39 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:07:15.247 11:01:39 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:07:15.247 11:01:39 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:07:15.247 11:01:39 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:15.247 11:01:39 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:07:15.507 /dev/nbd0 00:07:15.507 11:01:39 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:07:15.507 11:01:39 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:07:15.507 11:01:39 event.app_repeat -- common/autotest_common.sh@872 -- # local nbd_name=nbd0 00:07:15.507 11:01:39 event.app_repeat -- common/autotest_common.sh@873 -- # local i 00:07:15.507 11:01:39 event.app_repeat -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:07:15.507 11:01:39 event.app_repeat -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:07:15.507 11:01:39 event.app_repeat -- common/autotest_common.sh@876 -- # grep -q -w nbd0 /proc/partitions 00:07:15.507 11:01:39 event.app_repeat -- common/autotest_common.sh@877 -- # break 00:07:15.507 11:01:39 event.app_repeat -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:07:15.507 11:01:39 event.app_repeat -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:07:15.507 11:01:39 event.app_repeat -- common/autotest_common.sh@889 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:15.507 1+0 records in 00:07:15.507 1+0 records out 00:07:15.507 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000249013 s, 16.4 MB/s 00:07:15.507 11:01:39 event.app_repeat -- common/autotest_common.sh@890 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:15.507 11:01:40 event.app_repeat -- common/autotest_common.sh@890 -- # size=4096 00:07:15.507 11:01:40 event.app_repeat -- common/autotest_common.sh@891 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:15.507 11:01:40 event.app_repeat -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:07:15.507 11:01:40 event.app_repeat -- common/autotest_common.sh@893 -- # return 0 00:07:15.507 11:01:40 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:15.507 11:01:40 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:15.507 11:01:40 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:07:15.767 /dev/nbd1 00:07:15.767 11:01:40 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:07:15.767 11:01:40 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:07:15.767 11:01:40 event.app_repeat -- common/autotest_common.sh@872 -- # local nbd_name=nbd1 00:07:15.767 11:01:40 event.app_repeat -- common/autotest_common.sh@873 -- # local i 00:07:15.767 11:01:40 event.app_repeat -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:07:15.767 11:01:40 event.app_repeat -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:07:15.767 11:01:40 event.app_repeat -- common/autotest_common.sh@876 -- # grep -q -w nbd1 /proc/partitions 00:07:15.767 11:01:40 event.app_repeat -- common/autotest_common.sh@877 -- # break 00:07:15.767 11:01:40 event.app_repeat -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:07:15.767 11:01:40 event.app_repeat -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:07:15.767 11:01:40 event.app_repeat -- common/autotest_common.sh@889 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:15.767 1+0 records in 00:07:15.767 1+0 records out 00:07:15.767 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000239987 s, 17.1 MB/s 00:07:15.767 11:01:40 event.app_repeat -- common/autotest_common.sh@890 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:15.767 11:01:40 event.app_repeat -- common/autotest_common.sh@890 -- # size=4096 00:07:15.767 11:01:40 event.app_repeat -- common/autotest_common.sh@891 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:15.767 11:01:40 event.app_repeat -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:07:15.767 11:01:40 event.app_repeat -- common/autotest_common.sh@893 -- # return 0 00:07:15.767 11:01:40 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:15.767 11:01:40 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:15.767 11:01:40 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:15.767 11:01:40 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:15.767 11:01:40 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:16.026 11:01:40 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:07:16.026 { 00:07:16.026 "nbd_device": "/dev/nbd0", 00:07:16.026 "bdev_name": "Malloc0" 00:07:16.026 }, 00:07:16.026 { 00:07:16.026 "nbd_device": "/dev/nbd1", 00:07:16.026 "bdev_name": "Malloc1" 00:07:16.026 } 00:07:16.026 ]' 00:07:16.026 11:01:40 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:07:16.026 { 00:07:16.026 "nbd_device": "/dev/nbd0", 00:07:16.026 "bdev_name": "Malloc0" 00:07:16.026 }, 00:07:16.026 { 00:07:16.026 "nbd_device": "/dev/nbd1", 00:07:16.026 "bdev_name": "Malloc1" 00:07:16.026 } 00:07:16.026 ]' 00:07:16.026 11:01:40 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:16.026 11:01:40 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:07:16.026 /dev/nbd1' 00:07:16.026 11:01:40 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:07:16.026 /dev/nbd1' 00:07:16.026 11:01:40 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:16.026 11:01:40 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:07:16.026 11:01:40 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:07:16.026 11:01:40 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:07:16.026 11:01:40 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:07:16.026 11:01:40 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:07:16.026 11:01:40 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:16.026 11:01:40 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:16.026 11:01:40 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:07:16.026 11:01:40 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:16.026 11:01:40 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:07:16.026 11:01:40 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:07:16.026 256+0 records in 00:07:16.026 256+0 records out 00:07:16.026 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0112066 s, 93.6 MB/s 00:07:16.026 11:01:40 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:16.026 11:01:40 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:07:16.026 256+0 records in 00:07:16.026 256+0 records out 00:07:16.026 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0198247 s, 52.9 MB/s 00:07:16.026 11:01:40 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:16.026 11:01:40 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:07:16.026 256+0 records in 00:07:16.026 256+0 records out 00:07:16.026 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0212666 s, 49.3 MB/s 00:07:16.026 11:01:40 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:07:16.026 11:01:40 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:16.026 11:01:40 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:16.026 11:01:40 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:07:16.027 11:01:40 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:16.027 11:01:40 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:07:16.027 11:01:40 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:07:16.027 11:01:40 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:16.027 11:01:40 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:07:16.027 11:01:40 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:16.027 11:01:40 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:07:16.027 11:01:40 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:16.027 11:01:40 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:07:16.027 11:01:40 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:16.027 11:01:40 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:16.027 11:01:40 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:16.027 11:01:40 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:07:16.027 11:01:40 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:16.027 11:01:40 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:16.287 11:01:40 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:16.287 11:01:40 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:16.287 11:01:40 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:16.287 11:01:40 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:16.287 11:01:40 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:16.287 11:01:40 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:16.287 11:01:40 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:16.287 11:01:40 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:16.287 11:01:40 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:16.287 11:01:40 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:16.546 11:01:41 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:16.546 11:01:41 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:16.546 11:01:41 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:16.546 11:01:41 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:16.546 11:01:41 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:16.546 11:01:41 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:16.546 11:01:41 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:16.546 11:01:41 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:16.546 11:01:41 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:16.546 11:01:41 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:16.546 11:01:41 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:16.806 11:01:41 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:16.806 11:01:41 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:16.806 11:01:41 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:16.806 11:01:41 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:16.806 11:01:41 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:07:16.806 11:01:41 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:16.806 11:01:41 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:07:16.806 11:01:41 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:07:16.806 11:01:41 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:07:16.806 11:01:41 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:07:16.806 11:01:41 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:07:16.806 11:01:41 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:07:16.806 11:01:41 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:07:17.065 11:01:41 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:07:17.065 [2024-11-17 11:01:41.651141] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:17.065 [2024-11-17 11:01:41.670393] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:07:17.065 [2024-11-17 11:01:41.670394] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:17.065 [2024-11-17 11:01:41.710513] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:07:17.065 [2024-11-17 11:01:41.710558] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:07:20.358 11:01:44 event.app_repeat -- event/event.sh@38 -- # waitforlisten 140602 /var/tmp/spdk-nbd.sock 00:07:20.358 11:01:44 event.app_repeat -- common/autotest_common.sh@835 -- # '[' -z 140602 ']' 00:07:20.358 11:01:44 event.app_repeat -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:20.358 11:01:44 event.app_repeat -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:20.358 11:01:44 event.app_repeat -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:20.358 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:20.358 11:01:44 event.app_repeat -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:20.358 11:01:44 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:20.358 11:01:44 event.app_repeat -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:20.358 11:01:44 event.app_repeat -- common/autotest_common.sh@868 -- # return 0 00:07:20.358 11:01:44 event.app_repeat -- event/event.sh@39 -- # killprocess 140602 00:07:20.358 11:01:44 event.app_repeat -- common/autotest_common.sh@954 -- # '[' -z 140602 ']' 00:07:20.358 11:01:44 event.app_repeat -- common/autotest_common.sh@958 -- # kill -0 140602 00:07:20.358 11:01:44 event.app_repeat -- common/autotest_common.sh@959 -- # uname 00:07:20.358 11:01:44 event.app_repeat -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:20.358 11:01:44 event.app_repeat -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 140602 00:07:20.358 11:01:44 event.app_repeat -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:20.358 11:01:44 event.app_repeat -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:20.358 11:01:44 event.app_repeat -- common/autotest_common.sh@972 -- # echo 'killing process with pid 140602' 00:07:20.358 killing process with pid 140602 00:07:20.358 11:01:44 event.app_repeat -- common/autotest_common.sh@973 -- # kill 140602 00:07:20.358 11:01:44 event.app_repeat -- common/autotest_common.sh@978 -- # wait 140602 00:07:20.358 spdk_app_start is called in Round 0. 00:07:20.358 Shutdown signal received, stop current app iteration 00:07:20.358 Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 reinitialization... 00:07:20.358 spdk_app_start is called in Round 1. 00:07:20.358 Shutdown signal received, stop current app iteration 00:07:20.358 Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 reinitialization... 00:07:20.358 spdk_app_start is called in Round 2. 00:07:20.358 Shutdown signal received, stop current app iteration 00:07:20.358 Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 reinitialization... 00:07:20.358 spdk_app_start is called in Round 3. 00:07:20.358 Shutdown signal received, stop current app iteration 00:07:20.358 11:01:44 event.app_repeat -- event/event.sh@40 -- # trap - SIGINT SIGTERM EXIT 00:07:20.358 11:01:44 event.app_repeat -- event/event.sh@42 -- # return 0 00:07:20.358 00:07:20.358 real 0m16.480s 00:07:20.358 user 0m35.711s 00:07:20.358 sys 0m3.200s 00:07:20.358 11:01:44 event.app_repeat -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:20.358 11:01:44 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:20.358 ************************************ 00:07:20.358 END TEST app_repeat 00:07:20.358 ************************************ 00:07:20.358 11:01:44 event -- event/event.sh@54 -- # (( SPDK_TEST_CRYPTO == 0 )) 00:07:20.358 11:01:44 event -- event/event.sh@55 -- # run_test cpu_locks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/cpu_locks.sh 00:07:20.358 11:01:44 event -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:20.358 11:01:44 event -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:20.358 11:01:44 event -- common/autotest_common.sh@10 -- # set +x 00:07:20.358 ************************************ 00:07:20.358 START TEST cpu_locks 00:07:20.358 ************************************ 00:07:20.358 11:01:44 event.cpu_locks -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/cpu_locks.sh 00:07:20.618 * Looking for test storage... 00:07:20.618 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event 00:07:20.618 11:01:45 event.cpu_locks -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:07:20.618 11:01:45 event.cpu_locks -- common/autotest_common.sh@1693 -- # lcov --version 00:07:20.618 11:01:45 event.cpu_locks -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:07:20.618 11:01:45 event.cpu_locks -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:07:20.618 11:01:45 event.cpu_locks -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:20.618 11:01:45 event.cpu_locks -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:20.618 11:01:45 event.cpu_locks -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:20.618 11:01:45 event.cpu_locks -- scripts/common.sh@336 -- # IFS=.-: 00:07:20.618 11:01:45 event.cpu_locks -- scripts/common.sh@336 -- # read -ra ver1 00:07:20.618 11:01:45 event.cpu_locks -- scripts/common.sh@337 -- # IFS=.-: 00:07:20.618 11:01:45 event.cpu_locks -- scripts/common.sh@337 -- # read -ra ver2 00:07:20.618 11:01:45 event.cpu_locks -- scripts/common.sh@338 -- # local 'op=<' 00:07:20.618 11:01:45 event.cpu_locks -- scripts/common.sh@340 -- # ver1_l=2 00:07:20.618 11:01:45 event.cpu_locks -- scripts/common.sh@341 -- # ver2_l=1 00:07:20.618 11:01:45 event.cpu_locks -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:20.618 11:01:45 event.cpu_locks -- scripts/common.sh@344 -- # case "$op" in 00:07:20.618 11:01:45 event.cpu_locks -- scripts/common.sh@345 -- # : 1 00:07:20.618 11:01:45 event.cpu_locks -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:20.618 11:01:45 event.cpu_locks -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:20.618 11:01:45 event.cpu_locks -- scripts/common.sh@365 -- # decimal 1 00:07:20.618 11:01:45 event.cpu_locks -- scripts/common.sh@353 -- # local d=1 00:07:20.618 11:01:45 event.cpu_locks -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:20.618 11:01:45 event.cpu_locks -- scripts/common.sh@355 -- # echo 1 00:07:20.618 11:01:45 event.cpu_locks -- scripts/common.sh@365 -- # ver1[v]=1 00:07:20.618 11:01:45 event.cpu_locks -- scripts/common.sh@366 -- # decimal 2 00:07:20.618 11:01:45 event.cpu_locks -- scripts/common.sh@353 -- # local d=2 00:07:20.618 11:01:45 event.cpu_locks -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:20.618 11:01:45 event.cpu_locks -- scripts/common.sh@355 -- # echo 2 00:07:20.618 11:01:45 event.cpu_locks -- scripts/common.sh@366 -- # ver2[v]=2 00:07:20.618 11:01:45 event.cpu_locks -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:20.618 11:01:45 event.cpu_locks -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:20.618 11:01:45 event.cpu_locks -- scripts/common.sh@368 -- # return 0 00:07:20.618 11:01:45 event.cpu_locks -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:20.618 11:01:45 event.cpu_locks -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:07:20.618 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:20.618 --rc genhtml_branch_coverage=1 00:07:20.618 --rc genhtml_function_coverage=1 00:07:20.618 --rc genhtml_legend=1 00:07:20.618 --rc geninfo_all_blocks=1 00:07:20.618 --rc geninfo_unexecuted_blocks=1 00:07:20.618 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:20.618 ' 00:07:20.618 11:01:45 event.cpu_locks -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:07:20.618 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:20.618 --rc genhtml_branch_coverage=1 00:07:20.618 --rc genhtml_function_coverage=1 00:07:20.618 --rc genhtml_legend=1 00:07:20.618 --rc geninfo_all_blocks=1 00:07:20.618 --rc geninfo_unexecuted_blocks=1 00:07:20.618 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:20.618 ' 00:07:20.618 11:01:45 event.cpu_locks -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:07:20.618 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:20.618 --rc genhtml_branch_coverage=1 00:07:20.618 --rc genhtml_function_coverage=1 00:07:20.618 --rc genhtml_legend=1 00:07:20.618 --rc geninfo_all_blocks=1 00:07:20.618 --rc geninfo_unexecuted_blocks=1 00:07:20.618 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:20.618 ' 00:07:20.618 11:01:45 event.cpu_locks -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:07:20.618 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:20.618 --rc genhtml_branch_coverage=1 00:07:20.618 --rc genhtml_function_coverage=1 00:07:20.618 --rc genhtml_legend=1 00:07:20.618 --rc geninfo_all_blocks=1 00:07:20.618 --rc geninfo_unexecuted_blocks=1 00:07:20.618 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:20.618 ' 00:07:20.618 11:01:45 event.cpu_locks -- event/cpu_locks.sh@11 -- # rpc_sock1=/var/tmp/spdk.sock 00:07:20.618 11:01:45 event.cpu_locks -- event/cpu_locks.sh@12 -- # rpc_sock2=/var/tmp/spdk2.sock 00:07:20.618 11:01:45 event.cpu_locks -- event/cpu_locks.sh@164 -- # trap cleanup EXIT SIGTERM SIGINT 00:07:20.618 11:01:45 event.cpu_locks -- event/cpu_locks.sh@166 -- # run_test default_locks default_locks 00:07:20.618 11:01:45 event.cpu_locks -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:20.618 11:01:45 event.cpu_locks -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:20.618 11:01:45 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:07:20.618 ************************************ 00:07:20.618 START TEST default_locks 00:07:20.618 ************************************ 00:07:20.618 11:01:45 event.cpu_locks.default_locks -- common/autotest_common.sh@1129 -- # default_locks 00:07:20.618 11:01:45 event.cpu_locks.default_locks -- event/cpu_locks.sh@46 -- # spdk_tgt_pid=143735 00:07:20.618 11:01:45 event.cpu_locks.default_locks -- event/cpu_locks.sh@47 -- # waitforlisten 143735 00:07:20.618 11:01:45 event.cpu_locks.default_locks -- event/cpu_locks.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:07:20.618 11:01:45 event.cpu_locks.default_locks -- common/autotest_common.sh@835 -- # '[' -z 143735 ']' 00:07:20.619 11:01:45 event.cpu_locks.default_locks -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:20.619 11:01:45 event.cpu_locks.default_locks -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:20.619 11:01:45 event.cpu_locks.default_locks -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:20.619 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:20.619 11:01:45 event.cpu_locks.default_locks -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:20.619 11:01:45 event.cpu_locks.default_locks -- common/autotest_common.sh@10 -- # set +x 00:07:20.619 [2024-11-17 11:01:45.261357] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:07:20.619 [2024-11-17 11:01:45.261440] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid143735 ] 00:07:20.878 [2024-11-17 11:01:45.347815] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:20.878 [2024-11-17 11:01:45.369859] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:21.137 11:01:45 event.cpu_locks.default_locks -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:21.137 11:01:45 event.cpu_locks.default_locks -- common/autotest_common.sh@868 -- # return 0 00:07:21.137 11:01:45 event.cpu_locks.default_locks -- event/cpu_locks.sh@49 -- # locks_exist 143735 00:07:21.137 11:01:45 event.cpu_locks.default_locks -- event/cpu_locks.sh@22 -- # lslocks -p 143735 00:07:21.137 11:01:45 event.cpu_locks.default_locks -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:07:21.706 lslocks: write error 00:07:21.707 11:01:46 event.cpu_locks.default_locks -- event/cpu_locks.sh@50 -- # killprocess 143735 00:07:21.707 11:01:46 event.cpu_locks.default_locks -- common/autotest_common.sh@954 -- # '[' -z 143735 ']' 00:07:21.707 11:01:46 event.cpu_locks.default_locks -- common/autotest_common.sh@958 -- # kill -0 143735 00:07:21.707 11:01:46 event.cpu_locks.default_locks -- common/autotest_common.sh@959 -- # uname 00:07:21.707 11:01:46 event.cpu_locks.default_locks -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:21.707 11:01:46 event.cpu_locks.default_locks -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 143735 00:07:21.707 11:01:46 event.cpu_locks.default_locks -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:21.707 11:01:46 event.cpu_locks.default_locks -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:21.707 11:01:46 event.cpu_locks.default_locks -- common/autotest_common.sh@972 -- # echo 'killing process with pid 143735' 00:07:21.707 killing process with pid 143735 00:07:21.707 11:01:46 event.cpu_locks.default_locks -- common/autotest_common.sh@973 -- # kill 143735 00:07:21.707 11:01:46 event.cpu_locks.default_locks -- common/autotest_common.sh@978 -- # wait 143735 00:07:21.966 11:01:46 event.cpu_locks.default_locks -- event/cpu_locks.sh@52 -- # NOT waitforlisten 143735 00:07:21.966 11:01:46 event.cpu_locks.default_locks -- common/autotest_common.sh@652 -- # local es=0 00:07:21.966 11:01:46 event.cpu_locks.default_locks -- common/autotest_common.sh@654 -- # valid_exec_arg waitforlisten 143735 00:07:21.966 11:01:46 event.cpu_locks.default_locks -- common/autotest_common.sh@640 -- # local arg=waitforlisten 00:07:21.966 11:01:46 event.cpu_locks.default_locks -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:21.966 11:01:46 event.cpu_locks.default_locks -- common/autotest_common.sh@644 -- # type -t waitforlisten 00:07:21.966 11:01:46 event.cpu_locks.default_locks -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:21.966 11:01:46 event.cpu_locks.default_locks -- common/autotest_common.sh@655 -- # waitforlisten 143735 00:07:21.966 11:01:46 event.cpu_locks.default_locks -- common/autotest_common.sh@835 -- # '[' -z 143735 ']' 00:07:21.966 11:01:46 event.cpu_locks.default_locks -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:21.966 11:01:46 event.cpu_locks.default_locks -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:21.967 11:01:46 event.cpu_locks.default_locks -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:21.967 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:21.967 11:01:46 event.cpu_locks.default_locks -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:21.967 11:01:46 event.cpu_locks.default_locks -- common/autotest_common.sh@10 -- # set +x 00:07:21.967 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 850: kill: (143735) - No such process 00:07:21.967 ERROR: process (pid: 143735) is no longer running 00:07:21.967 11:01:46 event.cpu_locks.default_locks -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:21.967 11:01:46 event.cpu_locks.default_locks -- common/autotest_common.sh@868 -- # return 1 00:07:21.967 11:01:46 event.cpu_locks.default_locks -- common/autotest_common.sh@655 -- # es=1 00:07:21.967 11:01:46 event.cpu_locks.default_locks -- common/autotest_common.sh@663 -- # (( es > 128 )) 00:07:21.967 11:01:46 event.cpu_locks.default_locks -- common/autotest_common.sh@674 -- # [[ -n '' ]] 00:07:21.967 11:01:46 event.cpu_locks.default_locks -- common/autotest_common.sh@679 -- # (( !es == 0 )) 00:07:21.967 11:01:46 event.cpu_locks.default_locks -- event/cpu_locks.sh@54 -- # no_locks 00:07:21.967 11:01:46 event.cpu_locks.default_locks -- event/cpu_locks.sh@26 -- # lock_files=() 00:07:21.967 11:01:46 event.cpu_locks.default_locks -- event/cpu_locks.sh@26 -- # local lock_files 00:07:21.967 11:01:46 event.cpu_locks.default_locks -- event/cpu_locks.sh@27 -- # (( 0 != 0 )) 00:07:21.967 00:07:21.967 real 0m1.278s 00:07:21.967 user 0m1.268s 00:07:21.967 sys 0m0.613s 00:07:21.967 11:01:46 event.cpu_locks.default_locks -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:21.967 11:01:46 event.cpu_locks.default_locks -- common/autotest_common.sh@10 -- # set +x 00:07:21.967 ************************************ 00:07:21.967 END TEST default_locks 00:07:21.967 ************************************ 00:07:21.967 11:01:46 event.cpu_locks -- event/cpu_locks.sh@167 -- # run_test default_locks_via_rpc default_locks_via_rpc 00:07:21.967 11:01:46 event.cpu_locks -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:21.967 11:01:46 event.cpu_locks -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:21.967 11:01:46 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:07:21.967 ************************************ 00:07:21.967 START TEST default_locks_via_rpc 00:07:21.967 ************************************ 00:07:21.967 11:01:46 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@1129 -- # default_locks_via_rpc 00:07:21.967 11:01:46 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@62 -- # spdk_tgt_pid=144038 00:07:21.967 11:01:46 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@63 -- # waitforlisten 144038 00:07:21.967 11:01:46 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@61 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:07:21.967 11:01:46 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@835 -- # '[' -z 144038 ']' 00:07:21.967 11:01:46 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:21.967 11:01:46 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:21.967 11:01:46 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:21.967 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:21.967 11:01:46 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:21.967 11:01:46 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:22.227 [2024-11-17 11:01:46.624455] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:07:22.227 [2024-11-17 11:01:46.624527] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid144038 ] 00:07:22.227 [2024-11-17 11:01:46.710853] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:22.227 [2024-11-17 11:01:46.733148] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:22.486 11:01:46 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:22.486 11:01:46 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@868 -- # return 0 00:07:22.486 11:01:46 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@65 -- # rpc_cmd framework_disable_cpumask_locks 00:07:22.486 11:01:46 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:22.486 11:01:46 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:22.486 11:01:46 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:22.486 11:01:46 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@67 -- # no_locks 00:07:22.486 11:01:46 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@26 -- # lock_files=() 00:07:22.486 11:01:46 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@26 -- # local lock_files 00:07:22.486 11:01:46 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@27 -- # (( 0 != 0 )) 00:07:22.486 11:01:46 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@69 -- # rpc_cmd framework_enable_cpumask_locks 00:07:22.486 11:01:46 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:22.486 11:01:46 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:22.486 11:01:46 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:22.486 11:01:46 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@71 -- # locks_exist 144038 00:07:22.486 11:01:46 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@22 -- # lslocks -p 144038 00:07:22.486 11:01:46 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:07:23.056 11:01:47 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@73 -- # killprocess 144038 00:07:23.056 11:01:47 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@954 -- # '[' -z 144038 ']' 00:07:23.056 11:01:47 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@958 -- # kill -0 144038 00:07:23.056 11:01:47 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@959 -- # uname 00:07:23.056 11:01:47 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:23.056 11:01:47 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 144038 00:07:23.056 11:01:47 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:23.056 11:01:47 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:23.056 11:01:47 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@972 -- # echo 'killing process with pid 144038' 00:07:23.056 killing process with pid 144038 00:07:23.056 11:01:47 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@973 -- # kill 144038 00:07:23.056 11:01:47 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@978 -- # wait 144038 00:07:23.315 00:07:23.315 real 0m1.154s 00:07:23.315 user 0m1.108s 00:07:23.315 sys 0m0.581s 00:07:23.315 11:01:47 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:23.315 11:01:47 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:23.315 ************************************ 00:07:23.315 END TEST default_locks_via_rpc 00:07:23.315 ************************************ 00:07:23.315 11:01:47 event.cpu_locks -- event/cpu_locks.sh@168 -- # run_test non_locking_app_on_locked_coremask non_locking_app_on_locked_coremask 00:07:23.315 11:01:47 event.cpu_locks -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:23.315 11:01:47 event.cpu_locks -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:23.315 11:01:47 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:07:23.315 ************************************ 00:07:23.315 START TEST non_locking_app_on_locked_coremask 00:07:23.315 ************************************ 00:07:23.315 11:01:47 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@1129 -- # non_locking_app_on_locked_coremask 00:07:23.315 11:01:47 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@80 -- # spdk_tgt_pid=144329 00:07:23.315 11:01:47 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@81 -- # waitforlisten 144329 /var/tmp/spdk.sock 00:07:23.315 11:01:47 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@79 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:07:23.315 11:01:47 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@835 -- # '[' -z 144329 ']' 00:07:23.315 11:01:47 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:23.316 11:01:47 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:23.316 11:01:47 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:23.316 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:23.316 11:01:47 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:23.316 11:01:47 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:23.316 [2024-11-17 11:01:47.864589] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:07:23.316 [2024-11-17 11:01:47.864676] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid144329 ] 00:07:23.316 [2024-11-17 11:01:47.950595] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:23.575 [2024-11-17 11:01:47.972701] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:23.575 11:01:48 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:23.575 11:01:48 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@868 -- # return 0 00:07:23.575 11:01:48 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@84 -- # spdk_tgt_pid2=144333 00:07:23.575 11:01:48 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@85 -- # waitforlisten 144333 /var/tmp/spdk2.sock 00:07:23.575 11:01:48 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@83 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 --disable-cpumask-locks -r /var/tmp/spdk2.sock 00:07:23.575 11:01:48 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@835 -- # '[' -z 144333 ']' 00:07:23.575 11:01:48 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk2.sock 00:07:23.575 11:01:48 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:23.575 11:01:48 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:07:23.575 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:07:23.575 11:01:48 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:23.575 11:01:48 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:23.575 [2024-11-17 11:01:48.198874] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:07:23.575 [2024-11-17 11:01:48.198934] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid144333 ] 00:07:23.835 [2024-11-17 11:01:48.292523] app.c: 916:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:07:23.835 [2024-11-17 11:01:48.292556] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:23.835 [2024-11-17 11:01:48.338072] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:24.095 11:01:48 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:24.095 11:01:48 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@868 -- # return 0 00:07:24.095 11:01:48 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@87 -- # locks_exist 144329 00:07:24.095 11:01:48 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@22 -- # lslocks -p 144329 00:07:24.095 11:01:48 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:07:25.485 lslocks: write error 00:07:25.485 11:01:49 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@89 -- # killprocess 144329 00:07:25.485 11:01:49 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@954 -- # '[' -z 144329 ']' 00:07:25.485 11:01:49 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@958 -- # kill -0 144329 00:07:25.485 11:01:49 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@959 -- # uname 00:07:25.485 11:01:49 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:25.485 11:01:49 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 144329 00:07:25.485 11:01:49 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:25.485 11:01:49 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:25.485 11:01:49 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@972 -- # echo 'killing process with pid 144329' 00:07:25.485 killing process with pid 144329 00:07:25.485 11:01:49 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@973 -- # kill 144329 00:07:25.485 11:01:49 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@978 -- # wait 144329 00:07:26.054 11:01:50 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@90 -- # killprocess 144333 00:07:26.054 11:01:50 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@954 -- # '[' -z 144333 ']' 00:07:26.054 11:01:50 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@958 -- # kill -0 144333 00:07:26.054 11:01:50 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@959 -- # uname 00:07:26.054 11:01:50 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:26.054 11:01:50 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 144333 00:07:26.054 11:01:50 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:26.054 11:01:50 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:26.054 11:01:50 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@972 -- # echo 'killing process with pid 144333' 00:07:26.054 killing process with pid 144333 00:07:26.054 11:01:50 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@973 -- # kill 144333 00:07:26.054 11:01:50 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@978 -- # wait 144333 00:07:26.314 00:07:26.314 real 0m2.991s 00:07:26.314 user 0m3.015s 00:07:26.314 sys 0m1.247s 00:07:26.314 11:01:50 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:26.314 11:01:50 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:26.314 ************************************ 00:07:26.314 END TEST non_locking_app_on_locked_coremask 00:07:26.314 ************************************ 00:07:26.314 11:01:50 event.cpu_locks -- event/cpu_locks.sh@169 -- # run_test locking_app_on_unlocked_coremask locking_app_on_unlocked_coremask 00:07:26.314 11:01:50 event.cpu_locks -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:26.314 11:01:50 event.cpu_locks -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:26.314 11:01:50 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:07:26.314 ************************************ 00:07:26.314 START TEST locking_app_on_unlocked_coremask 00:07:26.314 ************************************ 00:07:26.314 11:01:50 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@1129 -- # locking_app_on_unlocked_coremask 00:07:26.314 11:01:50 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@98 -- # spdk_tgt_pid=144893 00:07:26.314 11:01:50 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@99 -- # waitforlisten 144893 /var/tmp/spdk.sock 00:07:26.314 11:01:50 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@97 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 --disable-cpumask-locks 00:07:26.314 11:01:50 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@835 -- # '[' -z 144893 ']' 00:07:26.314 11:01:50 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:26.314 11:01:50 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:26.314 11:01:50 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:26.314 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:26.314 11:01:50 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:26.314 11:01:50 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:26.314 [2024-11-17 11:01:50.943536] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:07:26.314 [2024-11-17 11:01:50.943623] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid144893 ] 00:07:26.575 [2024-11-17 11:01:51.028698] app.c: 916:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:07:26.575 [2024-11-17 11:01:51.028725] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:26.575 [2024-11-17 11:01:51.047922] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:26.835 11:01:51 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:26.835 11:01:51 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@868 -- # return 0 00:07:26.835 11:01:51 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@102 -- # spdk_tgt_pid2=144901 00:07:26.835 11:01:51 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@103 -- # waitforlisten 144901 /var/tmp/spdk2.sock 00:07:26.835 11:01:51 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@101 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -r /var/tmp/spdk2.sock 00:07:26.835 11:01:51 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@835 -- # '[' -z 144901 ']' 00:07:26.835 11:01:51 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk2.sock 00:07:26.835 11:01:51 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:26.835 11:01:51 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:07:26.835 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:07:26.835 11:01:51 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:26.835 11:01:51 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:26.835 [2024-11-17 11:01:51.281546] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:07:26.835 [2024-11-17 11:01:51.281619] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid144901 ] 00:07:26.835 [2024-11-17 11:01:51.375220] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:26.835 [2024-11-17 11:01:51.416989] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:27.405 11:01:51 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:27.405 11:01:51 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@868 -- # return 0 00:07:27.405 11:01:51 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@105 -- # locks_exist 144901 00:07:27.405 11:01:51 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@22 -- # lslocks -p 144901 00:07:27.405 11:01:51 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:07:28.346 lslocks: write error 00:07:28.346 11:01:52 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@107 -- # killprocess 144893 00:07:28.346 11:01:52 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@954 -- # '[' -z 144893 ']' 00:07:28.346 11:01:52 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@958 -- # kill -0 144893 00:07:28.346 11:01:52 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@959 -- # uname 00:07:28.346 11:01:52 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:28.346 11:01:52 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 144893 00:07:28.346 11:01:52 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:28.346 11:01:53 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:28.607 11:01:53 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@972 -- # echo 'killing process with pid 144893' 00:07:28.607 killing process with pid 144893 00:07:28.607 11:01:53 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@973 -- # kill 144893 00:07:28.607 11:01:53 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@978 -- # wait 144893 00:07:29.177 11:01:53 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@108 -- # killprocess 144901 00:07:29.177 11:01:53 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@954 -- # '[' -z 144901 ']' 00:07:29.177 11:01:53 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@958 -- # kill -0 144901 00:07:29.178 11:01:53 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@959 -- # uname 00:07:29.178 11:01:53 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:29.178 11:01:53 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 144901 00:07:29.178 11:01:53 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:29.178 11:01:53 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:29.178 11:01:53 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@972 -- # echo 'killing process with pid 144901' 00:07:29.178 killing process with pid 144901 00:07:29.178 11:01:53 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@973 -- # kill 144901 00:07:29.178 11:01:53 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@978 -- # wait 144901 00:07:29.437 00:07:29.437 real 0m2.983s 00:07:29.437 user 0m2.977s 00:07:29.437 sys 0m1.258s 00:07:29.437 11:01:53 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:29.437 11:01:53 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:29.437 ************************************ 00:07:29.437 END TEST locking_app_on_unlocked_coremask 00:07:29.437 ************************************ 00:07:29.437 11:01:53 event.cpu_locks -- event/cpu_locks.sh@170 -- # run_test locking_app_on_locked_coremask locking_app_on_locked_coremask 00:07:29.437 11:01:53 event.cpu_locks -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:29.437 11:01:53 event.cpu_locks -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:29.437 11:01:53 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:07:29.437 ************************************ 00:07:29.437 START TEST locking_app_on_locked_coremask 00:07:29.437 ************************************ 00:07:29.437 11:01:53 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@1129 -- # locking_app_on_locked_coremask 00:07:29.437 11:01:53 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@115 -- # spdk_tgt_pid=145461 00:07:29.437 11:01:53 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@116 -- # waitforlisten 145461 /var/tmp/spdk.sock 00:07:29.437 11:01:53 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:07:29.437 11:01:53 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@835 -- # '[' -z 145461 ']' 00:07:29.437 11:01:53 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:29.437 11:01:53 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:29.437 11:01:53 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:29.437 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:29.437 11:01:53 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:29.437 11:01:53 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:29.437 [2024-11-17 11:01:54.011440] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:07:29.437 [2024-11-17 11:01:54.011525] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid145461 ] 00:07:29.696 [2024-11-17 11:01:54.095991] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:29.696 [2024-11-17 11:01:54.115443] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:29.696 11:01:54 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:29.696 11:01:54 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@868 -- # return 0 00:07:29.696 11:01:54 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@119 -- # spdk_tgt_pid2=145468 00:07:29.696 11:01:54 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@120 -- # NOT waitforlisten 145468 /var/tmp/spdk2.sock 00:07:29.696 11:01:54 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@118 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -r /var/tmp/spdk2.sock 00:07:29.696 11:01:54 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@652 -- # local es=0 00:07:29.696 11:01:54 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@654 -- # valid_exec_arg waitforlisten 145468 /var/tmp/spdk2.sock 00:07:29.696 11:01:54 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@640 -- # local arg=waitforlisten 00:07:29.696 11:01:54 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:29.696 11:01:54 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@644 -- # type -t waitforlisten 00:07:29.696 11:01:54 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:29.696 11:01:54 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@655 -- # waitforlisten 145468 /var/tmp/spdk2.sock 00:07:29.696 11:01:54 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@835 -- # '[' -z 145468 ']' 00:07:29.696 11:01:54 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk2.sock 00:07:29.696 11:01:54 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:29.696 11:01:54 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:07:29.696 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:07:29.696 11:01:54 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:29.696 11:01:54 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:29.696 [2024-11-17 11:01:54.347657] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:07:29.696 [2024-11-17 11:01:54.347722] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid145468 ] 00:07:29.956 [2024-11-17 11:01:54.443701] app.c: 782:claim_cpu_cores: *ERROR*: Cannot create lock on core 0, probably process 145461 has claimed it. 00:07:29.956 [2024-11-17 11:01:54.443742] app.c: 912:spdk_app_start: *ERROR*: Unable to acquire lock on assigned core mask - exiting. 00:07:30.527 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 850: kill: (145468) - No such process 00:07:30.527 ERROR: process (pid: 145468) is no longer running 00:07:30.527 11:01:55 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:30.527 11:01:55 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@868 -- # return 1 00:07:30.527 11:01:55 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@655 -- # es=1 00:07:30.527 11:01:55 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@663 -- # (( es > 128 )) 00:07:30.527 11:01:55 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@674 -- # [[ -n '' ]] 00:07:30.527 11:01:55 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@679 -- # (( !es == 0 )) 00:07:30.527 11:01:55 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@122 -- # locks_exist 145461 00:07:30.527 11:01:55 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@22 -- # lslocks -p 145461 00:07:30.527 11:01:55 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:07:31.097 lslocks: write error 00:07:31.097 11:01:55 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@124 -- # killprocess 145461 00:07:31.097 11:01:55 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@954 -- # '[' -z 145461 ']' 00:07:31.097 11:01:55 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@958 -- # kill -0 145461 00:07:31.097 11:01:55 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@959 -- # uname 00:07:31.097 11:01:55 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:31.097 11:01:55 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 145461 00:07:31.097 11:01:55 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:31.097 11:01:55 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:31.097 11:01:55 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@972 -- # echo 'killing process with pid 145461' 00:07:31.097 killing process with pid 145461 00:07:31.097 11:01:55 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@973 -- # kill 145461 00:07:31.097 11:01:55 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@978 -- # wait 145461 00:07:31.358 00:07:31.358 real 0m1.926s 00:07:31.358 user 0m2.026s 00:07:31.358 sys 0m0.719s 00:07:31.358 11:01:55 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:31.358 11:01:55 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:31.358 ************************************ 00:07:31.358 END TEST locking_app_on_locked_coremask 00:07:31.358 ************************************ 00:07:31.358 11:01:55 event.cpu_locks -- event/cpu_locks.sh@171 -- # run_test locking_overlapped_coremask locking_overlapped_coremask 00:07:31.358 11:01:55 event.cpu_locks -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:31.358 11:01:55 event.cpu_locks -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:31.358 11:01:55 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:07:31.358 ************************************ 00:07:31.358 START TEST locking_overlapped_coremask 00:07:31.358 ************************************ 00:07:31.358 11:01:55 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@1129 -- # locking_overlapped_coremask 00:07:31.358 11:01:55 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@132 -- # spdk_tgt_pid=145772 00:07:31.358 11:01:55 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@133 -- # waitforlisten 145772 /var/tmp/spdk.sock 00:07:31.358 11:01:55 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@131 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x7 00:07:31.358 11:01:55 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@835 -- # '[' -z 145772 ']' 00:07:31.358 11:01:55 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:31.358 11:01:55 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:31.358 11:01:55 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:31.358 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:31.358 11:01:55 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:31.358 11:01:55 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:31.618 [2024-11-17 11:01:56.020661] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:07:31.618 [2024-11-17 11:01:56.020741] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid145772 ] 00:07:31.618 [2024-11-17 11:01:56.106425] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 3 00:07:31.618 [2024-11-17 11:01:56.131413] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:07:31.618 [2024-11-17 11:01:56.131522] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:31.618 [2024-11-17 11:01:56.131523] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 2 00:07:31.878 11:01:56 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:31.878 11:01:56 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@868 -- # return 0 00:07:31.878 11:01:56 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@136 -- # spdk_tgt_pid2=145809 00:07:31.878 11:01:56 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@137 -- # NOT waitforlisten 145809 /var/tmp/spdk2.sock 00:07:31.879 11:01:56 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@135 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1c -r /var/tmp/spdk2.sock 00:07:31.879 11:01:56 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@652 -- # local es=0 00:07:31.879 11:01:56 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@654 -- # valid_exec_arg waitforlisten 145809 /var/tmp/spdk2.sock 00:07:31.879 11:01:56 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@640 -- # local arg=waitforlisten 00:07:31.879 11:01:56 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:31.879 11:01:56 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@644 -- # type -t waitforlisten 00:07:31.879 11:01:56 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:31.879 11:01:56 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@655 -- # waitforlisten 145809 /var/tmp/spdk2.sock 00:07:31.879 11:01:56 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@835 -- # '[' -z 145809 ']' 00:07:31.879 11:01:56 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk2.sock 00:07:31.879 11:01:56 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:31.879 11:01:56 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:07:31.879 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:07:31.879 11:01:56 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:31.879 11:01:56 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:31.879 [2024-11-17 11:01:56.348353] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:07:31.879 [2024-11-17 11:01:56.348438] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1c --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid145809 ] 00:07:31.879 [2024-11-17 11:01:56.450937] app.c: 782:claim_cpu_cores: *ERROR*: Cannot create lock on core 2, probably process 145772 has claimed it. 00:07:31.879 [2024-11-17 11:01:56.450974] app.c: 912:spdk_app_start: *ERROR*: Unable to acquire lock on assigned core mask - exiting. 00:07:32.449 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 850: kill: (145809) - No such process 00:07:32.449 ERROR: process (pid: 145809) is no longer running 00:07:32.449 11:01:57 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:32.449 11:01:57 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@868 -- # return 1 00:07:32.449 11:01:57 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@655 -- # es=1 00:07:32.449 11:01:57 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@663 -- # (( es > 128 )) 00:07:32.449 11:01:57 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@674 -- # [[ -n '' ]] 00:07:32.449 11:01:57 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@679 -- # (( !es == 0 )) 00:07:32.449 11:01:57 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@139 -- # check_remaining_locks 00:07:32.449 11:01:57 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@36 -- # locks=(/var/tmp/spdk_cpu_lock_*) 00:07:32.449 11:01:57 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@37 -- # locks_expected=(/var/tmp/spdk_cpu_lock_{000..002}) 00:07:32.449 11:01:57 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@38 -- # [[ /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 == \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\0\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\1\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\2 ]] 00:07:32.449 11:01:57 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@141 -- # killprocess 145772 00:07:32.449 11:01:57 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@954 -- # '[' -z 145772 ']' 00:07:32.449 11:01:57 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@958 -- # kill -0 145772 00:07:32.449 11:01:57 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@959 -- # uname 00:07:32.449 11:01:57 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:32.449 11:01:57 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 145772 00:07:32.449 11:01:57 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:32.449 11:01:57 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:32.449 11:01:57 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@972 -- # echo 'killing process with pid 145772' 00:07:32.449 killing process with pid 145772 00:07:32.449 11:01:57 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@973 -- # kill 145772 00:07:32.449 11:01:57 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@978 -- # wait 145772 00:07:33.019 00:07:33.019 real 0m1.381s 00:07:33.019 user 0m3.822s 00:07:33.019 sys 0m0.454s 00:07:33.019 11:01:57 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:33.019 11:01:57 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:33.019 ************************************ 00:07:33.019 END TEST locking_overlapped_coremask 00:07:33.019 ************************************ 00:07:33.019 11:01:57 event.cpu_locks -- event/cpu_locks.sh@172 -- # run_test locking_overlapped_coremask_via_rpc locking_overlapped_coremask_via_rpc 00:07:33.019 11:01:57 event.cpu_locks -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:33.019 11:01:57 event.cpu_locks -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:33.019 11:01:57 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:07:33.019 ************************************ 00:07:33.019 START TEST locking_overlapped_coremask_via_rpc 00:07:33.019 ************************************ 00:07:33.019 11:01:57 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@1129 -- # locking_overlapped_coremask_via_rpc 00:07:33.019 11:01:57 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@148 -- # spdk_tgt_pid=146068 00:07:33.019 11:01:57 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@149 -- # waitforlisten 146068 /var/tmp/spdk.sock 00:07:33.019 11:01:57 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@147 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x7 --disable-cpumask-locks 00:07:33.019 11:01:57 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@835 -- # '[' -z 146068 ']' 00:07:33.019 11:01:57 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:33.019 11:01:57 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:33.020 11:01:57 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:33.020 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:33.020 11:01:57 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:33.020 11:01:57 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:33.020 [2024-11-17 11:01:57.486489] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:07:33.020 [2024-11-17 11:01:57.486558] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid146068 ] 00:07:33.020 [2024-11-17 11:01:57.572909] app.c: 916:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:07:33.020 [2024-11-17 11:01:57.572937] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 3 00:07:33.020 [2024-11-17 11:01:57.595838] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:07:33.020 [2024-11-17 11:01:57.595946] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:33.020 [2024-11-17 11:01:57.595948] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 2 00:07:33.279 11:01:57 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:33.279 11:01:57 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@868 -- # return 0 00:07:33.279 11:01:57 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@152 -- # spdk_tgt_pid2=146088 00:07:33.280 11:01:57 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@153 -- # waitforlisten 146088 /var/tmp/spdk2.sock 00:07:33.280 11:01:57 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@151 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1c -r /var/tmp/spdk2.sock --disable-cpumask-locks 00:07:33.280 11:01:57 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@835 -- # '[' -z 146088 ']' 00:07:33.280 11:01:57 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk2.sock 00:07:33.280 11:01:57 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:33.280 11:01:57 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:07:33.280 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:07:33.280 11:01:57 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:33.280 11:01:57 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:33.280 [2024-11-17 11:01:57.824351] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:07:33.280 [2024-11-17 11:01:57.824436] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1c --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid146088 ] 00:07:33.280 [2024-11-17 11:01:57.927949] app.c: 916:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:07:33.280 [2024-11-17 11:01:57.927981] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 3 00:07:33.540 [2024-11-17 11:01:57.972780] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 3 00:07:33.540 [2024-11-17 11:01:57.976089] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 2 00:07:33.540 [2024-11-17 11:01:57.976090] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 4 00:07:34.111 11:01:58 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:34.111 11:01:58 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@868 -- # return 0 00:07:34.111 11:01:58 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@155 -- # rpc_cmd framework_enable_cpumask_locks 00:07:34.111 11:01:58 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:34.111 11:01:58 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:34.111 11:01:58 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:34.111 11:01:58 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@156 -- # NOT rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:07:34.111 11:01:58 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@652 -- # local es=0 00:07:34.111 11:01:58 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@654 -- # valid_exec_arg rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:07:34.111 11:01:58 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@640 -- # local arg=rpc_cmd 00:07:34.111 11:01:58 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:34.111 11:01:58 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@644 -- # type -t rpc_cmd 00:07:34.111 11:01:58 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:34.111 11:01:58 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@655 -- # rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:07:34.111 11:01:58 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:34.111 11:01:58 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:34.111 [2024-11-17 11:01:58.709109] app.c: 782:claim_cpu_cores: *ERROR*: Cannot create lock on core 2, probably process 146068 has claimed it. 00:07:34.111 request: 00:07:34.111 { 00:07:34.111 "method": "framework_enable_cpumask_locks", 00:07:34.111 "req_id": 1 00:07:34.111 } 00:07:34.111 Got JSON-RPC error response 00:07:34.111 response: 00:07:34.111 { 00:07:34.111 "code": -32603, 00:07:34.111 "message": "Failed to claim CPU core: 2" 00:07:34.111 } 00:07:34.111 11:01:58 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@591 -- # [[ 1 == 0 ]] 00:07:34.111 11:01:58 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@655 -- # es=1 00:07:34.111 11:01:58 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@663 -- # (( es > 128 )) 00:07:34.111 11:01:58 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@674 -- # [[ -n '' ]] 00:07:34.111 11:01:58 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@679 -- # (( !es == 0 )) 00:07:34.111 11:01:58 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@158 -- # waitforlisten 146068 /var/tmp/spdk.sock 00:07:34.111 11:01:58 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@835 -- # '[' -z 146068 ']' 00:07:34.111 11:01:58 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:34.111 11:01:58 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:34.111 11:01:58 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:34.111 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:34.111 11:01:58 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:34.111 11:01:58 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:34.372 11:01:58 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:34.372 11:01:58 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@868 -- # return 0 00:07:34.372 11:01:58 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@159 -- # waitforlisten 146088 /var/tmp/spdk2.sock 00:07:34.372 11:01:58 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@835 -- # '[' -z 146088 ']' 00:07:34.372 11:01:58 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk2.sock 00:07:34.372 11:01:58 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:34.372 11:01:58 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:07:34.372 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:07:34.372 11:01:58 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:34.372 11:01:58 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:34.657 11:01:59 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:34.657 11:01:59 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@868 -- # return 0 00:07:34.657 11:01:59 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@161 -- # check_remaining_locks 00:07:34.657 11:01:59 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@36 -- # locks=(/var/tmp/spdk_cpu_lock_*) 00:07:34.657 11:01:59 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@37 -- # locks_expected=(/var/tmp/spdk_cpu_lock_{000..002}) 00:07:34.657 11:01:59 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@38 -- # [[ /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 == \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\0\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\1\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\2 ]] 00:07:34.657 00:07:34.657 real 0m1.658s 00:07:34.657 user 0m0.799s 00:07:34.657 sys 0m0.155s 00:07:34.657 11:01:59 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:34.657 11:01:59 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:34.657 ************************************ 00:07:34.657 END TEST locking_overlapped_coremask_via_rpc 00:07:34.657 ************************************ 00:07:34.657 11:01:59 event.cpu_locks -- event/cpu_locks.sh@174 -- # cleanup 00:07:34.657 11:01:59 event.cpu_locks -- event/cpu_locks.sh@15 -- # [[ -z 146068 ]] 00:07:34.657 11:01:59 event.cpu_locks -- event/cpu_locks.sh@15 -- # killprocess 146068 00:07:34.657 11:01:59 event.cpu_locks -- common/autotest_common.sh@954 -- # '[' -z 146068 ']' 00:07:34.657 11:01:59 event.cpu_locks -- common/autotest_common.sh@958 -- # kill -0 146068 00:07:34.657 11:01:59 event.cpu_locks -- common/autotest_common.sh@959 -- # uname 00:07:34.657 11:01:59 event.cpu_locks -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:34.657 11:01:59 event.cpu_locks -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 146068 00:07:34.657 11:01:59 event.cpu_locks -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:34.657 11:01:59 event.cpu_locks -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:34.657 11:01:59 event.cpu_locks -- common/autotest_common.sh@972 -- # echo 'killing process with pid 146068' 00:07:34.657 killing process with pid 146068 00:07:34.657 11:01:59 event.cpu_locks -- common/autotest_common.sh@973 -- # kill 146068 00:07:34.657 11:01:59 event.cpu_locks -- common/autotest_common.sh@978 -- # wait 146068 00:07:34.920 11:01:59 event.cpu_locks -- event/cpu_locks.sh@16 -- # [[ -z 146088 ]] 00:07:34.920 11:01:59 event.cpu_locks -- event/cpu_locks.sh@16 -- # killprocess 146088 00:07:34.921 11:01:59 event.cpu_locks -- common/autotest_common.sh@954 -- # '[' -z 146088 ']' 00:07:34.921 11:01:59 event.cpu_locks -- common/autotest_common.sh@958 -- # kill -0 146088 00:07:34.921 11:01:59 event.cpu_locks -- common/autotest_common.sh@959 -- # uname 00:07:34.921 11:01:59 event.cpu_locks -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:34.921 11:01:59 event.cpu_locks -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 146088 00:07:35.182 11:01:59 event.cpu_locks -- common/autotest_common.sh@960 -- # process_name=reactor_2 00:07:35.182 11:01:59 event.cpu_locks -- common/autotest_common.sh@964 -- # '[' reactor_2 = sudo ']' 00:07:35.182 11:01:59 event.cpu_locks -- common/autotest_common.sh@972 -- # echo 'killing process with pid 146088' 00:07:35.182 killing process with pid 146088 00:07:35.182 11:01:59 event.cpu_locks -- common/autotest_common.sh@973 -- # kill 146088 00:07:35.182 11:01:59 event.cpu_locks -- common/autotest_common.sh@978 -- # wait 146088 00:07:35.443 11:01:59 event.cpu_locks -- event/cpu_locks.sh@18 -- # rm -f 00:07:35.443 11:01:59 event.cpu_locks -- event/cpu_locks.sh@1 -- # cleanup 00:07:35.443 11:01:59 event.cpu_locks -- event/cpu_locks.sh@15 -- # [[ -z 146068 ]] 00:07:35.443 11:01:59 event.cpu_locks -- event/cpu_locks.sh@15 -- # killprocess 146068 00:07:35.443 11:01:59 event.cpu_locks -- common/autotest_common.sh@954 -- # '[' -z 146068 ']' 00:07:35.443 11:01:59 event.cpu_locks -- common/autotest_common.sh@958 -- # kill -0 146068 00:07:35.443 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 958: kill: (146068) - No such process 00:07:35.443 11:01:59 event.cpu_locks -- common/autotest_common.sh@981 -- # echo 'Process with pid 146068 is not found' 00:07:35.443 Process with pid 146068 is not found 00:07:35.443 11:01:59 event.cpu_locks -- event/cpu_locks.sh@16 -- # [[ -z 146088 ]] 00:07:35.443 11:01:59 event.cpu_locks -- event/cpu_locks.sh@16 -- # killprocess 146088 00:07:35.443 11:01:59 event.cpu_locks -- common/autotest_common.sh@954 -- # '[' -z 146088 ']' 00:07:35.443 11:01:59 event.cpu_locks -- common/autotest_common.sh@958 -- # kill -0 146088 00:07:35.443 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 958: kill: (146088) - No such process 00:07:35.443 11:01:59 event.cpu_locks -- common/autotest_common.sh@981 -- # echo 'Process with pid 146088 is not found' 00:07:35.443 Process with pid 146088 is not found 00:07:35.443 11:01:59 event.cpu_locks -- event/cpu_locks.sh@18 -- # rm -f 00:07:35.443 00:07:35.443 real 0m14.906s 00:07:35.443 user 0m24.923s 00:07:35.443 sys 0m6.136s 00:07:35.443 11:01:59 event.cpu_locks -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:35.443 11:01:59 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:07:35.443 ************************************ 00:07:35.443 END TEST cpu_locks 00:07:35.443 ************************************ 00:07:35.443 00:07:35.443 real 0m39.366s 00:07:35.443 user 1m12.749s 00:07:35.443 sys 0m10.554s 00:07:35.443 11:01:59 event -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:35.443 11:01:59 event -- common/autotest_common.sh@10 -- # set +x 00:07:35.443 ************************************ 00:07:35.443 END TEST event 00:07:35.443 ************************************ 00:07:35.443 11:01:59 -- spdk/autotest.sh@169 -- # run_test thread /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/thread.sh 00:07:35.443 11:01:59 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:35.443 11:01:59 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:35.443 11:01:59 -- common/autotest_common.sh@10 -- # set +x 00:07:35.443 ************************************ 00:07:35.443 START TEST thread 00:07:35.443 ************************************ 00:07:35.443 11:02:00 thread -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/thread.sh 00:07:35.704 * Looking for test storage... 00:07:35.704 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread 00:07:35.704 11:02:00 thread -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:07:35.704 11:02:00 thread -- common/autotest_common.sh@1693 -- # lcov --version 00:07:35.704 11:02:00 thread -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:07:35.704 11:02:00 thread -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:07:35.704 11:02:00 thread -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:35.704 11:02:00 thread -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:35.704 11:02:00 thread -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:35.704 11:02:00 thread -- scripts/common.sh@336 -- # IFS=.-: 00:07:35.704 11:02:00 thread -- scripts/common.sh@336 -- # read -ra ver1 00:07:35.704 11:02:00 thread -- scripts/common.sh@337 -- # IFS=.-: 00:07:35.704 11:02:00 thread -- scripts/common.sh@337 -- # read -ra ver2 00:07:35.704 11:02:00 thread -- scripts/common.sh@338 -- # local 'op=<' 00:07:35.704 11:02:00 thread -- scripts/common.sh@340 -- # ver1_l=2 00:07:35.704 11:02:00 thread -- scripts/common.sh@341 -- # ver2_l=1 00:07:35.704 11:02:00 thread -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:35.704 11:02:00 thread -- scripts/common.sh@344 -- # case "$op" in 00:07:35.704 11:02:00 thread -- scripts/common.sh@345 -- # : 1 00:07:35.704 11:02:00 thread -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:35.704 11:02:00 thread -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:35.704 11:02:00 thread -- scripts/common.sh@365 -- # decimal 1 00:07:35.704 11:02:00 thread -- scripts/common.sh@353 -- # local d=1 00:07:35.704 11:02:00 thread -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:35.704 11:02:00 thread -- scripts/common.sh@355 -- # echo 1 00:07:35.704 11:02:00 thread -- scripts/common.sh@365 -- # ver1[v]=1 00:07:35.704 11:02:00 thread -- scripts/common.sh@366 -- # decimal 2 00:07:35.704 11:02:00 thread -- scripts/common.sh@353 -- # local d=2 00:07:35.704 11:02:00 thread -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:35.704 11:02:00 thread -- scripts/common.sh@355 -- # echo 2 00:07:35.704 11:02:00 thread -- scripts/common.sh@366 -- # ver2[v]=2 00:07:35.704 11:02:00 thread -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:35.704 11:02:00 thread -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:35.704 11:02:00 thread -- scripts/common.sh@368 -- # return 0 00:07:35.704 11:02:00 thread -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:35.704 11:02:00 thread -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:07:35.704 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:35.704 --rc genhtml_branch_coverage=1 00:07:35.704 --rc genhtml_function_coverage=1 00:07:35.704 --rc genhtml_legend=1 00:07:35.704 --rc geninfo_all_blocks=1 00:07:35.704 --rc geninfo_unexecuted_blocks=1 00:07:35.704 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:35.704 ' 00:07:35.704 11:02:00 thread -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:07:35.704 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:35.704 --rc genhtml_branch_coverage=1 00:07:35.704 --rc genhtml_function_coverage=1 00:07:35.704 --rc genhtml_legend=1 00:07:35.704 --rc geninfo_all_blocks=1 00:07:35.704 --rc geninfo_unexecuted_blocks=1 00:07:35.704 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:35.704 ' 00:07:35.704 11:02:00 thread -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:07:35.704 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:35.704 --rc genhtml_branch_coverage=1 00:07:35.704 --rc genhtml_function_coverage=1 00:07:35.704 --rc genhtml_legend=1 00:07:35.704 --rc geninfo_all_blocks=1 00:07:35.704 --rc geninfo_unexecuted_blocks=1 00:07:35.704 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:35.704 ' 00:07:35.704 11:02:00 thread -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:07:35.704 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:35.704 --rc genhtml_branch_coverage=1 00:07:35.704 --rc genhtml_function_coverage=1 00:07:35.704 --rc genhtml_legend=1 00:07:35.704 --rc geninfo_all_blocks=1 00:07:35.704 --rc geninfo_unexecuted_blocks=1 00:07:35.704 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:35.704 ' 00:07:35.704 11:02:00 thread -- thread/thread.sh@11 -- # run_test thread_poller_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:07:35.704 11:02:00 thread -- common/autotest_common.sh@1105 -- # '[' 8 -le 1 ']' 00:07:35.704 11:02:00 thread -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:35.704 11:02:00 thread -- common/autotest_common.sh@10 -- # set +x 00:07:35.704 ************************************ 00:07:35.704 START TEST thread_poller_perf 00:07:35.704 ************************************ 00:07:35.705 11:02:00 thread.thread_poller_perf -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:07:35.705 [2024-11-17 11:02:00.281749] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:07:35.705 [2024-11-17 11:02:00.281831] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid146706 ] 00:07:35.965 [2024-11-17 11:02:00.370303] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:35.965 [2024-11-17 11:02:00.392791] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:35.965 Running 1000 pollers for 1 seconds with 1 microseconds period. 00:07:36.905 [2024-11-17T10:02:01.563Z] ====================================== 00:07:36.905 [2024-11-17T10:02:01.563Z] busy:2503989628 (cyc) 00:07:36.905 [2024-11-17T10:02:01.563Z] total_run_count: 840000 00:07:36.905 [2024-11-17T10:02:01.563Z] tsc_hz: 2500000000 (cyc) 00:07:36.905 [2024-11-17T10:02:01.563Z] ====================================== 00:07:36.905 [2024-11-17T10:02:01.563Z] poller_cost: 2980 (cyc), 1192 (nsec) 00:07:36.905 00:07:36.905 real 0m1.160s 00:07:36.905 user 0m1.066s 00:07:36.905 sys 0m0.089s 00:07:36.905 11:02:01 thread.thread_poller_perf -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:36.905 11:02:01 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:07:36.905 ************************************ 00:07:36.905 END TEST thread_poller_perf 00:07:36.905 ************************************ 00:07:36.905 11:02:01 thread -- thread/thread.sh@12 -- # run_test thread_poller_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:07:36.905 11:02:01 thread -- common/autotest_common.sh@1105 -- # '[' 8 -le 1 ']' 00:07:36.905 11:02:01 thread -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:36.905 11:02:01 thread -- common/autotest_common.sh@10 -- # set +x 00:07:36.905 ************************************ 00:07:36.905 START TEST thread_poller_perf 00:07:36.905 ************************************ 00:07:36.906 11:02:01 thread.thread_poller_perf -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:07:36.906 [2024-11-17 11:02:01.526870] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:07:36.906 [2024-11-17 11:02:01.526955] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid146990 ] 00:07:37.164 [2024-11-17 11:02:01.616493] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:37.164 [2024-11-17 11:02:01.639596] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:37.164 Running 1000 pollers for 1 seconds with 0 microseconds period. 00:07:38.104 [2024-11-17T10:02:02.762Z] ====================================== 00:07:38.104 [2024-11-17T10:02:02.762Z] busy:2501262532 (cyc) 00:07:38.104 [2024-11-17T10:02:02.762Z] total_run_count: 13249000 00:07:38.104 [2024-11-17T10:02:02.762Z] tsc_hz: 2500000000 (cyc) 00:07:38.104 [2024-11-17T10:02:02.762Z] ====================================== 00:07:38.104 [2024-11-17T10:02:02.762Z] poller_cost: 188 (cyc), 75 (nsec) 00:07:38.104 00:07:38.104 real 0m1.162s 00:07:38.104 user 0m1.058s 00:07:38.104 sys 0m0.099s 00:07:38.104 11:02:02 thread.thread_poller_perf -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:38.104 11:02:02 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:07:38.105 ************************************ 00:07:38.105 END TEST thread_poller_perf 00:07:38.105 ************************************ 00:07:38.105 11:02:02 thread -- thread/thread.sh@17 -- # [[ n != \y ]] 00:07:38.105 11:02:02 thread -- thread/thread.sh@18 -- # run_test thread_spdk_lock /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock 00:07:38.105 11:02:02 thread -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:38.105 11:02:02 thread -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:38.105 11:02:02 thread -- common/autotest_common.sh@10 -- # set +x 00:07:38.105 ************************************ 00:07:38.105 START TEST thread_spdk_lock 00:07:38.105 ************************************ 00:07:38.105 11:02:02 thread.thread_spdk_lock -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock 00:07:38.365 [2024-11-17 11:02:02.773783] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:07:38.365 [2024-11-17 11:02:02.773867] [ DPDK EAL parameters: spdk_lock_test --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid147167 ] 00:07:38.365 [2024-11-17 11:02:02.859961] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:38.365 [2024-11-17 11:02:02.886090] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:38.365 [2024-11-17 11:02:02.886090] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:07:38.936 [2024-11-17 11:02:03.380184] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 980:thread_execute_poller: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:07:38.936 [2024-11-17 11:02:03.380219] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:3112:spdk_spin_lock: *ERROR*: unrecoverable spinlock error 2: Deadlock detected (thread != sspin->thread) 00:07:38.936 [2024-11-17 11:02:03.380230] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:3067:sspin_stacks_print: *ERROR*: spinlock 0x131e040 00:07:38.936 [2024-11-17 11:02:03.380915] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 875:msg_queue_run_batch: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:07:38.936 [2024-11-17 11:02:03.381017] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:1041:thread_execute_timed_poller: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:07:38.936 [2024-11-17 11:02:03.381036] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 875:msg_queue_run_batch: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:07:38.936 Starting test contend 00:07:38.936 Worker Delay Wait us Hold us Total us 00:07:38.936 0 3 169107 186313 355420 00:07:38.936 1 5 82083 288107 370190 00:07:38.936 PASS test contend 00:07:38.936 Starting test hold_by_poller 00:07:38.936 PASS test hold_by_poller 00:07:38.936 Starting test hold_by_message 00:07:38.936 PASS test hold_by_message 00:07:38.936 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock summary: 00:07:38.936 100014 assertions passed 00:07:38.936 0 assertions failed 00:07:38.936 00:07:38.936 real 0m0.651s 00:07:38.936 user 0m1.052s 00:07:38.936 sys 0m0.090s 00:07:38.936 11:02:03 thread.thread_spdk_lock -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:38.936 11:02:03 thread.thread_spdk_lock -- common/autotest_common.sh@10 -- # set +x 00:07:38.936 ************************************ 00:07:38.936 END TEST thread_spdk_lock 00:07:38.936 ************************************ 00:07:38.936 00:07:38.936 real 0m3.419s 00:07:38.936 user 0m3.371s 00:07:38.936 sys 0m0.568s 00:07:38.936 11:02:03 thread -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:38.936 11:02:03 thread -- common/autotest_common.sh@10 -- # set +x 00:07:38.936 ************************************ 00:07:38.936 END TEST thread 00:07:38.936 ************************************ 00:07:38.936 11:02:03 -- spdk/autotest.sh@171 -- # [[ 0 -eq 1 ]] 00:07:38.936 11:02:03 -- spdk/autotest.sh@176 -- # run_test app_cmdline /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/cmdline.sh 00:07:38.936 11:02:03 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:38.936 11:02:03 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:38.936 11:02:03 -- common/autotest_common.sh@10 -- # set +x 00:07:38.936 ************************************ 00:07:38.936 START TEST app_cmdline 00:07:38.936 ************************************ 00:07:38.936 11:02:03 app_cmdline -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/cmdline.sh 00:07:39.198 * Looking for test storage... 00:07:39.198 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:39.198 11:02:03 app_cmdline -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:07:39.198 11:02:03 app_cmdline -- common/autotest_common.sh@1693 -- # lcov --version 00:07:39.198 11:02:03 app_cmdline -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:07:39.198 11:02:03 app_cmdline -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:07:39.198 11:02:03 app_cmdline -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:39.198 11:02:03 app_cmdline -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:39.198 11:02:03 app_cmdline -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:39.198 11:02:03 app_cmdline -- scripts/common.sh@336 -- # IFS=.-: 00:07:39.198 11:02:03 app_cmdline -- scripts/common.sh@336 -- # read -ra ver1 00:07:39.198 11:02:03 app_cmdline -- scripts/common.sh@337 -- # IFS=.-: 00:07:39.198 11:02:03 app_cmdline -- scripts/common.sh@337 -- # read -ra ver2 00:07:39.198 11:02:03 app_cmdline -- scripts/common.sh@338 -- # local 'op=<' 00:07:39.198 11:02:03 app_cmdline -- scripts/common.sh@340 -- # ver1_l=2 00:07:39.198 11:02:03 app_cmdline -- scripts/common.sh@341 -- # ver2_l=1 00:07:39.198 11:02:03 app_cmdline -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:39.198 11:02:03 app_cmdline -- scripts/common.sh@344 -- # case "$op" in 00:07:39.198 11:02:03 app_cmdline -- scripts/common.sh@345 -- # : 1 00:07:39.198 11:02:03 app_cmdline -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:39.198 11:02:03 app_cmdline -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:39.198 11:02:03 app_cmdline -- scripts/common.sh@365 -- # decimal 1 00:07:39.198 11:02:03 app_cmdline -- scripts/common.sh@353 -- # local d=1 00:07:39.198 11:02:03 app_cmdline -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:39.198 11:02:03 app_cmdline -- scripts/common.sh@355 -- # echo 1 00:07:39.198 11:02:03 app_cmdline -- scripts/common.sh@365 -- # ver1[v]=1 00:07:39.198 11:02:03 app_cmdline -- scripts/common.sh@366 -- # decimal 2 00:07:39.198 11:02:03 app_cmdline -- scripts/common.sh@353 -- # local d=2 00:07:39.198 11:02:03 app_cmdline -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:39.198 11:02:03 app_cmdline -- scripts/common.sh@355 -- # echo 2 00:07:39.198 11:02:03 app_cmdline -- scripts/common.sh@366 -- # ver2[v]=2 00:07:39.198 11:02:03 app_cmdline -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:39.198 11:02:03 app_cmdline -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:39.198 11:02:03 app_cmdline -- scripts/common.sh@368 -- # return 0 00:07:39.198 11:02:03 app_cmdline -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:39.198 11:02:03 app_cmdline -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:07:39.198 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:39.198 --rc genhtml_branch_coverage=1 00:07:39.198 --rc genhtml_function_coverage=1 00:07:39.198 --rc genhtml_legend=1 00:07:39.198 --rc geninfo_all_blocks=1 00:07:39.198 --rc geninfo_unexecuted_blocks=1 00:07:39.198 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:39.198 ' 00:07:39.198 11:02:03 app_cmdline -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:07:39.198 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:39.198 --rc genhtml_branch_coverage=1 00:07:39.198 --rc genhtml_function_coverage=1 00:07:39.198 --rc genhtml_legend=1 00:07:39.198 --rc geninfo_all_blocks=1 00:07:39.198 --rc geninfo_unexecuted_blocks=1 00:07:39.198 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:39.198 ' 00:07:39.198 11:02:03 app_cmdline -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:07:39.198 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:39.198 --rc genhtml_branch_coverage=1 00:07:39.198 --rc genhtml_function_coverage=1 00:07:39.198 --rc genhtml_legend=1 00:07:39.198 --rc geninfo_all_blocks=1 00:07:39.198 --rc geninfo_unexecuted_blocks=1 00:07:39.198 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:39.198 ' 00:07:39.198 11:02:03 app_cmdline -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:07:39.198 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:39.198 --rc genhtml_branch_coverage=1 00:07:39.198 --rc genhtml_function_coverage=1 00:07:39.198 --rc genhtml_legend=1 00:07:39.198 --rc geninfo_all_blocks=1 00:07:39.198 --rc geninfo_unexecuted_blocks=1 00:07:39.198 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:39.198 ' 00:07:39.198 11:02:03 app_cmdline -- app/cmdline.sh@14 -- # trap 'killprocess $spdk_tgt_pid' EXIT 00:07:39.198 11:02:03 app_cmdline -- app/cmdline.sh@17 -- # spdk_tgt_pid=147355 00:07:39.198 11:02:03 app_cmdline -- app/cmdline.sh@18 -- # waitforlisten 147355 00:07:39.198 11:02:03 app_cmdline -- app/cmdline.sh@16 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --rpcs-allowed spdk_get_version,rpc_get_methods 00:07:39.198 11:02:03 app_cmdline -- common/autotest_common.sh@835 -- # '[' -z 147355 ']' 00:07:39.198 11:02:03 app_cmdline -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:39.198 11:02:03 app_cmdline -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:39.198 11:02:03 app_cmdline -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:39.198 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:39.198 11:02:03 app_cmdline -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:39.198 11:02:03 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:07:39.198 [2024-11-17 11:02:03.761301] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:07:39.198 [2024-11-17 11:02:03.761365] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid147355 ] 00:07:39.198 [2024-11-17 11:02:03.847889] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:39.459 [2024-11-17 11:02:03.869448] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:39.459 11:02:04 app_cmdline -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:39.459 11:02:04 app_cmdline -- common/autotest_common.sh@868 -- # return 0 00:07:39.459 11:02:04 app_cmdline -- app/cmdline.sh@20 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py spdk_get_version 00:07:39.719 { 00:07:39.719 "version": "SPDK v25.01-pre git sha1 83e8405e4", 00:07:39.719 "fields": { 00:07:39.719 "major": 25, 00:07:39.719 "minor": 1, 00:07:39.719 "patch": 0, 00:07:39.719 "suffix": "-pre", 00:07:39.719 "commit": "83e8405e4" 00:07:39.719 } 00:07:39.719 } 00:07:39.719 11:02:04 app_cmdline -- app/cmdline.sh@22 -- # expected_methods=() 00:07:39.719 11:02:04 app_cmdline -- app/cmdline.sh@23 -- # expected_methods+=("rpc_get_methods") 00:07:39.719 11:02:04 app_cmdline -- app/cmdline.sh@24 -- # expected_methods+=("spdk_get_version") 00:07:39.719 11:02:04 app_cmdline -- app/cmdline.sh@26 -- # methods=($(rpc_cmd rpc_get_methods | jq -r ".[]" | sort)) 00:07:39.719 11:02:04 app_cmdline -- app/cmdline.sh@26 -- # rpc_cmd rpc_get_methods 00:07:39.719 11:02:04 app_cmdline -- app/cmdline.sh@26 -- # jq -r '.[]' 00:07:39.719 11:02:04 app_cmdline -- app/cmdline.sh@26 -- # sort 00:07:39.719 11:02:04 app_cmdline -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:39.719 11:02:04 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:07:39.719 11:02:04 app_cmdline -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:39.719 11:02:04 app_cmdline -- app/cmdline.sh@27 -- # (( 2 == 2 )) 00:07:39.720 11:02:04 app_cmdline -- app/cmdline.sh@28 -- # [[ rpc_get_methods spdk_get_version == \r\p\c\_\g\e\t\_\m\e\t\h\o\d\s\ \s\p\d\k\_\g\e\t\_\v\e\r\s\i\o\n ]] 00:07:39.720 11:02:04 app_cmdline -- app/cmdline.sh@30 -- # NOT /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:39.720 11:02:04 app_cmdline -- common/autotest_common.sh@652 -- # local es=0 00:07:39.720 11:02:04 app_cmdline -- common/autotest_common.sh@654 -- # valid_exec_arg /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:39.720 11:02:04 app_cmdline -- common/autotest_common.sh@640 -- # local arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:39.720 11:02:04 app_cmdline -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:39.720 11:02:04 app_cmdline -- common/autotest_common.sh@644 -- # type -t /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:39.720 11:02:04 app_cmdline -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:39.720 11:02:04 app_cmdline -- common/autotest_common.sh@646 -- # type -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:39.720 11:02:04 app_cmdline -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:39.720 11:02:04 app_cmdline -- common/autotest_common.sh@646 -- # arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:39.720 11:02:04 app_cmdline -- common/autotest_common.sh@646 -- # [[ -x /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py ]] 00:07:39.720 11:02:04 app_cmdline -- common/autotest_common.sh@655 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:39.980 request: 00:07:39.980 { 00:07:39.980 "method": "env_dpdk_get_mem_stats", 00:07:39.980 "req_id": 1 00:07:39.980 } 00:07:39.980 Got JSON-RPC error response 00:07:39.980 response: 00:07:39.980 { 00:07:39.980 "code": -32601, 00:07:39.980 "message": "Method not found" 00:07:39.980 } 00:07:39.980 11:02:04 app_cmdline -- common/autotest_common.sh@655 -- # es=1 00:07:39.980 11:02:04 app_cmdline -- common/autotest_common.sh@663 -- # (( es > 128 )) 00:07:39.980 11:02:04 app_cmdline -- common/autotest_common.sh@674 -- # [[ -n '' ]] 00:07:39.980 11:02:04 app_cmdline -- common/autotest_common.sh@679 -- # (( !es == 0 )) 00:07:39.980 11:02:04 app_cmdline -- app/cmdline.sh@1 -- # killprocess 147355 00:07:39.980 11:02:04 app_cmdline -- common/autotest_common.sh@954 -- # '[' -z 147355 ']' 00:07:39.980 11:02:04 app_cmdline -- common/autotest_common.sh@958 -- # kill -0 147355 00:07:39.980 11:02:04 app_cmdline -- common/autotest_common.sh@959 -- # uname 00:07:39.980 11:02:04 app_cmdline -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:39.980 11:02:04 app_cmdline -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 147355 00:07:39.980 11:02:04 app_cmdline -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:39.980 11:02:04 app_cmdline -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:39.980 11:02:04 app_cmdline -- common/autotest_common.sh@972 -- # echo 'killing process with pid 147355' 00:07:39.980 killing process with pid 147355 00:07:39.980 11:02:04 app_cmdline -- common/autotest_common.sh@973 -- # kill 147355 00:07:39.980 11:02:04 app_cmdline -- common/autotest_common.sh@978 -- # wait 147355 00:07:40.241 00:07:40.241 real 0m1.333s 00:07:40.241 user 0m1.520s 00:07:40.241 sys 0m0.527s 00:07:40.241 11:02:04 app_cmdline -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:40.241 11:02:04 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:07:40.241 ************************************ 00:07:40.241 END TEST app_cmdline 00:07:40.241 ************************************ 00:07:40.502 11:02:04 -- spdk/autotest.sh@177 -- # run_test version /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/version.sh 00:07:40.502 11:02:04 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:40.502 11:02:04 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:40.502 11:02:04 -- common/autotest_common.sh@10 -- # set +x 00:07:40.502 ************************************ 00:07:40.502 START TEST version 00:07:40.502 ************************************ 00:07:40.502 11:02:04 version -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/version.sh 00:07:40.502 * Looking for test storage... 00:07:40.502 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:40.502 11:02:05 version -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:07:40.502 11:02:05 version -- common/autotest_common.sh@1693 -- # lcov --version 00:07:40.502 11:02:05 version -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:07:40.502 11:02:05 version -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:07:40.502 11:02:05 version -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:40.502 11:02:05 version -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:40.502 11:02:05 version -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:40.502 11:02:05 version -- scripts/common.sh@336 -- # IFS=.-: 00:07:40.502 11:02:05 version -- scripts/common.sh@336 -- # read -ra ver1 00:07:40.502 11:02:05 version -- scripts/common.sh@337 -- # IFS=.-: 00:07:40.502 11:02:05 version -- scripts/common.sh@337 -- # read -ra ver2 00:07:40.502 11:02:05 version -- scripts/common.sh@338 -- # local 'op=<' 00:07:40.502 11:02:05 version -- scripts/common.sh@340 -- # ver1_l=2 00:07:40.502 11:02:05 version -- scripts/common.sh@341 -- # ver2_l=1 00:07:40.502 11:02:05 version -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:40.502 11:02:05 version -- scripts/common.sh@344 -- # case "$op" in 00:07:40.502 11:02:05 version -- scripts/common.sh@345 -- # : 1 00:07:40.502 11:02:05 version -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:40.503 11:02:05 version -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:40.503 11:02:05 version -- scripts/common.sh@365 -- # decimal 1 00:07:40.503 11:02:05 version -- scripts/common.sh@353 -- # local d=1 00:07:40.503 11:02:05 version -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:40.503 11:02:05 version -- scripts/common.sh@355 -- # echo 1 00:07:40.503 11:02:05 version -- scripts/common.sh@365 -- # ver1[v]=1 00:07:40.503 11:02:05 version -- scripts/common.sh@366 -- # decimal 2 00:07:40.503 11:02:05 version -- scripts/common.sh@353 -- # local d=2 00:07:40.503 11:02:05 version -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:40.503 11:02:05 version -- scripts/common.sh@355 -- # echo 2 00:07:40.503 11:02:05 version -- scripts/common.sh@366 -- # ver2[v]=2 00:07:40.503 11:02:05 version -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:40.503 11:02:05 version -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:40.503 11:02:05 version -- scripts/common.sh@368 -- # return 0 00:07:40.503 11:02:05 version -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:40.503 11:02:05 version -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:07:40.503 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:40.503 --rc genhtml_branch_coverage=1 00:07:40.503 --rc genhtml_function_coverage=1 00:07:40.503 --rc genhtml_legend=1 00:07:40.503 --rc geninfo_all_blocks=1 00:07:40.503 --rc geninfo_unexecuted_blocks=1 00:07:40.503 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:40.503 ' 00:07:40.503 11:02:05 version -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:07:40.503 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:40.503 --rc genhtml_branch_coverage=1 00:07:40.503 --rc genhtml_function_coverage=1 00:07:40.503 --rc genhtml_legend=1 00:07:40.503 --rc geninfo_all_blocks=1 00:07:40.503 --rc geninfo_unexecuted_blocks=1 00:07:40.503 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:40.503 ' 00:07:40.503 11:02:05 version -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:07:40.503 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:40.503 --rc genhtml_branch_coverage=1 00:07:40.503 --rc genhtml_function_coverage=1 00:07:40.503 --rc genhtml_legend=1 00:07:40.503 --rc geninfo_all_blocks=1 00:07:40.503 --rc geninfo_unexecuted_blocks=1 00:07:40.503 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:40.503 ' 00:07:40.503 11:02:05 version -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:07:40.503 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:40.503 --rc genhtml_branch_coverage=1 00:07:40.503 --rc genhtml_function_coverage=1 00:07:40.503 --rc genhtml_legend=1 00:07:40.503 --rc geninfo_all_blocks=1 00:07:40.503 --rc geninfo_unexecuted_blocks=1 00:07:40.503 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:40.503 ' 00:07:40.503 11:02:05 version -- app/version.sh@17 -- # get_header_version major 00:07:40.503 11:02:05 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MAJOR[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:40.503 11:02:05 version -- app/version.sh@14 -- # cut -f2 00:07:40.503 11:02:05 version -- app/version.sh@14 -- # tr -d '"' 00:07:40.503 11:02:05 version -- app/version.sh@17 -- # major=25 00:07:40.764 11:02:05 version -- app/version.sh@18 -- # get_header_version minor 00:07:40.764 11:02:05 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MINOR[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:40.764 11:02:05 version -- app/version.sh@14 -- # cut -f2 00:07:40.764 11:02:05 version -- app/version.sh@14 -- # tr -d '"' 00:07:40.764 11:02:05 version -- app/version.sh@18 -- # minor=1 00:07:40.764 11:02:05 version -- app/version.sh@19 -- # get_header_version patch 00:07:40.765 11:02:05 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_PATCH[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:40.765 11:02:05 version -- app/version.sh@14 -- # cut -f2 00:07:40.765 11:02:05 version -- app/version.sh@14 -- # tr -d '"' 00:07:40.765 11:02:05 version -- app/version.sh@19 -- # patch=0 00:07:40.765 11:02:05 version -- app/version.sh@20 -- # get_header_version suffix 00:07:40.765 11:02:05 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_SUFFIX[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:40.765 11:02:05 version -- app/version.sh@14 -- # cut -f2 00:07:40.765 11:02:05 version -- app/version.sh@14 -- # tr -d '"' 00:07:40.765 11:02:05 version -- app/version.sh@20 -- # suffix=-pre 00:07:40.765 11:02:05 version -- app/version.sh@22 -- # version=25.1 00:07:40.765 11:02:05 version -- app/version.sh@25 -- # (( patch != 0 )) 00:07:40.765 11:02:05 version -- app/version.sh@28 -- # version=25.1rc0 00:07:40.765 11:02:05 version -- app/version.sh@30 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:40.765 11:02:05 version -- app/version.sh@30 -- # python3 -c 'import spdk; print(spdk.__version__)' 00:07:40.765 11:02:05 version -- app/version.sh@30 -- # py_version=25.1rc0 00:07:40.765 11:02:05 version -- app/version.sh@31 -- # [[ 25.1rc0 == \2\5\.\1\r\c\0 ]] 00:07:40.765 00:07:40.765 real 0m0.273s 00:07:40.765 user 0m0.143s 00:07:40.765 sys 0m0.186s 00:07:40.765 11:02:05 version -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:40.765 11:02:05 version -- common/autotest_common.sh@10 -- # set +x 00:07:40.765 ************************************ 00:07:40.765 END TEST version 00:07:40.765 ************************************ 00:07:40.765 11:02:05 -- spdk/autotest.sh@179 -- # '[' 0 -eq 1 ']' 00:07:40.765 11:02:05 -- spdk/autotest.sh@188 -- # [[ 0 -eq 1 ]] 00:07:40.765 11:02:05 -- spdk/autotest.sh@194 -- # uname -s 00:07:40.765 11:02:05 -- spdk/autotest.sh@194 -- # [[ Linux == Linux ]] 00:07:40.765 11:02:05 -- spdk/autotest.sh@195 -- # [[ 0 -eq 1 ]] 00:07:40.765 11:02:05 -- spdk/autotest.sh@195 -- # [[ 0 -eq 1 ]] 00:07:40.765 11:02:05 -- spdk/autotest.sh@207 -- # '[' 0 -eq 1 ']' 00:07:40.765 11:02:05 -- spdk/autotest.sh@256 -- # '[' 0 -eq 1 ']' 00:07:40.765 11:02:05 -- spdk/autotest.sh@260 -- # timing_exit lib 00:07:40.765 11:02:05 -- common/autotest_common.sh@732 -- # xtrace_disable 00:07:40.765 11:02:05 -- common/autotest_common.sh@10 -- # set +x 00:07:40.765 11:02:05 -- spdk/autotest.sh@262 -- # '[' 0 -eq 1 ']' 00:07:40.765 11:02:05 -- spdk/autotest.sh@267 -- # '[' 0 -eq 1 ']' 00:07:40.765 11:02:05 -- spdk/autotest.sh@276 -- # '[' 0 -eq 1 ']' 00:07:40.765 11:02:05 -- spdk/autotest.sh@311 -- # '[' 0 -eq 1 ']' 00:07:40.765 11:02:05 -- spdk/autotest.sh@315 -- # '[' 0 -eq 1 ']' 00:07:40.765 11:02:05 -- spdk/autotest.sh@319 -- # '[' 0 -eq 1 ']' 00:07:40.765 11:02:05 -- spdk/autotest.sh@324 -- # '[' 0 -eq 1 ']' 00:07:40.765 11:02:05 -- spdk/autotest.sh@333 -- # '[' 0 -eq 1 ']' 00:07:40.765 11:02:05 -- spdk/autotest.sh@338 -- # '[' 0 -eq 1 ']' 00:07:40.765 11:02:05 -- spdk/autotest.sh@342 -- # '[' 0 -eq 1 ']' 00:07:40.765 11:02:05 -- spdk/autotest.sh@346 -- # '[' 0 -eq 1 ']' 00:07:40.765 11:02:05 -- spdk/autotest.sh@350 -- # '[' 0 -eq 1 ']' 00:07:40.765 11:02:05 -- spdk/autotest.sh@355 -- # '[' 0 -eq 1 ']' 00:07:40.765 11:02:05 -- spdk/autotest.sh@359 -- # '[' 0 -eq 1 ']' 00:07:40.765 11:02:05 -- spdk/autotest.sh@366 -- # [[ 0 -eq 1 ]] 00:07:40.765 11:02:05 -- spdk/autotest.sh@370 -- # [[ 0 -eq 1 ]] 00:07:40.765 11:02:05 -- spdk/autotest.sh@374 -- # [[ 1 -eq 1 ]] 00:07:40.765 11:02:05 -- spdk/autotest.sh@375 -- # run_test llvm_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm.sh 00:07:40.765 11:02:05 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:40.765 11:02:05 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:40.765 11:02:05 -- common/autotest_common.sh@10 -- # set +x 00:07:40.765 ************************************ 00:07:40.765 START TEST llvm_fuzz 00:07:40.765 ************************************ 00:07:40.765 11:02:05 llvm_fuzz -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm.sh 00:07:41.026 * Looking for test storage... 00:07:41.026 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz 00:07:41.026 11:02:05 llvm_fuzz -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:07:41.026 11:02:05 llvm_fuzz -- common/autotest_common.sh@1693 -- # lcov --version 00:07:41.026 11:02:05 llvm_fuzz -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:07:41.026 11:02:05 llvm_fuzz -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:07:41.026 11:02:05 llvm_fuzz -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:41.026 11:02:05 llvm_fuzz -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:41.026 11:02:05 llvm_fuzz -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:41.026 11:02:05 llvm_fuzz -- scripts/common.sh@336 -- # IFS=.-: 00:07:41.026 11:02:05 llvm_fuzz -- scripts/common.sh@336 -- # read -ra ver1 00:07:41.026 11:02:05 llvm_fuzz -- scripts/common.sh@337 -- # IFS=.-: 00:07:41.026 11:02:05 llvm_fuzz -- scripts/common.sh@337 -- # read -ra ver2 00:07:41.026 11:02:05 llvm_fuzz -- scripts/common.sh@338 -- # local 'op=<' 00:07:41.026 11:02:05 llvm_fuzz -- scripts/common.sh@340 -- # ver1_l=2 00:07:41.026 11:02:05 llvm_fuzz -- scripts/common.sh@341 -- # ver2_l=1 00:07:41.026 11:02:05 llvm_fuzz -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:41.026 11:02:05 llvm_fuzz -- scripts/common.sh@344 -- # case "$op" in 00:07:41.026 11:02:05 llvm_fuzz -- scripts/common.sh@345 -- # : 1 00:07:41.026 11:02:05 llvm_fuzz -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:41.027 11:02:05 llvm_fuzz -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:41.027 11:02:05 llvm_fuzz -- scripts/common.sh@365 -- # decimal 1 00:07:41.027 11:02:05 llvm_fuzz -- scripts/common.sh@353 -- # local d=1 00:07:41.027 11:02:05 llvm_fuzz -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:41.027 11:02:05 llvm_fuzz -- scripts/common.sh@355 -- # echo 1 00:07:41.027 11:02:05 llvm_fuzz -- scripts/common.sh@365 -- # ver1[v]=1 00:07:41.027 11:02:05 llvm_fuzz -- scripts/common.sh@366 -- # decimal 2 00:07:41.027 11:02:05 llvm_fuzz -- scripts/common.sh@353 -- # local d=2 00:07:41.027 11:02:05 llvm_fuzz -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:41.027 11:02:05 llvm_fuzz -- scripts/common.sh@355 -- # echo 2 00:07:41.027 11:02:05 llvm_fuzz -- scripts/common.sh@366 -- # ver2[v]=2 00:07:41.027 11:02:05 llvm_fuzz -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:41.027 11:02:05 llvm_fuzz -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:41.027 11:02:05 llvm_fuzz -- scripts/common.sh@368 -- # return 0 00:07:41.027 11:02:05 llvm_fuzz -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:41.027 11:02:05 llvm_fuzz -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:07:41.027 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:41.027 --rc genhtml_branch_coverage=1 00:07:41.027 --rc genhtml_function_coverage=1 00:07:41.027 --rc genhtml_legend=1 00:07:41.027 --rc geninfo_all_blocks=1 00:07:41.027 --rc geninfo_unexecuted_blocks=1 00:07:41.027 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:41.027 ' 00:07:41.027 11:02:05 llvm_fuzz -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:07:41.027 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:41.027 --rc genhtml_branch_coverage=1 00:07:41.027 --rc genhtml_function_coverage=1 00:07:41.027 --rc genhtml_legend=1 00:07:41.027 --rc geninfo_all_blocks=1 00:07:41.027 --rc geninfo_unexecuted_blocks=1 00:07:41.027 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:41.027 ' 00:07:41.027 11:02:05 llvm_fuzz -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:07:41.027 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:41.027 --rc genhtml_branch_coverage=1 00:07:41.027 --rc genhtml_function_coverage=1 00:07:41.027 --rc genhtml_legend=1 00:07:41.027 --rc geninfo_all_blocks=1 00:07:41.027 --rc geninfo_unexecuted_blocks=1 00:07:41.027 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:41.027 ' 00:07:41.027 11:02:05 llvm_fuzz -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:07:41.027 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:41.027 --rc genhtml_branch_coverage=1 00:07:41.027 --rc genhtml_function_coverage=1 00:07:41.027 --rc genhtml_legend=1 00:07:41.027 --rc geninfo_all_blocks=1 00:07:41.027 --rc geninfo_unexecuted_blocks=1 00:07:41.027 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:41.027 ' 00:07:41.027 11:02:05 llvm_fuzz -- fuzz/llvm.sh@11 -- # fuzzers=($(get_fuzzer_targets)) 00:07:41.027 11:02:05 llvm_fuzz -- fuzz/llvm.sh@11 -- # get_fuzzer_targets 00:07:41.027 11:02:05 llvm_fuzz -- common/autotest_common.sh@550 -- # fuzzers=() 00:07:41.027 11:02:05 llvm_fuzz -- common/autotest_common.sh@550 -- # local fuzzers 00:07:41.027 11:02:05 llvm_fuzz -- common/autotest_common.sh@552 -- # [[ -n '' ]] 00:07:41.027 11:02:05 llvm_fuzz -- common/autotest_common.sh@555 -- # fuzzers=("$rootdir/test/fuzz/llvm/"*) 00:07:41.027 11:02:05 llvm_fuzz -- common/autotest_common.sh@556 -- # fuzzers=("${fuzzers[@]##*/}") 00:07:41.027 11:02:05 llvm_fuzz -- common/autotest_common.sh@559 -- # echo 'common.sh llvm-gcov.sh nvmf vfio' 00:07:41.027 11:02:05 llvm_fuzz -- fuzz/llvm.sh@13 -- # llvm_out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm 00:07:41.027 11:02:05 llvm_fuzz -- fuzz/llvm.sh@15 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm 00:07:41.027 11:02:05 llvm_fuzz -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:07:41.027 11:02:05 llvm_fuzz -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:07:41.027 11:02:05 llvm_fuzz -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:07:41.027 11:02:05 llvm_fuzz -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:07:41.027 11:02:05 llvm_fuzz -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:07:41.027 11:02:05 llvm_fuzz -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:07:41.027 11:02:05 llvm_fuzz -- fuzz/llvm.sh@19 -- # run_test nvmf_llvm_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/run.sh 00:07:41.027 11:02:05 llvm_fuzz -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:41.027 11:02:05 llvm_fuzz -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:41.027 11:02:05 llvm_fuzz -- common/autotest_common.sh@10 -- # set +x 00:07:41.027 ************************************ 00:07:41.027 START TEST nvmf_llvm_fuzz 00:07:41.027 ************************************ 00:07:41.027 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/run.sh 00:07:41.294 * Looking for test storage... 00:07:41.294 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:41.294 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:07:41.294 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1693 -- # lcov --version 00:07:41.294 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:07:41.294 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:07:41.294 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:41.294 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:41.294 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:41.294 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@336 -- # IFS=.-: 00:07:41.294 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@336 -- # read -ra ver1 00:07:41.294 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@337 -- # IFS=.-: 00:07:41.294 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@337 -- # read -ra ver2 00:07:41.294 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@338 -- # local 'op=<' 00:07:41.294 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@340 -- # ver1_l=2 00:07:41.294 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@341 -- # ver2_l=1 00:07:41.294 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:41.294 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@344 -- # case "$op" in 00:07:41.295 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@345 -- # : 1 00:07:41.295 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:41.295 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:41.295 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@365 -- # decimal 1 00:07:41.295 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@353 -- # local d=1 00:07:41.295 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:41.295 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@355 -- # echo 1 00:07:41.295 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@365 -- # ver1[v]=1 00:07:41.295 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@366 -- # decimal 2 00:07:41.295 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@353 -- # local d=2 00:07:41.295 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:41.295 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@355 -- # echo 2 00:07:41.295 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@366 -- # ver2[v]=2 00:07:41.295 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:41.295 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:41.295 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@368 -- # return 0 00:07:41.295 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:41.295 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:07:41.295 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:41.295 --rc genhtml_branch_coverage=1 00:07:41.295 --rc genhtml_function_coverage=1 00:07:41.295 --rc genhtml_legend=1 00:07:41.295 --rc geninfo_all_blocks=1 00:07:41.295 --rc geninfo_unexecuted_blocks=1 00:07:41.295 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:41.295 ' 00:07:41.295 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:07:41.295 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:41.295 --rc genhtml_branch_coverage=1 00:07:41.295 --rc genhtml_function_coverage=1 00:07:41.295 --rc genhtml_legend=1 00:07:41.295 --rc geninfo_all_blocks=1 00:07:41.295 --rc geninfo_unexecuted_blocks=1 00:07:41.295 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:41.295 ' 00:07:41.295 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:07:41.295 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:41.295 --rc genhtml_branch_coverage=1 00:07:41.295 --rc genhtml_function_coverage=1 00:07:41.295 --rc genhtml_legend=1 00:07:41.295 --rc geninfo_all_blocks=1 00:07:41.295 --rc geninfo_unexecuted_blocks=1 00:07:41.295 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:41.295 ' 00:07:41.295 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:07:41.295 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:41.296 --rc genhtml_branch_coverage=1 00:07:41.296 --rc genhtml_function_coverage=1 00:07:41.296 --rc genhtml_legend=1 00:07:41.296 --rc geninfo_all_blocks=1 00:07:41.296 --rc geninfo_unexecuted_blocks=1 00:07:41.296 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:41.296 ' 00:07:41.296 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@60 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/common.sh 00:07:41.296 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- setup/common.sh@6 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh 00:07:41.296 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:07:41.296 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@34 -- # set -e 00:07:41.296 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:07:41.296 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@36 -- # shopt -s extglob 00:07:41.296 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@37 -- # shopt -s inherit_errexit 00:07:41.296 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@39 -- # '[' -z /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output ']' 00:07:41.296 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@44 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh ]] 00:07:41.296 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@45 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh 00:07:41.296 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:07:41.296 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:07:41.296 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=n 00:07:41.296 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:07:41.296 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:07:41.296 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:07:41.296 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:07:41.296 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:07:41.296 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:07:41.296 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:07:41.296 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:07:41.296 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:07:41.296 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:07:41.296 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:07:41.296 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:07:41.296 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:07:41.297 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@17 -- # CONFIG_MAX_NUMA_NODES=1 00:07:41.297 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@18 -- # CONFIG_PGO_CAPTURE=n 00:07:41.297 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@19 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:07:41.297 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@20 -- # CONFIG_ENV=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:07:41.297 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@21 -- # CONFIG_LTO=n 00:07:41.297 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@22 -- # CONFIG_ISCSI_INITIATOR=y 00:07:41.297 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@23 -- # CONFIG_CET=n 00:07:41.297 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@24 -- # CONFIG_VBDEV_COMPRESS_MLX5=n 00:07:41.297 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@25 -- # CONFIG_OCF_PATH= 00:07:41.297 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@26 -- # CONFIG_RDMA_SET_TOS=y 00:07:41.297 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@27 -- # CONFIG_AIO_FSDEV=y 00:07:41.297 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@28 -- # CONFIG_HAVE_ARC4RANDOM=y 00:07:41.297 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@29 -- # CONFIG_HAVE_LIBARCHIVE=n 00:07:41.297 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@30 -- # CONFIG_UBLK=y 00:07:41.297 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@31 -- # CONFIG_ISAL_CRYPTO=y 00:07:41.297 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@32 -- # CONFIG_OPENSSL_PATH= 00:07:41.297 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@33 -- # CONFIG_OCF=n 00:07:41.297 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@34 -- # CONFIG_FUSE=n 00:07:41.297 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@35 -- # CONFIG_VTUNE_DIR= 00:07:41.298 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@36 -- # CONFIG_FUZZER_LIB=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:07:41.298 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@37 -- # CONFIG_FUZZER=y 00:07:41.299 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@38 -- # CONFIG_FSDEV=y 00:07:41.299 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@39 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:07:41.299 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@40 -- # CONFIG_CRYPTO=n 00:07:41.299 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@41 -- # CONFIG_PGO_USE=n 00:07:41.299 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@42 -- # CONFIG_VHOST=y 00:07:41.299 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@43 -- # CONFIG_DAOS=n 00:07:41.299 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@44 -- # CONFIG_DPDK_INC_DIR=//var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:07:41.299 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@45 -- # CONFIG_DAOS_DIR= 00:07:41.299 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@46 -- # CONFIG_UNIT_TESTS=n 00:07:41.299 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@47 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:07:41.299 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@48 -- # CONFIG_VIRTIO=y 00:07:41.299 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@49 -- # CONFIG_DPDK_UADK=n 00:07:41.299 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@50 -- # CONFIG_COVERAGE=y 00:07:41.299 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@51 -- # CONFIG_RDMA=y 00:07:41.299 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@52 -- # CONFIG_HAVE_STRUCT_STAT_ST_ATIM=y 00:07:41.299 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@53 -- # CONFIG_HAVE_LZ4=n 00:07:41.299 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@54 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:07:41.299 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@55 -- # CONFIG_URING_PATH= 00:07:41.299 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@56 -- # CONFIG_XNVME=n 00:07:41.299 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@57 -- # CONFIG_VFIO_USER=y 00:07:41.299 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@58 -- # CONFIG_ARCH=native 00:07:41.299 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@59 -- # CONFIG_HAVE_EVP_MAC=y 00:07:41.305 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@60 -- # CONFIG_URING_ZNS=n 00:07:41.305 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@61 -- # CONFIG_WERROR=y 00:07:41.305 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@62 -- # CONFIG_HAVE_LIBBSD=n 00:07:41.305 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@63 -- # CONFIG_UBSAN=y 00:07:41.305 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@64 -- # CONFIG_HAVE_STRUCT_STAT_ST_ATIMESPEC=n 00:07:41.305 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@65 -- # CONFIG_IPSEC_MB_DIR= 00:07:41.305 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@66 -- # CONFIG_GOLANG=n 00:07:41.305 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@67 -- # CONFIG_ISAL=y 00:07:41.305 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@68 -- # CONFIG_IDXD_KERNEL=y 00:07:41.305 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@69 -- # CONFIG_DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:41.305 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@70 -- # CONFIG_RDMA_PROV=verbs 00:07:41.305 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@71 -- # CONFIG_APPS=y 00:07:41.306 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@72 -- # CONFIG_SHARED=n 00:07:41.306 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@73 -- # CONFIG_HAVE_KEYUTILS=y 00:07:41.306 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@74 -- # CONFIG_FC_PATH= 00:07:41.306 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@75 -- # CONFIG_DPDK_PKG_CONFIG=n 00:07:41.306 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@76 -- # CONFIG_FC=n 00:07:41.306 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@77 -- # CONFIG_AVAHI=n 00:07:41.306 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@78 -- # CONFIG_FIO_PLUGIN=y 00:07:41.306 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@79 -- # CONFIG_RAID5F=n 00:07:41.306 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@80 -- # CONFIG_EXAMPLES=y 00:07:41.306 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@81 -- # CONFIG_TESTS=y 00:07:41.306 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@82 -- # CONFIG_CRYPTO_MLX5=n 00:07:41.306 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@83 -- # CONFIG_MAX_LCORES=128 00:07:41.306 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@84 -- # CONFIG_IPSEC_MB=n 00:07:41.306 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@85 -- # CONFIG_PGO_DIR= 00:07:41.306 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@86 -- # CONFIG_DEBUG=y 00:07:41.306 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@87 -- # CONFIG_DPDK_COMPRESSDEV=n 00:07:41.306 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@88 -- # CONFIG_CROSS_PREFIX= 00:07:41.306 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@89 -- # CONFIG_COPY_FILE_RANGE=y 00:07:41.306 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@90 -- # CONFIG_URING=n 00:07:41.306 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:07:41.306 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:07:41.306 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:07:41.306 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:07:41.307 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:07:41.307 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:41.307 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:41.307 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:41.307 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:07:41.307 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:07:41.307 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:07:41.307 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:07:41.307 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:07:41.307 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:07:41.307 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/config.h ]] 00:07:41.307 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:07:41.307 #define SPDK_CONFIG_H 00:07:41.307 #define SPDK_CONFIG_AIO_FSDEV 1 00:07:41.307 #define SPDK_CONFIG_APPS 1 00:07:41.307 #define SPDK_CONFIG_ARCH native 00:07:41.307 #undef SPDK_CONFIG_ASAN 00:07:41.307 #undef SPDK_CONFIG_AVAHI 00:07:41.307 #undef SPDK_CONFIG_CET 00:07:41.307 #define SPDK_CONFIG_COPY_FILE_RANGE 1 00:07:41.307 #define SPDK_CONFIG_COVERAGE 1 00:07:41.307 #define SPDK_CONFIG_CROSS_PREFIX 00:07:41.307 #undef SPDK_CONFIG_CRYPTO 00:07:41.307 #undef SPDK_CONFIG_CRYPTO_MLX5 00:07:41.307 #undef SPDK_CONFIG_CUSTOMOCF 00:07:41.307 #undef SPDK_CONFIG_DAOS 00:07:41.307 #define SPDK_CONFIG_DAOS_DIR 00:07:41.307 #define SPDK_CONFIG_DEBUG 1 00:07:41.307 #undef SPDK_CONFIG_DPDK_COMPRESSDEV 00:07:41.307 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:07:41.308 #define SPDK_CONFIG_DPDK_INC_DIR //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:07:41.308 #define SPDK_CONFIG_DPDK_LIB_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:41.308 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:07:41.308 #undef SPDK_CONFIG_DPDK_UADK 00:07:41.308 #define SPDK_CONFIG_ENV /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:07:41.308 #define SPDK_CONFIG_EXAMPLES 1 00:07:41.308 #undef SPDK_CONFIG_FC 00:07:41.308 #define SPDK_CONFIG_FC_PATH 00:07:41.308 #define SPDK_CONFIG_FIO_PLUGIN 1 00:07:41.308 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:07:41.308 #define SPDK_CONFIG_FSDEV 1 00:07:41.308 #undef SPDK_CONFIG_FUSE 00:07:41.308 #define SPDK_CONFIG_FUZZER 1 00:07:41.308 #define SPDK_CONFIG_FUZZER_LIB /usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:07:41.308 #undef SPDK_CONFIG_GOLANG 00:07:41.308 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:07:41.308 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:07:41.308 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:07:41.308 #define SPDK_CONFIG_HAVE_KEYUTILS 1 00:07:41.308 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:07:41.308 #undef SPDK_CONFIG_HAVE_LIBBSD 00:07:41.308 #undef SPDK_CONFIG_HAVE_LZ4 00:07:41.308 #define SPDK_CONFIG_HAVE_STRUCT_STAT_ST_ATIM 1 00:07:41.308 #undef SPDK_CONFIG_HAVE_STRUCT_STAT_ST_ATIMESPEC 00:07:41.308 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:07:41.308 #define SPDK_CONFIG_IDXD 1 00:07:41.308 #define SPDK_CONFIG_IDXD_KERNEL 1 00:07:41.308 #undef SPDK_CONFIG_IPSEC_MB 00:07:41.308 #define SPDK_CONFIG_IPSEC_MB_DIR 00:07:41.308 #define SPDK_CONFIG_ISAL 1 00:07:41.308 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:07:41.308 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:07:41.308 #define SPDK_CONFIG_LIBDIR 00:07:41.308 #undef SPDK_CONFIG_LTO 00:07:41.308 #define SPDK_CONFIG_MAX_LCORES 128 00:07:41.310 #define SPDK_CONFIG_MAX_NUMA_NODES 1 00:07:41.310 #define SPDK_CONFIG_NVME_CUSE 1 00:07:41.310 #undef SPDK_CONFIG_OCF 00:07:41.310 #define SPDK_CONFIG_OCF_PATH 00:07:41.310 #define SPDK_CONFIG_OPENSSL_PATH 00:07:41.310 #undef SPDK_CONFIG_PGO_CAPTURE 00:07:41.310 #define SPDK_CONFIG_PGO_DIR 00:07:41.310 #undef SPDK_CONFIG_PGO_USE 00:07:41.310 #define SPDK_CONFIG_PREFIX /usr/local 00:07:41.310 #undef SPDK_CONFIG_RAID5F 00:07:41.310 #undef SPDK_CONFIG_RBD 00:07:41.310 #define SPDK_CONFIG_RDMA 1 00:07:41.310 #define SPDK_CONFIG_RDMA_PROV verbs 00:07:41.310 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:07:41.310 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:07:41.310 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:07:41.310 #undef SPDK_CONFIG_SHARED 00:07:41.310 #undef SPDK_CONFIG_SMA 00:07:41.310 #define SPDK_CONFIG_TESTS 1 00:07:41.310 #undef SPDK_CONFIG_TSAN 00:07:41.310 #define SPDK_CONFIG_UBLK 1 00:07:41.310 #define SPDK_CONFIG_UBSAN 1 00:07:41.310 #undef SPDK_CONFIG_UNIT_TESTS 00:07:41.310 #undef SPDK_CONFIG_URING 00:07:41.310 #define SPDK_CONFIG_URING_PATH 00:07:41.310 #undef SPDK_CONFIG_URING_ZNS 00:07:41.310 #undef SPDK_CONFIG_USDT 00:07:41.310 #undef SPDK_CONFIG_VBDEV_COMPRESS 00:07:41.311 #undef SPDK_CONFIG_VBDEV_COMPRESS_MLX5 00:07:41.311 #define SPDK_CONFIG_VFIO_USER 1 00:07:41.311 #define SPDK_CONFIG_VFIO_USER_DIR 00:07:41.311 #define SPDK_CONFIG_VHOST 1 00:07:41.311 #define SPDK_CONFIG_VIRTIO 1 00:07:41.311 #undef SPDK_CONFIG_VTUNE 00:07:41.311 #define SPDK_CONFIG_VTUNE_DIR 00:07:41.311 #define SPDK_CONFIG_WERROR 1 00:07:41.311 #define SPDK_CONFIG_WPDK_DIR 00:07:41.311 #undef SPDK_CONFIG_XNVME 00:07:41.311 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:07:41.311 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:07:41.311 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:07:41.311 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@15 -- # shopt -s extglob 00:07:41.311 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:07:41.311 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:07:41.311 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:07:41.311 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:41.311 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:41.311 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:41.311 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- paths/export.sh@5 -- # export PATH 00:07:41.311 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:41.311 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@56 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:07:41.312 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@6 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:07:41.312 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@6 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:07:41.312 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:07:41.312 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@7 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/../../../ 00:07:41.312 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:07:41.312 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@64 -- # TEST_TAG=N/A 00:07:41.312 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@65 -- # TEST_TAG_FILE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.run_test_name 00:07:41.312 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@67 -- # PM_OUTPUTDIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:07:41.312 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@68 -- # uname -s 00:07:41.312 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@68 -- # PM_OS=Linux 00:07:41.312 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:07:41.312 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:07:41.312 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:07:41.312 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:07:41.312 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:07:41.312 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:07:41.312 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@76 -- # SUDO[0]= 00:07:41.312 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@76 -- # SUDO[1]='sudo -E' 00:07:41.312 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:07:41.313 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:07:41.313 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@81 -- # [[ Linux == Linux ]] 00:07:41.313 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@81 -- # [[ ............................... != QEMU ]] 00:07:41.313 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@81 -- # [[ ! -e /.dockerenv ]] 00:07:41.313 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@84 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:07:41.313 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@85 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:07:41.313 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@88 -- # [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power ]] 00:07:41.313 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@58 -- # : 1 00:07:41.313 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@59 -- # export RUN_NIGHTLY 00:07:41.313 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@62 -- # : 0 00:07:41.313 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@63 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:07:41.313 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@64 -- # : 0 00:07:41.313 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@65 -- # export SPDK_RUN_VALGRIND 00:07:41.313 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@66 -- # : 1 00:07:41.313 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@67 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:07:41.313 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@68 -- # : 0 00:07:41.313 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@69 -- # export SPDK_TEST_UNITTEST 00:07:41.313 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@70 -- # : 00:07:41.313 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@71 -- # export SPDK_TEST_AUTOBUILD 00:07:41.313 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@72 -- # : 0 00:07:41.313 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@73 -- # export SPDK_TEST_RELEASE_BUILD 00:07:41.313 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@74 -- # : 0 00:07:41.313 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@75 -- # export SPDK_TEST_ISAL 00:07:41.313 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@76 -- # : 0 00:07:41.313 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@77 -- # export SPDK_TEST_ISCSI 00:07:41.313 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@78 -- # : 0 00:07:41.313 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@79 -- # export SPDK_TEST_ISCSI_INITIATOR 00:07:41.313 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@80 -- # : 0 00:07:41.313 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME 00:07:41.313 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@82 -- # : 0 00:07:41.313 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_PMR 00:07:41.313 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@84 -- # : 0 00:07:41.313 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_BP 00:07:41.314 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@86 -- # : 0 00:07:41.314 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVME_CLI 00:07:41.314 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@88 -- # : 0 00:07:41.314 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@89 -- # export SPDK_TEST_NVME_CUSE 00:07:41.314 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@90 -- # : 0 00:07:41.314 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@91 -- # export SPDK_TEST_NVME_FDP 00:07:41.314 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@92 -- # : 0 00:07:41.314 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@93 -- # export SPDK_TEST_NVMF 00:07:41.314 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@94 -- # : 0 00:07:41.314 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@95 -- # export SPDK_TEST_VFIOUSER 00:07:41.314 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@96 -- # : 0 00:07:41.314 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@97 -- # export SPDK_TEST_VFIOUSER_QEMU 00:07:41.314 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@98 -- # : 1 00:07:41.314 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@99 -- # export SPDK_TEST_FUZZER 00:07:41.314 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@100 -- # : 1 00:07:41.314 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@101 -- # export SPDK_TEST_FUZZER_SHORT 00:07:41.314 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@102 -- # : rdma 00:07:41.314 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@103 -- # export SPDK_TEST_NVMF_TRANSPORT 00:07:41.314 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@104 -- # : 0 00:07:41.314 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@105 -- # export SPDK_TEST_RBD 00:07:41.314 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@106 -- # : 0 00:07:41.314 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@107 -- # export SPDK_TEST_VHOST 00:07:41.314 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@108 -- # : 0 00:07:41.316 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@109 -- # export SPDK_TEST_BLOCKDEV 00:07:41.316 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@110 -- # : 0 00:07:41.316 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@111 -- # export SPDK_TEST_RAID 00:07:41.316 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@112 -- # : 0 00:07:41.316 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@113 -- # export SPDK_TEST_IOAT 00:07:41.316 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@114 -- # : 0 00:07:41.316 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@115 -- # export SPDK_TEST_BLOBFS 00:07:41.316 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@116 -- # : 0 00:07:41.316 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@117 -- # export SPDK_TEST_VHOST_INIT 00:07:41.316 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@118 -- # : 0 00:07:41.316 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@119 -- # export SPDK_TEST_LVOL 00:07:41.316 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@120 -- # : 0 00:07:41.316 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@121 -- # export SPDK_TEST_VBDEV_COMPRESS 00:07:41.316 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@122 -- # : 0 00:07:41.316 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@123 -- # export SPDK_RUN_ASAN 00:07:41.316 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@124 -- # : 1 00:07:41.316 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@125 -- # export SPDK_RUN_UBSAN 00:07:41.316 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@126 -- # : /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:07:41.316 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@127 -- # export SPDK_RUN_EXTERNAL_DPDK 00:07:41.316 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@128 -- # : 0 00:07:41.316 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@129 -- # export SPDK_RUN_NON_ROOT 00:07:41.316 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@130 -- # : 0 00:07:41.316 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@131 -- # export SPDK_TEST_CRYPTO 00:07:41.316 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@132 -- # : 0 00:07:41.316 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@133 -- # export SPDK_TEST_FTL 00:07:41.316 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@134 -- # : 0 00:07:41.316 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@135 -- # export SPDK_TEST_OCF 00:07:41.316 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@136 -- # : 0 00:07:41.316 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@137 -- # export SPDK_TEST_VMD 00:07:41.316 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@138 -- # : 0 00:07:41.316 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@139 -- # export SPDK_TEST_OPAL 00:07:41.316 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@140 -- # : v22.11.4 00:07:41.316 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@141 -- # export SPDK_TEST_NATIVE_DPDK 00:07:41.316 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@142 -- # : true 00:07:41.316 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@143 -- # export SPDK_AUTOTEST_X 00:07:41.316 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@144 -- # : 0 00:07:41.316 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@145 -- # export SPDK_TEST_URING 00:07:41.316 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@146 -- # : 0 00:07:41.316 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@147 -- # export SPDK_TEST_USDT 00:07:41.316 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@148 -- # : 0 00:07:41.316 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@149 -- # export SPDK_TEST_USE_IGB_UIO 00:07:41.316 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@150 -- # : 0 00:07:41.316 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@151 -- # export SPDK_TEST_SCHEDULER 00:07:41.316 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@152 -- # : 0 00:07:41.316 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@153 -- # export SPDK_TEST_SCANBUILD 00:07:41.316 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@154 -- # : 00:07:41.316 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@155 -- # export SPDK_TEST_NVMF_NICS 00:07:41.316 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@156 -- # : 0 00:07:41.316 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@157 -- # export SPDK_TEST_SMA 00:07:41.316 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@158 -- # : 0 00:07:41.316 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@159 -- # export SPDK_TEST_DAOS 00:07:41.316 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@160 -- # : 0 00:07:41.316 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@161 -- # export SPDK_TEST_XNVME 00:07:41.316 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@162 -- # : 0 00:07:41.316 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@163 -- # export SPDK_TEST_ACCEL 00:07:41.316 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@164 -- # : 0 00:07:41.316 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@165 -- # export SPDK_TEST_ACCEL_DSA 00:07:41.316 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@166 -- # : 0 00:07:41.316 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@167 -- # export SPDK_TEST_ACCEL_IAA 00:07:41.316 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@169 -- # : 00:07:41.316 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@170 -- # export SPDK_TEST_FUZZER_TARGET 00:07:41.316 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@171 -- # : 0 00:07:41.316 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@172 -- # export SPDK_TEST_NVMF_MDNS 00:07:41.316 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@173 -- # : 0 00:07:41.316 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@174 -- # export SPDK_JSONRPC_GO_CLIENT 00:07:41.316 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@175 -- # : 1 00:07:41.316 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@176 -- # export SPDK_TEST_SETUP 00:07:41.316 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@177 -- # : 0 00:07:41.316 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@178 -- # export SPDK_TEST_NVME_INTERRUPT 00:07:41.316 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@181 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:07:41.316 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@181 -- # SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:07:41.316 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@182 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:41.316 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@182 -- # DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:41.316 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@183 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:41.316 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@183 -- # VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:41.317 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@184 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:41.317 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@184 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:41.317 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@187 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:07:41.317 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@187 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:07:41.317 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@191 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:41.317 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@191 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:41.317 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@195 -- # export PYTHONDONTWRITEBYTECODE=1 00:07:41.317 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@195 -- # PYTHONDONTWRITEBYTECODE=1 00:07:41.317 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@199 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:07:41.317 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@199 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:07:41.317 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@200 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:07:41.317 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@200 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:07:41.317 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@204 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:07:41.317 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@205 -- # rm -rf /var/tmp/asan_suppression_file 00:07:41.317 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@206 -- # cat 00:07:41.317 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@242 -- # echo leak:libfuse3.so 00:07:41.317 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@244 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:07:41.317 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@244 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:07:41.317 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@246 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:07:41.317 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@246 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:07:41.317 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@248 -- # '[' -z /var/spdk/dependencies ']' 00:07:41.317 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@251 -- # export DEPENDENCY_DIR 00:07:41.317 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@255 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:41.317 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@255 -- # SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:41.317 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@256 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:41.317 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@256 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:41.317 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@259 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:07:41.317 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@259 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:07:41.317 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@260 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:07:41.317 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@260 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:07:41.317 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@262 -- # export AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:07:41.317 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@262 -- # AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:07:41.317 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@265 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:07:41.317 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@265 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:07:41.317 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@267 -- # _LCOV_MAIN=0 00:07:41.317 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@268 -- # _LCOV_LLVM=1 00:07:41.317 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@269 -- # _LCOV= 00:07:41.317 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@270 -- # [[ '' == *clang* ]] 00:07:41.317 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@270 -- # [[ 1 -eq 1 ]] 00:07:41.317 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@270 -- # _LCOV=1 00:07:41.317 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@272 -- # _lcov_opt[_LCOV_LLVM]='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:07:41.317 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@273 -- # _lcov_opt[_LCOV_MAIN]= 00:07:41.317 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@275 -- # lcov_opt='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:07:41.317 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@278 -- # '[' 0 -eq 0 ']' 00:07:41.317 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@279 -- # export valgrind= 00:07:41.317 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@279 -- # valgrind= 00:07:41.317 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@285 -- # uname -s 00:07:41.317 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@285 -- # '[' Linux = Linux ']' 00:07:41.317 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@286 -- # HUGEMEM=4096 00:07:41.317 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@287 -- # export CLEAR_HUGE=yes 00:07:41.317 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@287 -- # CLEAR_HUGE=yes 00:07:41.317 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@289 -- # MAKE=make 00:07:41.317 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@290 -- # MAKEFLAGS=-j112 00:07:41.317 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@306 -- # export HUGEMEM=4096 00:07:41.317 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@306 -- # HUGEMEM=4096 00:07:41.317 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@308 -- # NO_HUGE=() 00:07:41.317 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@309 -- # TEST_MODE= 00:07:41.317 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@331 -- # [[ -z 147857 ]] 00:07:41.317 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@331 -- # kill -0 147857 00:07:41.317 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1678 -- # set_test_storage 2147483648 00:07:41.317 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@341 -- # [[ -v testdir ]] 00:07:41.317 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@343 -- # local requested_size=2147483648 00:07:41.317 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@344 -- # local mount target_dir 00:07:41.317 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@346 -- # local -A mounts fss sizes avails uses 00:07:41.317 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@347 -- # local source fs size avail mount use 00:07:41.318 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@349 -- # local storage_fallback storage_candidates 00:07:41.318 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@351 -- # mktemp -udt spdk.XXXXXX 00:07:41.578 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@351 -- # storage_fallback=/tmp/spdk.FNJOOx 00:07:41.578 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@356 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:07:41.578 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@358 -- # [[ -n '' ]] 00:07:41.579 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@363 -- # [[ -n '' ]] 00:07:41.579 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@368 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf /tmp/spdk.FNJOOx/tests/nvmf /tmp/spdk.FNJOOx 00:07:41.579 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@371 -- # requested_size=2214592512 00:07:41.579 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:07:41.579 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@340 -- # df -T 00:07:41.579 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@340 -- # grep -v Filesystem 00:07:41.579 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=spdk_devtmpfs 00:07:41.579 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=devtmpfs 00:07:41.579 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=67108864 00:07:41.579 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=67108864 00:07:41.579 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=0 00:07:41.579 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:07:41.579 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=/dev/pmem0 00:07:41.579 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=ext2 00:07:41.579 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=4096 00:07:41.579 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=5284429824 00:07:41.579 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=5284425728 00:07:41.579 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:07:41.579 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=spdk_root 00:07:41.579 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=overlay 00:07:41.579 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=53091487744 00:07:41.579 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=61730553856 00:07:41.579 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=8639066112 00:07:41.579 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:07:41.579 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=tmpfs 00:07:41.579 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=tmpfs 00:07:41.579 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=30861848576 00:07:41.579 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=30865276928 00:07:41.579 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=3428352 00:07:41.579 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:07:41.579 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=tmpfs 00:07:41.579 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=tmpfs 00:07:41.579 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=12340117504 00:07:41.579 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=12346114048 00:07:41.579 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=5996544 00:07:41.579 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:07:41.579 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=tmpfs 00:07:41.579 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=tmpfs 00:07:41.579 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=30865100800 00:07:41.579 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=30865276928 00:07:41.579 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=176128 00:07:41.579 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:07:41.579 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=tmpfs 00:07:41.579 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=tmpfs 00:07:41.579 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=6173040640 00:07:41.579 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=6173052928 00:07:41.579 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=12288 00:07:41.579 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:07:41.579 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@379 -- # printf '* Looking for test storage...\n' 00:07:41.579 * Looking for test storage... 00:07:41.579 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@381 -- # local target_space new_size 00:07:41.579 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@382 -- # for target_dir in "${storage_candidates[@]}" 00:07:41.579 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@385 -- # df /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:41.579 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@385 -- # awk '$1 !~ /Filesystem/{print $6}' 00:07:41.579 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@385 -- # mount=/ 00:07:41.579 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@387 -- # target_space=53091487744 00:07:41.579 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@388 -- # (( target_space == 0 || target_space < requested_size )) 00:07:41.579 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@391 -- # (( target_space >= requested_size )) 00:07:41.579 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@393 -- # [[ overlay == tmpfs ]] 00:07:41.579 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@393 -- # [[ overlay == ramfs ]] 00:07:41.579 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@393 -- # [[ / == / ]] 00:07:41.579 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@394 -- # new_size=10853658624 00:07:41.579 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@395 -- # (( new_size * 100 / sizes[/] > 95 )) 00:07:41.579 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@400 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:41.579 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@400 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:41.579 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@401 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:41.579 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:41.579 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@402 -- # return 0 00:07:41.579 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1680 -- # set -o errtrace 00:07:41.579 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1681 -- # shopt -s extdebug 00:07:41.579 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1682 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:07:41.579 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1684 -- # PS4=' \t ${test_domain:-} -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:07:41.579 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1685 -- # true 00:07:41.579 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1687 -- # xtrace_fd 00:07:41.579 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@25 -- # [[ -n 14 ]] 00:07:41.579 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/14 ]] 00:07:41.579 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@27 -- # exec 00:07:41.579 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@29 -- # exec 00:07:41.579 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@31 -- # xtrace_restore 00:07:41.579 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:07:41.579 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:07:41.579 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@18 -- # set -x 00:07:41.579 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:07:41.579 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1693 -- # lcov --version 00:07:41.579 11:02:05 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:07:41.579 11:02:06 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:07:41.579 11:02:06 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:41.579 11:02:06 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:41.579 11:02:06 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:41.579 11:02:06 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@336 -- # IFS=.-: 00:07:41.579 11:02:06 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@336 -- # read -ra ver1 00:07:41.579 11:02:06 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@337 -- # IFS=.-: 00:07:41.579 11:02:06 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@337 -- # read -ra ver2 00:07:41.579 11:02:06 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@338 -- # local 'op=<' 00:07:41.579 11:02:06 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@340 -- # ver1_l=2 00:07:41.579 11:02:06 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@341 -- # ver2_l=1 00:07:41.580 11:02:06 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:41.580 11:02:06 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@344 -- # case "$op" in 00:07:41.580 11:02:06 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@345 -- # : 1 00:07:41.580 11:02:06 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:41.580 11:02:06 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:41.580 11:02:06 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@365 -- # decimal 1 00:07:41.580 11:02:06 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@353 -- # local d=1 00:07:41.580 11:02:06 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:41.580 11:02:06 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@355 -- # echo 1 00:07:41.580 11:02:06 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@365 -- # ver1[v]=1 00:07:41.580 11:02:06 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@366 -- # decimal 2 00:07:41.580 11:02:06 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@353 -- # local d=2 00:07:41.580 11:02:06 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:41.580 11:02:06 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@355 -- # echo 2 00:07:41.580 11:02:06 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@366 -- # ver2[v]=2 00:07:41.580 11:02:06 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:41.580 11:02:06 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:41.580 11:02:06 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@368 -- # return 0 00:07:41.580 11:02:06 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:41.580 11:02:06 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:07:41.580 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:41.580 --rc genhtml_branch_coverage=1 00:07:41.580 --rc genhtml_function_coverage=1 00:07:41.580 --rc genhtml_legend=1 00:07:41.580 --rc geninfo_all_blocks=1 00:07:41.580 --rc geninfo_unexecuted_blocks=1 00:07:41.580 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:41.580 ' 00:07:41.580 11:02:06 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:07:41.580 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:41.580 --rc genhtml_branch_coverage=1 00:07:41.580 --rc genhtml_function_coverage=1 00:07:41.580 --rc genhtml_legend=1 00:07:41.580 --rc geninfo_all_blocks=1 00:07:41.580 --rc geninfo_unexecuted_blocks=1 00:07:41.580 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:41.580 ' 00:07:41.580 11:02:06 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:07:41.580 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:41.580 --rc genhtml_branch_coverage=1 00:07:41.580 --rc genhtml_function_coverage=1 00:07:41.580 --rc genhtml_legend=1 00:07:41.580 --rc geninfo_all_blocks=1 00:07:41.580 --rc geninfo_unexecuted_blocks=1 00:07:41.580 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:41.580 ' 00:07:41.580 11:02:06 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:07:41.580 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:41.580 --rc genhtml_branch_coverage=1 00:07:41.580 --rc genhtml_function_coverage=1 00:07:41.580 --rc genhtml_legend=1 00:07:41.580 --rc geninfo_all_blocks=1 00:07:41.580 --rc geninfo_unexecuted_blocks=1 00:07:41.580 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:41.580 ' 00:07:41.580 11:02:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@61 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/../common.sh 00:07:41.580 11:02:06 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@8 -- # pids=() 00:07:41.580 11:02:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@63 -- # fuzzfile=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c 00:07:41.580 11:02:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@64 -- # grep -c '\.fn =' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c 00:07:41.580 11:02:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@64 -- # fuzz_num=25 00:07:41.580 11:02:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@65 -- # (( fuzz_num != 0 )) 00:07:41.580 11:02:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@67 -- # trap 'cleanup /tmp/llvm_fuzz* /var/tmp/suppress_nvmf_fuzz; exit 1' SIGINT SIGTERM EXIT 00:07:41.580 11:02:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@69 -- # mem_size=512 00:07:41.580 11:02:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@70 -- # [[ 1 -eq 1 ]] 00:07:41.580 11:02:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@71 -- # start_llvm_fuzz_short 25 1 00:07:41.580 11:02:06 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@69 -- # local fuzz_num=25 00:07:41.580 11:02:06 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@70 -- # local time=1 00:07:41.580 11:02:06 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i = 0 )) 00:07:41.580 11:02:06 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:41.580 11:02:06 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 0 1 0x1 00:07:41.580 11:02:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=0 00:07:41.580 11:02:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:07:41.580 11:02:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:07:41.580 11:02:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:07:41.580 11:02:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_0.conf 00:07:41.580 11:02:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:41.580 11:02:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:41.580 11:02:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 0 00:07:41.580 11:02:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4400 00:07:41.580 11:02:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:07:41.580 11:02:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4400' 00:07:41.580 11:02:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4400"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:41.580 11:02:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:41.580 11:02:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:41.580 11:02:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4400' -c /tmp/fuzz_json_0.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 -Z 0 00:07:41.580 [2024-11-17 11:02:06.141130] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:07:41.580 [2024-11-17 11:02:06.141203] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid148112 ] 00:07:41.841 [2024-11-17 11:02:06.346744] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:41.841 [2024-11-17 11:02:06.359001] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:41.841 [2024-11-17 11:02:06.411298] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:41.841 [2024-11-17 11:02:06.427652] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4400 *** 00:07:41.841 INFO: Running with entropic power schedule (0xFF, 100). 00:07:41.841 INFO: Seed: 2370285756 00:07:41.841 INFO: Loaded 1 modules (387604 inline 8-bit counters): 387604 [0x2a8d80c, 0x2aec220), 00:07:41.841 INFO: Loaded 1 PC tables (387604 PCs): 387604 [0x2aec220,0x30d6360), 00:07:41.841 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:07:41.841 INFO: A corpus is not provided, starting from an empty corpus 00:07:41.841 #2 INITED exec/s: 0 rss: 65Mb 00:07:41.841 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:41.841 This may also happen if the target rejected all inputs we tried so far 00:07:42.101 [2024-11-17 11:02:06.503719] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (46) qid:0 cid:4 nsid:46464646 cdw10:46464646 cdw11:46464646 SGL TRANSPORT DATA BLOCK TRANSPORT 0x4646464646464646 00:07:42.101 [2024-11-17 11:02:06.503755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.362 NEW_FUNC[1/715]: 0x452788 in fuzz_admin_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:47 00:07:42.362 NEW_FUNC[2/715]: 0x48ef68 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:42.362 #16 NEW cov: 12161 ft: 12166 corp: 2/122b lim: 320 exec/s: 0 rss: 72Mb L: 121/121 MS: 4 ChangeByte-ShuffleBytes-CopyPart-InsertRepeatedBytes- 00:07:42.362 [2024-11-17 11:02:06.864826] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (46) qid:0 cid:4 nsid:46464646 cdw10:46464646 cdw11:46464646 SGL TRANSPORT DATA BLOCK TRANSPORT 0x4646464646464646 00:07:42.362 [2024-11-17 11:02:06.864875] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.362 #22 NEW cov: 12302 ft: 12814 corp: 3/243b lim: 320 exec/s: 0 rss: 72Mb L: 121/121 MS: 1 ShuffleBytes- 00:07:42.362 [2024-11-17 11:02:06.935119] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (46) qid:0 cid:4 nsid:46464646 cdw10:46464646 cdw11:46464646 SGL TRANSPORT DATA BLOCK TRANSPORT 0x46464646ffffffff 00:07:42.362 [2024-11-17 11:02:06.935148] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.362 [2024-11-17 11:02:06.935288] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (46) qid:0 cid:5 nsid:46464646 cdw10:46464646 cdw11:46464646 SGL TRANSPORT DATA BLOCK TRANSPORT 0x4646464646464646 00:07:42.362 [2024-11-17 11:02:06.935303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.362 #23 NEW cov: 12308 ft: 13438 corp: 4/379b lim: 320 exec/s: 0 rss: 72Mb L: 136/136 MS: 1 InsertRepeatedBytes- 00:07:42.362 [2024-11-17 11:02:06.975221] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (46) qid:0 cid:4 nsid:46464646 cdw10:46464646 cdw11:46464646 SGL TRANSPORT DATA BLOCK TRANSPORT 0x4646464646464646 00:07:42.362 [2024-11-17 11:02:06.975248] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.362 [2024-11-17 11:02:06.975375] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (46) qid:0 cid:5 nsid:46464646 cdw10:46464646 cdw11:46464646 SGL TRANSPORT DATA BLOCK TRANSPORT 0x4646464646464646 00:07:42.362 [2024-11-17 11:02:06.975392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.362 #24 NEW cov: 12393 ft: 13719 corp: 5/560b lim: 320 exec/s: 0 rss: 72Mb L: 181/181 MS: 1 CopyPart- 00:07:42.622 [2024-11-17 11:02:07.045583] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (46) qid:0 cid:4 nsid:46464646 cdw10:46464646 cdw11:46464646 SGL TRANSPORT DATA BLOCK TRANSPORT 0x46464646ffffffff 00:07:42.622 [2024-11-17 11:02:07.045610] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.622 [2024-11-17 11:02:07.045738] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (46) qid:0 cid:5 nsid:46464646 cdw10:46464646 cdw11:46464646 SGL TRANSPORT DATA BLOCK TRANSPORT 0x4646464646464646 00:07:42.622 [2024-11-17 11:02:07.045755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.622 [2024-11-17 11:02:07.045873] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (44) qid:0 cid:6 nsid:44444444 cdw10:44444444 cdw11:44444444 00:07:42.622 [2024-11-17 11:02:07.045888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:42.622 #25 NEW cov: 12394 ft: 13941 corp: 6/802b lim: 320 exec/s: 0 rss: 72Mb L: 242/242 MS: 1 InsertRepeatedBytes- 00:07:42.622 [2024-11-17 11:02:07.115765] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (46) qid:0 cid:4 nsid:46464646 cdw10:46464646 cdw11:46464646 SGL TRANSPORT DATA BLOCK TRANSPORT 0x46464646ffffffff 00:07:42.622 [2024-11-17 11:02:07.115793] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.622 [2024-11-17 11:02:07.115932] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (46) qid:0 cid:5 nsid:46464646 cdw10:46464646 cdw11:46464646 SGL TRANSPORT DATA BLOCK TRANSPORT 0x4646464646464646 00:07:42.622 [2024-11-17 11:02:07.115950] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.622 [2024-11-17 11:02:07.116078] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (44) qid:0 cid:6 nsid:44444444 cdw10:44444444 cdw11:44444444 00:07:42.622 [2024-11-17 11:02:07.116096] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:42.622 #31 NEW cov: 12394 ft: 13994 corp: 7/1044b lim: 320 exec/s: 0 rss: 72Mb L: 242/242 MS: 1 ChangeBit- 00:07:42.622 [2024-11-17 11:02:07.185838] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (46) qid:0 cid:4 nsid:46464646 cdw10:46464646 cdw11:46464646 SGL TRANSPORT DATA BLOCK TRANSPORT 0x4646464646464646 00:07:42.622 [2024-11-17 11:02:07.185866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.622 [2024-11-17 11:02:07.186002] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (46) qid:0 cid:5 nsid:46464646 cdw10:46464646 cdw11:46464646 SGL TRANSPORT DATA BLOCK TRANSPORT 0x4646464646464646 00:07:42.622 [2024-11-17 11:02:07.186018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.622 #32 NEW cov: 12394 ft: 14055 corp: 8/1235b lim: 320 exec/s: 0 rss: 72Mb L: 191/242 MS: 1 InsertRepeatedBytes- 00:07:42.622 [2024-11-17 11:02:07.236311] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (46) qid:0 cid:4 nsid:46464646 cdw10:46464646 cdw11:46464646 SGL TRANSPORT DATA BLOCK TRANSPORT 0x46464646ffffffff 00:07:42.622 [2024-11-17 11:02:07.236339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.622 [2024-11-17 11:02:07.236474] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (46) qid:0 cid:5 nsid:46464646 cdw10:46464646 cdw11:46464646 SGL TRANSPORT DATA BLOCK TRANSPORT 0x4646464646464646 00:07:42.622 [2024-11-17 11:02:07.236491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.622 [2024-11-17 11:02:07.236616] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (44) qid:0 cid:6 nsid:c1c14444 cdw10:44444444 cdw11:44444444 00:07:42.622 [2024-11-17 11:02:07.236633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:42.622 [2024-11-17 11:02:07.236759] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (44) qid:0 cid:7 nsid:44444444 cdw10:44444444 cdw11:44444444 00:07:42.622 [2024-11-17 11:02:07.236776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:42.622 #33 NEW cov: 12394 ft: 14379 corp: 9/1498b lim: 320 exec/s: 0 rss: 72Mb L: 263/263 MS: 1 InsertRepeatedBytes- 00:07:42.622 [2024-11-17 11:02:07.276036] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (46) qid:0 cid:4 nsid:46464646 cdw10:46464646 cdw11:46464646 SGL TRANSPORT DATA BLOCK TRANSPORT 0x4646464646464646 00:07:42.622 [2024-11-17 11:02:07.276068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.622 [2024-11-17 11:02:07.276205] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (46) qid:0 cid:5 nsid:46464646 cdw10:46464646 cdw11:46464646 SGL TRANSPORT DATA BLOCK TRANSPORT 0x4646464646464646 00:07:42.622 [2024-11-17 11:02:07.276223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.884 #34 NEW cov: 12394 ft: 14408 corp: 10/1679b lim: 320 exec/s: 0 rss: 72Mb L: 181/263 MS: 1 ChangeByte- 00:07:42.884 [2024-11-17 11:02:07.346453] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (46) qid:0 cid:4 nsid:46464646 cdw10:46464646 cdw11:46464646 SGL TRANSPORT DATA BLOCK TRANSPORT 0x46464646ffffffff 00:07:42.884 [2024-11-17 11:02:07.346482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.884 [2024-11-17 11:02:07.346622] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (46) qid:0 cid:5 nsid:46464646 cdw10:44444444 cdw11:44444444 SGL TRANSPORT DATA BLOCK TRANSPORT 0x4646464646464646 00:07:42.884 [2024-11-17 11:02:07.346639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.884 [2024-11-17 11:02:07.346759] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (44) qid:0 cid:6 nsid:44444444 cdw10:44444444 cdw11:44444444 00:07:42.884 [2024-11-17 11:02:07.346775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:42.884 NEW_FUNC[1/1]: 0x1c4e498 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:07:42.884 #35 NEW cov: 12417 ft: 14503 corp: 11/1906b lim: 320 exec/s: 0 rss: 72Mb L: 227/263 MS: 1 EraseBytes- 00:07:42.884 [2024-11-17 11:02:07.396211] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (46) qid:0 cid:4 nsid:46464646 cdw10:46464646 cdw11:46464646 SGL TRANSPORT DATA BLOCK TRANSPORT 0x4646464646464646 00:07:42.884 [2024-11-17 11:02:07.396240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.884 #36 NEW cov: 12417 ft: 14559 corp: 12/2028b lim: 320 exec/s: 0 rss: 72Mb L: 122/263 MS: 1 InsertByte- 00:07:42.884 [2024-11-17 11:02:07.446891] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (46) qid:0 cid:4 nsid:46464646 cdw10:46464646 cdw11:46464646 SGL TRANSPORT DATA BLOCK TRANSPORT 0x46464646ffffffff 00:07:42.884 [2024-11-17 11:02:07.446920] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.884 [2024-11-17 11:02:07.447052] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (46) qid:0 cid:5 nsid:46464646 cdw10:46464646 cdw11:46464646 SGL TRANSPORT DATA BLOCK TRANSPORT 0x4646464646464646 00:07:42.884 [2024-11-17 11:02:07.447069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.884 [2024-11-17 11:02:07.447190] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (44) qid:0 cid:6 nsid:c1c14444 cdw10:44444444 cdw11:44444444 00:07:42.884 [2024-11-17 11:02:07.447210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:42.884 [2024-11-17 11:02:07.447330] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (44) qid:0 cid:7 nsid:44444444 cdw10:44444444 cdw11:44444444 00:07:42.884 [2024-11-17 11:02:07.447347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:42.884 #37 NEW cov: 12417 ft: 14617 corp: 13/2291b lim: 320 exec/s: 37 rss: 73Mb L: 263/263 MS: 1 ChangeByte- 00:07:42.884 [2024-11-17 11:02:07.516904] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (46) qid:0 cid:4 nsid:46464646 cdw10:46464646 cdw11:46464646 SGL TRANSPORT DATA BLOCK TRANSPORT 0xb9b9b9b9000001ff 00:07:42.884 [2024-11-17 11:02:07.516933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.884 [2024-11-17 11:02:07.517073] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (46) qid:0 cid:5 nsid:46464646 cdw10:46464646 cdw11:46464646 SGL TRANSPORT DATA BLOCK TRANSPORT 0x4646464646464646 00:07:42.884 [2024-11-17 11:02:07.517091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.884 [2024-11-17 11:02:07.517222] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (44) qid:0 cid:6 nsid:44444444 cdw10:44444444 cdw11:44444444 00:07:42.884 [2024-11-17 11:02:07.517239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:42.884 #38 NEW cov: 12417 ft: 14644 corp: 14/2533b lim: 320 exec/s: 38 rss: 73Mb L: 242/263 MS: 1 ChangeBinInt- 00:07:43.144 [2024-11-17 11:02:07.557060] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (46) qid:0 cid:4 nsid:46464646 cdw10:46464646 cdw11:46464646 SGL TRANSPORT DATA BLOCK TRANSPORT 0xb9b9b9b9000001ff 00:07:43.144 [2024-11-17 11:02:07.557089] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.144 [2024-11-17 11:02:07.557216] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (46) qid:0 cid:5 nsid:46464646 cdw10:46464646 cdw11:46464646 SGL TRANSPORT DATA BLOCK TRANSPORT 0x4646464646464646 00:07:43.144 [2024-11-17 11:02:07.557234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.144 [2024-11-17 11:02:07.557356] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (44) qid:0 cid:6 nsid:44444444 cdw10:44444444 cdw11:44444444 00:07:43.144 [2024-11-17 11:02:07.557374] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.144 #39 NEW cov: 12417 ft: 14683 corp: 15/2783b lim: 320 exec/s: 39 rss: 73Mb L: 250/263 MS: 1 CMP- DE: "\000\000\000\000\000\000\000\020"- 00:07:43.144 [2024-11-17 11:02:07.627046] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (46) qid:0 cid:4 nsid:46464646 cdw10:46464646 cdw11:46464646 SGL TRANSPORT DATA BLOCK TRANSPORT 0x4646464646464646 00:07:43.144 [2024-11-17 11:02:07.627074] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.144 [2024-11-17 11:02:07.627207] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (46) qid:0 cid:5 nsid:46464646 cdw10:9cac89ff cdw11:60c5e6b7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x4646464646464646 00:07:43.144 [2024-11-17 11:02:07.627227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.144 #40 NEW cov: 12417 ft: 14715 corp: 16/2964b lim: 320 exec/s: 40 rss: 73Mb L: 181/263 MS: 1 CMP- DE: "\377\211\254\234\267\346\305`"- 00:07:43.144 [2024-11-17 11:02:07.677270] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (46) qid:0 cid:4 nsid:46464646 cdw10:46464646 cdw11:46464646 SGL TRANSPORT DATA BLOCK TRANSPORT 0x4646464646464646 00:07:43.144 [2024-11-17 11:02:07.677301] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.144 [2024-11-17 11:02:07.677434] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (46) qid:0 cid:5 nsid:46464646 cdw10:9cac89ff cdw11:60c5e6b7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x4646464646464646 00:07:43.144 [2024-11-17 11:02:07.677450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.144 #41 NEW cov: 12417 ft: 14720 corp: 17/3145b lim: 320 exec/s: 41 rss: 73Mb L: 181/263 MS: 1 ChangeBit- 00:07:43.144 [2024-11-17 11:02:07.747766] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (46) qid:0 cid:4 nsid:46464646 cdw10:46464646 cdw11:46464646 SGL TRANSPORT DATA BLOCK TRANSPORT 0x46464646ffffffff 00:07:43.144 [2024-11-17 11:02:07.747793] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.144 [2024-11-17 11:02:07.747929] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (46) qid:0 cid:5 nsid:46464646 cdw10:46464646 cdw11:46464646 SGL TRANSPORT DATA BLOCK TRANSPORT 0x4646464646464646 00:07:43.144 [2024-11-17 11:02:07.747947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.145 [2024-11-17 11:02:07.748052] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (44) qid:0 cid:6 nsid:c1c14444 cdw10:44444444 cdw11:44444444 00:07:43.145 [2024-11-17 11:02:07.748080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.145 [2024-11-17 11:02:07.748216] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (44) qid:0 cid:7 nsid:44444444 cdw10:46464646 cdw11:46464646 00:07:43.145 [2024-11-17 11:02:07.748234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:43.145 #42 NEW cov: 12417 ft: 14747 corp: 18/3408b lim: 320 exec/s: 42 rss: 73Mb L: 263/263 MS: 1 CrossOver- 00:07:43.405 [2024-11-17 11:02:07.817350] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (46) qid:0 cid:4 nsid:46464646 cdw10:46464646 cdw11:46464646 SGL TRANSPORT DATA BLOCK TRANSPORT 0x4646464646464646 00:07:43.405 [2024-11-17 11:02:07.817377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.405 #48 NEW cov: 12417 ft: 14846 corp: 19/3529b lim: 320 exec/s: 48 rss: 73Mb L: 121/263 MS: 1 CopyPart- 00:07:43.405 [2024-11-17 11:02:07.857810] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (46) qid:0 cid:4 nsid:46464646 cdw10:46464646 cdw11:46464646 SGL TRANSPORT DATA BLOCK TRANSPORT 0xb9b9b9b9000001ff 00:07:43.405 [2024-11-17 11:02:07.857837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.405 [2024-11-17 11:02:07.857963] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (46) qid:0 cid:5 nsid:46464646 cdw10:46464646 cdw11:46464646 SGL TRANSPORT DATA BLOCK TRANSPORT 0x4646464646464646 00:07:43.405 [2024-11-17 11:02:07.857980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.405 [2024-11-17 11:02:07.858113] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (44) qid:0 cid:6 nsid:44444444 cdw10:44444444 cdw11:44444444 00:07:43.405 [2024-11-17 11:02:07.858129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.405 #49 NEW cov: 12417 ft: 14904 corp: 20/3771b lim: 320 exec/s: 49 rss: 73Mb L: 242/263 MS: 1 PersAutoDict- DE: "\377\211\254\234\267\346\305`"- 00:07:43.405 [2024-11-17 11:02:07.898065] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (46) qid:0 cid:4 nsid:46464646 cdw10:46464646 cdw11:46464646 SGL TRANSPORT DATA BLOCK TRANSPORT 0x4646464646464646 00:07:43.405 [2024-11-17 11:02:07.898091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.405 [2024-11-17 11:02:07.898229] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (46) qid:0 cid:5 nsid:46464646 cdw10:46464646 cdw11:46464646 SGL TRANSPORT DATA BLOCK TRANSPORT 0x4646464646464646 00:07:43.405 [2024-11-17 11:02:07.898245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.405 [2024-11-17 11:02:07.898385] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (46) qid:0 cid:6 nsid:46464646 cdw10:46464646 cdw11:46464646 SGL TRANSPORT DATA BLOCK TRANSPORT 0x4646464646464646 00:07:43.405 [2024-11-17 11:02:07.898402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.405 #50 NEW cov: 12417 ft: 14935 corp: 21/3999b lim: 320 exec/s: 50 rss: 73Mb L: 228/263 MS: 1 CopyPart- 00:07:43.405 [2024-11-17 11:02:07.937646] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (46) qid:0 cid:4 nsid:46464646 cdw10:46464646 cdw11:46464646 SGL TRANSPORT DATA BLOCK TRANSPORT 0x4646464646464646 00:07:43.405 [2024-11-17 11:02:07.937672] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.405 #51 NEW cov: 12417 ft: 14998 corp: 22/4120b lim: 320 exec/s: 51 rss: 73Mb L: 121/263 MS: 1 CrossOver- 00:07:43.405 [2024-11-17 11:02:07.988370] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (46) qid:0 cid:4 nsid:46464646 cdw10:46464646 cdw11:46464646 SGL TRANSPORT DATA BLOCK TRANSPORT 0x46464646ffffffff 00:07:43.405 [2024-11-17 11:02:07.988397] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.405 [2024-11-17 11:02:07.988535] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (46) qid:0 cid:5 nsid:46464646 cdw10:46464646 cdw11:46464646 SGL TRANSPORT DATA BLOCK TRANSPORT 0xb0b0b4646464646 00:07:43.405 [2024-11-17 11:02:07.988552] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.405 [2024-11-17 11:02:07.988688] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (44) qid:0 cid:6 nsid:44444444 cdw10:44444444 cdw11:44444444 00:07:43.405 [2024-11-17 11:02:07.988706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.405 [2024-11-17 11:02:07.988838] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (44) qid:0 cid:7 nsid:44444444 cdw10:44444444 cdw11:44444444 00:07:43.405 [2024-11-17 11:02:07.988854] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:43.405 #52 NEW cov: 12417 ft: 15008 corp: 23/4393b lim: 320 exec/s: 52 rss: 73Mb L: 273/273 MS: 1 InsertRepeatedBytes- 00:07:43.405 [2024-11-17 11:02:08.038494] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (46) qid:0 cid:4 nsid:46464646 cdw10:46464646 cdw11:46464646 SGL TRANSPORT DATA BLOCK TRANSPORT 0xb9b9b9b9000001ff 00:07:43.405 [2024-11-17 11:02:08.038523] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.405 [2024-11-17 11:02:08.038659] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (46) qid:0 cid:5 nsid:46464646 cdw10:46464646 cdw11:46464646 SGL TRANSPORT DATA BLOCK TRANSPORT 0x4646464646464646 00:07:43.405 [2024-11-17 11:02:08.038678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.405 [2024-11-17 11:02:08.038809] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (44) qid:0 cid:6 nsid:44444444 cdw10:44444444 cdw11:44444444 00:07:43.405 [2024-11-17 11:02:08.038828] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.666 #58 NEW cov: 12417 ft: 15038 corp: 24/4644b lim: 320 exec/s: 58 rss: 73Mb L: 251/273 MS: 1 InsertByte- 00:07:43.666 [2024-11-17 11:02:08.108632] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (46) qid:0 cid:4 nsid:46464646 cdw10:46464646 cdw11:46464646 SGL TRANSPORT DATA BLOCK TRANSPORT 0x4646464646464646 00:07:43.666 [2024-11-17 11:02:08.108663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.666 [2024-11-17 11:02:08.108792] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (46) qid:0 cid:5 nsid:46464646 cdw10:9cac89ff cdw11:60c5e6b7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x4646464646464646 00:07:43.666 [2024-11-17 11:02:08.108807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.666 [2024-11-17 11:02:08.108944] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (46) qid:0 cid:6 nsid:3a3a3a3a cdw10:3a3a3a3a cdw11:46464646 SGL TRANSPORT DATA BLOCK TRANSPORT 0x3a3a3a3a3a3a3a3a 00:07:43.666 [2024-11-17 11:02:08.108963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.666 #59 NEW cov: 12417 ft: 15049 corp: 25/4868b lim: 320 exec/s: 59 rss: 73Mb L: 224/273 MS: 1 InsertRepeatedBytes- 00:07:43.666 [2024-11-17 11:02:08.158361] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (46) qid:0 cid:4 nsid:46464646 cdw10:46464646 cdw11:46464646 SGL TRANSPORT DATA BLOCK TRANSPORT 0x4646464646464646 00:07:43.666 [2024-11-17 11:02:08.158387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.666 #60 NEW cov: 12417 ft: 15101 corp: 26/4984b lim: 320 exec/s: 60 rss: 73Mb L: 116/273 MS: 1 EraseBytes- 00:07:43.666 [2024-11-17 11:02:08.218933] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (46) qid:0 cid:4 nsid:46464646 cdw10:46464646 cdw11:46464646 SGL TRANSPORT DATA BLOCK TRANSPORT 0x4646464646464646 00:07:43.666 [2024-11-17 11:02:08.218961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.666 [2024-11-17 11:02:08.219105] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (46) qid:0 cid:5 nsid:46464646 cdw10:46464646 cdw11:46464646 SGL TRANSPORT DATA BLOCK TRANSPORT 0x4646464646464646 00:07:43.666 [2024-11-17 11:02:08.219121] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.666 [2024-11-17 11:02:08.219251] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (46) qid:0 cid:6 nsid:46464646 cdw10:46464646 cdw11:46464646 SGL TRANSPORT DATA BLOCK TRANSPORT 0x4646464646464646 00:07:43.666 [2024-11-17 11:02:08.219268] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.666 #61 NEW cov: 12417 ft: 15125 corp: 27/5213b lim: 320 exec/s: 61 rss: 73Mb L: 229/273 MS: 1 InsertByte- 00:07:43.666 [2024-11-17 11:02:08.289121] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (46) qid:0 cid:4 nsid:46464646 cdw10:46464646 cdw11:46464646 SGL TRANSPORT DATA BLOCK TRANSPORT 0xb9b9b9b9000001ff 00:07:43.666 [2024-11-17 11:02:08.289147] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.666 [2024-11-17 11:02:08.289272] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (46) qid:0 cid:5 nsid:46464646 cdw10:46464646 cdw11:46464646 SGL TRANSPORT DATA BLOCK TRANSPORT 0x4646464646464646 00:07:43.666 [2024-11-17 11:02:08.289289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.666 [2024-11-17 11:02:08.289425] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (44) qid:0 cid:6 nsid:44444444 cdw10:44444444 cdw11:44444444 00:07:43.666 [2024-11-17 11:02:08.289441] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.666 #62 NEW cov: 12417 ft: 15143 corp: 28/5455b lim: 320 exec/s: 62 rss: 73Mb L: 242/273 MS: 1 CopyPart- 00:07:43.927 [2024-11-17 11:02:08.338901] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (46) qid:0 cid:4 nsid:46464646 cdw10:46464646 cdw11:98464646 SGL TRANSPORT DATA BLOCK TRANSPORT 0x4646464646464646 00:07:43.927 [2024-11-17 11:02:08.338932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.927 #64 NEW cov: 12417 ft: 15148 corp: 29/5569b lim: 320 exec/s: 64 rss: 73Mb L: 114/273 MS: 2 CrossOver-InsertRepeatedBytes- 00:07:43.927 [2024-11-17 11:02:08.389333] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (46) qid:0 cid:4 nsid:46464646 cdw10:46464646 cdw11:46464646 SGL TRANSPORT DATA BLOCK TRANSPORT 0x4646464646464646 00:07:43.927 [2024-11-17 11:02:08.389361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.927 [2024-11-17 11:02:08.389487] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (46) qid:0 cid:5 nsid:46464646 cdw10:46464646 cdw11:46464646 SGL TRANSPORT DATA BLOCK TRANSPORT 0x4646464646464646 00:07:43.927 [2024-11-17 11:02:08.389502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.927 #65 NEW cov: 12417 ft: 15162 corp: 30/5750b lim: 320 exec/s: 65 rss: 73Mb L: 181/273 MS: 1 ChangeBit- 00:07:43.927 [2024-11-17 11:02:08.439807] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (46) qid:0 cid:4 nsid:46464646 cdw10:46464646 cdw11:46464646 SGL TRANSPORT DATA BLOCK TRANSPORT 0x46464646ffffffff 00:07:43.927 [2024-11-17 11:02:08.439834] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.927 [2024-11-17 11:02:08.439971] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (46) qid:0 cid:5 nsid:46464646 cdw10:44444444 cdw11:44444444 SGL TRANSPORT DATA BLOCK TRANSPORT 0x4646464646464646 00:07:43.927 [2024-11-17 11:02:08.439989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.927 [2024-11-17 11:02:08.440121] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (46) qid:0 cid:6 nsid:46464646 cdw10:46464646 cdw11:46464646 SGL TRANSPORT DATA BLOCK TRANSPORT 0x4646464646464646 00:07:43.927 [2024-11-17 11:02:08.440137] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.927 [2024-11-17 11:02:08.440259] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (44) qid:0 cid:7 nsid:44444444 cdw10:44444444 cdw11:44444444 00:07:43.927 [2024-11-17 11:02:08.440274] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:43.927 #66 NEW cov: 12417 ft: 15205 corp: 31/6058b lim: 320 exec/s: 33 rss: 73Mb L: 308/308 MS: 1 CopyPart- 00:07:43.927 #66 DONE cov: 12417 ft: 15205 corp: 31/6058b lim: 320 exec/s: 33 rss: 73Mb 00:07:43.927 ###### Recommended dictionary. ###### 00:07:43.927 "\000\000\000\000\000\000\000\020" # Uses: 0 00:07:43.927 "\377\211\254\234\267\346\305`" # Uses: 1 00:07:43.927 ###### End of recommended dictionary. ###### 00:07:43.927 Done 66 runs in 2 second(s) 00:07:43.927 11:02:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_0.conf /var/tmp/suppress_nvmf_fuzz 00:07:43.927 11:02:08 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:07:43.927 11:02:08 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:43.927 11:02:08 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 1 1 0x1 00:07:43.927 11:02:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=1 00:07:43.927 11:02:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:07:43.927 11:02:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:07:43.927 11:02:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:43.927 11:02:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_1.conf 00:07:43.927 11:02:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:43.927 11:02:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:44.188 11:02:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 1 00:07:44.188 11:02:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4401 00:07:44.188 11:02:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:44.188 11:02:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4401' 00:07:44.188 11:02:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4401"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:44.188 11:02:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:44.188 11:02:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:44.188 11:02:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4401' -c /tmp/fuzz_json_1.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 -Z 1 00:07:44.188 [2024-11-17 11:02:08.623188] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:07:44.188 [2024-11-17 11:02:08.623266] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid148399 ] 00:07:44.188 [2024-11-17 11:02:08.828522] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:44.188 [2024-11-17 11:02:08.841499] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:44.449 [2024-11-17 11:02:08.893844] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:44.449 [2024-11-17 11:02:08.910160] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4401 *** 00:07:44.450 INFO: Running with entropic power schedule (0xFF, 100). 00:07:44.450 INFO: Seed: 557326206 00:07:44.450 INFO: Loaded 1 modules (387604 inline 8-bit counters): 387604 [0x2a8d80c, 0x2aec220), 00:07:44.450 INFO: Loaded 1 PC tables (387604 PCs): 387604 [0x2aec220,0x30d6360), 00:07:44.450 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:44.450 INFO: A corpus is not provided, starting from an empty corpus 00:07:44.450 #2 INITED exec/s: 0 rss: 65Mb 00:07:44.450 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:44.450 This may also happen if the target rejected all inputs we tried so far 00:07:44.450 [2024-11-17 11:02:08.976127] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (37888) > buf size (4096) 00:07:44.450 [2024-11-17 11:02:08.976466] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:24ff0089 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.450 [2024-11-17 11:02:08.976506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.726 NEW_FUNC[1/716]: 0x453088 in fuzz_admin_get_log_page_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:67 00:07:44.727 NEW_FUNC[2/716]: 0x48ef68 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:44.727 #4 NEW cov: 12285 ft: 12265 corp: 2/11b lim: 30 exec/s: 0 rss: 72Mb L: 10/10 MS: 2 InsertByte-CMP- DE: "\377\211\254\235\2005\036\032"- 00:07:44.727 [2024-11-17 11:02:09.317139] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100009999 00:07:44.727 [2024-11-17 11:02:09.317310] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100009999 00:07:44.727 [2024-11-17 11:02:09.317461] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100009999 00:07:44.727 [2024-11-17 11:02:09.317821] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:99998199 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.727 [2024-11-17 11:02:09.317864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.727 [2024-11-17 11:02:09.317998] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:99998199 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.727 [2024-11-17 11:02:09.318018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.727 [2024-11-17 11:02:09.318137] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:99998199 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.727 [2024-11-17 11:02:09.318156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.727 #6 NEW cov: 12404 ft: 13326 corp: 3/31b lim: 30 exec/s: 0 rss: 72Mb L: 20/20 MS: 2 ChangeBit-InsertRepeatedBytes- 00:07:44.993 [2024-11-17 11:02:09.377124] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000ac9d 00:07:44.993 [2024-11-17 11:02:09.377483] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ac2481ff cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.993 [2024-11-17 11:02:09.377514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.993 #7 NEW cov: 12410 ft: 13528 corp: 4/42b lim: 30 exec/s: 0 rss: 72Mb L: 11/20 MS: 1 InsertByte- 00:07:44.993 [2024-11-17 11:02:09.447158] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (55296) > buf size (4096) 00:07:44.993 [2024-11-17 11:02:09.447529] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:35ff0024 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.993 [2024-11-17 11:02:09.447558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.993 #8 NEW cov: 12495 ft: 13857 corp: 5/52b lim: 30 exec/s: 0 rss: 72Mb L: 10/20 MS: 1 ShuffleBytes- 00:07:44.993 [2024-11-17 11:02:09.497377] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (317440) > buf size (4096) 00:07:44.993 [2024-11-17 11:02:09.497771] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:35ff8124 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.993 [2024-11-17 11:02:09.497802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.993 #9 NEW cov: 12495 ft: 13973 corp: 6/62b lim: 30 exec/s: 0 rss: 72Mb L: 10/20 MS: 1 CopyPart- 00:07:44.993 [2024-11-17 11:02:09.567606] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (55296) > buf size (4096) 00:07:44.993 [2024-11-17 11:02:09.567978] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:35ff0024 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.993 [2024-11-17 11:02:09.568008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.993 #10 NEW cov: 12495 ft: 14108 corp: 7/73b lim: 30 exec/s: 0 rss: 72Mb L: 11/20 MS: 1 InsertByte- 00:07:44.993 [2024-11-17 11:02:09.617932] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (841728) > buf size (4096) 00:07:44.993 [2024-11-17 11:02:09.618108] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200001a0a 00:07:44.993 [2024-11-17 11:02:09.618277] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x891a 00:07:44.993 [2024-11-17 11:02:09.618652] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:35ff8324 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.993 [2024-11-17 11:02:09.618680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.993 [2024-11-17 11:02:09.618811] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:9d800235 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.993 [2024-11-17 11:02:09.618834] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.993 [2024-11-17 11:02:09.618957] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:35ff0024 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.993 [2024-11-17 11:02:09.618974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.254 #16 NEW cov: 12495 ft: 14207 corp: 8/92b lim: 30 exec/s: 0 rss: 72Mb L: 19/20 MS: 1 CrossOver- 00:07:45.254 [2024-11-17 11:02:09.688319] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100009999 00:07:45.254 [2024-11-17 11:02:09.688524] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (419432) > buf size (4096) 00:07:45.254 [2024-11-17 11:02:09.688685] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000a99 00:07:45.254 [2024-11-17 11:02:09.688851] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100009999 00:07:45.254 [2024-11-17 11:02:09.689190] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:99998199 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.254 [2024-11-17 11:02:09.689219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.254 [2024-11-17 11:02:09.689344] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:99998199 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.254 [2024-11-17 11:02:09.689360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.254 [2024-11-17 11:02:09.689485] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:1e23029d cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.254 [2024-11-17 11:02:09.689505] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.254 [2024-11-17 11:02:09.689632] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:99998199 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.254 [2024-11-17 11:02:09.689650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:45.254 #17 NEW cov: 12495 ft: 14733 corp: 9/119b lim: 30 exec/s: 0 rss: 72Mb L: 27/27 MS: 1 CrossOver- 00:07:45.254 [2024-11-17 11:02:09.758253] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (448512) > buf size (4096) 00:07:45.254 [2024-11-17 11:02:09.758639] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:b5ff8124 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.254 [2024-11-17 11:02:09.758668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.254 #18 NEW cov: 12495 ft: 14767 corp: 10/129b lim: 30 exec/s: 0 rss: 72Mb L: 10/27 MS: 1 ChangeBit- 00:07:45.254 [2024-11-17 11:02:09.808313] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000ac9d 00:07:45.254 [2024-11-17 11:02:09.808704] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:4ddb81ff cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.254 [2024-11-17 11:02:09.808734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.254 NEW_FUNC[1/1]: 0x1c4e498 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:07:45.254 #19 NEW cov: 12518 ft: 14820 corp: 11/140b lim: 30 exec/s: 0 rss: 73Mb L: 11/27 MS: 1 ChangeBinInt- 00:07:45.254 [2024-11-17 11:02:09.878799] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300005f99 00:07:45.254 [2024-11-17 11:02:09.878976] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100009999 00:07:45.254 [2024-11-17 11:02:09.879151] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100009999 00:07:45.254 [2024-11-17 11:02:09.879520] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:99998399 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.254 [2024-11-17 11:02:09.879550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.254 [2024-11-17 11:02:09.879675] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:99998199 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.254 [2024-11-17 11:02:09.879694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.254 [2024-11-17 11:02:09.879822] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:99998199 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.254 [2024-11-17 11:02:09.879839] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.254 #20 NEW cov: 12518 ft: 14843 corp: 12/160b lim: 30 exec/s: 0 rss: 73Mb L: 20/27 MS: 1 ChangeBinInt- 00:07:45.516 [2024-11-17 11:02:09.928958] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100009999 00:07:45.516 [2024-11-17 11:02:09.929113] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100009999 00:07:45.516 [2024-11-17 11:02:09.929276] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100009999 00:07:45.516 [2024-11-17 11:02:09.929623] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:99998199 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.516 [2024-11-17 11:02:09.929652] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.516 [2024-11-17 11:02:09.929777] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:99998199 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.516 [2024-11-17 11:02:09.929798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.516 [2024-11-17 11:02:09.929922] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:99998199 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.516 [2024-11-17 11:02:09.929938] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.516 #21 NEW cov: 12518 ft: 14859 corp: 13/180b lim: 30 exec/s: 0 rss: 73Mb L: 20/27 MS: 1 ShuffleBytes- 00:07:45.516 [2024-11-17 11:02:09.979156] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100009999 00:07:45.516 [2024-11-17 11:02:09.979328] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (419432) > buf size (4096) 00:07:45.516 [2024-11-17 11:02:09.979492] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000a99 00:07:45.516 [2024-11-17 11:02:09.979666] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100009999 00:07:45.516 [2024-11-17 11:02:09.980014] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:99998199 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.516 [2024-11-17 11:02:09.980045] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.516 [2024-11-17 11:02:09.980165] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:99998199 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.516 [2024-11-17 11:02:09.980184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.516 [2024-11-17 11:02:09.980306] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:1e23029d cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.516 [2024-11-17 11:02:09.980326] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.516 [2024-11-17 11:02:09.980446] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:99998141 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.516 [2024-11-17 11:02:09.980464] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:45.516 #22 NEW cov: 12518 ft: 14918 corp: 14/207b lim: 30 exec/s: 22 rss: 73Mb L: 27/27 MS: 1 ChangeByte- 00:07:45.516 [2024-11-17 11:02:10.049279] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (841728) > buf size (4096) 00:07:45.516 [2024-11-17 11:02:10.049461] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200001a0a 00:07:45.516 [2024-11-17 11:02:10.049636] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x891a 00:07:45.516 [2024-11-17 11:02:10.050021] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:35ff8324 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.516 [2024-11-17 11:02:10.050056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.516 [2024-11-17 11:02:10.050177] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:9d800235 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.516 [2024-11-17 11:02:10.050194] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.516 [2024-11-17 11:02:10.050327] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:35ff0024 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.516 [2024-11-17 11:02:10.050346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.516 #23 NEW cov: 12518 ft: 14943 corp: 15/226b lim: 30 exec/s: 23 rss: 73Mb L: 19/27 MS: 1 CrossOver- 00:07:45.516 [2024-11-17 11:02:10.119515] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100009999 00:07:45.516 [2024-11-17 11:02:10.119693] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100009999 00:07:45.516 [2024-11-17 11:02:10.119841] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (927412) > buf size (4096) 00:07:45.516 [2024-11-17 11:02:10.120258] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:b5ff8199 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.516 [2024-11-17 11:02:10.120291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.516 [2024-11-17 11:02:10.120428] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:99998199 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.516 [2024-11-17 11:02:10.120447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.516 [2024-11-17 11:02:10.120578] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:89ac831e cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.516 [2024-11-17 11:02:10.120598] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.516 #24 NEW cov: 12518 ft: 14957 corp: 16/246b lim: 30 exec/s: 24 rss: 73Mb L: 20/27 MS: 1 CrossOver- 00:07:45.778 [2024-11-17 11:02:10.189766] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000ac9d 00:07:45.778 [2024-11-17 11:02:10.189951] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000ac9d 00:07:45.778 [2024-11-17 11:02:10.190115] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (655576) > buf size (4096) 00:07:45.778 [2024-11-17 11:02:10.190472] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:4ddb81ff cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.778 [2024-11-17 11:02:10.190503] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.778 [2024-11-17 11:02:10.190636] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:4ddb81ff cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.778 [2024-11-17 11:02:10.190654] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.778 [2024-11-17 11:02:10.190781] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:8035021e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.778 [2024-11-17 11:02:10.190799] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.778 #25 NEW cov: 12518 ft: 14975 corp: 17/268b lim: 30 exec/s: 25 rss: 73Mb L: 22/27 MS: 1 CopyPart- 00:07:45.778 [2024-11-17 11:02:10.259931] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100009999 00:07:45.778 [2024-11-17 11:02:10.260129] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100009999 00:07:45.778 [2024-11-17 11:02:10.260307] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100009999 00:07:45.778 [2024-11-17 11:02:10.260683] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:99998199 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.778 [2024-11-17 11:02:10.260714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.778 [2024-11-17 11:02:10.260845] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:99998199 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.778 [2024-11-17 11:02:10.260865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.778 [2024-11-17 11:02:10.261000] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:99998199 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.778 [2024-11-17 11:02:10.261020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.778 #26 NEW cov: 12518 ft: 14989 corp: 18/288b lim: 30 exec/s: 26 rss: 73Mb L: 20/27 MS: 1 ShuffleBytes- 00:07:45.778 [2024-11-17 11:02:10.309934] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000ac9d 00:07:45.778 [2024-11-17 11:02:10.310361] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ac2481ff cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.778 [2024-11-17 11:02:10.310390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.778 #27 NEW cov: 12518 ft: 15032 corp: 19/299b lim: 30 exec/s: 27 rss: 73Mb L: 11/27 MS: 1 ShuffleBytes- 00:07:45.778 [2024-11-17 11:02:10.360218] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x1000024ff 00:07:45.778 [2024-11-17 11:02:10.360413] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200001a0a 00:07:45.778 [2024-11-17 11:02:10.360594] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x891a 00:07:45.778 [2024-11-17 11:02:10.360974] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:35ff8189 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.778 [2024-11-17 11:02:10.361007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.778 [2024-11-17 11:02:10.361131] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:80ac0235 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.778 [2024-11-17 11:02:10.361150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.778 [2024-11-17 11:02:10.361274] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:35ff0024 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.778 [2024-11-17 11:02:10.361298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.778 #28 NEW cov: 12518 ft: 15056 corp: 20/318b lim: 30 exec/s: 28 rss: 73Mb L: 19/27 MS: 1 ShuffleBytes- 00:07:45.778 [2024-11-17 11:02:10.410382] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100009999 00:07:45.778 [2024-11-17 11:02:10.410552] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100009999 00:07:45.778 [2024-11-17 11:02:10.410715] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000997a 00:07:45.778 [2024-11-17 11:02:10.411071] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:99998199 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.779 [2024-11-17 11:02:10.411102] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.779 [2024-11-17 11:02:10.411224] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:99998199 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.779 [2024-11-17 11:02:10.411242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.779 [2024-11-17 11:02:10.411359] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:99998199 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.779 [2024-11-17 11:02:10.411379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.040 #29 NEW cov: 12518 ft: 15069 corp: 21/338b lim: 30 exec/s: 29 rss: 73Mb L: 20/27 MS: 1 ChangeByte- 00:07:46.040 [2024-11-17 11:02:10.480746] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100009999 00:07:46.040 [2024-11-17 11:02:10.480962] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (419432) > buf size (4096) 00:07:46.040 [2024-11-17 11:02:10.481135] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000a99 00:07:46.040 [2024-11-17 11:02:10.481307] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100009999 00:07:46.040 [2024-11-17 11:02:10.481689] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:99998199 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.040 [2024-11-17 11:02:10.481720] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.040 [2024-11-17 11:02:10.481843] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:99998199 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.040 [2024-11-17 11:02:10.481862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.040 [2024-11-17 11:02:10.481988] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:8923029d cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.040 [2024-11-17 11:02:10.482005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.040 [2024-11-17 11:02:10.482132] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:99998141 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.040 [2024-11-17 11:02:10.482152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:46.040 #30 NEW cov: 12518 ft: 15106 corp: 22/365b lim: 30 exec/s: 30 rss: 73Mb L: 27/27 MS: 1 ShuffleBytes- 00:07:46.040 [2024-11-17 11:02:10.550962] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100009999 00:07:46.040 [2024-11-17 11:02:10.551157] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (418944) > buf size (4096) 00:07:46.040 [2024-11-17 11:02:10.551329] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000a99 00:07:46.040 [2024-11-17 11:02:10.551492] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100009999 00:07:46.041 [2024-11-17 11:02:10.551881] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:99998199 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.041 [2024-11-17 11:02:10.551912] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.041 [2024-11-17 11:02:10.552046] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:991f8199 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.041 [2024-11-17 11:02:10.552063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.041 [2024-11-17 11:02:10.552191] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:8923029d cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.041 [2024-11-17 11:02:10.552212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.041 [2024-11-17 11:02:10.552343] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:99998141 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.041 [2024-11-17 11:02:10.552365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:46.041 #31 NEW cov: 12518 ft: 15148 corp: 23/392b lim: 30 exec/s: 31 rss: 73Mb L: 27/27 MS: 1 ChangeByte- 00:07:46.041 [2024-11-17 11:02:10.621032] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000ac9d 00:07:46.041 [2024-11-17 11:02:10.621215] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000ac9d 00:07:46.041 [2024-11-17 11:02:10.621392] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (655576) > buf size (4096) 00:07:46.041 [2024-11-17 11:02:10.621753] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:4ddb81ff cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.041 [2024-11-17 11:02:10.621783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.041 [2024-11-17 11:02:10.621904] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:4ddb81ff cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.041 [2024-11-17 11:02:10.621921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.041 [2024-11-17 11:02:10.622045] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:8035021e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.041 [2024-11-17 11:02:10.622061] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.041 #32 NEW cov: 12518 ft: 15153 corp: 24/415b lim: 30 exec/s: 32 rss: 74Mb L: 23/27 MS: 1 InsertByte- 00:07:46.041 [2024-11-17 11:02:10.691177] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100009935 00:07:46.041 [2024-11-17 11:02:10.691358] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100001a0a 00:07:46.041 [2024-11-17 11:02:10.691531] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000a99 00:07:46.041 [2024-11-17 11:02:10.691699] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100009999 00:07:46.041 [2024-11-17 11:02:10.692100] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:99998199 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.041 [2024-11-17 11:02:10.692133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.041 [2024-11-17 11:02:10.692249] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ff248180 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.041 [2024-11-17 11:02:10.692268] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.041 [2024-11-17 11:02:10.692397] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:8923029d cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.041 [2024-11-17 11:02:10.692417] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.041 [2024-11-17 11:02:10.692542] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:99998141 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.041 [2024-11-17 11:02:10.692562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:46.302 #33 NEW cov: 12518 ft: 15161 corp: 25/442b lim: 30 exec/s: 33 rss: 74Mb L: 27/27 MS: 1 CrossOver- 00:07:46.302 [2024-11-17 11:02:10.761482] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100009999 00:07:46.302 [2024-11-17 11:02:10.761659] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100009999 00:07:46.302 [2024-11-17 11:02:10.761827] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100009999 00:07:46.302 [2024-11-17 11:02:10.762210] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:99998199 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.302 [2024-11-17 11:02:10.762241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.302 [2024-11-17 11:02:10.762364] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:99998199 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.302 [2024-11-17 11:02:10.762381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.302 [2024-11-17 11:02:10.762505] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:99998199 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.302 [2024-11-17 11:02:10.762524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.302 #34 NEW cov: 12518 ft: 15202 corp: 26/463b lim: 30 exec/s: 34 rss: 74Mb L: 21/27 MS: 1 CopyPart- 00:07:46.302 [2024-11-17 11:02:10.831587] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100009999 00:07:46.302 [2024-11-17 11:02:10.831792] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100004199 00:07:46.302 [2024-11-17 11:02:10.832155] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:99998199 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.302 [2024-11-17 11:02:10.832183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.302 [2024-11-17 11:02:10.832296] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:99998199 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.302 [2024-11-17 11:02:10.832314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.302 #35 NEW cov: 12518 ft: 15472 corp: 27/480b lim: 30 exec/s: 35 rss: 74Mb L: 17/27 MS: 1 EraseBytes- 00:07:46.302 [2024-11-17 11:02:10.881731] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (841728) > buf size (4096) 00:07:46.302 [2024-11-17 11:02:10.881889] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100001eff 00:07:46.302 [2024-11-17 11:02:10.882277] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:35ff8324 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.302 [2024-11-17 11:02:10.882307] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.302 [2024-11-17 11:02:10.882420] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:9d358180 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.302 [2024-11-17 11:02:10.882441] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.302 #36 NEW cov: 12518 ft: 15473 corp: 28/493b lim: 30 exec/s: 36 rss: 74Mb L: 13/27 MS: 1 CrossOver- 00:07:46.302 [2024-11-17 11:02:10.951831] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100001a0a 00:07:46.302 [2024-11-17 11:02:10.952221] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ff248180 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.302 [2024-11-17 11:02:10.952251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.563 #37 NEW cov: 12518 ft: 15525 corp: 29/499b lim: 30 exec/s: 18 rss: 74Mb L: 6/27 MS: 1 EraseBytes- 00:07:46.563 #37 DONE cov: 12518 ft: 15525 corp: 29/499b lim: 30 exec/s: 18 rss: 74Mb 00:07:46.563 ###### Recommended dictionary. ###### 00:07:46.563 "\377\211\254\235\2005\036\032" # Uses: 0 00:07:46.563 ###### End of recommended dictionary. ###### 00:07:46.563 Done 37 runs in 2 second(s) 00:07:46.563 11:02:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_1.conf /var/tmp/suppress_nvmf_fuzz 00:07:46.563 11:02:11 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:07:46.563 11:02:11 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:46.563 11:02:11 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 2 1 0x1 00:07:46.563 11:02:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=2 00:07:46.563 11:02:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:07:46.563 11:02:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:07:46.563 11:02:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:07:46.563 11:02:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_2.conf 00:07:46.563 11:02:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:46.563 11:02:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:46.563 11:02:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 2 00:07:46.563 11:02:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4402 00:07:46.563 11:02:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:07:46.563 11:02:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4402' 00:07:46.563 11:02:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4402"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:46.563 11:02:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:46.563 11:02:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:46.563 11:02:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4402' -c /tmp/fuzz_json_2.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 -Z 2 00:07:46.563 [2024-11-17 11:02:11.116265] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:07:46.563 [2024-11-17 11:02:11.116350] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid148928 ] 00:07:46.824 [2024-11-17 11:02:11.315797] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:46.824 [2024-11-17 11:02:11.328375] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:46.824 [2024-11-17 11:02:11.381023] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:46.824 [2024-11-17 11:02:11.397308] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4402 *** 00:07:46.824 INFO: Running with entropic power schedule (0xFF, 100). 00:07:46.824 INFO: Seed: 3045315824 00:07:46.824 INFO: Loaded 1 modules (387604 inline 8-bit counters): 387604 [0x2a8d80c, 0x2aec220), 00:07:46.824 INFO: Loaded 1 PC tables (387604 PCs): 387604 [0x2aec220,0x30d6360), 00:07:46.824 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:07:46.824 INFO: A corpus is not provided, starting from an empty corpus 00:07:46.824 #2 INITED exec/s: 0 rss: 65Mb 00:07:46.824 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:46.824 This may also happen if the target rejected all inputs we tried so far 00:07:46.824 [2024-11-17 11:02:11.462603] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:46.824 [2024-11-17 11:02:11.462728] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:46.824 [2024-11-17 11:02:11.462942] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.824 [2024-11-17 11:02:11.462975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.824 [2024-11-17 11:02:11.463031] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.824 [2024-11-17 11:02:11.463051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.347 NEW_FUNC[1/715]: 0x455b38 in fuzz_admin_identify_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:95 00:07:47.347 NEW_FUNC[2/715]: 0x48ef68 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:47.347 #8 NEW cov: 12207 ft: 12207 corp: 2/20b lim: 35 exec/s: 0 rss: 72Mb L: 19/19 MS: 1 InsertRepeatedBytes- 00:07:47.347 [2024-11-17 11:02:11.793464] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:47.347 [2024-11-17 11:02:11.793600] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:47.347 [2024-11-17 11:02:11.793823] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.347 [2024-11-17 11:02:11.793859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.347 [2024-11-17 11:02:11.793922] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000a00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.347 [2024-11-17 11:02:11.793941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.347 #9 NEW cov: 12337 ft: 12545 corp: 3/39b lim: 35 exec/s: 0 rss: 72Mb L: 19/19 MS: 1 ChangeBinInt- 00:07:47.347 [2024-11-17 11:02:11.853860] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:82ff0082 cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.347 [2024-11-17 11:02:11.853889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.347 [2024-11-17 11:02:11.853946] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.347 [2024-11-17 11:02:11.853961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.347 #17 NEW cov: 12353 ft: 13108 corp: 4/56b lim: 35 exec/s: 0 rss: 73Mb L: 17/19 MS: 3 ChangeByte-CopyPart-InsertRepeatedBytes- 00:07:47.347 [2024-11-17 11:02:11.893707] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:47.347 [2024-11-17 11:02:11.893833] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:47.347 [2024-11-17 11:02:11.893953] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:47.347 [2024-11-17 11:02:11.894075] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:47.347 [2024-11-17 11:02:11.894306] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.347 [2024-11-17 11:02:11.894336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.347 [2024-11-17 11:02:11.894393] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.347 [2024-11-17 11:02:11.894409] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.347 [2024-11-17 11:02:11.894466] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:000a0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.347 [2024-11-17 11:02:11.894483] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.347 [2024-11-17 11:02:11.894538] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.347 [2024-11-17 11:02:11.894555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.347 #23 NEW cov: 12438 ft: 13902 corp: 5/88b lim: 35 exec/s: 0 rss: 73Mb L: 32/32 MS: 1 CopyPart- 00:07:47.347 [2024-11-17 11:02:11.934055] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:82ff0082 cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.347 [2024-11-17 11:02:11.934081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.347 [2024-11-17 11:02:11.934136] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:0800ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.347 [2024-11-17 11:02:11.934149] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.347 #24 NEW cov: 12438 ft: 14065 corp: 6/105b lim: 35 exec/s: 0 rss: 73Mb L: 17/32 MS: 1 ChangeBinInt- 00:07:47.347 [2024-11-17 11:02:11.994013] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:47.347 [2024-11-17 11:02:11.994149] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:47.347 [2024-11-17 11:02:11.994265] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:47.347 [2024-11-17 11:02:11.994376] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:47.347 [2024-11-17 11:02:11.994596] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.347 [2024-11-17 11:02:11.994624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.347 [2024-11-17 11:02:11.994680] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:0000000a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.347 [2024-11-17 11:02:11.994695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.347 [2024-11-17 11:02:11.994748] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:0a000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.347 [2024-11-17 11:02:11.994764] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.347 [2024-11-17 11:02:11.994817] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.347 [2024-11-17 11:02:11.994836] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.609 #25 NEW cov: 12438 ft: 14132 corp: 7/138b lim: 35 exec/s: 0 rss: 73Mb L: 33/33 MS: 1 CrossOver- 00:07:47.609 [2024-11-17 11:02:12.034051] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:47.609 [2024-11-17 11:02:12.034386] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.609 [2024-11-17 11:02:12.034414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.609 [2024-11-17 11:02:12.034471] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000004 cdw11:00000a00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.609 [2024-11-17 11:02:12.034485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.609 #26 NEW cov: 12438 ft: 14284 corp: 8/157b lim: 35 exec/s: 0 rss: 73Mb L: 19/33 MS: 1 ChangeBit- 00:07:47.609 [2024-11-17 11:02:12.094507] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:82ff0082 cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.609 [2024-11-17 11:02:12.094534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.609 [2024-11-17 11:02:12.094591] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:0800ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.609 [2024-11-17 11:02:12.094605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.609 #27 NEW cov: 12438 ft: 14341 corp: 9/177b lim: 35 exec/s: 0 rss: 73Mb L: 20/33 MS: 1 CopyPart- 00:07:47.609 [2024-11-17 11:02:12.154693] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:82ff0082 cdw11:5600ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.609 [2024-11-17 11:02:12.154720] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.609 [2024-11-17 11:02:12.154778] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:0800ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.609 [2024-11-17 11:02:12.154791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.609 #28 NEW cov: 12438 ft: 14446 corp: 10/194b lim: 35 exec/s: 0 rss: 73Mb L: 17/33 MS: 1 ChangeByte- 00:07:47.609 [2024-11-17 11:02:12.194659] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:28cd002a cdw11:9f007d5f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.609 [2024-11-17 11:02:12.194686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.609 #30 NEW cov: 12438 ft: 14746 corp: 11/203b lim: 35 exec/s: 0 rss: 73Mb L: 9/33 MS: 2 ChangeBit-CMP- DE: "(\315}_\237\254\212\000"- 00:07:47.609 [2024-11-17 11:02:12.234599] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:47.609 [2024-11-17 11:02:12.234942] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.609 [2024-11-17 11:02:12.234968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.609 [2024-11-17 11:02:12.235026] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:0a000004 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.609 [2024-11-17 11:02:12.235046] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.870 #31 NEW cov: 12438 ft: 14757 corp: 12/222b lim: 35 exec/s: 0 rss: 73Mb L: 19/33 MS: 1 ShuffleBytes- 00:07:47.870 [2024-11-17 11:02:12.295077] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:82ff0082 cdw11:5600ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.870 [2024-11-17 11:02:12.295104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.870 [2024-11-17 11:02:12.295178] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ff2700ff cdw11:0800ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.870 [2024-11-17 11:02:12.295192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.870 NEW_FUNC[1/1]: 0x1c4e498 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:07:47.870 #32 NEW cov: 12461 ft: 14786 corp: 13/239b lim: 35 exec/s: 0 rss: 73Mb L: 17/33 MS: 1 ChangeByte- 00:07:47.870 [2024-11-17 11:02:12.355285] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ff8200ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.870 [2024-11-17 11:02:12.355311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.870 [2024-11-17 11:02:12.355372] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.870 [2024-11-17 11:02:12.355387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.870 #33 NEW cov: 12461 ft: 14805 corp: 14/256b lim: 35 exec/s: 0 rss: 73Mb L: 17/33 MS: 1 ShuffleBytes- 00:07:47.870 [2024-11-17 11:02:12.395239] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:28cd006a cdw11:9f007d5f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.870 [2024-11-17 11:02:12.395264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.870 #34 NEW cov: 12461 ft: 14846 corp: 15/265b lim: 35 exec/s: 0 rss: 73Mb L: 9/33 MS: 1 ChangeBit- 00:07:47.870 [2024-11-17 11:02:12.455251] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:47.871 [2024-11-17 11:02:12.455388] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:47.871 [2024-11-17 11:02:12.455613] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.871 [2024-11-17 11:02:12.455642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.871 [2024-11-17 11:02:12.455702] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.871 [2024-11-17 11:02:12.455719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.871 #35 NEW cov: 12461 ft: 14852 corp: 16/284b lim: 35 exec/s: 35 rss: 73Mb L: 19/33 MS: 1 ChangeBinInt- 00:07:47.871 [2024-11-17 11:02:12.495489] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:827e0082 cdw11:7e008282 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.871 [2024-11-17 11:02:12.495514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.871 #39 NEW cov: 12461 ft: 14858 corp: 17/292b lim: 35 exec/s: 39 rss: 73Mb L: 8/33 MS: 4 CrossOver-ShuffleBytes-InsertByte-CopyPart- 00:07:48.132 [2024-11-17 11:02:12.535458] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:48.132 [2024-11-17 11:02:12.535783] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.132 [2024-11-17 11:02:12.535813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.132 [2024-11-17 11:02:12.535871] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:000a0004 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.132 [2024-11-17 11:02:12.535885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.132 #40 NEW cov: 12461 ft: 14929 corp: 18/310b lim: 35 exec/s: 40 rss: 73Mb L: 18/33 MS: 1 EraseBytes- 00:07:48.132 [2024-11-17 11:02:12.575584] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:48.132 [2024-11-17 11:02:12.575711] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:48.132 [2024-11-17 11:02:12.575933] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00006a00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.132 [2024-11-17 11:02:12.575962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.132 [2024-11-17 11:02:12.576019] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.132 [2024-11-17 11:02:12.576036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.133 #41 NEW cov: 12461 ft: 14957 corp: 19/330b lim: 35 exec/s: 41 rss: 73Mb L: 20/33 MS: 1 InsertByte- 00:07:48.133 [2024-11-17 11:02:12.636037] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:82ff0082 cdw11:5600ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.133 [2024-11-17 11:02:12.636067] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.133 [2024-11-17 11:02:12.636140] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:0800ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.133 [2024-11-17 11:02:12.636154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.133 #42 NEW cov: 12461 ft: 14968 corp: 20/347b lim: 35 exec/s: 42 rss: 73Mb L: 17/33 MS: 1 ChangeBinInt- 00:07:48.133 [2024-11-17 11:02:12.675901] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:48.133 [2024-11-17 11:02:12.676027] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:48.133 [2024-11-17 11:02:12.676149] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:48.133 [2024-11-17 11:02:12.676265] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:48.133 [2024-11-17 11:02:12.676492] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.133 [2024-11-17 11:02:12.676521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.133 [2024-11-17 11:02:12.676580] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.133 [2024-11-17 11:02:12.676597] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.133 [2024-11-17 11:02:12.676653] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.133 [2024-11-17 11:02:12.676668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.133 [2024-11-17 11:02:12.676725] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.133 [2024-11-17 11:02:12.676745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:48.133 #43 NEW cov: 12461 ft: 14983 corp: 21/376b lim: 35 exec/s: 43 rss: 74Mb L: 29/33 MS: 1 InsertRepeatedBytes- 00:07:48.133 [2024-11-17 11:02:12.716080] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:48.133 [2024-11-17 11:02:12.716313] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.133 [2024-11-17 11:02:12.716339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.133 [2024-11-17 11:02:12.716398] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.133 [2024-11-17 11:02:12.716414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.133 #44 NEW cov: 12461 ft: 14992 corp: 22/390b lim: 35 exec/s: 44 rss: 74Mb L: 14/33 MS: 1 CrossOver- 00:07:48.133 [2024-11-17 11:02:12.756391] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:82ff0082 cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.133 [2024-11-17 11:02:12.756417] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.133 [2024-11-17 11:02:12.756475] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:f700ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.133 [2024-11-17 11:02:12.756488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.133 #45 NEW cov: 12461 ft: 15072 corp: 23/408b lim: 35 exec/s: 45 rss: 74Mb L: 18/33 MS: 1 InsertByte- 00:07:48.395 [2024-11-17 11:02:12.796381] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:827e0082 cdw11:76008282 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.395 [2024-11-17 11:02:12.796407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.395 #46 NEW cov: 12461 ft: 15125 corp: 24/416b lim: 35 exec/s: 46 rss: 74Mb L: 8/33 MS: 1 ChangeBit- 00:07:48.395 [2024-11-17 11:02:12.856684] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.395 [2024-11-17 11:02:12.856710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.395 [2024-11-17 11:02:12.856788] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:0800ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.395 [2024-11-17 11:02:12.856802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.395 #47 NEW cov: 12461 ft: 15126 corp: 25/433b lim: 35 exec/s: 47 rss: 74Mb L: 17/33 MS: 1 CrossOver- 00:07:48.395 [2024-11-17 11:02:12.896507] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:48.395 [2024-11-17 11:02:12.896635] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:48.395 [2024-11-17 11:02:12.896858] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.395 [2024-11-17 11:02:12.896886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.395 [2024-11-17 11:02:12.896945] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:04000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.395 [2024-11-17 11:02:12.896962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.395 #48 NEW cov: 12461 ft: 15163 corp: 26/452b lim: 35 exec/s: 48 rss: 74Mb L: 19/33 MS: 1 ChangeBinInt- 00:07:48.395 [2024-11-17 11:02:12.937047] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:82ff0082 cdw11:5600ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.395 [2024-11-17 11:02:12.937072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.395 [2024-11-17 11:02:12.937147] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:28cd00ff cdw11:9f007d5f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.395 [2024-11-17 11:02:12.937161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.395 [2024-11-17 11:02:12.937217] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00ff008a cdw11:ff0027ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.395 [2024-11-17 11:02:12.937231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.395 #49 NEW cov: 12461 ft: 15356 corp: 27/477b lim: 35 exec/s: 49 rss: 74Mb L: 25/33 MS: 1 PersAutoDict- DE: "(\315}_\237\254\212\000"- 00:07:48.395 [2024-11-17 11:02:12.997244] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:79790079 cdw11:79007979 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.395 [2024-11-17 11:02:12.997269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.395 [2024-11-17 11:02:12.997327] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:79790079 cdw11:79007979 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.395 [2024-11-17 11:02:12.997341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.395 [2024-11-17 11:02:12.997396] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:79790079 cdw11:79007979 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.395 [2024-11-17 11:02:12.997410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.395 #51 NEW cov: 12461 ft: 15361 corp: 28/499b lim: 35 exec/s: 51 rss: 74Mb L: 22/33 MS: 2 CrossOver-InsertRepeatedBytes- 00:07:48.395 [2024-11-17 11:02:13.037051] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:82ff0082 cdw11:5600ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.395 [2024-11-17 11:02:13.037076] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.657 #52 NEW cov: 12461 ft: 15370 corp: 29/509b lim: 35 exec/s: 52 rss: 74Mb L: 10/33 MS: 1 EraseBytes- 00:07:48.657 [2024-11-17 11:02:13.077054] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:48.657 [2024-11-17 11:02:13.077211] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:48.657 [2024-11-17 11:02:13.077456] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00006a00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.657 [2024-11-17 11:02:13.077483] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.657 [2024-11-17 11:02:13.077544] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.657 [2024-11-17 11:02:13.077560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.657 #53 NEW cov: 12461 ft: 15393 corp: 30/529b lim: 35 exec/s: 53 rss: 74Mb L: 20/33 MS: 1 ShuffleBytes- 00:07:48.657 [2024-11-17 11:02:13.137204] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:48.657 [2024-11-17 11:02:13.137335] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:48.657 [2024-11-17 11:02:13.137557] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.657 [2024-11-17 11:02:13.137584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.657 [2024-11-17 11:02:13.137642] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000a00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.657 [2024-11-17 11:02:13.137658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.657 #54 NEW cov: 12461 ft: 15415 corp: 31/548b lim: 35 exec/s: 54 rss: 74Mb L: 19/33 MS: 1 ChangeBit- 00:07:48.657 [2024-11-17 11:02:13.177596] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff0029 cdw11:ff0082ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.657 [2024-11-17 11:02:13.177623] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.657 [2024-11-17 11:02:13.177695] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff0082 cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.657 [2024-11-17 11:02:13.177709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.657 #55 NEW cov: 12461 ft: 15421 corp: 32/566b lim: 35 exec/s: 55 rss: 74Mb L: 18/33 MS: 1 InsertByte- 00:07:48.657 [2024-11-17 11:02:13.237472] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:48.657 [2024-11-17 11:02:13.237616] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:48.657 [2024-11-17 11:02:13.237839] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00006a00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.657 [2024-11-17 11:02:13.237867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.657 [2024-11-17 11:02:13.237923] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.657 [2024-11-17 11:02:13.237938] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.657 #56 NEW cov: 12461 ft: 15453 corp: 33/580b lim: 35 exec/s: 56 rss: 74Mb L: 14/33 MS: 1 EraseBytes- 00:07:48.657 [2024-11-17 11:02:13.277871] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:82ff0082 cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.657 [2024-11-17 11:02:13.277896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.657 [2024-11-17 11:02:13.277972] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:f700ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.657 [2024-11-17 11:02:13.277986] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.918 #57 NEW cov: 12461 ft: 15488 corp: 34/598b lim: 35 exec/s: 57 rss: 74Mb L: 18/33 MS: 1 ChangeByte- 00:07:48.918 [2024-11-17 11:02:13.337747] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:48.919 [2024-11-17 11:02:13.337875] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:48.919 [2024-11-17 11:02:13.338121] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00006a00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.919 [2024-11-17 11:02:13.338150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.919 [2024-11-17 11:02:13.338209] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000018 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.919 [2024-11-17 11:02:13.338225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.919 #58 NEW cov: 12461 ft: 15547 corp: 35/618b lim: 35 exec/s: 58 rss: 74Mb L: 20/33 MS: 1 ChangeByte- 00:07:48.919 [2024-11-17 11:02:13.378185] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:82ff0082 cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.919 [2024-11-17 11:02:13.378212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.919 [2024-11-17 11:02:13.378271] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:f70800ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.919 [2024-11-17 11:02:13.378286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.919 #59 NEW cov: 12461 ft: 15591 corp: 36/632b lim: 35 exec/s: 59 rss: 74Mb L: 14/33 MS: 1 EraseBytes- 00:07:48.919 [2024-11-17 11:02:13.418079] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:48.919 [2024-11-17 11:02:13.418326] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:827e0082 cdw11:00008200 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.919 [2024-11-17 11:02:13.418353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.919 [2024-11-17 11:02:13.418410] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:82000010 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.919 [2024-11-17 11:02:13.418426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.919 #60 NEW cov: 12461 ft: 15665 corp: 37/648b lim: 35 exec/s: 30 rss: 74Mb L: 16/33 MS: 1 CMP- DE: "\000\000\000\000\000\000\000\020"- 00:07:48.919 #60 DONE cov: 12461 ft: 15665 corp: 37/648b lim: 35 exec/s: 30 rss: 74Mb 00:07:48.919 ###### Recommended dictionary. ###### 00:07:48.919 "(\315}_\237\254\212\000" # Uses: 1 00:07:48.919 "\000\000\000\000\000\000\000\020" # Uses: 0 00:07:48.919 ###### End of recommended dictionary. ###### 00:07:48.919 Done 60 runs in 2 second(s) 00:07:48.919 11:02:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_2.conf /var/tmp/suppress_nvmf_fuzz 00:07:48.919 11:02:13 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:07:48.919 11:02:13 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:48.919 11:02:13 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 3 1 0x1 00:07:48.919 11:02:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=3 00:07:48.919 11:02:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:07:48.919 11:02:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:07:48.919 11:02:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:07:48.919 11:02:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_3.conf 00:07:48.919 11:02:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:48.919 11:02:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:48.919 11:02:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 3 00:07:48.919 11:02:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4403 00:07:48.919 11:02:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:07:48.919 11:02:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4403' 00:07:48.919 11:02:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4403"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:49.180 11:02:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:49.180 11:02:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:49.180 11:02:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4403' -c /tmp/fuzz_json_3.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 -Z 3 00:07:49.180 [2024-11-17 11:02:13.603087] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:07:49.180 [2024-11-17 11:02:13.603163] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid149339 ] 00:07:49.180 [2024-11-17 11:02:13.807034] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:49.180 [2024-11-17 11:02:13.820132] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:49.441 [2024-11-17 11:02:13.872709] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:49.441 [2024-11-17 11:02:13.889059] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4403 *** 00:07:49.441 INFO: Running with entropic power schedule (0xFF, 100). 00:07:49.441 INFO: Seed: 1239371677 00:07:49.441 INFO: Loaded 1 modules (387604 inline 8-bit counters): 387604 [0x2a8d80c, 0x2aec220), 00:07:49.441 INFO: Loaded 1 PC tables (387604 PCs): 387604 [0x2aec220,0x30d6360), 00:07:49.441 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:07:49.441 INFO: A corpus is not provided, starting from an empty corpus 00:07:49.441 #2 INITED exec/s: 0 rss: 65Mb 00:07:49.441 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:49.441 This may also happen if the target rejected all inputs we tried so far 00:07:49.702 NEW_FUNC[1/703]: 0x457818 in fuzz_admin_abort_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:114 00:07:49.702 NEW_FUNC[2/703]: 0x48ef68 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:49.702 #4 NEW cov: 12130 ft: 12129 corp: 2/11b lim: 20 exec/s: 0 rss: 72Mb L: 10/10 MS: 2 ChangeBit-InsertRepeatedBytes- 00:07:49.702 [2024-11-17 11:02:14.279200] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:49.702 [2024-11-17 11:02:14.279261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.702 NEW_FUNC[1/20]: 0x1383c48 in nvmf_qpair_abort_request /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:3482 00:07:49.702 NEW_FUNC[2/20]: 0x13847c8 in nvmf_qpair_abort_aer /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:3424 00:07:49.702 #5 NEW cov: 12564 ft: 13365 corp: 3/27b lim: 20 exec/s: 0 rss: 72Mb L: 16/16 MS: 1 InsertRepeatedBytes- 00:07:49.963 #6 NEW cov: 12570 ft: 13579 corp: 4/37b lim: 20 exec/s: 0 rss: 72Mb L: 10/16 MS: 1 InsertRepeatedBytes- 00:07:49.963 #7 NEW cov: 12655 ft: 13855 corp: 5/47b lim: 20 exec/s: 0 rss: 72Mb L: 10/16 MS: 1 ChangeBit- 00:07:49.963 [2024-11-17 11:02:14.439299] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:49.963 [2024-11-17 11:02:14.439328] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.963 #8 NEW cov: 12655 ft: 13966 corp: 6/63b lim: 20 exec/s: 0 rss: 72Mb L: 16/16 MS: 1 CopyPart- 00:07:49.963 #9 NEW cov: 12655 ft: 14027 corp: 7/73b lim: 20 exec/s: 0 rss: 72Mb L: 10/16 MS: 1 CopyPart- 00:07:49.963 #15 NEW cov: 12659 ft: 14179 corp: 8/86b lim: 20 exec/s: 0 rss: 72Mb L: 13/16 MS: 1 CopyPart- 00:07:50.224 [2024-11-17 11:02:14.619826] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:50.224 [2024-11-17 11:02:14.619857] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.224 NEW_FUNC[1/1]: 0x15a9278 in _nvmf_tcp_qpair_abort_request /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/tcp.c:3649 00:07:50.224 #16 NEW cov: 12686 ft: 14268 corp: 9/103b lim: 20 exec/s: 0 rss: 73Mb L: 17/17 MS: 1 InsertByte- 00:07:50.224 #17 NEW cov: 12686 ft: 14332 corp: 10/120b lim: 20 exec/s: 0 rss: 73Mb L: 17/17 MS: 1 CopyPart- 00:07:50.224 #18 NEW cov: 12686 ft: 14452 corp: 11/133b lim: 20 exec/s: 0 rss: 73Mb L: 13/17 MS: 1 ChangeByte- 00:07:50.224 NEW_FUNC[1/1]: 0x1c4e498 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:07:50.224 #19 NEW cov: 12709 ft: 14503 corp: 12/148b lim: 20 exec/s: 0 rss: 73Mb L: 15/17 MS: 1 EraseBytes- 00:07:50.224 [2024-11-17 11:02:14.870520] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:50.224 [2024-11-17 11:02:14.870547] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.485 #20 NEW cov: 12709 ft: 14560 corp: 13/166b lim: 20 exec/s: 0 rss: 73Mb L: 18/18 MS: 1 InsertByte- 00:07:50.485 #21 NEW cov: 12709 ft: 14736 corp: 14/176b lim: 20 exec/s: 21 rss: 73Mb L: 10/18 MS: 1 CMP- DE: "\000\000\000\017"- 00:07:50.485 [2024-11-17 11:02:14.950722] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:50.485 [2024-11-17 11:02:14.950749] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.485 #22 NEW cov: 12709 ft: 14791 corp: 15/195b lim: 20 exec/s: 22 rss: 73Mb L: 19/19 MS: 1 InsertByte- 00:07:50.485 #23 NEW cov: 12709 ft: 14810 corp: 16/207b lim: 20 exec/s: 23 rss: 73Mb L: 12/19 MS: 1 EraseBytes- 00:07:50.485 #24 NEW cov: 12709 ft: 14874 corp: 17/217b lim: 20 exec/s: 24 rss: 73Mb L: 10/19 MS: 1 ShuffleBytes- 00:07:50.485 #25 NEW cov: 12709 ft: 15175 corp: 18/222b lim: 20 exec/s: 25 rss: 73Mb L: 5/19 MS: 1 EraseBytes- 00:07:50.746 #26 NEW cov: 12709 ft: 15199 corp: 19/238b lim: 20 exec/s: 26 rss: 73Mb L: 16/19 MS: 1 InsertByte- 00:07:50.746 #27 NEW cov: 12709 ft: 15209 corp: 20/254b lim: 20 exec/s: 27 rss: 73Mb L: 16/19 MS: 1 InsertRepeatedBytes- 00:07:50.746 #28 NEW cov: 12709 ft: 15275 corp: 21/259b lim: 20 exec/s: 28 rss: 73Mb L: 5/19 MS: 1 ChangeByte- 00:07:50.746 #29 NEW cov: 12709 ft: 15308 corp: 22/271b lim: 20 exec/s: 29 rss: 73Mb L: 12/19 MS: 1 ChangeBit- 00:07:50.746 #30 NEW cov: 12709 ft: 15331 corp: 23/286b lim: 20 exec/s: 30 rss: 74Mb L: 15/19 MS: 1 CrossOver- 00:07:51.008 [2024-11-17 11:02:15.412098] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:51.008 [2024-11-17 11:02:15.412126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.008 #31 NEW cov: 12709 ft: 15398 corp: 24/305b lim: 20 exec/s: 31 rss: 74Mb L: 19/19 MS: 1 PersAutoDict- DE: "\000\000\000\017"- 00:07:51.008 #32 NEW cov: 12709 ft: 15404 corp: 25/322b lim: 20 exec/s: 32 rss: 74Mb L: 17/19 MS: 1 CrossOver- 00:07:51.008 #35 NEW cov: 12709 ft: 15473 corp: 26/328b lim: 20 exec/s: 35 rss: 74Mb L: 6/19 MS: 3 InsertByte-ChangeBinInt-PersAutoDict- DE: "\000\000\000\017"- 00:07:51.008 #36 NEW cov: 12709 ft: 15487 corp: 27/344b lim: 20 exec/s: 36 rss: 74Mb L: 16/19 MS: 1 ChangeBinInt- 00:07:51.008 #37 NEW cov: 12709 ft: 15535 corp: 28/362b lim: 20 exec/s: 37 rss: 74Mb L: 18/19 MS: 1 InsertByte- 00:07:51.269 #38 NEW cov: 12709 ft: 15565 corp: 29/367b lim: 20 exec/s: 38 rss: 74Mb L: 5/19 MS: 1 CrossOver- 00:07:51.269 #39 NEW cov: 12709 ft: 15600 corp: 30/381b lim: 20 exec/s: 39 rss: 74Mb L: 14/19 MS: 1 CrossOver- 00:07:51.270 #40 NEW cov: 12709 ft: 15653 corp: 31/397b lim: 20 exec/s: 40 rss: 74Mb L: 16/19 MS: 1 ChangeBinInt- 00:07:51.270 #41 NEW cov: 12709 ft: 15669 corp: 32/413b lim: 20 exec/s: 41 rss: 74Mb L: 16/19 MS: 1 ChangeBit- 00:07:51.270 #42 NEW cov: 12709 ft: 15736 corp: 33/432b lim: 20 exec/s: 42 rss: 74Mb L: 19/19 MS: 1 CrossOver- 00:07:51.270 #43 NEW cov: 12709 ft: 15742 corp: 34/451b lim: 20 exec/s: 43 rss: 74Mb L: 19/19 MS: 1 InsertRepeatedBytes- 00:07:51.531 #44 NEW cov: 12709 ft: 15754 corp: 35/469b lim: 20 exec/s: 22 rss: 74Mb L: 18/19 MS: 1 CrossOver- 00:07:51.531 #44 DONE cov: 12709 ft: 15754 corp: 35/469b lim: 20 exec/s: 22 rss: 74Mb 00:07:51.531 ###### Recommended dictionary. ###### 00:07:51.531 "\000\000\000\017" # Uses: 2 00:07:51.531 ###### End of recommended dictionary. ###### 00:07:51.531 Done 44 runs in 2 second(s) 00:07:51.531 11:02:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_3.conf /var/tmp/suppress_nvmf_fuzz 00:07:51.531 11:02:16 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:07:51.531 11:02:16 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:51.531 11:02:16 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 4 1 0x1 00:07:51.531 11:02:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=4 00:07:51.531 11:02:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:07:51.531 11:02:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:07:51.531 11:02:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:07:51.531 11:02:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_4.conf 00:07:51.531 11:02:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:51.531 11:02:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:51.531 11:02:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 4 00:07:51.531 11:02:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4404 00:07:51.531 11:02:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:07:51.531 11:02:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4404' 00:07:51.531 11:02:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4404"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:51.531 11:02:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:51.531 11:02:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:51.531 11:02:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4404' -c /tmp/fuzz_json_4.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 -Z 4 00:07:51.531 [2024-11-17 11:02:16.117750] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:07:51.531 [2024-11-17 11:02:16.117819] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid149749 ] 00:07:51.820 [2024-11-17 11:02:16.315019] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:51.820 [2024-11-17 11:02:16.327629] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:51.820 [2024-11-17 11:02:16.380020] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:51.820 [2024-11-17 11:02:16.396369] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4404 *** 00:07:51.820 INFO: Running with entropic power schedule (0xFF, 100). 00:07:51.820 INFO: Seed: 3749370303 00:07:51.820 INFO: Loaded 1 modules (387604 inline 8-bit counters): 387604 [0x2a8d80c, 0x2aec220), 00:07:51.820 INFO: Loaded 1 PC tables (387604 PCs): 387604 [0x2aec220,0x30d6360), 00:07:51.820 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:07:51.820 INFO: A corpus is not provided, starting from an empty corpus 00:07:51.820 #2 INITED exec/s: 0 rss: 65Mb 00:07:51.820 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:51.820 This may also happen if the target rejected all inputs we tried so far 00:07:51.821 [2024-11-17 11:02:16.461732] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0a0affff cdw11:ff0a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.821 [2024-11-17 11:02:16.461760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.342 NEW_FUNC[1/716]: 0x458918 in fuzz_admin_create_io_completion_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:126 00:07:52.342 NEW_FUNC[2/716]: 0x48ef68 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:52.342 #11 NEW cov: 12227 ft: 12226 corp: 2/8b lim: 35 exec/s: 0 rss: 72Mb L: 7/7 MS: 4 CopyPart-ShuffleBytes-CMP-CopyPart- DE: "\377\377"- 00:07:52.342 [2024-11-17 11:02:16.793037] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:17178a17 cdw11:17170000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.342 [2024-11-17 11:02:16.793109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.342 [2024-11-17 11:02:16.793199] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:17171717 cdw11:17170000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.342 [2024-11-17 11:02:16.793229] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.342 #14 NEW cov: 12357 ft: 13544 corp: 3/23b lim: 35 exec/s: 0 rss: 72Mb L: 15/15 MS: 3 ChangeBit-ShuffleBytes-InsertRepeatedBytes- 00:07:52.342 [2024-11-17 11:02:16.842685] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0a0aff07 cdw11:ff0a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.343 [2024-11-17 11:02:16.842710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.343 #15 NEW cov: 12363 ft: 13795 corp: 4/30b lim: 35 exec/s: 0 rss: 72Mb L: 7/15 MS: 1 ChangeByte- 00:07:52.343 [2024-11-17 11:02:16.902987] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:17178a17 cdw11:17170000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.343 [2024-11-17 11:02:16.903013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.343 [2024-11-17 11:02:16.903084] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:98171717 cdw11:17170000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.343 [2024-11-17 11:02:16.903098] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.343 #16 NEW cov: 12448 ft: 14123 corp: 5/46b lim: 35 exec/s: 0 rss: 72Mb L: 16/16 MS: 1 InsertByte- 00:07:52.343 [2024-11-17 11:02:16.963026] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0a0aff07 cdw11:ff0a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.343 [2024-11-17 11:02:16.963056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.604 #17 NEW cov: 12448 ft: 14225 corp: 6/53b lim: 35 exec/s: 0 rss: 72Mb L: 7/16 MS: 1 ShuffleBytes- 00:07:52.604 [2024-11-17 11:02:17.023189] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0a0aff07 cdw11:ff0a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.604 [2024-11-17 11:02:17.023212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.604 #18 NEW cov: 12448 ft: 14292 corp: 7/66b lim: 35 exec/s: 0 rss: 72Mb L: 13/16 MS: 1 CopyPart- 00:07:52.604 [2024-11-17 11:02:17.083352] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0a0aff07 cdw11:ff0a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.604 [2024-11-17 11:02:17.083377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.604 #19 NEW cov: 12448 ft: 14427 corp: 8/74b lim: 35 exec/s: 0 rss: 72Mb L: 8/16 MS: 1 InsertByte- 00:07:52.604 [2024-11-17 11:02:17.123434] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0a0aff07 cdw11:ff0a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.604 [2024-11-17 11:02:17.123459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.604 #20 NEW cov: 12448 ft: 14497 corp: 9/87b lim: 35 exec/s: 0 rss: 73Mb L: 13/16 MS: 1 ChangeBit- 00:07:52.604 [2024-11-17 11:02:17.183610] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0a0affff cdw11:ff0a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.604 [2024-11-17 11:02:17.183635] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.604 #21 NEW cov: 12448 ft: 14556 corp: 10/94b lim: 35 exec/s: 0 rss: 73Mb L: 7/16 MS: 1 ChangeBit- 00:07:52.604 [2024-11-17 11:02:17.223879] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:070ad9ff cdw11:0aff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.604 [2024-11-17 11:02:17.223905] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.604 [2024-11-17 11:02:17.223977] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:0a0a0a07 cdw11:ff0a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.604 [2024-11-17 11:02:17.223991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.604 #22 NEW cov: 12448 ft: 14637 corp: 11/108b lim: 35 exec/s: 0 rss: 73Mb L: 14/16 MS: 1 InsertByte- 00:07:52.864 [2024-11-17 11:02:17.263839] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0a0aff07 cdw11:ff0a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.864 [2024-11-17 11:02:17.263863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.864 #23 NEW cov: 12448 ft: 14734 corp: 12/118b lim: 35 exec/s: 0 rss: 73Mb L: 10/16 MS: 1 EraseBytes- 00:07:52.864 [2024-11-17 11:02:17.323992] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:0a0a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.864 [2024-11-17 11:02:17.324018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.864 NEW_FUNC[1/1]: 0x1c4e498 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:07:52.864 #29 NEW cov: 12471 ft: 14778 corp: 13/125b lim: 35 exec/s: 0 rss: 73Mb L: 7/16 MS: 1 CopyPart- 00:07:52.864 [2024-11-17 11:02:17.364063] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:7c7cff7c cdw11:7c7c0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.864 [2024-11-17 11:02:17.364088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.864 #33 NEW cov: 12471 ft: 14808 corp: 14/137b lim: 35 exec/s: 0 rss: 73Mb L: 12/16 MS: 4 CopyPart-InsertByte-PersAutoDict-InsertRepeatedBytes- DE: "\377\377"- 00:07:52.864 [2024-11-17 11:02:17.404341] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:070aff2a cdw11:0aff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.864 [2024-11-17 11:02:17.404365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.864 [2024-11-17 11:02:17.404418] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:0a0a0a07 cdw11:ff0a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.864 [2024-11-17 11:02:17.404431] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.864 #34 NEW cov: 12471 ft: 14814 corp: 15/151b lim: 35 exec/s: 0 rss: 73Mb L: 14/16 MS: 1 InsertByte- 00:07:52.864 [2024-11-17 11:02:17.444305] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0a0affff cdw11:5bff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.864 [2024-11-17 11:02:17.444330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.864 #35 NEW cov: 12471 ft: 14842 corp: 16/159b lim: 35 exec/s: 35 rss: 73Mb L: 8/16 MS: 1 InsertByte- 00:07:52.864 [2024-11-17 11:02:17.484405] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:070a30ff cdw11:0aff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.864 [2024-11-17 11:02:17.484430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.864 #36 NEW cov: 12471 ft: 14853 corp: 17/167b lim: 35 exec/s: 36 rss: 73Mb L: 8/16 MS: 1 InsertByte- 00:07:53.125 [2024-11-17 11:02:17.524573] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0a0affff cdw11:ff0a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.125 [2024-11-17 11:02:17.524598] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.126 #37 NEW cov: 12471 ft: 14873 corp: 18/174b lim: 35 exec/s: 37 rss: 73Mb L: 7/16 MS: 1 PersAutoDict- DE: "\377\377"- 00:07:53.126 [2024-11-17 11:02:17.565134] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ff0a28fd cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.126 [2024-11-17 11:02:17.565159] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.126 [2024-11-17 11:02:17.565212] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.126 [2024-11-17 11:02:17.565226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.126 [2024-11-17 11:02:17.565279] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.126 [2024-11-17 11:02:17.565292] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.126 [2024-11-17 11:02:17.565346] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.126 [2024-11-17 11:02:17.565360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:53.126 #46 NEW cov: 12471 ft: 15255 corp: 19/207b lim: 35 exec/s: 46 rss: 73Mb L: 33/33 MS: 4 PersAutoDict-ChangeBit-InsertByte-InsertRepeatedBytes- DE: "\377\377"- 00:07:53.126 [2024-11-17 11:02:17.604742] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0a0dffff cdw11:ff0a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.126 [2024-11-17 11:02:17.604766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.126 #47 NEW cov: 12471 ft: 15336 corp: 20/214b lim: 35 exec/s: 47 rss: 73Mb L: 7/33 MS: 1 ChangeBinInt- 00:07:53.126 [2024-11-17 11:02:17.644908] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0a0aff07 cdw11:ff0a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.126 [2024-11-17 11:02:17.644934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.126 #48 NEW cov: 12471 ft: 15351 corp: 21/227b lim: 35 exec/s: 48 rss: 73Mb L: 13/33 MS: 1 ShuffleBytes- 00:07:53.126 [2024-11-17 11:02:17.685175] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0a0aff07 cdw11:ff0a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.126 [2024-11-17 11:02:17.685199] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.126 [2024-11-17 11:02:17.685258] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:0aff070a cdw11:3a2a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.126 [2024-11-17 11:02:17.685271] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.126 #49 NEW cov: 12471 ft: 15400 corp: 22/241b lim: 35 exec/s: 49 rss: 73Mb L: 14/33 MS: 1 InsertByte- 00:07:53.126 [2024-11-17 11:02:17.725647] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:c0c0c0c0 cdw11:c0c00003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.126 [2024-11-17 11:02:17.725672] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.126 [2024-11-17 11:02:17.725740] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:c0c0c0c0 cdw11:c0c00003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.126 [2024-11-17 11:02:17.725754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.126 [2024-11-17 11:02:17.725806] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:c0c0c0c0 cdw11:c0c00003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.126 [2024-11-17 11:02:17.725820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.126 [2024-11-17 11:02:17.725873] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:c0ffc0c0 cdw11:ff0a0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.126 [2024-11-17 11:02:17.725887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:53.126 #50 NEW cov: 12471 ft: 15407 corp: 23/272b lim: 35 exec/s: 50 rss: 73Mb L: 31/33 MS: 1 InsertRepeatedBytes- 00:07:53.386 [2024-11-17 11:02:17.785523] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0a0aff07 cdw11:ff0a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.386 [2024-11-17 11:02:17.785548] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.386 [2024-11-17 11:02:17.785618] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:0aff070a cdw11:3a2a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.386 [2024-11-17 11:02:17.785632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.387 #51 NEW cov: 12471 ft: 15413 corp: 24/286b lim: 35 exec/s: 51 rss: 73Mb L: 14/33 MS: 1 ShuffleBytes- 00:07:53.387 [2024-11-17 11:02:17.845463] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:30ffffff cdw11:0a0a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.387 [2024-11-17 11:02:17.845487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.387 #52 NEW cov: 12471 ft: 15423 corp: 25/293b lim: 35 exec/s: 52 rss: 73Mb L: 7/33 MS: 1 ChangeByte- 00:07:53.387 [2024-11-17 11:02:17.885596] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0a2c0aff cdw11:070a0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.387 [2024-11-17 11:02:17.885620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.387 #55 NEW cov: 12471 ft: 15436 corp: 26/300b lim: 35 exec/s: 55 rss: 73Mb L: 7/33 MS: 3 EraseBytes-ShuffleBytes-InsertByte- 00:07:53.387 [2024-11-17 11:02:17.926159] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:c0c0c0c0 cdw11:c0c00003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.387 [2024-11-17 11:02:17.926184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.387 [2024-11-17 11:02:17.926237] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:c0c0c0c0 cdw11:c0c00003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.387 [2024-11-17 11:02:17.926254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.387 [2024-11-17 11:02:17.926322] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:c0c0c0c0 cdw11:c0c00003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.387 [2024-11-17 11:02:17.926336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.387 [2024-11-17 11:02:17.926388] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:c0ffc0c0 cdw11:ff0a0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.387 [2024-11-17 11:02:17.926402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:53.387 #56 NEW cov: 12471 ft: 15476 corp: 27/332b lim: 35 exec/s: 56 rss: 73Mb L: 32/33 MS: 1 InsertByte- 00:07:53.387 [2024-11-17 11:02:17.985883] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ff0affff cdw11:0aff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.387 [2024-11-17 11:02:17.985910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.387 #57 NEW cov: 12471 ft: 15488 corp: 28/339b lim: 35 exec/s: 57 rss: 73Mb L: 7/33 MS: 1 CopyPart- 00:07:53.647 [2024-11-17 11:02:18.046544] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00002360 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.647 [2024-11-17 11:02:18.046570] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.647 [2024-11-17 11:02:18.046641] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.647 [2024-11-17 11:02:18.046656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.648 [2024-11-17 11:02:18.046710] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.648 [2024-11-17 11:02:18.046723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.648 [2024-11-17 11:02:18.046776] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.648 [2024-11-17 11:02:18.046790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:53.648 #62 NEW cov: 12471 ft: 15492 corp: 29/371b lim: 35 exec/s: 62 rss: 73Mb L: 32/33 MS: 5 CrossOver-ChangeByte-ChangeByte-ChangeByte-InsertRepeatedBytes- 00:07:53.648 [2024-11-17 11:02:18.106425] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:070ad9ff cdw11:0aff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.648 [2024-11-17 11:02:18.106450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.648 [2024-11-17 11:02:18.106506] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:0a0a2b07 cdw11:ff0a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.648 [2024-11-17 11:02:18.106520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.648 #63 NEW cov: 12471 ft: 15504 corp: 30/385b lim: 35 exec/s: 63 rss: 74Mb L: 14/33 MS: 1 ChangeByte- 00:07:53.648 [2024-11-17 11:02:18.166423] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ff0dffff cdw11:ff0a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.648 [2024-11-17 11:02:18.166448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.648 #64 NEW cov: 12471 ft: 15533 corp: 31/392b lim: 35 exec/s: 64 rss: 74Mb L: 7/33 MS: 1 PersAutoDict- DE: "\377\377"- 00:07:53.648 [2024-11-17 11:02:18.227022] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:c0c0c0c0 cdw11:c0c00003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.648 [2024-11-17 11:02:18.227054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.648 [2024-11-17 11:02:18.227122] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:c0c0c0c0 cdw11:c0c00003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.648 [2024-11-17 11:02:18.227137] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.648 [2024-11-17 11:02:18.227191] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:c0c0c0c0 cdw11:c0c00003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.648 [2024-11-17 11:02:18.227207] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.648 [2024-11-17 11:02:18.227260] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:c0ffc0c0 cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.648 [2024-11-17 11:02:18.227273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:53.648 #65 NEW cov: 12471 ft: 15546 corp: 32/426b lim: 35 exec/s: 65 rss: 74Mb L: 34/34 MS: 1 PersAutoDict- DE: "\377\377"- 00:07:53.648 [2024-11-17 11:02:18.286749] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:7c7c7c7c cdw11:7c7c0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.648 [2024-11-17 11:02:18.286774] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.908 #66 NEW cov: 12471 ft: 15550 corp: 33/433b lim: 35 exec/s: 66 rss: 74Mb L: 7/34 MS: 1 EraseBytes- 00:07:53.908 [2024-11-17 11:02:18.347541] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.908 [2024-11-17 11:02:18.347567] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.908 [2024-11-17 11:02:18.347637] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.908 [2024-11-17 11:02:18.347651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.908 [2024-11-17 11:02:18.347704] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.908 [2024-11-17 11:02:18.347717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.908 [2024-11-17 11:02:18.347770] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:070a00ff cdw11:0aff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.908 [2024-11-17 11:02:18.347784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:53.908 [2024-11-17 11:02:18.347836] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:8 nsid:0 cdw10:070a0a0a cdw11:ff2a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.908 [2024-11-17 11:02:18.347850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:53.908 #67 NEW cov: 12471 ft: 15613 corp: 34/468b lim: 35 exec/s: 67 rss: 74Mb L: 35/35 MS: 1 InsertRepeatedBytes- 00:07:53.908 [2024-11-17 11:02:18.407573] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:17178a17 cdw11:17170002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.908 [2024-11-17 11:02:18.407602] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.908 [2024-11-17 11:02:18.407672] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:77777777 cdw11:77770002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.908 [2024-11-17 11:02:18.407687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.908 [2024-11-17 11:02:18.407741] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:77777777 cdw11:77770002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.908 [2024-11-17 11:02:18.407755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.908 [2024-11-17 11:02:18.407807] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:17177777 cdw11:17170000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.908 [2024-11-17 11:02:18.407821] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:53.908 #68 NEW cov: 12471 ft: 15630 corp: 35/500b lim: 35 exec/s: 68 rss: 74Mb L: 32/35 MS: 1 InsertRepeatedBytes- 00:07:53.908 [2024-11-17 11:02:18.447196] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0afffeff cdw11:0a0a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.908 [2024-11-17 11:02:18.447221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.908 #69 NEW cov: 12471 ft: 15652 corp: 36/507b lim: 35 exec/s: 34 rss: 74Mb L: 7/35 MS: 1 ChangeBit- 00:07:53.908 #69 DONE cov: 12471 ft: 15652 corp: 36/507b lim: 35 exec/s: 34 rss: 74Mb 00:07:53.908 ###### Recommended dictionary. ###### 00:07:53.908 "\377\377" # Uses: 5 00:07:53.908 ###### End of recommended dictionary. ###### 00:07:53.908 Done 69 runs in 2 second(s) 00:07:53.909 11:02:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_4.conf /var/tmp/suppress_nvmf_fuzz 00:07:54.169 11:02:18 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:07:54.169 11:02:18 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:54.169 11:02:18 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 5 1 0x1 00:07:54.169 11:02:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=5 00:07:54.169 11:02:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:07:54.169 11:02:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:07:54.169 11:02:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:07:54.169 11:02:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_5.conf 00:07:54.169 11:02:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:54.169 11:02:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:54.169 11:02:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 5 00:07:54.169 11:02:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4405 00:07:54.169 11:02:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:07:54.169 11:02:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4405' 00:07:54.169 11:02:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4405"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:54.169 11:02:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:54.169 11:02:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:54.169 11:02:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4405' -c /tmp/fuzz_json_5.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 -Z 5 00:07:54.169 [2024-11-17 11:02:18.609749] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:07:54.169 [2024-11-17 11:02:18.609827] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid150282 ] 00:07:54.169 [2024-11-17 11:02:18.809714] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:54.169 [2024-11-17 11:02:18.822334] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:54.430 [2024-11-17 11:02:18.874564] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:54.430 [2024-11-17 11:02:18.890893] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4405 *** 00:07:54.430 INFO: Running with entropic power schedule (0xFF, 100). 00:07:54.430 INFO: Seed: 1948400390 00:07:54.430 INFO: Loaded 1 modules (387604 inline 8-bit counters): 387604 [0x2a8d80c, 0x2aec220), 00:07:54.430 INFO: Loaded 1 PC tables (387604 PCs): 387604 [0x2aec220,0x30d6360), 00:07:54.430 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:07:54.430 INFO: A corpus is not provided, starting from an empty corpus 00:07:54.430 #2 INITED exec/s: 0 rss: 65Mb 00:07:54.430 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:54.430 This may also happen if the target rejected all inputs we tried so far 00:07:54.430 [2024-11-17 11:02:18.956321] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:c1720a0a cdw11:4ca30005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.430 [2024-11-17 11:02:18.956349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.691 NEW_FUNC[1/716]: 0x45aab8 in fuzz_admin_create_io_submission_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:142 00:07:54.691 NEW_FUNC[2/716]: 0x48ef68 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:54.691 #5 NEW cov: 12238 ft: 12235 corp: 2/11b lim: 45 exec/s: 0 rss: 72Mb L: 10/10 MS: 3 CopyPart-ChangeBit-CMP- DE: "\012\301rL\243\254\212\000"- 00:07:54.691 [2024-11-17 11:02:19.287247] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:724c0ac1 cdw11:a3ac0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.691 [2024-11-17 11:02:19.287291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.691 #6 NEW cov: 12368 ft: 12652 corp: 3/26b lim: 45 exec/s: 0 rss: 72Mb L: 15/15 MS: 1 CopyPart- 00:07:54.951 [2024-11-17 11:02:19.347913] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:724c0ac1 cdw11:a3ac0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.951 [2024-11-17 11:02:19.347940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.951 [2024-11-17 11:02:19.347994] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ac8a4ca3 cdw11:00ff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.951 [2024-11-17 11:02:19.348008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.951 [2024-11-17 11:02:19.348066] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.951 [2024-11-17 11:02:19.348080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.951 [2024-11-17 11:02:19.348135] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.951 [2024-11-17 11:02:19.348152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.951 [2024-11-17 11:02:19.348204] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.951 [2024-11-17 11:02:19.348218] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:54.951 #7 NEW cov: 12374 ft: 13803 corp: 4/71b lim: 45 exec/s: 0 rss: 72Mb L: 45/45 MS: 1 InsertRepeatedBytes- 00:07:54.951 [2024-11-17 11:02:19.407996] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:724c0ac1 cdw11:a3ac0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.951 [2024-11-17 11:02:19.408022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.951 [2024-11-17 11:02:19.408097] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ac8a4ca3 cdw11:00ff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.951 [2024-11-17 11:02:19.408112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.951 [2024-11-17 11:02:19.408165] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.951 [2024-11-17 11:02:19.408178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.951 [2024-11-17 11:02:19.408239] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.951 [2024-11-17 11:02:19.408252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.951 [2024-11-17 11:02:19.408306] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:fdff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.951 [2024-11-17 11:02:19.408319] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:54.951 #8 NEW cov: 12459 ft: 14041 corp: 5/116b lim: 45 exec/s: 0 rss: 72Mb L: 45/45 MS: 1 ChangeBit- 00:07:54.951 [2024-11-17 11:02:19.467540] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:c1720a0a cdw11:4ca30005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.951 [2024-11-17 11:02:19.467566] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.951 #9 NEW cov: 12459 ft: 14265 corp: 6/126b lim: 45 exec/s: 0 rss: 72Mb L: 10/45 MS: 1 ShuffleBytes- 00:07:54.951 [2024-11-17 11:02:19.508258] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:724c0ac1 cdw11:a3ac0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.951 [2024-11-17 11:02:19.508284] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.951 [2024-11-17 11:02:19.508353] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ac8a4ca3 cdw11:00ff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.951 [2024-11-17 11:02:19.508368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.951 [2024-11-17 11:02:19.508420] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.951 [2024-11-17 11:02:19.508434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.951 [2024-11-17 11:02:19.508488] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.951 [2024-11-17 11:02:19.508505] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.951 [2024-11-17 11:02:19.508559] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.951 [2024-11-17 11:02:19.508572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:54.951 #10 NEW cov: 12459 ft: 14381 corp: 7/171b lim: 45 exec/s: 0 rss: 72Mb L: 45/45 MS: 1 ChangeByte- 00:07:54.951 [2024-11-17 11:02:19.548375] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:724c0ac1 cdw11:a3ac0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.951 [2024-11-17 11:02:19.548400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.951 [2024-11-17 11:02:19.548453] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ac8a4ca3 cdw11:00ff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.951 [2024-11-17 11:02:19.548467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.951 [2024-11-17 11:02:19.548521] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.951 [2024-11-17 11:02:19.548551] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.951 [2024-11-17 11:02:19.548604] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.951 [2024-11-17 11:02:19.548617] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.951 [2024-11-17 11:02:19.548667] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.951 [2024-11-17 11:02:19.548682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:54.951 #16 NEW cov: 12459 ft: 14461 corp: 8/216b lim: 45 exec/s: 0 rss: 72Mb L: 45/45 MS: 1 ChangeBit- 00:07:54.951 [2024-11-17 11:02:19.588544] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:724c0ac1 cdw11:a3ac0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.951 [2024-11-17 11:02:19.588571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.951 [2024-11-17 11:02:19.588627] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ac8a4ca3 cdw11:00ff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.951 [2024-11-17 11:02:19.588641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.951 [2024-11-17 11:02:19.588695] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.951 [2024-11-17 11:02:19.588709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.951 [2024-11-17 11:02:19.588763] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:0ac1ffff cdw11:724c0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.951 [2024-11-17 11:02:19.588776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.951 [2024-11-17 11:02:19.588829] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:8 nsid:0 cdw10:ffff00ff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.952 [2024-11-17 11:02:19.588842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:55.211 #17 NEW cov: 12459 ft: 14490 corp: 9/261b lim: 45 exec/s: 0 rss: 72Mb L: 45/45 MS: 1 PersAutoDict- DE: "\012\301rL\243\254\212\000"- 00:07:55.211 [2024-11-17 11:02:19.648638] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:724c0ac1 cdw11:a3ac0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.211 [2024-11-17 11:02:19.648665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.211 [2024-11-17 11:02:19.648735] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:5375b45c cdw11:ff000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.211 [2024-11-17 11:02:19.648749] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.211 [2024-11-17 11:02:19.648799] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.211 [2024-11-17 11:02:19.648813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.212 [2024-11-17 11:02:19.648866] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.212 [2024-11-17 11:02:19.648879] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.212 [2024-11-17 11:02:19.648932] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.212 [2024-11-17 11:02:19.648946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:55.212 #18 NEW cov: 12459 ft: 14512 corp: 10/306b lim: 45 exec/s: 0 rss: 72Mb L: 45/45 MS: 1 ChangeBinInt- 00:07:55.212 [2024-11-17 11:02:19.708647] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:724c0ac1 cdw11:a3ac0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.212 [2024-11-17 11:02:19.708673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.212 [2024-11-17 11:02:19.708726] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.212 [2024-11-17 11:02:19.708740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.212 [2024-11-17 11:02:19.708791] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.212 [2024-11-17 11:02:19.708805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.212 [2024-11-17 11:02:19.708855] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.212 [2024-11-17 11:02:19.708868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.212 #19 NEW cov: 12459 ft: 14562 corp: 11/346b lim: 45 exec/s: 0 rss: 72Mb L: 40/45 MS: 1 EraseBytes- 00:07:55.212 [2024-11-17 11:02:19.748312] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:0ac10acb cdw11:724c0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.212 [2024-11-17 11:02:19.748338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.212 #21 NEW cov: 12459 ft: 14681 corp: 12/356b lim: 45 exec/s: 0 rss: 72Mb L: 10/45 MS: 2 InsertByte-PersAutoDict- DE: "\012\301rL\243\254\212\000"- 00:07:55.212 [2024-11-17 11:02:19.788822] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:0ac10acb cdw11:724c0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.212 [2024-11-17 11:02:19.788851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.212 [2024-11-17 11:02:19.788922] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.212 [2024-11-17 11:02:19.788937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.212 [2024-11-17 11:02:19.788991] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.212 [2024-11-17 11:02:19.789005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.212 [2024-11-17 11:02:19.789054] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.212 [2024-11-17 11:02:19.789068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.212 NEW_FUNC[1/1]: 0x1c4e498 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:07:55.212 #22 NEW cov: 12482 ft: 14707 corp: 13/397b lim: 45 exec/s: 0 rss: 73Mb L: 41/45 MS: 1 InsertRepeatedBytes- 00:07:55.212 [2024-11-17 11:02:19.849151] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:724c0ac1 cdw11:a3ac0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.212 [2024-11-17 11:02:19.849177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.212 [2024-11-17 11:02:19.849229] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ac8a4ca3 cdw11:00ff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.212 [2024-11-17 11:02:19.849243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.212 [2024-11-17 11:02:19.849295] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.212 [2024-11-17 11:02:19.849326] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.212 [2024-11-17 11:02:19.849379] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffff0aff cdw11:724c0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.212 [2024-11-17 11:02:19.849392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.212 [2024-11-17 11:02:19.849446] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:8 nsid:0 cdw10:ffff00ff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.212 [2024-11-17 11:02:19.849459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:55.473 #23 NEW cov: 12482 ft: 14722 corp: 14/442b lim: 45 exec/s: 0 rss: 73Mb L: 45/45 MS: 1 ShuffleBytes- 00:07:55.473 [2024-11-17 11:02:19.909214] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:0ac10acb cdw11:724c0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.473 [2024-11-17 11:02:19.909240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.473 [2024-11-17 11:02:19.909295] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:c172cb0a cdw11:4ca30005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.473 [2024-11-17 11:02:19.909308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.473 [2024-11-17 11:02:19.909361] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.473 [2024-11-17 11:02:19.909378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.473 [2024-11-17 11:02:19.909429] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.473 [2024-11-17 11:02:19.909444] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.473 #24 NEW cov: 12482 ft: 14739 corp: 15/483b lim: 45 exec/s: 24 rss: 73Mb L: 41/45 MS: 1 CrossOver- 00:07:55.473 [2024-11-17 11:02:19.968897] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:c1720a0f cdw11:4ca30005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.473 [2024-11-17 11:02:19.968922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.473 #25 NEW cov: 12482 ft: 14805 corp: 16/493b lim: 45 exec/s: 25 rss: 73Mb L: 10/45 MS: 1 ChangeBinInt- 00:07:55.473 [2024-11-17 11:02:20.029722] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:724c0ac1 cdw11:a3ac0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.473 [2024-11-17 11:02:20.029750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.473 [2024-11-17 11:02:20.029807] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ac8a4ca3 cdw11:00ff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.473 [2024-11-17 11:02:20.029822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.473 [2024-11-17 11:02:20.029874] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.473 [2024-11-17 11:02:20.029888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.473 [2024-11-17 11:02:20.029942] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.473 [2024-11-17 11:02:20.029956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.473 [2024-11-17 11:02:20.030011] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:8 nsid:0 cdw10:1eff003f cdw11:fdff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.473 [2024-11-17 11:02:20.030026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:55.473 #26 NEW cov: 12482 ft: 14822 corp: 17/538b lim: 45 exec/s: 26 rss: 73Mb L: 45/45 MS: 1 CMP- DE: "\000\000?\036"- 00:07:55.473 [2024-11-17 11:02:20.089906] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:4cc10aac cdw11:0a720005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.473 [2024-11-17 11:02:20.089934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.473 [2024-11-17 11:02:20.089988] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ac8a4ca3 cdw11:00ff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.473 [2024-11-17 11:02:20.090001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.473 [2024-11-17 11:02:20.090057] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.473 [2024-11-17 11:02:20.090071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.473 [2024-11-17 11:02:20.090123] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.473 [2024-11-17 11:02:20.090140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.473 [2024-11-17 11:02:20.090194] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.473 [2024-11-17 11:02:20.090207] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:55.473 #27 NEW cov: 12482 ft: 14844 corp: 18/583b lim: 45 exec/s: 27 rss: 73Mb L: 45/45 MS: 1 ShuffleBytes- 00:07:55.733 [2024-11-17 11:02:20.130017] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:0ac10acb cdw11:724c0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.733 [2024-11-17 11:02:20.130049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.733 [2024-11-17 11:02:20.130106] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:c172cb0a cdw11:4ca30005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.733 [2024-11-17 11:02:20.130119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.733 [2024-11-17 11:02:20.130170] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.733 [2024-11-17 11:02:20.130183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.733 [2024-11-17 11:02:20.130235] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.733 [2024-11-17 11:02:20.130248] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.733 [2024-11-17 11:02:20.130299] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.734 [2024-11-17 11:02:20.130312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:55.734 #28 NEW cov: 12482 ft: 14855 corp: 19/628b lim: 45 exec/s: 28 rss: 73Mb L: 45/45 MS: 1 InsertRepeatedBytes- 00:07:55.734 [2024-11-17 11:02:20.189515] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:0a0a004a cdw11:c1a30005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.734 [2024-11-17 11:02:20.189541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.734 #30 NEW cov: 12482 ft: 14919 corp: 20/638b lim: 45 exec/s: 30 rss: 73Mb L: 10/45 MS: 2 EraseBytes-CopyPart- 00:07:55.734 [2024-11-17 11:02:20.230108] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:0ac10acb cdw11:724c0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.734 [2024-11-17 11:02:20.230135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.734 [2024-11-17 11:02:20.230188] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:c172cb0a cdw11:4ca30005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.734 [2024-11-17 11:02:20.230201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.734 [2024-11-17 11:02:20.230254] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:dfff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.734 [2024-11-17 11:02:20.230268] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.734 [2024-11-17 11:02:20.230321] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.734 [2024-11-17 11:02:20.230338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.734 #31 NEW cov: 12482 ft: 14991 corp: 21/679b lim: 45 exec/s: 31 rss: 73Mb L: 41/45 MS: 1 ChangeBit- 00:07:55.734 [2024-11-17 11:02:20.269753] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000a01 cdw11:00a30005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.734 [2024-11-17 11:02:20.269779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.734 #32 NEW cov: 12482 ft: 15040 corp: 22/689b lim: 45 exec/s: 32 rss: 73Mb L: 10/45 MS: 1 CMP- DE: "\001\000\000\000"- 00:07:55.734 [2024-11-17 11:02:20.330579] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:0ac10acb cdw11:724c0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.734 [2024-11-17 11:02:20.330605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.734 [2024-11-17 11:02:20.330661] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:c172cb0a cdw11:4ca30005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.734 [2024-11-17 11:02:20.330675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.734 [2024-11-17 11:02:20.330726] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:dfff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.734 [2024-11-17 11:02:20.330740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.734 [2024-11-17 11:02:20.330794] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.734 [2024-11-17 11:02:20.330807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.734 [2024-11-17 11:02:20.330861] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:8 nsid:0 cdw10:0000ffff cdw11:3f1e0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.734 [2024-11-17 11:02:20.330874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:55.734 #33 NEW cov: 12482 ft: 15067 corp: 23/734b lim: 45 exec/s: 33 rss: 73Mb L: 45/45 MS: 1 PersAutoDict- DE: "\000\000?\036"- 00:07:55.994 [2024-11-17 11:02:20.390086] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:003f0a00 cdw11:0ac10003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.994 [2024-11-17 11:02:20.390112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.994 #36 NEW cov: 12482 ft: 15121 corp: 24/747b lim: 45 exec/s: 36 rss: 73Mb L: 13/45 MS: 3 ShuffleBytes-PersAutoDict-PersAutoDict- DE: "\000\000?\036"-"\012\301rL\243\254\212\000"- 00:07:55.994 [2024-11-17 11:02:20.430849] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:0ac10acb cdw11:724c0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.995 [2024-11-17 11:02:20.430876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.995 [2024-11-17 11:02:20.430931] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:c172cb0a cdw11:4ca30005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.995 [2024-11-17 11:02:20.430945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.995 [2024-11-17 11:02:20.431000] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:dfff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.995 [2024-11-17 11:02:20.431014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.995 [2024-11-17 11:02:20.431070] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.995 [2024-11-17 11:02:20.431083] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.995 [2024-11-17 11:02:20.431135] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:8 nsid:0 cdw10:ff00ffff cdw11:3f1e0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.995 [2024-11-17 11:02:20.431148] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:55.995 #37 NEW cov: 12482 ft: 15138 corp: 25/792b lim: 45 exec/s: 37 rss: 74Mb L: 45/45 MS: 1 CopyPart- 00:07:55.995 [2024-11-17 11:02:20.491049] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:724c0ac1 cdw11:a3ac0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.995 [2024-11-17 11:02:20.491074] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.995 [2024-11-17 11:02:20.491126] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ac024ca3 cdw11:00ff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.995 [2024-11-17 11:02:20.491140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.995 [2024-11-17 11:02:20.491191] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.995 [2024-11-17 11:02:20.491204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.995 [2024-11-17 11:02:20.491256] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.995 [2024-11-17 11:02:20.491269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.995 [2024-11-17 11:02:20.491322] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:8 nsid:0 cdw10:1eff003f cdw11:fdff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.995 [2024-11-17 11:02:20.491335] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:55.995 #38 NEW cov: 12482 ft: 15149 corp: 26/837b lim: 45 exec/s: 38 rss: 74Mb L: 45/45 MS: 1 ChangeByte- 00:07:55.995 [2024-11-17 11:02:20.550506] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00a30a01 cdw11:ac8a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.995 [2024-11-17 11:02:20.550531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.995 #39 NEW cov: 12482 ft: 15169 corp: 27/852b lim: 45 exec/s: 39 rss: 74Mb L: 15/45 MS: 1 CopyPart- 00:07:55.995 [2024-11-17 11:02:20.610696] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:c1320a0f cdw11:4ca30005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.995 [2024-11-17 11:02:20.610722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.995 #40 NEW cov: 12482 ft: 15179 corp: 28/862b lim: 45 exec/s: 40 rss: 74Mb L: 10/45 MS: 1 ChangeBit- 00:07:56.255 [2024-11-17 11:02:20.651493] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:724c0ac1 cdw11:a3ac0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.255 [2024-11-17 11:02:20.651518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.255 [2024-11-17 11:02:20.651572] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:a3004cac cdw11:ff8a0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.255 [2024-11-17 11:02:20.651590] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.255 [2024-11-17 11:02:20.651643] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.255 [2024-11-17 11:02:20.651657] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.255 [2024-11-17 11:02:20.651708] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.255 [2024-11-17 11:02:20.651722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.255 [2024-11-17 11:02:20.651774] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:8 nsid:0 cdw10:1eff003f cdw11:fdff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.255 [2024-11-17 11:02:20.651787] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:56.255 #46 NEW cov: 12482 ft: 15187 corp: 29/907b lim: 45 exec/s: 46 rss: 74Mb L: 45/45 MS: 1 ShuffleBytes- 00:07:56.255 [2024-11-17 11:02:20.691438] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:0ac10acb cdw11:724c0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.255 [2024-11-17 11:02:20.691463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.255 [2024-11-17 11:02:20.691517] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:c172cb0a cdw11:4ca30005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.255 [2024-11-17 11:02:20.691532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.255 [2024-11-17 11:02:20.691583] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.255 [2024-11-17 11:02:20.691597] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.255 [2024-11-17 11:02:20.691649] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.255 [2024-11-17 11:02:20.691662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.256 #47 NEW cov: 12482 ft: 15191 corp: 30/948b lim: 45 exec/s: 47 rss: 74Mb L: 41/45 MS: 1 ChangeBit- 00:07:56.256 [2024-11-17 11:02:20.731260] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ac8a0a0a cdw11:004c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.256 [2024-11-17 11:02:20.731287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.256 [2024-11-17 11:02:20.731343] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:0a0a0a0a cdw11:0a0a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.256 [2024-11-17 11:02:20.731357] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.256 #50 NEW cov: 12482 ft: 15449 corp: 31/971b lim: 45 exec/s: 50 rss: 74Mb L: 23/45 MS: 3 EraseBytes-CrossOver-InsertRepeatedBytes- 00:07:56.256 [2024-11-17 11:02:20.771787] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:724c0ac1 cdw11:a3ac0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.256 [2024-11-17 11:02:20.771813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.256 [2024-11-17 11:02:20.771868] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ac8a4ca3 cdw11:00ff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.256 [2024-11-17 11:02:20.771885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.256 [2024-11-17 11:02:20.771937] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.256 [2024-11-17 11:02:20.771950] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.256 [2024-11-17 11:02:20.772003] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:0ac1ffff cdw11:724c0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.256 [2024-11-17 11:02:20.772017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.256 [2024-11-17 11:02:20.772073] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:8 nsid:0 cdw10:ffff00ff cdw11:ffff0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.256 [2024-11-17 11:02:20.772087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:56.256 #51 NEW cov: 12482 ft: 15460 corp: 32/1016b lim: 45 exec/s: 51 rss: 74Mb L: 45/45 MS: 1 ChangeByte- 00:07:56.256 [2024-11-17 11:02:20.811770] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:4cc10aac cdw11:0a720005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.256 [2024-11-17 11:02:20.811796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.256 [2024-11-17 11:02:20.811852] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ac8a4ca3 cdw11:00ff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.256 [2024-11-17 11:02:20.811865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.256 [2024-11-17 11:02:20.811917] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.256 [2024-11-17 11:02:20.811931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.256 [2024-11-17 11:02:20.811984] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.256 [2024-11-17 11:02:20.811997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.256 #52 NEW cov: 12482 ft: 15497 corp: 33/1058b lim: 45 exec/s: 52 rss: 74Mb L: 42/45 MS: 1 EraseBytes- 00:07:56.256 [2024-11-17 11:02:20.871586] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.256 [2024-11-17 11:02:20.871611] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.256 [2024-11-17 11:02:20.871668] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.256 [2024-11-17 11:02:20.871682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.256 #55 NEW cov: 12482 ft: 15506 corp: 34/1082b lim: 45 exec/s: 55 rss: 74Mb L: 24/45 MS: 3 CopyPart-ChangeByte-InsertRepeatedBytes- 00:07:56.517 [2024-11-17 11:02:20.912237] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:724c0ac1 cdw11:a3ac0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.517 [2024-11-17 11:02:20.912264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.517 [2024-11-17 11:02:20.912320] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:5375b45c cdw11:ff000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.517 [2024-11-17 11:02:20.912337] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.517 [2024-11-17 11:02:20.912389] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.517 [2024-11-17 11:02:20.912403] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.517 [2024-11-17 11:02:20.912456] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:3dff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.517 [2024-11-17 11:02:20.912470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.517 [2024-11-17 11:02:20.912521] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.517 [2024-11-17 11:02:20.912535] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:56.517 #56 NEW cov: 12482 ft: 15508 corp: 35/1127b lim: 45 exec/s: 28 rss: 74Mb L: 45/45 MS: 1 ChangeByte- 00:07:56.517 #56 DONE cov: 12482 ft: 15508 corp: 35/1127b lim: 45 exec/s: 28 rss: 74Mb 00:07:56.517 ###### Recommended dictionary. ###### 00:07:56.517 "\012\301rL\243\254\212\000" # Uses: 3 00:07:56.517 "\000\000?\036" # Uses: 2 00:07:56.517 "\001\000\000\000" # Uses: 0 00:07:56.517 ###### End of recommended dictionary. ###### 00:07:56.517 Done 56 runs in 2 second(s) 00:07:56.517 11:02:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_5.conf /var/tmp/suppress_nvmf_fuzz 00:07:56.517 11:02:21 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:07:56.517 11:02:21 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:56.517 11:02:21 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 6 1 0x1 00:07:56.517 11:02:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=6 00:07:56.517 11:02:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:07:56.517 11:02:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:07:56.517 11:02:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:07:56.517 11:02:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_6.conf 00:07:56.517 11:02:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:56.517 11:02:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:56.517 11:02:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 6 00:07:56.517 11:02:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4406 00:07:56.517 11:02:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:07:56.517 11:02:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4406' 00:07:56.517 11:02:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4406"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:56.517 11:02:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:56.517 11:02:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:56.517 11:02:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4406' -c /tmp/fuzz_json_6.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 -Z 6 00:07:56.517 [2024-11-17 11:02:21.099565] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:07:56.517 [2024-11-17 11:02:21.099656] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid150579 ] 00:07:56.778 [2024-11-17 11:02:21.309549] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:56.778 [2024-11-17 11:02:21.322479] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:56.778 [2024-11-17 11:02:21.374907] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:56.778 [2024-11-17 11:02:21.391253] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4406 *** 00:07:56.778 INFO: Running with entropic power schedule (0xFF, 100). 00:07:56.778 INFO: Seed: 153427254 00:07:56.778 INFO: Loaded 1 modules (387604 inline 8-bit counters): 387604 [0x2a8d80c, 0x2aec220), 00:07:56.778 INFO: Loaded 1 PC tables (387604 PCs): 387604 [0x2aec220,0x30d6360), 00:07:56.778 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:07:56.778 INFO: A corpus is not provided, starting from an empty corpus 00:07:56.778 #2 INITED exec/s: 0 rss: 65Mb 00:07:56.778 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:56.778 This may also happen if the target rejected all inputs we tried so far 00:07:57.038 [2024-11-17 11:02:21.458184] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0e cdw11:00000000 00:07:57.038 [2024-11-17 11:02:21.458223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.038 [2024-11-17 11:02:21.458342] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:57.038 [2024-11-17 11:02:21.458361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.038 [2024-11-17 11:02:21.458478] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:57.038 [2024-11-17 11:02:21.458498] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.038 [2024-11-17 11:02:21.458619] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:57.038 [2024-11-17 11:02:21.458637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.299 NEW_FUNC[1/714]: 0x45d2c8 in fuzz_admin_delete_io_completion_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:161 00:07:57.299 NEW_FUNC[2/714]: 0x48ef68 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:57.299 #3 NEW cov: 12173 ft: 12173 corp: 2/10b lim: 10 exec/s: 0 rss: 72Mb L: 9/9 MS: 1 CMP- DE: "\016\000\000\000\000\000\000\000"- 00:07:57.299 [2024-11-17 11:02:21.788767] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0e cdw11:00000000 00:07:57.299 [2024-11-17 11:02:21.788811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.299 [2024-11-17 11:02:21.788934] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:57.299 [2024-11-17 11:02:21.788952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.299 [2024-11-17 11:02:21.789065] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000f800 cdw11:00000000 00:07:57.299 [2024-11-17 11:02:21.789085] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.299 [2024-11-17 11:02:21.789195] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:57.299 [2024-11-17 11:02:21.789214] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.299 #4 NEW cov: 12286 ft: 12909 corp: 3/19b lim: 10 exec/s: 0 rss: 72Mb L: 9/9 MS: 1 ChangeBinInt- 00:07:57.299 [2024-11-17 11:02:21.858873] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0e cdw11:00000000 00:07:57.299 [2024-11-17 11:02:21.858901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.299 [2024-11-17 11:02:21.859014] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:57.299 [2024-11-17 11:02:21.859032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.299 [2024-11-17 11:02:21.859145] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:57.299 [2024-11-17 11:02:21.859161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.299 [2024-11-17 11:02:21.859273] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000002 cdw11:00000000 00:07:57.299 [2024-11-17 11:02:21.859290] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.299 #5 NEW cov: 12292 ft: 13170 corp: 4/28b lim: 10 exec/s: 0 rss: 72Mb L: 9/9 MS: 1 ChangeBit- 00:07:57.299 [2024-11-17 11:02:21.899003] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0e cdw11:00000000 00:07:57.299 [2024-11-17 11:02:21.899032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.299 [2024-11-17 11:02:21.899149] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:57.299 [2024-11-17 11:02:21.899164] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.299 [2024-11-17 11:02:21.899266] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:57.299 [2024-11-17 11:02:21.899282] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.299 [2024-11-17 11:02:21.899389] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:57.299 [2024-11-17 11:02:21.899404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.299 #6 NEW cov: 12377 ft: 13483 corp: 5/37b lim: 10 exec/s: 0 rss: 72Mb L: 9/9 MS: 1 ChangeByte- 00:07:57.299 [2024-11-17 11:02:21.949369] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0e cdw11:00000000 00:07:57.299 [2024-11-17 11:02:21.949397] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.299 [2024-11-17 11:02:21.949502] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:57.299 [2024-11-17 11:02:21.949519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.299 [2024-11-17 11:02:21.949630] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000f800 cdw11:00000000 00:07:57.299 [2024-11-17 11:02:21.949646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.299 [2024-11-17 11:02:21.949752] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:57.299 [2024-11-17 11:02:21.949768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.299 [2024-11-17 11:02:21.949882] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 00:07:57.299 [2024-11-17 11:02:21.949900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:57.560 #7 NEW cov: 12377 ft: 13671 corp: 6/47b lim: 10 exec/s: 0 rss: 72Mb L: 10/10 MS: 1 CrossOver- 00:07:57.560 [2024-11-17 11:02:22.019521] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0e cdw11:00000000 00:07:57.560 [2024-11-17 11:02:22.019548] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.560 [2024-11-17 11:02:22.019654] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:57.560 [2024-11-17 11:02:22.019671] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.560 [2024-11-17 11:02:22.019779] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000f800 cdw11:00000000 00:07:57.560 [2024-11-17 11:02:22.019796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.560 [2024-11-17 11:02:22.019905] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:57.560 [2024-11-17 11:02:22.019921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.560 [2024-11-17 11:02:22.020030] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:00002d00 cdw11:00000000 00:07:57.560 [2024-11-17 11:02:22.020049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:57.560 #8 NEW cov: 12377 ft: 13707 corp: 7/57b lim: 10 exec/s: 0 rss: 72Mb L: 10/10 MS: 1 ChangeByte- 00:07:57.560 [2024-11-17 11:02:22.079593] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0e cdw11:00000000 00:07:57.560 [2024-11-17 11:02:22.079622] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.560 [2024-11-17 11:02:22.079726] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:57.560 [2024-11-17 11:02:22.079742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.560 [2024-11-17 11:02:22.079856] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000009 cdw11:00000000 00:07:57.560 [2024-11-17 11:02:22.079874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.560 [2024-11-17 11:02:22.079991] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:57.560 [2024-11-17 11:02:22.080007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.560 #9 NEW cov: 12377 ft: 13751 corp: 8/66b lim: 10 exec/s: 0 rss: 72Mb L: 9/10 MS: 1 ChangeBinInt- 00:07:57.560 [2024-11-17 11:02:22.129945] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0e cdw11:00000000 00:07:57.561 [2024-11-17 11:02:22.129974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.561 [2024-11-17 11:02:22.130105] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:57.561 [2024-11-17 11:02:22.130124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.561 [2024-11-17 11:02:22.130227] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00006900 cdw11:00000000 00:07:57.561 [2024-11-17 11:02:22.130247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.561 [2024-11-17 11:02:22.130360] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:57.561 [2024-11-17 11:02:22.130377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.561 [2024-11-17 11:02:22.130485] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000003f cdw11:00000000 00:07:57.561 [2024-11-17 11:02:22.130501] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:57.561 #10 NEW cov: 12377 ft: 13786 corp: 9/76b lim: 10 exec/s: 0 rss: 72Mb L: 10/10 MS: 1 InsertByte- 00:07:57.561 [2024-11-17 11:02:22.200033] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0e cdw11:00000000 00:07:57.561 [2024-11-17 11:02:22.200064] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.561 [2024-11-17 11:02:22.200175] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:57.561 [2024-11-17 11:02:22.200192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.561 [2024-11-17 11:02:22.200300] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000b00 cdw11:00000000 00:07:57.561 [2024-11-17 11:02:22.200316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.561 [2024-11-17 11:02:22.200424] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:57.561 [2024-11-17 11:02:22.200438] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.821 #11 NEW cov: 12377 ft: 13799 corp: 10/85b lim: 10 exec/s: 0 rss: 72Mb L: 9/10 MS: 1 ChangeByte- 00:07:57.821 [2024-11-17 11:02:22.250353] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0e cdw11:00000000 00:07:57.821 [2024-11-17 11:02:22.250381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.821 [2024-11-17 11:02:22.250495] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:57.821 [2024-11-17 11:02:22.250512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.821 [2024-11-17 11:02:22.250619] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:000097ff cdw11:00000000 00:07:57.821 [2024-11-17 11:02:22.250635] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.821 [2024-11-17 11:02:22.250748] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000fffe cdw11:00000000 00:07:57.821 [2024-11-17 11:02:22.250763] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.821 [2024-11-17 11:02:22.250874] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000003f cdw11:00000000 00:07:57.821 [2024-11-17 11:02:22.250890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:57.821 #12 NEW cov: 12377 ft: 13911 corp: 11/95b lim: 10 exec/s: 0 rss: 72Mb L: 10/10 MS: 1 ChangeBinInt- 00:07:57.821 [2024-11-17 11:02:22.320377] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 00:07:57.821 [2024-11-17 11:02:22.320404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.821 [2024-11-17 11:02:22.320529] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000090e cdw11:00000000 00:07:57.821 [2024-11-17 11:02:22.320545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.821 [2024-11-17 11:02:22.320650] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:57.821 [2024-11-17 11:02:22.320665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.821 [2024-11-17 11:02:22.320769] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:57.822 [2024-11-17 11:02:22.320786] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.822 NEW_FUNC[1/1]: 0x1c4e498 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:07:57.822 #13 NEW cov: 12400 ft: 13969 corp: 12/104b lim: 10 exec/s: 0 rss: 73Mb L: 9/10 MS: 1 ShuffleBytes- 00:07:57.822 [2024-11-17 11:02:22.390871] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0e cdw11:00000000 00:07:57.822 [2024-11-17 11:02:22.390899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.822 [2024-11-17 11:02:22.391016] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:57.822 [2024-11-17 11:02:22.391034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.822 [2024-11-17 11:02:22.391156] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00006900 cdw11:00000000 00:07:57.822 [2024-11-17 11:02:22.391173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.822 [2024-11-17 11:02:22.391292] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000fa00 cdw11:00000000 00:07:57.822 [2024-11-17 11:02:22.391308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.822 [2024-11-17 11:02:22.391423] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000003f cdw11:00000000 00:07:57.822 [2024-11-17 11:02:22.391439] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:57.822 #14 NEW cov: 12400 ft: 13984 corp: 13/114b lim: 10 exec/s: 0 rss: 73Mb L: 10/10 MS: 1 ChangeBinInt- 00:07:57.822 [2024-11-17 11:02:22.440993] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a2e cdw11:00000000 00:07:57.822 [2024-11-17 11:02:22.441020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.822 [2024-11-17 11:02:22.441137] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:57.822 [2024-11-17 11:02:22.441155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.822 [2024-11-17 11:02:22.441268] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00006900 cdw11:00000000 00:07:57.822 [2024-11-17 11:02:22.441285] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.822 [2024-11-17 11:02:22.441404] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:57.822 [2024-11-17 11:02:22.441422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.822 [2024-11-17 11:02:22.441535] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000003f cdw11:00000000 00:07:57.822 [2024-11-17 11:02:22.441550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:57.822 #15 NEW cov: 12400 ft: 13997 corp: 14/124b lim: 10 exec/s: 15 rss: 73Mb L: 10/10 MS: 1 ChangeByte- 00:07:58.082 [2024-11-17 11:02:22.491038] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:58.082 [2024-11-17 11:02:22.491070] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.082 [2024-11-17 11:02:22.491188] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:58.082 [2024-11-17 11:02:22.491208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.082 [2024-11-17 11:02:22.491319] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:58.082 [2024-11-17 11:02:22.491337] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.082 [2024-11-17 11:02:22.491440] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:58.082 [2024-11-17 11:02:22.491456] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.082 [2024-11-17 11:02:22.491565] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000003f cdw11:00000000 00:07:58.082 [2024-11-17 11:02:22.491582] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:58.082 #16 NEW cov: 12400 ft: 14012 corp: 15/134b lim: 10 exec/s: 16 rss: 73Mb L: 10/10 MS: 1 ChangeBinInt- 00:07:58.082 [2024-11-17 11:02:22.561141] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 00:07:58.082 [2024-11-17 11:02:22.561168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.083 [2024-11-17 11:02:22.561271] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000090e cdw11:00000000 00:07:58.083 [2024-11-17 11:02:22.561289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.083 [2024-11-17 11:02:22.561399] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:58.083 [2024-11-17 11:02:22.561414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.083 [2024-11-17 11:02:22.561523] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000003d cdw11:00000000 00:07:58.083 [2024-11-17 11:02:22.561540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.083 #17 NEW cov: 12400 ft: 14025 corp: 16/143b lim: 10 exec/s: 17 rss: 73Mb L: 9/10 MS: 1 ChangeByte- 00:07:58.083 [2024-11-17 11:02:22.631363] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0e cdw11:00000000 00:07:58.083 [2024-11-17 11:02:22.631391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.083 [2024-11-17 11:02:22.631497] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:000000ff cdw11:00000000 00:07:58.083 [2024-11-17 11:02:22.631514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.083 [2024-11-17 11:02:22.631626] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:58.083 [2024-11-17 11:02:22.631643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.083 [2024-11-17 11:02:22.631755] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:58.083 [2024-11-17 11:02:22.631772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.083 #18 NEW cov: 12400 ft: 14046 corp: 17/152b lim: 10 exec/s: 18 rss: 73Mb L: 9/10 MS: 1 ChangeBinInt- 00:07:58.083 [2024-11-17 11:02:22.681861] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:58.083 [2024-11-17 11:02:22.681888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.083 [2024-11-17 11:02:22.681998] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000001 cdw11:00000000 00:07:58.083 [2024-11-17 11:02:22.682014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.083 [2024-11-17 11:02:22.682130] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:58.083 [2024-11-17 11:02:22.682146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.083 [2024-11-17 11:02:22.682270] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:58.083 [2024-11-17 11:02:22.682287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.083 [2024-11-17 11:02:22.682403] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000003f cdw11:00000000 00:07:58.083 [2024-11-17 11:02:22.682420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:58.083 #19 NEW cov: 12400 ft: 14068 corp: 18/162b lim: 10 exec/s: 19 rss: 73Mb L: 10/10 MS: 1 ChangeBit- 00:07:58.343 [2024-11-17 11:02:22.751927] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:58.343 [2024-11-17 11:02:22.751957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.343 [2024-11-17 11:02:22.752073] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:58.343 [2024-11-17 11:02:22.752091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.344 [2024-11-17 11:02:22.752207] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:58.344 [2024-11-17 11:02:22.752223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.344 [2024-11-17 11:02:22.752344] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:58.344 [2024-11-17 11:02:22.752362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.344 [2024-11-17 11:02:22.752477] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000003f cdw11:00000000 00:07:58.344 [2024-11-17 11:02:22.752495] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:58.344 #20 NEW cov: 12400 ft: 14094 corp: 19/172b lim: 10 exec/s: 20 rss: 73Mb L: 10/10 MS: 1 CopyPart- 00:07:58.344 [2024-11-17 11:02:22.801834] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000e00 cdw11:00000000 00:07:58.344 [2024-11-17 11:02:22.801865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.344 [2024-11-17 11:02:22.801983] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:58.344 [2024-11-17 11:02:22.802001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.344 [2024-11-17 11:02:22.802121] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:58.344 [2024-11-17 11:02:22.802139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.344 [2024-11-17 11:02:22.802252] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:58.344 [2024-11-17 11:02:22.802270] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.344 #21 NEW cov: 12400 ft: 14128 corp: 20/181b lim: 10 exec/s: 21 rss: 73Mb L: 9/10 MS: 1 PersAutoDict- DE: "\016\000\000\000\000\000\000\000"- 00:07:58.344 [2024-11-17 11:02:22.871903] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:58.344 [2024-11-17 11:02:22.871931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.344 [2024-11-17 11:02:22.872047] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:58.344 [2024-11-17 11:02:22.872066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.344 [2024-11-17 11:02:22.872179] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000003f cdw11:00000000 00:07:58.344 [2024-11-17 11:02:22.872195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.344 #22 NEW cov: 12400 ft: 14353 corp: 21/187b lim: 10 exec/s: 22 rss: 73Mb L: 6/10 MS: 1 EraseBytes- 00:07:58.344 [2024-11-17 11:02:22.942580] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0e cdw11:00000000 00:07:58.344 [2024-11-17 11:02:22.942609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.344 [2024-11-17 11:02:22.942727] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000e00 cdw11:00000000 00:07:58.344 [2024-11-17 11:02:22.942743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.344 [2024-11-17 11:02:22.942853] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:58.344 [2024-11-17 11:02:22.942870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.344 [2024-11-17 11:02:22.942988] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:58.344 [2024-11-17 11:02:22.943006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.344 [2024-11-17 11:02:22.943102] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 00:07:58.344 [2024-11-17 11:02:22.943120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:58.344 #23 NEW cov: 12400 ft: 14400 corp: 22/197b lim: 10 exec/s: 23 rss: 73Mb L: 10/10 MS: 1 PersAutoDict- DE: "\016\000\000\000\000\000\000\000"- 00:07:58.604 [2024-11-17 11:02:23.012750] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0e cdw11:00000000 00:07:58.604 [2024-11-17 11:02:23.012781] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.604 [2024-11-17 11:02:23.012890] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000e00 cdw11:00000000 00:07:58.604 [2024-11-17 11:02:23.012908] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.604 [2024-11-17 11:02:23.013016] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000000a cdw11:00000000 00:07:58.604 [2024-11-17 11:02:23.013032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.604 [2024-11-17 11:02:23.013148] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000e0e cdw11:00000000 00:07:58.604 [2024-11-17 11:02:23.013163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.604 [2024-11-17 11:02:23.013279] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 00:07:58.604 [2024-11-17 11:02:23.013296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:58.604 #24 NEW cov: 12400 ft: 14423 corp: 23/207b lim: 10 exec/s: 24 rss: 73Mb L: 10/10 MS: 1 CopyPart- 00:07:58.604 [2024-11-17 11:02:23.083014] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 00:07:58.604 [2024-11-17 11:02:23.083046] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.605 [2024-11-17 11:02:23.083161] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 00:07:58.605 [2024-11-17 11:02:23.083178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.605 [2024-11-17 11:02:23.083283] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000e00 cdw11:00000000 00:07:58.605 [2024-11-17 11:02:23.083299] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.605 [2024-11-17 11:02:23.083410] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:58.605 [2024-11-17 11:02:23.083426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.605 [2024-11-17 11:02:23.083538] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:00003d00 cdw11:00000000 00:07:58.605 [2024-11-17 11:02:23.083553] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:58.605 #25 NEW cov: 12400 ft: 14442 corp: 24/217b lim: 10 exec/s: 25 rss: 74Mb L: 10/10 MS: 1 CopyPart- 00:07:58.605 [2024-11-17 11:02:23.133092] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0e cdw11:00000000 00:07:58.605 [2024-11-17 11:02:23.133121] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.605 [2024-11-17 11:02:23.133240] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:58.605 [2024-11-17 11:02:23.133256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.605 [2024-11-17 11:02:23.133369] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00006900 cdw11:00000000 00:07:58.605 [2024-11-17 11:02:23.133383] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.605 [2024-11-17 11:02:23.133486] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00001000 cdw11:00000000 00:07:58.605 [2024-11-17 11:02:23.133504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.605 [2024-11-17 11:02:23.133606] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000003f cdw11:00000000 00:07:58.605 [2024-11-17 11:02:23.133620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:58.605 #26 NEW cov: 12400 ft: 14460 corp: 25/227b lim: 10 exec/s: 26 rss: 74Mb L: 10/10 MS: 1 ChangeBit- 00:07:58.605 [2024-11-17 11:02:23.183194] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0e cdw11:00000000 00:07:58.605 [2024-11-17 11:02:23.183228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.605 [2024-11-17 11:02:23.183331] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:58.605 [2024-11-17 11:02:23.183348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.605 [2024-11-17 11:02:23.183458] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000f800 cdw11:00000000 00:07:58.605 [2024-11-17 11:02:23.183476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.605 [2024-11-17 11:02:23.183588] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000040 cdw11:00000000 00:07:58.605 [2024-11-17 11:02:23.183605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.605 [2024-11-17 11:02:23.183706] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 00:07:58.605 [2024-11-17 11:02:23.183723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:58.605 #27 NEW cov: 12400 ft: 14480 corp: 26/237b lim: 10 exec/s: 27 rss: 74Mb L: 10/10 MS: 1 ChangeBit- 00:07:58.605 [2024-11-17 11:02:23.223111] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000aff cdw11:00000000 00:07:58.605 [2024-11-17 11:02:23.223138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.605 [2024-11-17 11:02:23.223242] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:58.605 [2024-11-17 11:02:23.223257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.605 [2024-11-17 11:02:23.223384] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:58.605 [2024-11-17 11:02:23.223400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.605 [2024-11-17 11:02:23.223507] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:58.605 [2024-11-17 11:02:23.223522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.605 #28 NEW cov: 12400 ft: 14560 corp: 27/246b lim: 10 exec/s: 28 rss: 74Mb L: 9/10 MS: 1 InsertRepeatedBytes- 00:07:58.866 [2024-11-17 11:02:23.273290] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0e cdw11:00000000 00:07:58.866 [2024-11-17 11:02:23.273317] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.866 [2024-11-17 11:02:23.273429] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:58.866 [2024-11-17 11:02:23.273447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.866 [2024-11-17 11:02:23.273564] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000f800 cdw11:00000000 00:07:58.866 [2024-11-17 11:02:23.273581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.866 [2024-11-17 11:02:23.273688] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:58.866 [2024-11-17 11:02:23.273705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.866 #29 NEW cov: 12400 ft: 14638 corp: 28/255b lim: 10 exec/s: 29 rss: 74Mb L: 9/10 MS: 1 CopyPart- 00:07:58.866 [2024-11-17 11:02:23.313600] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0e cdw11:00000000 00:07:58.866 [2024-11-17 11:02:23.313626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.866 [2024-11-17 11:02:23.313730] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:58.866 [2024-11-17 11:02:23.313745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.866 [2024-11-17 11:02:23.313861] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000f800 cdw11:00000000 00:07:58.866 [2024-11-17 11:02:23.313877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.866 [2024-11-17 11:02:23.313984] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:58.866 [2024-11-17 11:02:23.313999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.866 [2024-11-17 11:02:23.314117] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 00:07:58.866 [2024-11-17 11:02:23.314135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:58.866 #30 NEW cov: 12400 ft: 14672 corp: 29/265b lim: 10 exec/s: 30 rss: 74Mb L: 10/10 MS: 1 CopyPart- 00:07:58.866 [2024-11-17 11:02:23.363578] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:58.866 [2024-11-17 11:02:23.363606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.866 [2024-11-17 11:02:23.363730] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000a0e cdw11:00000000 00:07:58.866 [2024-11-17 11:02:23.363746] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.866 [2024-11-17 11:02:23.363860] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:58.866 [2024-11-17 11:02:23.363877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.867 [2024-11-17 11:02:23.363989] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:58.867 [2024-11-17 11:02:23.364006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.867 #31 NEW cov: 12400 ft: 14676 corp: 30/274b lim: 10 exec/s: 31 rss: 74Mb L: 9/10 MS: 1 ShuffleBytes- 00:07:58.867 [2024-11-17 11:02:23.413713] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0e cdw11:00000000 00:07:58.867 [2024-11-17 11:02:23.413742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.867 [2024-11-17 11:02:23.413857] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:000000ff cdw11:00000000 00:07:58.867 [2024-11-17 11:02:23.413873] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.867 [2024-11-17 11:02:23.413991] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:58.867 [2024-11-17 11:02:23.414005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.867 [2024-11-17 11:02:23.414127] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000062 cdw11:00000000 00:07:58.867 [2024-11-17 11:02:23.414144] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.867 #32 pulse cov: 12400 ft: 14739 corp: 30/274b lim: 10 exec/s: 16 rss: 74Mb 00:07:58.867 #32 NEW cov: 12400 ft: 14739 corp: 31/283b lim: 10 exec/s: 16 rss: 74Mb L: 9/10 MS: 1 ChangeByte- 00:07:58.867 #32 DONE cov: 12400 ft: 14739 corp: 31/283b lim: 10 exec/s: 16 rss: 74Mb 00:07:58.867 ###### Recommended dictionary. ###### 00:07:58.867 "\016\000\000\000\000\000\000\000" # Uses: 2 00:07:58.867 ###### End of recommended dictionary. ###### 00:07:58.867 Done 32 runs in 2 second(s) 00:07:59.127 11:02:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_6.conf /var/tmp/suppress_nvmf_fuzz 00:07:59.128 11:02:23 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:07:59.128 11:02:23 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:59.128 11:02:23 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 7 1 0x1 00:07:59.128 11:02:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=7 00:07:59.128 11:02:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:07:59.128 11:02:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:07:59.128 11:02:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:07:59.128 11:02:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_7.conf 00:07:59.128 11:02:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:59.128 11:02:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:59.128 11:02:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 7 00:07:59.128 11:02:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4407 00:07:59.128 11:02:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:07:59.128 11:02:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4407' 00:07:59.128 11:02:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4407"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:59.128 11:02:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:59.128 11:02:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:59.128 11:02:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4407' -c /tmp/fuzz_json_7.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 -Z 7 00:07:59.128 [2024-11-17 11:02:23.598846] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:07:59.128 [2024-11-17 11:02:23.598936] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid151107 ] 00:07:59.389 [2024-11-17 11:02:23.798459] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:59.389 [2024-11-17 11:02:23.811238] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:59.389 [2024-11-17 11:02:23.863691] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:59.389 [2024-11-17 11:02:23.879981] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4407 *** 00:07:59.389 INFO: Running with entropic power schedule (0xFF, 100). 00:07:59.389 INFO: Seed: 2642455530 00:07:59.389 INFO: Loaded 1 modules (387604 inline 8-bit counters): 387604 [0x2a8d80c, 0x2aec220), 00:07:59.389 INFO: Loaded 1 PC tables (387604 PCs): 387604 [0x2aec220,0x30d6360), 00:07:59.389 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:07:59.389 INFO: A corpus is not provided, starting from an empty corpus 00:07:59.389 #2 INITED exec/s: 0 rss: 65Mb 00:07:59.389 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:59.389 This may also happen if the target rejected all inputs we tried so far 00:07:59.389 [2024-11-17 11:02:23.935332] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002bb9 cdw11:00000000 00:07:59.389 [2024-11-17 11:02:23.935360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.649 NEW_FUNC[1/714]: 0x45dcc8 in fuzz_admin_delete_io_submission_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:172 00:07:59.649 NEW_FUNC[2/714]: 0x48ef68 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:59.649 #4 NEW cov: 12173 ft: 12172 corp: 2/3b lim: 10 exec/s: 0 rss: 72Mb L: 2/2 MS: 2 ChangeByte-InsertByte- 00:07:59.649 [2024-11-17 11:02:24.266412] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00009d0a cdw11:00000000 00:07:59.649 [2024-11-17 11:02:24.266460] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.650 #6 NEW cov: 12286 ft: 12779 corp: 3/5b lim: 10 exec/s: 0 rss: 72Mb L: 2/2 MS: 2 CopyPart-InsertByte- 00:07:59.910 [2024-11-17 11:02:24.316258] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00009d0a cdw11:00000000 00:07:59.910 [2024-11-17 11:02:24.316284] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.910 #7 NEW cov: 12292 ft: 12940 corp: 4/7b lim: 10 exec/s: 0 rss: 72Mb L: 2/2 MS: 1 CopyPart- 00:07:59.910 [2024-11-17 11:02:24.376476] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a7a cdw11:00000000 00:07:59.910 [2024-11-17 11:02:24.376501] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.910 #11 NEW cov: 12377 ft: 13333 corp: 5/9b lim: 10 exec/s: 0 rss: 72Mb L: 2/2 MS: 4 ShuffleBytes-CopyPart-ShuffleBytes-InsertByte- 00:07:59.910 [2024-11-17 11:02:24.416571] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00007acd cdw11:00000000 00:07:59.910 [2024-11-17 11:02:24.416597] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.910 #13 NEW cov: 12377 ft: 13414 corp: 6/11b lim: 10 exec/s: 0 rss: 72Mb L: 2/2 MS: 2 EraseBytes-InsertByte- 00:07:59.910 [2024-11-17 11:02:24.476677] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000ff0a cdw11:00000000 00:07:59.910 [2024-11-17 11:02:24.476702] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.910 #14 NEW cov: 12377 ft: 13565 corp: 7/13b lim: 10 exec/s: 0 rss: 72Mb L: 2/2 MS: 1 ChangeByte- 00:07:59.910 [2024-11-17 11:02:24.517035] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000b6b6 cdw11:00000000 00:07:59.910 [2024-11-17 11:02:24.517064] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.910 [2024-11-17 11:02:24.517119] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000b6b6 cdw11:00000000 00:07:59.910 [2024-11-17 11:02:24.517132] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.910 [2024-11-17 11:02:24.517180] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000b60a cdw11:00000000 00:07:59.910 [2024-11-17 11:02:24.517210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.910 #15 NEW cov: 12377 ft: 13868 corp: 8/19b lim: 10 exec/s: 0 rss: 72Mb L: 6/6 MS: 1 InsertRepeatedBytes- 00:07:59.910 [2024-11-17 11:02:24.557262] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a01 cdw11:00000000 00:07:59.910 [2024-11-17 11:02:24.557288] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.910 [2024-11-17 11:02:24.557340] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00008aac cdw11:00000000 00:07:59.910 [2024-11-17 11:02:24.557354] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.911 [2024-11-17 11:02:24.557406] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000a68d cdw11:00000000 00:07:59.911 [2024-11-17 11:02:24.557420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.911 [2024-11-17 11:02:24.557472] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00002f90 cdw11:00000000 00:07:59.911 [2024-11-17 11:02:24.557485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.172 #16 NEW cov: 12377 ft: 14097 corp: 9/28b lim: 10 exec/s: 0 rss: 72Mb L: 9/9 MS: 1 CMP- DE: "\001\212\254\246\215/\220\362"- 00:08:00.172 [2024-11-17 11:02:24.597046] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000808 cdw11:00000000 00:08:00.172 [2024-11-17 11:02:24.597087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.172 #18 NEW cov: 12377 ft: 14150 corp: 10/30b lim: 10 exec/s: 0 rss: 72Mb L: 2/9 MS: 2 ChangeBit-CopyPart- 00:08:00.172 [2024-11-17 11:02:24.637174] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000ff0a cdw11:00000000 00:08:00.172 [2024-11-17 11:02:24.637198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.172 #19 NEW cov: 12377 ft: 14189 corp: 11/32b lim: 10 exec/s: 0 rss: 72Mb L: 2/9 MS: 1 CrossOver- 00:08:00.172 [2024-11-17 11:02:24.697666] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000018a cdw11:00000000 00:08:00.172 [2024-11-17 11:02:24.697691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.172 [2024-11-17 11:02:24.697742] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000aca6 cdw11:00000000 00:08:00.172 [2024-11-17 11:02:24.697756] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.172 [2024-11-17 11:02:24.697809] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00008d2f cdw11:00000000 00:08:00.172 [2024-11-17 11:02:24.697822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.172 [2024-11-17 11:02:24.697874] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:000090f2 cdw11:00000000 00:08:00.172 [2024-11-17 11:02:24.697887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.172 #24 NEW cov: 12377 ft: 14204 corp: 12/41b lim: 10 exec/s: 0 rss: 72Mb L: 9/9 MS: 5 ChangeBit-CopyPart-CrossOver-ChangeBinInt-PersAutoDict- DE: "\001\212\254\246\215/\220\362"- 00:08:00.172 [2024-11-17 11:02:24.737806] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a01 cdw11:00000000 00:08:00.172 [2024-11-17 11:02:24.737830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.172 [2024-11-17 11:02:24.737898] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00008aac cdw11:00000000 00:08:00.172 [2024-11-17 11:02:24.737912] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.172 [2024-11-17 11:02:24.737960] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00002f90 cdw11:00000000 00:08:00.172 [2024-11-17 11:02:24.737973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.172 [2024-11-17 11:02:24.738024] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00008da6 cdw11:00000000 00:08:00.172 [2024-11-17 11:02:24.738037] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.172 #25 NEW cov: 12377 ft: 14217 corp: 13/50b lim: 10 exec/s: 0 rss: 72Mb L: 9/9 MS: 1 ShuffleBytes- 00:08:00.172 [2024-11-17 11:02:24.797615] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002b2b cdw11:00000000 00:08:00.172 [2024-11-17 11:02:24.797641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.433 NEW_FUNC[1/1]: 0x1c4e498 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:00.433 #26 NEW cov: 12400 ft: 14299 corp: 14/52b lim: 10 exec/s: 0 rss: 73Mb L: 2/9 MS: 1 CopyPart- 00:08:00.433 [2024-11-17 11:02:24.858019] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002bff cdw11:00000000 00:08:00.433 [2024-11-17 11:02:24.858048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.433 [2024-11-17 11:02:24.858117] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:00.433 [2024-11-17 11:02:24.858131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.433 [2024-11-17 11:02:24.858182] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:00.433 [2024-11-17 11:02:24.858196] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.433 #28 NEW cov: 12400 ft: 14329 corp: 15/59b lim: 10 exec/s: 0 rss: 73Mb L: 7/9 MS: 2 EraseBytes-InsertRepeatedBytes- 00:08:00.433 [2024-11-17 11:02:24.918234] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00007aff cdw11:00000000 00:08:00.433 [2024-11-17 11:02:24.918259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.433 [2024-11-17 11:02:24.918310] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:00.433 [2024-11-17 11:02:24.918324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.433 [2024-11-17 11:02:24.918373] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000bacd cdw11:00000000 00:08:00.433 [2024-11-17 11:02:24.918387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.433 #29 NEW cov: 12400 ft: 14430 corp: 16/65b lim: 10 exec/s: 29 rss: 73Mb L: 6/9 MS: 1 CMP- DE: "\377\377\377\272"- 00:08:00.433 [2024-11-17 11:02:24.978143] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000808 cdw11:00000000 00:08:00.433 [2024-11-17 11:02:24.978168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.433 #30 NEW cov: 12400 ft: 14454 corp: 17/67b lim: 10 exec/s: 30 rss: 73Mb L: 2/9 MS: 1 ShuffleBytes- 00:08:00.433 [2024-11-17 11:02:25.038555] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00001616 cdw11:00000000 00:08:00.433 [2024-11-17 11:02:25.038580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.433 [2024-11-17 11:02:25.038631] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00001616 cdw11:00000000 00:08:00.433 [2024-11-17 11:02:25.038644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.433 [2024-11-17 11:02:25.038694] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000160a cdw11:00000000 00:08:00.433 [2024-11-17 11:02:25.038708] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.433 #31 NEW cov: 12400 ft: 14516 corp: 18/73b lim: 10 exec/s: 31 rss: 73Mb L: 6/9 MS: 1 InsertRepeatedBytes- 00:08:00.433 [2024-11-17 11:02:25.078818] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a01 cdw11:00000000 00:08:00.433 [2024-11-17 11:02:25.078843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.433 [2024-11-17 11:02:25.078911] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00008aac cdw11:00000000 00:08:00.433 [2024-11-17 11:02:25.078925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.433 [2024-11-17 11:02:25.078976] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000a67a cdw11:00000000 00:08:00.433 [2024-11-17 11:02:25.078990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.433 [2024-11-17 11:02:25.079038] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00008d2f cdw11:00000000 00:08:00.433 [2024-11-17 11:02:25.079056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.433 [2024-11-17 11:02:25.079105] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:000090f2 cdw11:00000000 00:08:00.433 [2024-11-17 11:02:25.079118] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:00.694 #32 NEW cov: 12400 ft: 14574 corp: 19/83b lim: 10 exec/s: 32 rss: 73Mb L: 10/10 MS: 1 InsertByte- 00:08:00.694 [2024-11-17 11:02:25.118983] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00001616 cdw11:00000000 00:08:00.694 [2024-11-17 11:02:25.119008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.694 [2024-11-17 11:02:25.119080] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00001616 cdw11:00000000 00:08:00.694 [2024-11-17 11:02:25.119095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.694 [2024-11-17 11:02:25.119154] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00001619 cdw11:00000000 00:08:00.694 [2024-11-17 11:02:25.119167] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.694 [2024-11-17 11:02:25.119219] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00001919 cdw11:00000000 00:08:00.694 [2024-11-17 11:02:25.119232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.694 [2024-11-17 11:02:25.119283] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:0000190a cdw11:00000000 00:08:00.694 [2024-11-17 11:02:25.119296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:00.694 #33 NEW cov: 12400 ft: 14581 corp: 20/93b lim: 10 exec/s: 33 rss: 73Mb L: 10/10 MS: 1 InsertRepeatedBytes- 00:08:00.694 [2024-11-17 11:02:25.178684] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000b6b6 cdw11:00000000 00:08:00.694 [2024-11-17 11:02:25.178709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.694 #34 NEW cov: 12400 ft: 14602 corp: 21/96b lim: 10 exec/s: 34 rss: 73Mb L: 3/10 MS: 1 CrossOver- 00:08:00.694 [2024-11-17 11:02:25.238827] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00008108 cdw11:00000000 00:08:00.694 [2024-11-17 11:02:25.238853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.694 #35 NEW cov: 12400 ft: 14629 corp: 22/98b lim: 10 exec/s: 35 rss: 73Mb L: 2/10 MS: 1 ChangeByte- 00:08:00.694 [2024-11-17 11:02:25.279450] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:00.694 [2024-11-17 11:02:25.279475] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.694 [2024-11-17 11:02:25.279529] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:00.694 [2024-11-17 11:02:25.279542] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.694 [2024-11-17 11:02:25.279593] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:00.694 [2024-11-17 11:02:25.279606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.694 [2024-11-17 11:02:25.279657] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:00.694 [2024-11-17 11:02:25.279670] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.694 [2024-11-17 11:02:25.279721] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:0000ff0a cdw11:00000000 00:08:00.694 [2024-11-17 11:02:25.279734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:00.694 #36 NEW cov: 12400 ft: 14651 corp: 23/108b lim: 10 exec/s: 36 rss: 73Mb L: 10/10 MS: 1 InsertRepeatedBytes- 00:08:00.695 [2024-11-17 11:02:25.319286] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00007aff cdw11:00000000 00:08:00.695 [2024-11-17 11:02:25.319311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.695 [2024-11-17 11:02:25.319360] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffba cdw11:00000000 00:08:00.695 [2024-11-17 11:02:25.319373] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.695 [2024-11-17 11:02:25.319421] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ffcd cdw11:00000000 00:08:00.695 [2024-11-17 11:02:25.319435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.955 #37 NEW cov: 12400 ft: 14656 corp: 24/114b lim: 10 exec/s: 37 rss: 73Mb L: 6/10 MS: 1 ShuffleBytes- 00:08:00.955 [2024-11-17 11:02:25.379616] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000018a cdw11:00000000 00:08:00.955 [2024-11-17 11:02:25.379641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.955 [2024-11-17 11:02:25.379694] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000aca6 cdw11:00000000 00:08:00.955 [2024-11-17 11:02:25.379708] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.955 [2024-11-17 11:02:25.379762] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00008d2f cdw11:00000000 00:08:00.955 [2024-11-17 11:02:25.379775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.955 [2024-11-17 11:02:25.379825] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:000090f0 cdw11:00000000 00:08:00.955 [2024-11-17 11:02:25.379839] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.955 #38 NEW cov: 12400 ft: 14677 corp: 25/123b lim: 10 exec/s: 38 rss: 73Mb L: 9/10 MS: 1 ChangeBit- 00:08:00.955 [2024-11-17 11:02:25.439377] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00009d04 cdw11:00000000 00:08:00.955 [2024-11-17 11:02:25.439402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.955 #39 NEW cov: 12400 ft: 14694 corp: 26/125b lim: 10 exec/s: 39 rss: 73Mb L: 2/10 MS: 1 ChangeBinInt- 00:08:00.955 [2024-11-17 11:02:25.479733] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002ba2 cdw11:00000000 00:08:00.955 [2024-11-17 11:02:25.479759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.955 [2024-11-17 11:02:25.479827] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000a2a2 cdw11:00000000 00:08:00.955 [2024-11-17 11:02:25.479841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.955 [2024-11-17 11:02:25.479892] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000a2a2 cdw11:00000000 00:08:00.955 [2024-11-17 11:02:25.479906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.955 #40 NEW cov: 12400 ft: 14700 corp: 27/132b lim: 10 exec/s: 40 rss: 73Mb L: 7/10 MS: 1 InsertRepeatedBytes- 00:08:00.956 [2024-11-17 11:02:25.519941] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00007a8a cdw11:00000000 00:08:00.956 [2024-11-17 11:02:25.519965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.956 [2024-11-17 11:02:25.520019] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000aca6 cdw11:00000000 00:08:00.956 [2024-11-17 11:02:25.520033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.956 [2024-11-17 11:02:25.520089] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00008d2f cdw11:00000000 00:08:00.956 [2024-11-17 11:02:25.520103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.956 [2024-11-17 11:02:25.520154] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:000090f2 cdw11:00000000 00:08:00.956 [2024-11-17 11:02:25.520167] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.956 #41 NEW cov: 12400 ft: 14707 corp: 28/141b lim: 10 exec/s: 41 rss: 73Mb L: 9/10 MS: 1 CrossOver- 00:08:00.956 [2024-11-17 11:02:25.559828] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000808 cdw11:00000000 00:08:00.956 [2024-11-17 11:02:25.559853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.956 [2024-11-17 11:02:25.559922] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000a7a cdw11:00000000 00:08:00.956 [2024-11-17 11:02:25.559937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.956 #42 NEW cov: 12400 ft: 14851 corp: 29/145b lim: 10 exec/s: 42 rss: 73Mb L: 4/10 MS: 1 CrossOver- 00:08:01.217 [2024-11-17 11:02:25.620201] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002ba2 cdw11:00000000 00:08:01.217 [2024-11-17 11:02:25.620227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.217 [2024-11-17 11:02:25.620296] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:01.217 [2024-11-17 11:02:25.620310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.217 [2024-11-17 11:02:25.620360] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ffba cdw11:00000000 00:08:01.217 [2024-11-17 11:02:25.620373] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.217 #43 NEW cov: 12400 ft: 14858 corp: 30/152b lim: 10 exec/s: 43 rss: 73Mb L: 7/10 MS: 1 PersAutoDict- DE: "\377\377\377\272"- 00:08:01.217 [2024-11-17 11:02:25.680495] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000409 cdw11:00000000 00:08:01.217 [2024-11-17 11:02:25.680520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.217 [2024-11-17 11:02:25.680568] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000909 cdw11:00000000 00:08:01.217 [2024-11-17 11:02:25.680581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.217 [2024-11-17 11:02:25.680633] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000909 cdw11:00000000 00:08:01.217 [2024-11-17 11:02:25.680662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.217 [2024-11-17 11:02:25.680712] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000909 cdw11:00000000 00:08:01.217 [2024-11-17 11:02:25.680726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.217 [2024-11-17 11:02:25.740587] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000fcf6 cdw11:00000000 00:08:01.217 [2024-11-17 11:02:25.740613] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.217 [2024-11-17 11:02:25.740666] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000f6f6 cdw11:00000000 00:08:01.217 [2024-11-17 11:02:25.740679] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.217 [2024-11-17 11:02:25.740731] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000f6f6 cdw11:00000000 00:08:01.217 [2024-11-17 11:02:25.740745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.217 [2024-11-17 11:02:25.740795] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000f6fe cdw11:00000000 00:08:01.217 [2024-11-17 11:02:25.740811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.217 #46 NEW cov: 12400 ft: 14864 corp: 31/160b lim: 10 exec/s: 46 rss: 73Mb L: 8/10 MS: 3 EraseBytes-InsertRepeatedBytes-ChangeBinInt- 00:08:01.217 [2024-11-17 11:02:25.780341] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:000029b9 cdw11:00000000 00:08:01.217 [2024-11-17 11:02:25.780366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.217 #47 NEW cov: 12400 ft: 14874 corp: 32/162b lim: 10 exec/s: 47 rss: 73Mb L: 2/10 MS: 1 ChangeBit- 00:08:01.217 [2024-11-17 11:02:25.820515] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:000029b9 cdw11:00000000 00:08:01.217 [2024-11-17 11:02:25.820539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.217 [2024-11-17 11:02:25.820591] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:000029b9 cdw11:00000000 00:08:01.217 [2024-11-17 11:02:25.820605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.218 #48 NEW cov: 12400 ft: 14912 corp: 33/166b lim: 10 exec/s: 48 rss: 73Mb L: 4/10 MS: 1 CopyPart- 00:08:01.479 [2024-11-17 11:02:25.880868] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a01 cdw11:00000000 00:08:01.479 [2024-11-17 11:02:25.880893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.479 [2024-11-17 11:02:25.880944] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00008aac cdw11:00000000 00:08:01.479 [2024-11-17 11:02:25.880958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.479 [2024-11-17 11:02:25.881008] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:000090f2 cdw11:00000000 00:08:01.479 [2024-11-17 11:02:25.881021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.479 #49 NEW cov: 12400 ft: 14920 corp: 34/172b lim: 10 exec/s: 49 rss: 74Mb L: 6/10 MS: 1 EraseBytes- 00:08:01.479 [2024-11-17 11:02:25.920921] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:01.479 [2024-11-17 11:02:25.920946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.479 [2024-11-17 11:02:25.921013] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffba cdw11:00000000 00:08:01.479 [2024-11-17 11:02:25.921026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.479 [2024-11-17 11:02:25.921078] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00008108 cdw11:00000000 00:08:01.479 [2024-11-17 11:02:25.921092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.479 #50 NEW cov: 12400 ft: 14940 corp: 35/178b lim: 10 exec/s: 25 rss: 74Mb L: 6/10 MS: 1 PersAutoDict- DE: "\377\377\377\272"- 00:08:01.479 #50 DONE cov: 12400 ft: 14940 corp: 35/178b lim: 10 exec/s: 25 rss: 74Mb 00:08:01.479 ###### Recommended dictionary. ###### 00:08:01.479 "\001\212\254\246\215/\220\362" # Uses: 1 00:08:01.479 "\377\377\377\272" # Uses: 2 00:08:01.479 ###### End of recommended dictionary. ###### 00:08:01.479 Done 50 runs in 2 second(s) 00:08:01.479 11:02:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_7.conf /var/tmp/suppress_nvmf_fuzz 00:08:01.479 11:02:26 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:01.479 11:02:26 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:01.479 11:02:26 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 8 1 0x1 00:08:01.479 11:02:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=8 00:08:01.479 11:02:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:01.479 11:02:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:01.479 11:02:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:08:01.479 11:02:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_8.conf 00:08:01.479 11:02:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:01.479 11:02:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:01.479 11:02:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 8 00:08:01.479 11:02:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4408 00:08:01.479 11:02:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:08:01.479 11:02:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4408' 00:08:01.479 11:02:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4408"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:01.479 11:02:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:01.479 11:02:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:01.479 11:02:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4408' -c /tmp/fuzz_json_8.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 -Z 8 00:08:01.479 [2024-11-17 11:02:26.106853] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:08:01.479 [2024-11-17 11:02:26.106922] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid151562 ] 00:08:01.740 [2024-11-17 11:02:26.304293] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:01.740 [2024-11-17 11:02:26.316901] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:01.740 [2024-11-17 11:02:26.369171] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:01.740 [2024-11-17 11:02:26.385508] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4408 *** 00:08:02.001 INFO: Running with entropic power schedule (0xFF, 100). 00:08:02.001 INFO: Seed: 852456765 00:08:02.001 INFO: Loaded 1 modules (387604 inline 8-bit counters): 387604 [0x2a8d80c, 0x2aec220), 00:08:02.001 INFO: Loaded 1 PC tables (387604 PCs): 387604 [0x2aec220,0x30d6360), 00:08:02.001 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:08:02.001 INFO: A corpus is not provided, starting from an empty corpus 00:08:02.001 [2024-11-17 11:02:26.440977] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.001 [2024-11-17 11:02:26.441006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.001 #2 INITED cov: 12201 ft: 12198 corp: 1/1b exec/s: 0 rss: 70Mb 00:08:02.001 [2024-11-17 11:02:26.481017] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.001 [2024-11-17 11:02:26.481048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.001 #3 NEW cov: 12314 ft: 12863 corp: 2/2b lim: 5 exec/s: 0 rss: 71Mb L: 1/1 MS: 1 ChangeByte- 00:08:02.001 [2024-11-17 11:02:26.541346] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.001 [2024-11-17 11:02:26.541372] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.001 [2024-11-17 11:02:26.541430] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.001 [2024-11-17 11:02:26.541444] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.001 #4 NEW cov: 12320 ft: 13720 corp: 3/4b lim: 5 exec/s: 0 rss: 71Mb L: 2/2 MS: 1 CrossOver- 00:08:02.001 [2024-11-17 11:02:26.581402] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.001 [2024-11-17 11:02:26.581427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.001 [2024-11-17 11:02:26.581504] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.001 [2024-11-17 11:02:26.581518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.001 #5 NEW cov: 12405 ft: 13930 corp: 4/6b lim: 5 exec/s: 0 rss: 71Mb L: 2/2 MS: 1 ChangeBit- 00:08:02.001 [2024-11-17 11:02:26.641926] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.001 [2024-11-17 11:02:26.641951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.001 [2024-11-17 11:02:26.642027] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.001 [2024-11-17 11:02:26.642045] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.001 [2024-11-17 11:02:26.642102] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.001 [2024-11-17 11:02:26.642115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.001 [2024-11-17 11:02:26.642172] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.001 [2024-11-17 11:02:26.642186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.262 #6 NEW cov: 12405 ft: 14267 corp: 5/10b lim: 5 exec/s: 0 rss: 71Mb L: 4/4 MS: 1 CopyPart- 00:08:02.262 [2024-11-17 11:02:26.681660] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.262 [2024-11-17 11:02:26.681686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.262 [2024-11-17 11:02:26.681745] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.262 [2024-11-17 11:02:26.681759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.262 #7 NEW cov: 12405 ft: 14364 corp: 6/12b lim: 5 exec/s: 0 rss: 71Mb L: 2/4 MS: 1 ChangeBit- 00:08:02.262 [2024-11-17 11:02:26.721818] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.262 [2024-11-17 11:02:26.721847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.262 [2024-11-17 11:02:26.721907] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.262 [2024-11-17 11:02:26.721921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.262 #8 NEW cov: 12405 ft: 14471 corp: 7/14b lim: 5 exec/s: 0 rss: 71Mb L: 2/4 MS: 1 ChangeBit- 00:08:02.262 [2024-11-17 11:02:26.782283] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.262 [2024-11-17 11:02:26.782308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.262 [2024-11-17 11:02:26.782367] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.262 [2024-11-17 11:02:26.782380] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.262 [2024-11-17 11:02:26.782439] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.262 [2024-11-17 11:02:26.782452] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.262 [2024-11-17 11:02:26.782509] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.262 [2024-11-17 11:02:26.782522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.262 #9 NEW cov: 12405 ft: 14501 corp: 8/18b lim: 5 exec/s: 0 rss: 71Mb L: 4/4 MS: 1 CopyPart- 00:08:02.262 [2024-11-17 11:02:26.822394] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.262 [2024-11-17 11:02:26.822419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.262 [2024-11-17 11:02:26.822495] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.262 [2024-11-17 11:02:26.822510] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.262 [2024-11-17 11:02:26.822570] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.262 [2024-11-17 11:02:26.822584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.262 [2024-11-17 11:02:26.822641] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.262 [2024-11-17 11:02:26.822655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.262 #10 NEW cov: 12405 ft: 14534 corp: 9/22b lim: 5 exec/s: 0 rss: 71Mb L: 4/4 MS: 1 CrossOver- 00:08:02.262 [2024-11-17 11:02:26.861992] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.262 [2024-11-17 11:02:26.862016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.262 #11 NEW cov: 12405 ft: 14627 corp: 10/23b lim: 5 exec/s: 0 rss: 71Mb L: 1/4 MS: 1 CopyPart- 00:08:02.523 [2024-11-17 11:02:26.922330] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.523 [2024-11-17 11:02:26.922356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.523 [2024-11-17 11:02:26.922432] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.523 [2024-11-17 11:02:26.922458] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.523 #12 NEW cov: 12405 ft: 14682 corp: 11/25b lim: 5 exec/s: 0 rss: 71Mb L: 2/4 MS: 1 ShuffleBytes- 00:08:02.523 [2024-11-17 11:02:26.982481] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.523 [2024-11-17 11:02:26.982506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.523 [2024-11-17 11:02:26.982563] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.523 [2024-11-17 11:02:26.982577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.523 #13 NEW cov: 12405 ft: 14761 corp: 12/27b lim: 5 exec/s: 0 rss: 71Mb L: 2/4 MS: 1 ChangeBit- 00:08:02.523 [2024-11-17 11:02:27.022960] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.523 [2024-11-17 11:02:27.022984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.523 [2024-11-17 11:02:27.023060] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.523 [2024-11-17 11:02:27.023074] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.523 [2024-11-17 11:02:27.023144] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.523 [2024-11-17 11:02:27.023157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.523 [2024-11-17 11:02:27.023214] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.523 [2024-11-17 11:02:27.023228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.523 #14 NEW cov: 12405 ft: 14795 corp: 13/31b lim: 5 exec/s: 0 rss: 71Mb L: 4/4 MS: 1 CrossOver- 00:08:02.523 [2024-11-17 11:02:27.062738] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.523 [2024-11-17 11:02:27.062763] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.523 [2024-11-17 11:02:27.062823] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.523 [2024-11-17 11:02:27.062837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.523 #15 NEW cov: 12405 ft: 14917 corp: 14/33b lim: 5 exec/s: 0 rss: 71Mb L: 2/4 MS: 1 ShuffleBytes- 00:08:02.523 [2024-11-17 11:02:27.122890] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.524 [2024-11-17 11:02:27.122918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.524 [2024-11-17 11:02:27.122980] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.524 [2024-11-17 11:02:27.122993] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.524 #16 NEW cov: 12405 ft: 14990 corp: 15/35b lim: 5 exec/s: 0 rss: 71Mb L: 2/4 MS: 1 ShuffleBytes- 00:08:02.784 [2024-11-17 11:02:27.183127] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.784 [2024-11-17 11:02:27.183153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.784 [2024-11-17 11:02:27.183215] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.784 [2024-11-17 11:02:27.183229] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.784 #17 NEW cov: 12405 ft: 15021 corp: 16/37b lim: 5 exec/s: 0 rss: 72Mb L: 2/4 MS: 1 ChangeBit- 00:08:02.784 [2024-11-17 11:02:27.243594] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.784 [2024-11-17 11:02:27.243620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.784 [2024-11-17 11:02:27.243696] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.784 [2024-11-17 11:02:27.243711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.784 [2024-11-17 11:02:27.243770] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.784 [2024-11-17 11:02:27.243784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.785 [2024-11-17 11:02:27.243843] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.785 [2024-11-17 11:02:27.243857] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.785 #18 NEW cov: 12405 ft: 15066 corp: 17/41b lim: 5 exec/s: 0 rss: 72Mb L: 4/4 MS: 1 ChangeBit- 00:08:02.785 [2024-11-17 11:02:27.303305] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.785 [2024-11-17 11:02:27.303331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.046 NEW_FUNC[1/1]: 0x1c4e498 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:03.046 #19 NEW cov: 12428 ft: 15098 corp: 18/42b lim: 5 exec/s: 19 rss: 73Mb L: 1/4 MS: 1 CrossOver- 00:08:03.046 [2024-11-17 11:02:27.614503] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.046 [2024-11-17 11:02:27.614546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.046 [2024-11-17 11:02:27.614614] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.046 [2024-11-17 11:02:27.614632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.046 [2024-11-17 11:02:27.614707] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.046 [2024-11-17 11:02:27.614726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.046 [2024-11-17 11:02:27.614789] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.046 [2024-11-17 11:02:27.614807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.046 #20 NEW cov: 12428 ft: 15219 corp: 19/46b lim: 5 exec/s: 20 rss: 73Mb L: 4/4 MS: 1 CrossOver- 00:08:03.046 [2024-11-17 11:02:27.654517] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.046 [2024-11-17 11:02:27.654544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.046 [2024-11-17 11:02:27.654602] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.046 [2024-11-17 11:02:27.654615] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.046 [2024-11-17 11:02:27.654670] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.046 [2024-11-17 11:02:27.654684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.046 [2024-11-17 11:02:27.654741] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.046 [2024-11-17 11:02:27.654755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.046 #21 NEW cov: 12428 ft: 15242 corp: 20/50b lim: 5 exec/s: 21 rss: 73Mb L: 4/4 MS: 1 CrossOver- 00:08:03.046 [2024-11-17 11:02:27.694298] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.046 [2024-11-17 11:02:27.694325] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.046 [2024-11-17 11:02:27.694382] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.046 [2024-11-17 11:02:27.694396] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.308 #22 NEW cov: 12428 ft: 15267 corp: 21/52b lim: 5 exec/s: 22 rss: 73Mb L: 2/4 MS: 1 ShuffleBytes- 00:08:03.308 [2024-11-17 11:02:27.734436] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.308 [2024-11-17 11:02:27.734461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.308 [2024-11-17 11:02:27.734520] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.308 [2024-11-17 11:02:27.734534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.308 #23 NEW cov: 12428 ft: 15308 corp: 22/54b lim: 5 exec/s: 23 rss: 73Mb L: 2/4 MS: 1 ChangeByte- 00:08:03.308 [2024-11-17 11:02:27.774376] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.308 [2024-11-17 11:02:27.774404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.308 #24 NEW cov: 12428 ft: 15337 corp: 23/55b lim: 5 exec/s: 24 rss: 73Mb L: 1/4 MS: 1 EraseBytes- 00:08:03.308 [2024-11-17 11:02:27.835026] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.308 [2024-11-17 11:02:27.835054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.308 [2024-11-17 11:02:27.835126] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.308 [2024-11-17 11:02:27.835140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.308 [2024-11-17 11:02:27.835192] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.308 [2024-11-17 11:02:27.835206] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.308 [2024-11-17 11:02:27.835258] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.308 [2024-11-17 11:02:27.835271] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.308 #25 NEW cov: 12428 ft: 15369 corp: 24/59b lim: 5 exec/s: 25 rss: 73Mb L: 4/4 MS: 1 ShuffleBytes- 00:08:03.308 [2024-11-17 11:02:27.895055] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.308 [2024-11-17 11:02:27.895081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.308 [2024-11-17 11:02:27.895135] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.308 [2024-11-17 11:02:27.895149] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.308 [2024-11-17 11:02:27.895204] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.308 [2024-11-17 11:02:27.895218] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.308 #26 NEW cov: 12428 ft: 15526 corp: 25/62b lim: 5 exec/s: 26 rss: 73Mb L: 3/4 MS: 1 CrossOver- 00:08:03.309 [2024-11-17 11:02:27.935274] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.309 [2024-11-17 11:02:27.935299] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.309 [2024-11-17 11:02:27.935353] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.309 [2024-11-17 11:02:27.935366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.309 [2024-11-17 11:02:27.935420] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.309 [2024-11-17 11:02:27.935433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.309 [2024-11-17 11:02:27.935488] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.309 [2024-11-17 11:02:27.935501] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.570 #27 NEW cov: 12428 ft: 15557 corp: 26/66b lim: 5 exec/s: 27 rss: 73Mb L: 4/4 MS: 1 ChangeBinInt- 00:08:03.570 [2024-11-17 11:02:27.995180] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.570 [2024-11-17 11:02:27.995205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.570 [2024-11-17 11:02:27.995262] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.570 [2024-11-17 11:02:27.995275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.570 #28 NEW cov: 12428 ft: 15574 corp: 27/68b lim: 5 exec/s: 28 rss: 73Mb L: 2/4 MS: 1 InsertByte- 00:08:03.570 [2024-11-17 11:02:28.035271] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.570 [2024-11-17 11:02:28.035296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.570 [2024-11-17 11:02:28.035367] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.570 [2024-11-17 11:02:28.035381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.570 #29 NEW cov: 12428 ft: 15608 corp: 28/70b lim: 5 exec/s: 29 rss: 73Mb L: 2/4 MS: 1 ChangeBit- 00:08:03.570 [2024-11-17 11:02:28.095785] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.570 [2024-11-17 11:02:28.095809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.570 [2024-11-17 11:02:28.095877] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.570 [2024-11-17 11:02:28.095891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.570 [2024-11-17 11:02:28.095944] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.570 [2024-11-17 11:02:28.095957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.570 [2024-11-17 11:02:28.096011] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.570 [2024-11-17 11:02:28.096023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.570 #30 NEW cov: 12428 ft: 15663 corp: 29/74b lim: 5 exec/s: 30 rss: 73Mb L: 4/4 MS: 1 ChangeByte- 00:08:03.570 [2024-11-17 11:02:28.155922] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.570 [2024-11-17 11:02:28.155946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.570 [2024-11-17 11:02:28.155998] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.570 [2024-11-17 11:02:28.156015] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.570 [2024-11-17 11:02:28.156087] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.570 [2024-11-17 11:02:28.156101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.570 [2024-11-17 11:02:28.156155] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.570 [2024-11-17 11:02:28.156168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.570 #31 NEW cov: 12428 ft: 15675 corp: 30/78b lim: 5 exec/s: 31 rss: 73Mb L: 4/4 MS: 1 ChangeBit- 00:08:03.570 [2024-11-17 11:02:28.195539] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.570 [2024-11-17 11:02:28.195563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.832 #32 NEW cov: 12428 ft: 15697 corp: 31/79b lim: 5 exec/s: 32 rss: 74Mb L: 1/4 MS: 1 EraseBytes- 00:08:03.832 [2024-11-17 11:02:28.255744] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.832 [2024-11-17 11:02:28.255769] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.832 #33 NEW cov: 12428 ft: 15718 corp: 32/80b lim: 5 exec/s: 33 rss: 74Mb L: 1/4 MS: 1 ChangeByte- 00:08:03.832 [2024-11-17 11:02:28.296129] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.832 [2024-11-17 11:02:28.296154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.832 [2024-11-17 11:02:28.296222] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.832 [2024-11-17 11:02:28.296237] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.832 [2024-11-17 11:02:28.296286] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.832 [2024-11-17 11:02:28.296300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.832 #34 NEW cov: 12428 ft: 15745 corp: 33/83b lim: 5 exec/s: 34 rss: 74Mb L: 3/4 MS: 1 InsertByte- 00:08:03.832 [2024-11-17 11:02:28.356486] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.832 [2024-11-17 11:02:28.356511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.832 [2024-11-17 11:02:28.356580] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.832 [2024-11-17 11:02:28.356594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.832 [2024-11-17 11:02:28.356647] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.832 [2024-11-17 11:02:28.356661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.832 [2024-11-17 11:02:28.356714] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.832 [2024-11-17 11:02:28.356731] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.832 #35 NEW cov: 12428 ft: 15811 corp: 34/87b lim: 5 exec/s: 35 rss: 74Mb L: 4/4 MS: 1 CrossOver- 00:08:03.832 [2024-11-17 11:02:28.396571] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.832 [2024-11-17 11:02:28.396596] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.832 [2024-11-17 11:02:28.396664] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.832 [2024-11-17 11:02:28.396679] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.832 [2024-11-17 11:02:28.396729] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.832 [2024-11-17 11:02:28.396743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.832 [2024-11-17 11:02:28.396794] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.832 [2024-11-17 11:02:28.396808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.832 #36 NEW cov: 12428 ft: 15827 corp: 35/91b lim: 5 exec/s: 18 rss: 74Mb L: 4/4 MS: 1 ChangeByte- 00:08:03.832 #36 DONE cov: 12428 ft: 15827 corp: 35/91b lim: 5 exec/s: 18 rss: 74Mb 00:08:03.832 Done 36 runs in 2 second(s) 00:08:04.093 11:02:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_8.conf /var/tmp/suppress_nvmf_fuzz 00:08:04.093 11:02:28 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:04.093 11:02:28 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:04.093 11:02:28 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 9 1 0x1 00:08:04.093 11:02:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=9 00:08:04.093 11:02:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:04.093 11:02:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:04.093 11:02:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:08:04.093 11:02:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_9.conf 00:08:04.093 11:02:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:04.093 11:02:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:04.093 11:02:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 9 00:08:04.093 11:02:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4409 00:08:04.093 11:02:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:08:04.093 11:02:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4409' 00:08:04.093 11:02:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4409"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:04.093 11:02:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:04.093 11:02:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:04.093 11:02:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4409' -c /tmp/fuzz_json_9.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 -Z 9 00:08:04.093 [2024-11-17 11:02:28.583220] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:08:04.093 [2024-11-17 11:02:28.583298] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid151918 ] 00:08:04.353 [2024-11-17 11:02:28.781200] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:04.353 [2024-11-17 11:02:28.793592] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:04.353 [2024-11-17 11:02:28.846110] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:04.353 [2024-11-17 11:02:28.862480] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4409 *** 00:08:04.353 INFO: Running with entropic power schedule (0xFF, 100). 00:08:04.353 INFO: Seed: 3329452988 00:08:04.353 INFO: Loaded 1 modules (387604 inline 8-bit counters): 387604 [0x2a8d80c, 0x2aec220), 00:08:04.353 INFO: Loaded 1 PC tables (387604 PCs): 387604 [0x2aec220,0x30d6360), 00:08:04.353 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:08:04.353 INFO: A corpus is not provided, starting from an empty corpus 00:08:04.353 [2024-11-17 11:02:28.928659] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.353 [2024-11-17 11:02:28.928695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.353 #2 INITED cov: 12201 ft: 12201 corp: 1/1b exec/s: 0 rss: 70Mb 00:08:04.353 [2024-11-17 11:02:28.978991] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.353 [2024-11-17 11:02:28.979023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.353 [2024-11-17 11:02:28.979155] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.353 [2024-11-17 11:02:28.979172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.629 #3 NEW cov: 12314 ft: 13496 corp: 2/3b lim: 5 exec/s: 0 rss: 71Mb L: 2/2 MS: 1 CrossOver- 00:08:04.630 [2024-11-17 11:02:29.049796] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.630 [2024-11-17 11:02:29.049831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.630 [2024-11-17 11:02:29.049960] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.630 [2024-11-17 11:02:29.049976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.630 [2024-11-17 11:02:29.050099] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.630 [2024-11-17 11:02:29.050116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.630 [2024-11-17 11:02:29.050235] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.630 [2024-11-17 11:02:29.050252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.630 #4 NEW cov: 12320 ft: 14102 corp: 3/7b lim: 5 exec/s: 0 rss: 71Mb L: 4/4 MS: 1 InsertRepeatedBytes- 00:08:04.630 [2024-11-17 11:02:29.099933] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.630 [2024-11-17 11:02:29.099962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.630 [2024-11-17 11:02:29.100108] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.630 [2024-11-17 11:02:29.100126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.630 [2024-11-17 11:02:29.100251] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.631 [2024-11-17 11:02:29.100268] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.631 [2024-11-17 11:02:29.100385] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.631 [2024-11-17 11:02:29.100403] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.631 #5 NEW cov: 12405 ft: 14352 corp: 4/11b lim: 5 exec/s: 0 rss: 71Mb L: 4/4 MS: 1 ChangeBit- 00:08:04.631 [2024-11-17 11:02:29.170117] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.631 [2024-11-17 11:02:29.170146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.631 [2024-11-17 11:02:29.170269] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.631 [2024-11-17 11:02:29.170287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.631 [2024-11-17 11:02:29.170408] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.631 [2024-11-17 11:02:29.170423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.631 [2024-11-17 11:02:29.170539] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.631 [2024-11-17 11:02:29.170555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.631 #6 NEW cov: 12405 ft: 14445 corp: 5/15b lim: 5 exec/s: 0 rss: 71Mb L: 4/4 MS: 1 ChangeBit- 00:08:04.631 [2024-11-17 11:02:29.220253] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.631 [2024-11-17 11:02:29.220282] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.632 [2024-11-17 11:02:29.220409] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.632 [2024-11-17 11:02:29.220426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.632 [2024-11-17 11:02:29.220545] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.632 [2024-11-17 11:02:29.220563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.632 [2024-11-17 11:02:29.220684] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.632 [2024-11-17 11:02:29.220700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.632 #7 NEW cov: 12405 ft: 14522 corp: 6/19b lim: 5 exec/s: 0 rss: 71Mb L: 4/4 MS: 1 ChangeByte- 00:08:04.894 [2024-11-17 11:02:29.290226] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.894 [2024-11-17 11:02:29.290256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.894 [2024-11-17 11:02:29.290368] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.894 [2024-11-17 11:02:29.290388] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.894 [2024-11-17 11:02:29.290502] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.894 [2024-11-17 11:02:29.290519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.894 #8 NEW cov: 12405 ft: 14730 corp: 7/22b lim: 5 exec/s: 0 rss: 71Mb L: 3/4 MS: 1 EraseBytes- 00:08:04.894 [2024-11-17 11:02:29.360878] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.894 [2024-11-17 11:02:29.360904] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.894 [2024-11-17 11:02:29.361038] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.894 [2024-11-17 11:02:29.361058] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.894 [2024-11-17 11:02:29.361176] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.894 [2024-11-17 11:02:29.361203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.895 [2024-11-17 11:02:29.361319] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.895 [2024-11-17 11:02:29.361334] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.895 [2024-11-17 11:02:29.361459] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.895 [2024-11-17 11:02:29.361476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:04.895 #9 NEW cov: 12405 ft: 14831 corp: 8/27b lim: 5 exec/s: 0 rss: 71Mb L: 5/5 MS: 1 CrossOver- 00:08:04.895 [2024-11-17 11:02:29.410566] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.895 [2024-11-17 11:02:29.410592] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.895 [2024-11-17 11:02:29.410713] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.895 [2024-11-17 11:02:29.410729] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.895 [2024-11-17 11:02:29.410853] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.895 [2024-11-17 11:02:29.410870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.895 #10 NEW cov: 12405 ft: 14865 corp: 9/30b lim: 5 exec/s: 0 rss: 71Mb L: 3/5 MS: 1 CrossOver- 00:08:04.895 [2024-11-17 11:02:29.460977] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.895 [2024-11-17 11:02:29.461003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.895 [2024-11-17 11:02:29.461120] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.895 [2024-11-17 11:02:29.461139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.895 [2024-11-17 11:02:29.461248] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.895 [2024-11-17 11:02:29.461265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.895 [2024-11-17 11:02:29.461384] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.895 [2024-11-17 11:02:29.461400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.895 #11 NEW cov: 12405 ft: 14933 corp: 10/34b lim: 5 exec/s: 0 rss: 71Mb L: 4/5 MS: 1 ChangeBit- 00:08:04.895 [2024-11-17 11:02:29.510828] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.895 [2024-11-17 11:02:29.510855] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.895 [2024-11-17 11:02:29.510975] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.895 [2024-11-17 11:02:29.510991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.895 [2024-11-17 11:02:29.511120] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.895 [2024-11-17 11:02:29.511137] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.155 #12 NEW cov: 12405 ft: 14961 corp: 11/37b lim: 5 exec/s: 0 rss: 71Mb L: 3/5 MS: 1 ShuffleBytes- 00:08:05.155 [2024-11-17 11:02:29.581511] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.155 [2024-11-17 11:02:29.581538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.155 [2024-11-17 11:02:29.581663] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.155 [2024-11-17 11:02:29.581682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.155 [2024-11-17 11:02:29.581806] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.155 [2024-11-17 11:02:29.581826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.155 [2024-11-17 11:02:29.581948] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.155 [2024-11-17 11:02:29.581965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.155 #13 NEW cov: 12405 ft: 14986 corp: 12/41b lim: 5 exec/s: 0 rss: 71Mb L: 4/5 MS: 1 ChangeBit- 00:08:05.155 [2024-11-17 11:02:29.651532] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.155 [2024-11-17 11:02:29.651558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.155 [2024-11-17 11:02:29.651693] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.155 [2024-11-17 11:02:29.651710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.155 [2024-11-17 11:02:29.651818] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.155 [2024-11-17 11:02:29.651834] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.155 [2024-11-17 11:02:29.651958] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.155 [2024-11-17 11:02:29.651971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.155 #14 NEW cov: 12405 ft: 15021 corp: 13/45b lim: 5 exec/s: 0 rss: 72Mb L: 4/5 MS: 1 InsertByte- 00:08:05.155 [2024-11-17 11:02:29.721518] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.155 [2024-11-17 11:02:29.721545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.156 [2024-11-17 11:02:29.721675] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.156 [2024-11-17 11:02:29.721693] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.156 [2024-11-17 11:02:29.721817] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.156 [2024-11-17 11:02:29.721833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.156 #15 NEW cov: 12405 ft: 15063 corp: 14/48b lim: 5 exec/s: 0 rss: 72Mb L: 3/5 MS: 1 EraseBytes- 00:08:05.156 [2024-11-17 11:02:29.791727] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.156 [2024-11-17 11:02:29.791753] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.156 [2024-11-17 11:02:29.791868] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.156 [2024-11-17 11:02:29.791886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.156 [2024-11-17 11:02:29.792001] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.156 [2024-11-17 11:02:29.792019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.677 NEW_FUNC[1/1]: 0x1c4e498 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:05.677 #16 NEW cov: 12428 ft: 15110 corp: 15/51b lim: 5 exec/s: 16 rss: 73Mb L: 3/5 MS: 1 ChangeByte- 00:08:05.677 [2024-11-17 11:02:30.113409] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.677 [2024-11-17 11:02:30.113453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.677 [2024-11-17 11:02:30.113589] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.677 [2024-11-17 11:02:30.113609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.677 [2024-11-17 11:02:30.113735] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.677 [2024-11-17 11:02:30.113753] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.677 [2024-11-17 11:02:30.113885] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.677 [2024-11-17 11:02:30.113904] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.677 [2024-11-17 11:02:30.114038] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.677 [2024-11-17 11:02:30.114059] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:05.677 #17 NEW cov: 12428 ft: 15161 corp: 16/56b lim: 5 exec/s: 17 rss: 73Mb L: 5/5 MS: 1 CopyPart- 00:08:05.677 [2024-11-17 11:02:30.163231] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.677 [2024-11-17 11:02:30.163263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.677 [2024-11-17 11:02:30.163391] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.678 [2024-11-17 11:02:30.163408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.678 [2024-11-17 11:02:30.163531] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.678 [2024-11-17 11:02:30.163549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.678 [2024-11-17 11:02:30.163663] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.678 [2024-11-17 11:02:30.163680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.678 #18 NEW cov: 12428 ft: 15259 corp: 17/60b lim: 5 exec/s: 18 rss: 73Mb L: 4/5 MS: 1 ShuffleBytes- 00:08:05.678 [2024-11-17 11:02:30.213583] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.678 [2024-11-17 11:02:30.213614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.678 [2024-11-17 11:02:30.213737] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.678 [2024-11-17 11:02:30.213756] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.678 [2024-11-17 11:02:30.213879] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.678 [2024-11-17 11:02:30.213896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.678 [2024-11-17 11:02:30.214014] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.678 [2024-11-17 11:02:30.214031] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.678 [2024-11-17 11:02:30.214163] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.678 [2024-11-17 11:02:30.214179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:05.678 #19 NEW cov: 12428 ft: 15346 corp: 18/65b lim: 5 exec/s: 19 rss: 73Mb L: 5/5 MS: 1 CMP- DE: "\000\000"- 00:08:05.678 [2024-11-17 11:02:30.283736] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.678 [2024-11-17 11:02:30.283763] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.678 [2024-11-17 11:02:30.283893] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.678 [2024-11-17 11:02:30.283927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.678 [2024-11-17 11:02:30.284045] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.678 [2024-11-17 11:02:30.284062] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.678 [2024-11-17 11:02:30.284181] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.678 [2024-11-17 11:02:30.284198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.678 [2024-11-17 11:02:30.284314] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.678 [2024-11-17 11:02:30.284332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:05.678 #20 NEW cov: 12428 ft: 15359 corp: 19/70b lim: 5 exec/s: 20 rss: 73Mb L: 5/5 MS: 1 ChangeByte- 00:08:05.939 [2024-11-17 11:02:30.333317] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.939 [2024-11-17 11:02:30.333346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.939 [2024-11-17 11:02:30.333477] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.939 [2024-11-17 11:02:30.333494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.939 [2024-11-17 11:02:30.333623] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.939 [2024-11-17 11:02:30.333641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.939 #21 NEW cov: 12428 ft: 15411 corp: 20/73b lim: 5 exec/s: 21 rss: 73Mb L: 3/5 MS: 1 ChangeBit- 00:08:05.939 [2024-11-17 11:02:30.384167] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.939 [2024-11-17 11:02:30.384198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.939 [2024-11-17 11:02:30.384322] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.939 [2024-11-17 11:02:30.384339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.939 [2024-11-17 11:02:30.384459] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.939 [2024-11-17 11:02:30.384476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.939 [2024-11-17 11:02:30.384596] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.939 [2024-11-17 11:02:30.384614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.939 [2024-11-17 11:02:30.384735] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.940 [2024-11-17 11:02:30.384753] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:05.940 #22 NEW cov: 12428 ft: 15458 corp: 21/78b lim: 5 exec/s: 22 rss: 73Mb L: 5/5 MS: 1 CopyPart- 00:08:05.940 [2024-11-17 11:02:30.434275] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.940 [2024-11-17 11:02:30.434302] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.940 [2024-11-17 11:02:30.434434] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.940 [2024-11-17 11:02:30.434453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.940 [2024-11-17 11:02:30.434579] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.940 [2024-11-17 11:02:30.434596] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.940 [2024-11-17 11:02:30.434710] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.940 [2024-11-17 11:02:30.434726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.940 [2024-11-17 11:02:30.434848] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.940 [2024-11-17 11:02:30.434865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:05.940 #23 NEW cov: 12428 ft: 15487 corp: 22/83b lim: 5 exec/s: 23 rss: 73Mb L: 5/5 MS: 1 CopyPart- 00:08:05.940 [2024-11-17 11:02:30.484194] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.940 [2024-11-17 11:02:30.484222] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.940 [2024-11-17 11:02:30.484354] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.940 [2024-11-17 11:02:30.484371] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.940 [2024-11-17 11:02:30.484489] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.940 [2024-11-17 11:02:30.484508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.940 [2024-11-17 11:02:30.484630] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.940 [2024-11-17 11:02:30.484647] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.940 #24 NEW cov: 12428 ft: 15497 corp: 23/87b lim: 5 exec/s: 24 rss: 73Mb L: 4/5 MS: 1 ChangeBinInt- 00:08:05.940 [2024-11-17 11:02:30.534586] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.940 [2024-11-17 11:02:30.534615] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.940 [2024-11-17 11:02:30.534738] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.940 [2024-11-17 11:02:30.534755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.940 [2024-11-17 11:02:30.534881] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.940 [2024-11-17 11:02:30.534899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.940 [2024-11-17 11:02:30.535017] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.940 [2024-11-17 11:02:30.535033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.940 [2024-11-17 11:02:30.535153] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.940 [2024-11-17 11:02:30.535170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:05.940 #25 NEW cov: 12428 ft: 15509 corp: 24/92b lim: 5 exec/s: 25 rss: 73Mb L: 5/5 MS: 1 InsertByte- 00:08:06.201 [2024-11-17 11:02:30.604447] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.201 [2024-11-17 11:02:30.604475] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.201 [2024-11-17 11:02:30.604591] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.201 [2024-11-17 11:02:30.604608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.201 [2024-11-17 11:02:30.604729] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.201 [2024-11-17 11:02:30.604747] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.201 [2024-11-17 11:02:30.604877] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.201 [2024-11-17 11:02:30.604897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.201 #26 NEW cov: 12428 ft: 15582 corp: 25/96b lim: 5 exec/s: 26 rss: 73Mb L: 4/5 MS: 1 ChangeByte- 00:08:06.201 [2024-11-17 11:02:30.674637] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.201 [2024-11-17 11:02:30.674664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.201 [2024-11-17 11:02:30.674779] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.201 [2024-11-17 11:02:30.674795] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.201 [2024-11-17 11:02:30.674918] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.201 [2024-11-17 11:02:30.674936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.201 [2024-11-17 11:02:30.675082] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.201 [2024-11-17 11:02:30.675097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.201 #27 NEW cov: 12428 ft: 15594 corp: 26/100b lim: 5 exec/s: 27 rss: 73Mb L: 4/5 MS: 1 ChangeBit- 00:08:06.201 [2024-11-17 11:02:30.725062] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.201 [2024-11-17 11:02:30.725090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.201 [2024-11-17 11:02:30.725216] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.201 [2024-11-17 11:02:30.725234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.201 [2024-11-17 11:02:30.725352] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.201 [2024-11-17 11:02:30.725370] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.201 [2024-11-17 11:02:30.725490] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.201 [2024-11-17 11:02:30.725507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.201 [2024-11-17 11:02:30.725625] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.201 [2024-11-17 11:02:30.725643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:06.201 #28 NEW cov: 12428 ft: 15614 corp: 27/105b lim: 5 exec/s: 28 rss: 73Mb L: 5/5 MS: 1 CrossOver- 00:08:06.201 [2024-11-17 11:02:30.795337] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.201 [2024-11-17 11:02:30.795366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.201 [2024-11-17 11:02:30.795493] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.202 [2024-11-17 11:02:30.795510] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.202 [2024-11-17 11:02:30.795629] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.202 [2024-11-17 11:02:30.795646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.202 [2024-11-17 11:02:30.795770] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.202 [2024-11-17 11:02:30.795786] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.202 [2024-11-17 11:02:30.795908] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.202 [2024-11-17 11:02:30.795925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:06.202 #29 NEW cov: 12428 ft: 15624 corp: 28/110b lim: 5 exec/s: 29 rss: 74Mb L: 5/5 MS: 1 ChangeByte- 00:08:06.462 [2024-11-17 11:02:30.865568] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.462 [2024-11-17 11:02:30.865596] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.462 [2024-11-17 11:02:30.865726] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.462 [2024-11-17 11:02:30.865743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.462 [2024-11-17 11:02:30.865868] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.462 [2024-11-17 11:02:30.865884] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.463 [2024-11-17 11:02:30.866004] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.463 [2024-11-17 11:02:30.866019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.463 [2024-11-17 11:02:30.866161] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.463 [2024-11-17 11:02:30.866177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:06.463 #30 NEW cov: 12428 ft: 15669 corp: 29/115b lim: 5 exec/s: 30 rss: 74Mb L: 5/5 MS: 1 CopyPart- 00:08:06.463 [2024-11-17 11:02:30.915666] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.463 [2024-11-17 11:02:30.915697] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.463 [2024-11-17 11:02:30.915824] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.463 [2024-11-17 11:02:30.915839] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.463 [2024-11-17 11:02:30.915961] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.463 [2024-11-17 11:02:30.915977] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.463 [2024-11-17 11:02:30.916107] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.463 [2024-11-17 11:02:30.916123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.463 [2024-11-17 11:02:30.916243] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.463 [2024-11-17 11:02:30.916259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:06.463 #31 NEW cov: 12428 ft: 15682 corp: 30/120b lim: 5 exec/s: 15 rss: 74Mb L: 5/5 MS: 1 ChangeBinInt- 00:08:06.463 #31 DONE cov: 12428 ft: 15682 corp: 30/120b lim: 5 exec/s: 15 rss: 74Mb 00:08:06.463 ###### Recommended dictionary. ###### 00:08:06.463 "\000\000" # Uses: 0 00:08:06.463 ###### End of recommended dictionary. ###### 00:08:06.463 Done 31 runs in 2 second(s) 00:08:06.463 11:02:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_9.conf /var/tmp/suppress_nvmf_fuzz 00:08:06.463 11:02:31 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:06.463 11:02:31 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:06.463 11:02:31 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 10 1 0x1 00:08:06.463 11:02:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=10 00:08:06.463 11:02:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:06.463 11:02:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:06.463 11:02:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:08:06.463 11:02:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_10.conf 00:08:06.463 11:02:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:06.463 11:02:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:06.463 11:02:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 10 00:08:06.463 11:02:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4410 00:08:06.463 11:02:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:08:06.463 11:02:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4410' 00:08:06.463 11:02:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4410"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:06.463 11:02:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:06.463 11:02:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:06.463 11:02:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4410' -c /tmp/fuzz_json_10.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 -Z 10 00:08:06.463 [2024-11-17 11:02:31.100952] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:08:06.463 [2024-11-17 11:02:31.101022] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid152453 ] 00:08:06.724 [2024-11-17 11:02:31.297874] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:06.724 [2024-11-17 11:02:31.311409] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:06.724 [2024-11-17 11:02:31.363963] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:06.985 [2024-11-17 11:02:31.380282] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4410 *** 00:08:06.985 INFO: Running with entropic power schedule (0xFF, 100). 00:08:06.985 INFO: Seed: 1552490541 00:08:06.985 INFO: Loaded 1 modules (387604 inline 8-bit counters): 387604 [0x2a8d80c, 0x2aec220), 00:08:06.985 INFO: Loaded 1 PC tables (387604 PCs): 387604 [0x2aec220,0x30d6360), 00:08:06.985 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:08:06.985 INFO: A corpus is not provided, starting from an empty corpus 00:08:06.985 #2 INITED exec/s: 0 rss: 65Mb 00:08:06.985 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:06.985 This may also happen if the target rejected all inputs we tried so far 00:08:06.985 [2024-11-17 11:02:31.445646] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:660aff89 cdw11:acaa8cb2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.985 [2024-11-17 11:02:31.445673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.246 NEW_FUNC[1/715]: 0x45f648 in fuzz_admin_security_receive_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:205 00:08:07.246 NEW_FUNC[2/715]: 0x48ef68 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:07.246 #12 NEW cov: 12206 ft: 12198 corp: 2/11b lim: 40 exec/s: 0 rss: 72Mb L: 10/10 MS: 5 ChangeBit-CrossOver-ChangeByte-ChangeBit-CMP- DE: "\377\211\254\252\214\262\257\032"- 00:08:07.246 [2024-11-17 11:02:31.776524] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:660aff60 cdw11:acaa8cb2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.246 [2024-11-17 11:02:31.776558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.246 #13 NEW cov: 12336 ft: 12588 corp: 3/21b lim: 40 exec/s: 0 rss: 72Mb L: 10/10 MS: 1 ChangeByte- 00:08:07.246 [2024-11-17 11:02:31.836556] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.246 [2024-11-17 11:02:31.836582] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.246 #14 NEW cov: 12342 ft: 12821 corp: 4/30b lim: 40 exec/s: 0 rss: 72Mb L: 9/10 MS: 1 CMP- DE: "\377\377\377\377\377\377\377\017"- 00:08:07.246 [2024-11-17 11:02:31.876687] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:660aa089 cdw11:acaa8cb2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.246 [2024-11-17 11:02:31.876713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.246 #15 NEW cov: 12427 ft: 13120 corp: 5/40b lim: 40 exec/s: 0 rss: 72Mb L: 10/10 MS: 1 ChangeByte- 00:08:07.507 [2024-11-17 11:02:31.917029] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:3fffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.507 [2024-11-17 11:02:31.917060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.507 [2024-11-17 11:02:31.917118] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.507 [2024-11-17 11:02:31.917135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.507 [2024-11-17 11:02:31.917190] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.507 [2024-11-17 11:02:31.917204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.507 #18 NEW cov: 12427 ft: 13703 corp: 6/64b lim: 40 exec/s: 0 rss: 72Mb L: 24/24 MS: 3 ChangeByte-ChangeBit-InsertRepeatedBytes- 00:08:07.507 [2024-11-17 11:02:31.956859] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:660aff60 cdw11:acaa8cb2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.507 [2024-11-17 11:02:31.956883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.507 #19 NEW cov: 12427 ft: 13811 corp: 7/75b lim: 40 exec/s: 0 rss: 72Mb L: 11/24 MS: 1 InsertByte- 00:08:07.507 [2024-11-17 11:02:32.017038] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0affff37 cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.507 [2024-11-17 11:02:32.017067] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.507 #20 NEW cov: 12427 ft: 13912 corp: 8/84b lim: 40 exec/s: 0 rss: 72Mb L: 9/24 MS: 1 ChangeByte- 00:08:07.507 [2024-11-17 11:02:32.077566] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:660aff60 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.507 [2024-11-17 11:02:32.077591] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.507 [2024-11-17 11:02:32.077648] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.507 [2024-11-17 11:02:32.077661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.507 [2024-11-17 11:02:32.077733] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.507 [2024-11-17 11:02:32.077747] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.507 [2024-11-17 11:02:32.077801] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:000000ac cdw11:aa8cb2af SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.507 [2024-11-17 11:02:32.077815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.507 #21 NEW cov: 12427 ft: 14405 corp: 9/117b lim: 40 exec/s: 0 rss: 72Mb L: 33/33 MS: 1 InsertRepeatedBytes- 00:08:07.507 [2024-11-17 11:02:32.117681] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:66565656 cdw11:56565656 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.507 [2024-11-17 11:02:32.117706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.507 [2024-11-17 11:02:32.117766] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:56565656 cdw11:56565656 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.507 [2024-11-17 11:02:32.117780] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.507 [2024-11-17 11:02:32.117837] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:56565656 cdw11:56565656 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.507 [2024-11-17 11:02:32.117853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.507 [2024-11-17 11:02:32.117908] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:56565656 cdw11:56560aff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.507 [2024-11-17 11:02:32.117921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.507 #22 NEW cov: 12427 ft: 14502 corp: 10/156b lim: 40 exec/s: 0 rss: 72Mb L: 39/39 MS: 1 InsertRepeatedBytes- 00:08:07.507 [2024-11-17 11:02:32.157422] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:660a26ff cdw11:89acaa8c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.507 [2024-11-17 11:02:32.157447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.768 #23 NEW cov: 12427 ft: 14577 corp: 11/167b lim: 40 exec/s: 0 rss: 72Mb L: 11/39 MS: 1 InsertByte- 00:08:07.768 [2024-11-17 11:02:32.197536] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:660aa089 cdw11:acaa8cb2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.768 [2024-11-17 11:02:32.197561] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.768 #24 NEW cov: 12427 ft: 14632 corp: 12/177b lim: 40 exec/s: 0 rss: 73Mb L: 10/39 MS: 1 ChangeByte- 00:08:07.768 [2024-11-17 11:02:32.257957] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:3fffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.768 [2024-11-17 11:02:32.257981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.768 [2024-11-17 11:02:32.258045] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.768 [2024-11-17 11:02:32.258058] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.768 [2024-11-17 11:02:32.258113] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:fffeffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.768 [2024-11-17 11:02:32.258126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.768 #25 NEW cov: 12427 ft: 14648 corp: 13/201b lim: 40 exec/s: 0 rss: 73Mb L: 24/39 MS: 1 ChangeBit- 00:08:07.768 [2024-11-17 11:02:32.317874] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffff0f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.768 [2024-11-17 11:02:32.317899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.768 NEW_FUNC[1/1]: 0x1c4e498 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:07.768 #26 NEW cov: 12450 ft: 14691 corp: 14/211b lim: 40 exec/s: 0 rss: 73Mb L: 10/39 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\377\017"- 00:08:07.768 [2024-11-17 11:02:32.378221] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:660aa089 cdw11:acaa8cb2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.768 [2024-11-17 11:02:32.378247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.768 [2024-11-17 11:02:32.378306] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ff89acaa cdw11:8cb2af1a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.768 [2024-11-17 11:02:32.378320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.768 #27 NEW cov: 12450 ft: 14946 corp: 15/229b lim: 40 exec/s: 0 rss: 73Mb L: 18/39 MS: 1 PersAutoDict- DE: "\377\211\254\252\214\262\257\032"- 00:08:07.768 [2024-11-17 11:02:32.418172] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:660a0000 cdw11:00aa8cb2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.768 [2024-11-17 11:02:32.418197] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.029 #28 NEW cov: 12450 ft: 15011 corp: 16/239b lim: 40 exec/s: 28 rss: 73Mb L: 10/39 MS: 1 ChangeBinInt- 00:08:08.029 [2024-11-17 11:02:32.458658] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0affb9b9 cdw11:b9b9b9b9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.029 [2024-11-17 11:02:32.458683] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.029 [2024-11-17 11:02:32.458743] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:b9b9b9b9 cdw11:b9b9b9b9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.029 [2024-11-17 11:02:32.458757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.029 [2024-11-17 11:02:32.458813] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:b9b9b9b9 cdw11:b9b9b9b9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.029 [2024-11-17 11:02:32.458826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.029 [2024-11-17 11:02:32.458884] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:b9b9b9ff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.029 [2024-11-17 11:02:32.458897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:08.029 #29 NEW cov: 12450 ft: 15026 corp: 17/273b lim: 40 exec/s: 29 rss: 73Mb L: 34/39 MS: 1 InsertRepeatedBytes- 00:08:08.029 [2024-11-17 11:02:32.498770] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:660aff60 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.029 [2024-11-17 11:02:32.498795] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.029 [2024-11-17 11:02:32.498870] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00003a00 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.029 [2024-11-17 11:02:32.498884] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.029 [2024-11-17 11:02:32.498940] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.029 [2024-11-17 11:02:32.498953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.029 [2024-11-17 11:02:32.499010] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:000000ac cdw11:aa8cb2af SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.029 [2024-11-17 11:02:32.499023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:08.029 #30 NEW cov: 12450 ft: 15052 corp: 18/306b lim: 40 exec/s: 30 rss: 73Mb L: 33/39 MS: 1 ChangeByte- 00:08:08.029 [2024-11-17 11:02:32.558689] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.029 [2024-11-17 11:02:32.558713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.029 [2024-11-17 11:02:32.558771] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffff0f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.029 [2024-11-17 11:02:32.558784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.029 #31 NEW cov: 12450 ft: 15120 corp: 19/323b lim: 40 exec/s: 31 rss: 73Mb L: 17/39 MS: 1 CrossOver- 00:08:08.029 [2024-11-17 11:02:32.598812] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:660affff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.029 [2024-11-17 11:02:32.598836] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.029 [2024-11-17 11:02:32.598892] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffff0000 cdw11:00aa8cb2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.029 [2024-11-17 11:02:32.598905] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.029 #32 NEW cov: 12450 ft: 15135 corp: 20/341b lim: 40 exec/s: 32 rss: 73Mb L: 18/39 MS: 1 CMP- DE: "\377\377\377\377\377\377\377\377"- 00:08:08.029 [2024-11-17 11:02:32.658845] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:660affff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.029 [2024-11-17 11:02:32.658869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.291 #33 NEW cov: 12450 ft: 15157 corp: 21/352b lim: 40 exec/s: 33 rss: 73Mb L: 11/39 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\377\377"- 00:08:08.291 [2024-11-17 11:02:32.719023] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:66000a26 cdw11:ff89acaa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.291 [2024-11-17 11:02:32.719051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.291 #34 NEW cov: 12450 ft: 15169 corp: 22/364b lim: 40 exec/s: 34 rss: 73Mb L: 12/39 MS: 1 InsertByte- 00:08:08.291 [2024-11-17 11:02:32.779217] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:660aff60 cdw11:acaa8cb2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.291 [2024-11-17 11:02:32.779241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.291 #35 NEW cov: 12450 ft: 15196 corp: 23/375b lim: 40 exec/s: 35 rss: 73Mb L: 11/39 MS: 1 CopyPart- 00:08:08.291 [2024-11-17 11:02:32.819544] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:3fffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.291 [2024-11-17 11:02:32.819569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.291 [2024-11-17 11:02:32.819629] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:efffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.291 [2024-11-17 11:02:32.819642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.291 [2024-11-17 11:02:32.819699] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:fffeffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.291 [2024-11-17 11:02:32.819712] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.291 #36 NEW cov: 12450 ft: 15208 corp: 24/399b lim: 40 exec/s: 36 rss: 73Mb L: 24/39 MS: 1 ChangeBit- 00:08:08.291 [2024-11-17 11:02:32.879864] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:660aff60 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.291 [2024-11-17 11:02:32.879888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.291 [2024-11-17 11:02:32.879944] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.291 [2024-11-17 11:02:32.879960] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.291 [2024-11-17 11:02:32.880018] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:0000660a cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.291 [2024-11-17 11:02:32.880031] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.291 [2024-11-17 11:02:32.880093] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ffffff00 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.291 [2024-11-17 11:02:32.880107] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:08.291 #37 NEW cov: 12450 ft: 15216 corp: 25/434b lim: 40 exec/s: 37 rss: 73Mb L: 35/39 MS: 1 CrossOver- 00:08:08.291 [2024-11-17 11:02:32.919712] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:660aa089 cdw11:acaa8cb2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.291 [2024-11-17 11:02:32.919736] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.291 [2024-11-17 11:02:32.919795] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ff89ac41 cdw11:aa8cb2af SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.291 [2024-11-17 11:02:32.919808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.552 #38 NEW cov: 12450 ft: 15221 corp: 26/453b lim: 40 exec/s: 38 rss: 74Mb L: 19/39 MS: 1 InsertByte- 00:08:08.552 [2024-11-17 11:02:32.979743] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ff6fffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.552 [2024-11-17 11:02:32.979768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.552 #39 NEW cov: 12450 ft: 15240 corp: 27/464b lim: 40 exec/s: 39 rss: 74Mb L: 11/39 MS: 1 InsertByte- 00:08:08.552 [2024-11-17 11:02:33.040275] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:660aff60 cdw11:000000fc SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.552 [2024-11-17 11:02:33.040299] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.552 [2024-11-17 11:02:33.040353] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.552 [2024-11-17 11:02:33.040366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.552 [2024-11-17 11:02:33.040420] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.552 [2024-11-17 11:02:33.040433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.552 [2024-11-17 11:02:33.040490] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:000000ac cdw11:aa8cb2af SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.552 [2024-11-17 11:02:33.040503] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:08.552 #40 NEW cov: 12450 ft: 15247 corp: 28/497b lim: 40 exec/s: 40 rss: 74Mb L: 33/39 MS: 1 ChangeBinInt- 00:08:08.552 [2024-11-17 11:02:33.080415] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ff660a60 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.552 [2024-11-17 11:02:33.080439] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.552 [2024-11-17 11:02:33.080498] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00003a00 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.552 [2024-11-17 11:02:33.080512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.552 [2024-11-17 11:02:33.080567] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.552 [2024-11-17 11:02:33.080581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.552 [2024-11-17 11:02:33.080634] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:000000ac cdw11:aa8cb2af SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.552 [2024-11-17 11:02:33.080647] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:08.552 #41 NEW cov: 12450 ft: 15248 corp: 29/530b lim: 40 exec/s: 41 rss: 74Mb L: 33/39 MS: 1 ShuffleBytes- 00:08:08.552 [2024-11-17 11:02:33.140408] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ff6fffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.552 [2024-11-17 11:02:33.140432] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.552 [2024-11-17 11:02:33.140490] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.552 [2024-11-17 11:02:33.140504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.552 #42 NEW cov: 12450 ft: 15264 corp: 30/551b lim: 40 exec/s: 42 rss: 74Mb L: 21/39 MS: 1 InsertRepeatedBytes- 00:08:08.552 [2024-11-17 11:02:33.200518] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:59660aff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.552 [2024-11-17 11:02:33.200541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.552 [2024-11-17 11:02:33.200600] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffff00 cdw11:0000aa8c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.552 [2024-11-17 11:02:33.200613] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.813 #43 NEW cov: 12450 ft: 15310 corp: 31/570b lim: 40 exec/s: 43 rss: 74Mb L: 19/39 MS: 1 InsertByte- 00:08:08.813 [2024-11-17 11:02:33.260541] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:66000a26 cdw11:f889acaa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.813 [2024-11-17 11:02:33.260566] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.813 #44 NEW cov: 12450 ft: 15318 corp: 32/582b lim: 40 exec/s: 44 rss: 74Mb L: 12/39 MS: 1 ChangeBinInt- 00:08:08.813 [2024-11-17 11:02:33.320949] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:660aff60 cdw11:000000fc SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.813 [2024-11-17 11:02:33.320974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.813 [2024-11-17 11:02:33.321031] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.814 [2024-11-17 11:02:33.321048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.814 [2024-11-17 11:02:33.321104] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.814 [2024-11-17 11:02:33.321121] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.814 #45 NEW cov: 12450 ft: 15352 corp: 33/607b lim: 40 exec/s: 45 rss: 74Mb L: 25/39 MS: 1 EraseBytes- 00:08:08.814 [2024-11-17 11:02:33.381219] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:660aff60 cdw11:000000fc SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.814 [2024-11-17 11:02:33.381242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.814 [2024-11-17 11:02:33.381300] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.814 [2024-11-17 11:02:33.381313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.814 [2024-11-17 11:02:33.381367] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.814 [2024-11-17 11:02:33.381381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.814 [2024-11-17 11:02:33.381435] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:000000ac cdw11:aa660aff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.814 [2024-11-17 11:02:33.381447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:08.814 #46 NEW cov: 12450 ft: 15363 corp: 34/641b lim: 40 exec/s: 46 rss: 74Mb L: 34/39 MS: 1 CrossOver- 00:08:08.814 [2024-11-17 11:02:33.421125] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:59660aff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.814 [2024-11-17 11:02:33.421150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.814 [2024-11-17 11:02:33.421208] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:000000aa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.814 [2024-11-17 11:02:33.421221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.814 #47 NEW cov: 12450 ft: 15369 corp: 35/661b lim: 40 exec/s: 23 rss: 74Mb L: 20/39 MS: 1 InsertByte- 00:08:08.814 #47 DONE cov: 12450 ft: 15369 corp: 35/661b lim: 40 exec/s: 23 rss: 74Mb 00:08:08.814 ###### Recommended dictionary. ###### 00:08:08.814 "\377\211\254\252\214\262\257\032" # Uses: 1 00:08:08.814 "\377\377\377\377\377\377\377\017" # Uses: 1 00:08:08.814 "\377\377\377\377\377\377\377\377" # Uses: 1 00:08:08.814 ###### End of recommended dictionary. ###### 00:08:08.814 Done 47 runs in 2 second(s) 00:08:09.074 11:02:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_10.conf /var/tmp/suppress_nvmf_fuzz 00:08:09.074 11:02:33 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:09.074 11:02:33 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:09.074 11:02:33 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 11 1 0x1 00:08:09.074 11:02:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=11 00:08:09.074 11:02:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:09.074 11:02:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:09.074 11:02:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:08:09.074 11:02:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_11.conf 00:08:09.074 11:02:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:09.074 11:02:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:09.074 11:02:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 11 00:08:09.074 11:02:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4411 00:08:09.074 11:02:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:08:09.074 11:02:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4411' 00:08:09.074 11:02:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4411"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:09.074 11:02:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:09.074 11:02:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:09.074 11:02:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4411' -c /tmp/fuzz_json_11.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 -Z 11 00:08:09.074 [2024-11-17 11:02:33.606226] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:08:09.074 [2024-11-17 11:02:33.606297] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid152807 ] 00:08:09.335 [2024-11-17 11:02:33.813499] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:09.335 [2024-11-17 11:02:33.827389] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:09.335 [2024-11-17 11:02:33.880144] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:09.335 [2024-11-17 11:02:33.896474] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4411 *** 00:08:09.335 INFO: Running with entropic power schedule (0xFF, 100). 00:08:09.335 INFO: Seed: 4069475619 00:08:09.335 INFO: Loaded 1 modules (387604 inline 8-bit counters): 387604 [0x2a8d80c, 0x2aec220), 00:08:09.335 INFO: Loaded 1 PC tables (387604 PCs): 387604 [0x2aec220,0x30d6360), 00:08:09.335 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:08:09.335 INFO: A corpus is not provided, starting from an empty corpus 00:08:09.335 #2 INITED exec/s: 0 rss: 65Mb 00:08:09.335 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:09.335 This may also happen if the target rejected all inputs we tried so far 00:08:09.335 [2024-11-17 11:02:33.962900] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:9e9e9e9e cdw11:9e9e9e9e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.335 [2024-11-17 11:02:33.962940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.335 [2024-11-17 11:02:33.963075] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:9e9e9e9e cdw11:9e9e9e9e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.335 [2024-11-17 11:02:33.963092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.859 NEW_FUNC[1/714]: 0x4613b8 in fuzz_admin_security_send_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:223 00:08:09.859 NEW_FUNC[2/714]: 0x48ef68 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:09.859 #3 NEW cov: 12204 ft: 12236 corp: 2/23b lim: 40 exec/s: 0 rss: 71Mb L: 22/22 MS: 1 InsertRepeatedBytes- 00:08:09.859 [2024-11-17 11:02:34.294156] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.859 [2024-11-17 11:02:34.294198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.859 [2024-11-17 11:02:34.294320] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.859 [2024-11-17 11:02:34.294351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.859 [2024-11-17 11:02:34.294464] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.859 [2024-11-17 11:02:34.294481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.859 NEW_FUNC[1/2]: 0x1c474e8 in event_queue_run_batch /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:595 00:08:09.859 NEW_FUNC[2/2]: 0x1c4d1c8 in _reactor_run /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:962 00:08:09.859 #5 NEW cov: 12349 ft: 13107 corp: 3/51b lim: 40 exec/s: 0 rss: 72Mb L: 28/28 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:08:09.859 [2024-11-17 11:02:34.344516] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:9e9e9e9e cdw11:9e9e9e9e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.859 [2024-11-17 11:02:34.344544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.859 [2024-11-17 11:02:34.344677] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:9e9e9e9e cdw11:9e9e9e9e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.859 [2024-11-17 11:02:34.344693] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.859 [2024-11-17 11:02:34.344812] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:9e9e9e9e cdw11:9e9e9e9e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.859 [2024-11-17 11:02:34.344829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.859 [2024-11-17 11:02:34.344955] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:9e9e9e9e cdw11:9e9e9e9e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.859 [2024-11-17 11:02:34.344973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.859 #6 NEW cov: 12355 ft: 13640 corp: 4/84b lim: 40 exec/s: 0 rss: 72Mb L: 33/33 MS: 1 CopyPart- 00:08:09.859 [2024-11-17 11:02:34.414153] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:9e9e9e9e cdw11:9e9e9e9e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.859 [2024-11-17 11:02:34.414181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.859 [2024-11-17 11:02:34.414303] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:9e9e9e5a cdw11:9e9e9e9e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.859 [2024-11-17 11:02:34.414320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.859 #7 NEW cov: 12440 ft: 13898 corp: 5/106b lim: 40 exec/s: 0 rss: 72Mb L: 22/33 MS: 1 ChangeBinInt- 00:08:09.859 [2024-11-17 11:02:34.464331] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:9e9e9e9e cdw11:9e9e9e9e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.859 [2024-11-17 11:02:34.464358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.860 [2024-11-17 11:02:34.464479] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:9e9e9e9e cdw11:9e9e9e9e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.860 [2024-11-17 11:02:34.464496] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.860 #8 NEW cov: 12440 ft: 14044 corp: 6/128b lim: 40 exec/s: 0 rss: 72Mb L: 22/33 MS: 1 CopyPart- 00:08:09.860 [2024-11-17 11:02:34.514603] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:9e9e9e9e cdw11:9e9e9e9e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.860 [2024-11-17 11:02:34.514633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.860 [2024-11-17 11:02:34.514759] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:9e9e9e9e cdw11:9e9e9e9e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.860 [2024-11-17 11:02:34.514778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.118 #9 NEW cov: 12440 ft: 14112 corp: 7/151b lim: 40 exec/s: 0 rss: 72Mb L: 23/33 MS: 1 CopyPart- 00:08:10.118 [2024-11-17 11:02:34.584975] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.118 [2024-11-17 11:02:34.585001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.118 [2024-11-17 11:02:34.585136] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.118 [2024-11-17 11:02:34.585153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.118 [2024-11-17 11:02:34.585278] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ff9e9e9e cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.118 [2024-11-17 11:02:34.585294] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.118 #15 NEW cov: 12440 ft: 14237 corp: 8/179b lim: 40 exec/s: 0 rss: 72Mb L: 28/33 MS: 1 CrossOver- 00:08:10.118 [2024-11-17 11:02:34.654694] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a9e9e9e cdw11:9e9e9e9e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.118 [2024-11-17 11:02:34.654721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.118 #16 NEW cov: 12440 ft: 15003 corp: 9/194b lim: 40 exec/s: 0 rss: 72Mb L: 15/33 MS: 1 CrossOver- 00:08:10.118 [2024-11-17 11:02:34.704810] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:0affffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.118 [2024-11-17 11:02:34.704838] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.119 #17 NEW cov: 12440 ft: 15062 corp: 10/206b lim: 40 exec/s: 0 rss: 72Mb L: 12/33 MS: 1 CrossOver- 00:08:10.378 [2024-11-17 11:02:34.775047] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a60ffff cdw11:0affffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.378 [2024-11-17 11:02:34.775077] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.378 #18 NEW cov: 12440 ft: 15123 corp: 11/218b lim: 40 exec/s: 0 rss: 72Mb L: 12/33 MS: 1 ChangeByte- 00:08:10.378 [2024-11-17 11:02:34.845296] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:16010016 cdw11:010000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.378 [2024-11-17 11:02:34.845323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.378 NEW_FUNC[1/1]: 0x1c4e498 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:10.378 #21 NEW cov: 12463 ft: 15165 corp: 12/232b lim: 40 exec/s: 0 rss: 73Mb L: 14/33 MS: 3 EraseBytes-CMP-CopyPart- DE: "\026\001\000\000"- 00:08:10.378 [2024-11-17 11:02:34.916071] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.378 [2024-11-17 11:02:34.916100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.378 [2024-11-17 11:02:34.916242] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.378 [2024-11-17 11:02:34.916259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.378 [2024-11-17 11:02:34.916396] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ff9e9e9e cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.378 [2024-11-17 11:02:34.916411] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.378 #22 NEW cov: 12463 ft: 15187 corp: 13/260b lim: 40 exec/s: 22 rss: 73Mb L: 28/33 MS: 1 CrossOver- 00:08:10.378 [2024-11-17 11:02:34.966116] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a99ffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.379 [2024-11-17 11:02:34.966145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.379 [2024-11-17 11:02:34.966285] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.379 [2024-11-17 11:02:34.966302] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.379 [2024-11-17 11:02:34.966433] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ff9e9e9e cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.379 [2024-11-17 11:02:34.966450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.379 #23 NEW cov: 12463 ft: 15206 corp: 14/288b lim: 40 exec/s: 23 rss: 73Mb L: 28/33 MS: 1 ChangeByte- 00:08:10.640 [2024-11-17 11:02:35.036147] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:9e9e9e9e cdw11:9e9e9e9e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.640 [2024-11-17 11:02:35.036176] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.640 [2024-11-17 11:02:35.036311] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:9e9e9e9e cdw11:9e9e9e9e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.640 [2024-11-17 11:02:35.036330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.640 #24 NEW cov: 12463 ft: 15223 corp: 15/310b lim: 40 exec/s: 24 rss: 73Mb L: 22/33 MS: 1 ChangeByte- 00:08:10.640 [2024-11-17 11:02:35.086883] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a99ffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.640 [2024-11-17 11:02:35.086912] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.640 [2024-11-17 11:02:35.087047] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.640 [2024-11-17 11:02:35.087065] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.640 [2024-11-17 11:02:35.087190] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ff9e9e9e cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.640 [2024-11-17 11:02:35.087208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.640 [2024-11-17 11:02:35.087338] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ff89acac cdw11:a8f30fea SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.640 [2024-11-17 11:02:35.087356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.640 #25 NEW cov: 12463 ft: 15278 corp: 16/346b lim: 40 exec/s: 25 rss: 73Mb L: 36/36 MS: 1 CMP- DE: "\377\211\254\254\250\363\017\352"- 00:08:10.640 [2024-11-17 11:02:35.156426] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:9e9e9e9e cdw11:9e969e9e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.640 [2024-11-17 11:02:35.156456] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.640 [2024-11-17 11:02:35.156588] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:9e9e9e9e cdw11:9e9e9e9e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.640 [2024-11-17 11:02:35.156604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.640 #26 NEW cov: 12463 ft: 15323 corp: 17/368b lim: 40 exec/s: 26 rss: 73Mb L: 22/36 MS: 1 ChangeBit- 00:08:10.640 [2024-11-17 11:02:35.226406] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a9e9e9e cdw11:9e9e9e9e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.640 [2024-11-17 11:02:35.226435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.640 #27 NEW cov: 12463 ft: 15370 corp: 18/383b lim: 40 exec/s: 27 rss: 73Mb L: 15/36 MS: 1 ChangeBinInt- 00:08:10.901 [2024-11-17 11:02:35.296969] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:9e9e9e9e cdw11:9e9e9e9e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.901 [2024-11-17 11:02:35.296998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.901 [2024-11-17 11:02:35.297134] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:9e9e9e5a cdw11:9e9e9e9e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.901 [2024-11-17 11:02:35.297151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.901 #28 NEW cov: 12463 ft: 15401 corp: 19/405b lim: 40 exec/s: 28 rss: 73Mb L: 22/36 MS: 1 CopyPart- 00:08:10.901 [2024-11-17 11:02:35.367164] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:9e9e9e9e cdw11:9e9e9f9e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.901 [2024-11-17 11:02:35.367191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.901 [2024-11-17 11:02:35.367318] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:9e9e9e9e cdw11:9e9e9e9e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.901 [2024-11-17 11:02:35.367334] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.901 #29 NEW cov: 12463 ft: 15445 corp: 20/427b lim: 40 exec/s: 29 rss: 73Mb L: 22/36 MS: 1 ChangeBit- 00:08:10.901 [2024-11-17 11:02:35.417615] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.901 [2024-11-17 11:02:35.417644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.901 [2024-11-17 11:02:35.417774] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.901 [2024-11-17 11:02:35.417791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.901 [2024-11-17 11:02:35.417921] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffff9e9e cdw11:9effffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.901 [2024-11-17 11:02:35.417937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.901 #30 NEW cov: 12463 ft: 15571 corp: 21/455b lim: 40 exec/s: 30 rss: 73Mb L: 28/36 MS: 1 CrossOver- 00:08:10.901 [2024-11-17 11:02:35.467755] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.901 [2024-11-17 11:02:35.467784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.901 [2024-11-17 11:02:35.467913] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.901 [2024-11-17 11:02:35.467930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.901 [2024-11-17 11:02:35.468053] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ff9e9e9e cdw11:b8b8b8ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.901 [2024-11-17 11:02:35.468069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.901 #31 NEW cov: 12463 ft: 15587 corp: 22/486b lim: 40 exec/s: 31 rss: 73Mb L: 31/36 MS: 1 InsertRepeatedBytes- 00:08:10.901 [2024-11-17 11:02:35.517598] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:9e9e9e9e cdw11:9e969e9e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.901 [2024-11-17 11:02:35.517625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.901 [2024-11-17 11:02:35.517766] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:9e9e9e9e cdw11:9e9e9e9e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.901 [2024-11-17 11:02:35.517781] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.162 #32 NEW cov: 12463 ft: 15610 corp: 23/508b lim: 40 exec/s: 32 rss: 73Mb L: 22/36 MS: 1 ChangeBit- 00:08:11.162 [2024-11-17 11:02:35.587496] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a9e9e9e cdw11:9e9e9e9e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.162 [2024-11-17 11:02:35.587524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.162 #33 NEW cov: 12463 ft: 15643 corp: 24/523b lim: 40 exec/s: 33 rss: 73Mb L: 15/36 MS: 1 CopyPart- 00:08:11.162 [2024-11-17 11:02:35.637557] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a9e9e16 cdw11:0100009e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.162 [2024-11-17 11:02:35.637583] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.162 #34 NEW cov: 12463 ft: 15684 corp: 25/538b lim: 40 exec/s: 34 rss: 73Mb L: 15/36 MS: 1 PersAutoDict- DE: "\026\001\000\000"- 00:08:11.162 [2024-11-17 11:02:35.688100] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a60ffff cdw11:0affff16 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.162 [2024-11-17 11:02:35.688126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.162 [2024-11-17 11:02:35.688244] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:010000ff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.162 [2024-11-17 11:02:35.688271] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.162 #35 NEW cov: 12463 ft: 15693 corp: 26/554b lim: 40 exec/s: 35 rss: 73Mb L: 16/36 MS: 1 PersAutoDict- DE: "\026\001\000\000"- 00:08:11.162 [2024-11-17 11:02:35.737967] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:9e9e9e9e cdw11:9e9e9e9e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.162 [2024-11-17 11:02:35.737994] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.162 #36 NEW cov: 12463 ft: 15727 corp: 27/569b lim: 40 exec/s: 36 rss: 73Mb L: 15/36 MS: 1 EraseBytes- 00:08:11.162 [2024-11-17 11:02:35.788056] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a9e9e16 cdw11:0100009e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.162 [2024-11-17 11:02:35.788084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.423 #37 NEW cov: 12463 ft: 15746 corp: 28/584b lim: 40 exec/s: 37 rss: 73Mb L: 15/36 MS: 1 CopyPart- 00:08:11.423 [2024-11-17 11:02:35.858593] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:9e9e9e9e cdw11:96969e9e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.423 [2024-11-17 11:02:35.858621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.423 [2024-11-17 11:02:35.858753] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:9e9e9e9e cdw11:9e9e9e9e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.423 [2024-11-17 11:02:35.858769] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.423 #43 NEW cov: 12463 ft: 15748 corp: 29/606b lim: 40 exec/s: 43 rss: 73Mb L: 22/36 MS: 1 ChangeBit- 00:08:11.423 [2024-11-17 11:02:35.908821] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:9e9e6261 cdw11:61729e9e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.423 [2024-11-17 11:02:35.908850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.423 [2024-11-17 11:02:35.908978] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:9e9e9e9e cdw11:9e9e9e9e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.423 [2024-11-17 11:02:35.908995] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.423 #44 NEW cov: 12463 ft: 15757 corp: 30/628b lim: 40 exec/s: 22 rss: 73Mb L: 22/36 MS: 1 ChangeBinInt- 00:08:11.423 #44 DONE cov: 12463 ft: 15757 corp: 30/628b lim: 40 exec/s: 22 rss: 73Mb 00:08:11.423 ###### Recommended dictionary. ###### 00:08:11.423 "\026\001\000\000" # Uses: 3 00:08:11.423 "\377\211\254\254\250\363\017\352" # Uses: 0 00:08:11.423 ###### End of recommended dictionary. ###### 00:08:11.423 Done 44 runs in 2 second(s) 00:08:11.423 11:02:36 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_11.conf /var/tmp/suppress_nvmf_fuzz 00:08:11.423 11:02:36 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:11.423 11:02:36 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:11.424 11:02:36 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 12 1 0x1 00:08:11.424 11:02:36 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=12 00:08:11.424 11:02:36 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:11.424 11:02:36 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:11.424 11:02:36 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:08:11.424 11:02:36 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_12.conf 00:08:11.424 11:02:36 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:11.424 11:02:36 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:11.424 11:02:36 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 12 00:08:11.424 11:02:36 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4412 00:08:11.424 11:02:36 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:08:11.424 11:02:36 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4412' 00:08:11.424 11:02:36 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4412"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:11.424 11:02:36 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:11.424 11:02:36 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:11.424 11:02:36 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4412' -c /tmp/fuzz_json_12.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 -Z 12 00:08:11.684 [2024-11-17 11:02:36.093697] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:08:11.684 [2024-11-17 11:02:36.093766] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid153277 ] 00:08:11.684 [2024-11-17 11:02:36.291287] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:11.684 [2024-11-17 11:02:36.303692] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:11.945 [2024-11-17 11:02:36.355974] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:11.945 [2024-11-17 11:02:36.372274] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4412 *** 00:08:11.945 INFO: Running with entropic power schedule (0xFF, 100). 00:08:11.945 INFO: Seed: 2248518927 00:08:11.945 INFO: Loaded 1 modules (387604 inline 8-bit counters): 387604 [0x2a8d80c, 0x2aec220), 00:08:11.945 INFO: Loaded 1 PC tables (387604 PCs): 387604 [0x2aec220,0x30d6360), 00:08:11.945 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:08:11.945 INFO: A corpus is not provided, starting from an empty corpus 00:08:11.945 #2 INITED exec/s: 0 rss: 65Mb 00:08:11.945 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:11.945 This may also happen if the target rejected all inputs we tried so far 00:08:11.945 [2024-11-17 11:02:36.442339] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:2b2bffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.945 [2024-11-17 11:02:36.442381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.205 NEW_FUNC[1/716]: 0x463128 in fuzz_admin_directive_send_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:241 00:08:12.205 NEW_FUNC[2/716]: 0x48ef68 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:12.205 #6 NEW cov: 12234 ft: 12223 corp: 2/12b lim: 40 exec/s: 0 rss: 72Mb L: 11/11 MS: 4 InsertByte-ChangeByte-CopyPart-InsertRepeatedBytes- 00:08:12.206 [2024-11-17 11:02:36.804010] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:2b2bffff cdw11:ffff6f6f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.206 [2024-11-17 11:02:36.804057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.206 [2024-11-17 11:02:36.804204] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:6f6f6f6f cdw11:6f6fffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.206 [2024-11-17 11:02:36.804222] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.206 #7 NEW cov: 12347 ft: 13575 corp: 3/31b lim: 40 exec/s: 0 rss: 72Mb L: 19/19 MS: 1 InsertRepeatedBytes- 00:08:12.466 [2024-11-17 11:02:36.873884] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:2b2bffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.466 [2024-11-17 11:02:36.873917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.466 #13 NEW cov: 12353 ft: 13803 corp: 4/43b lim: 40 exec/s: 0 rss: 72Mb L: 12/19 MS: 1 InsertByte- 00:08:12.466 [2024-11-17 11:02:36.924177] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:2b2b0aff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.466 [2024-11-17 11:02:36.924211] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.466 #14 NEW cov: 12438 ft: 14021 corp: 5/56b lim: 40 exec/s: 0 rss: 72Mb L: 13/19 MS: 1 CrossOver- 00:08:12.466 [2024-11-17 11:02:36.994548] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:2b2bffff cdw11:ffff6f6f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.466 [2024-11-17 11:02:36.994577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.466 #15 NEW cov: 12438 ft: 14142 corp: 6/71b lim: 40 exec/s: 0 rss: 72Mb L: 15/19 MS: 1 EraseBytes- 00:08:12.466 [2024-11-17 11:02:37.065112] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:2b2b8eff cdw11:ffffff6f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.466 [2024-11-17 11:02:37.065141] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.466 [2024-11-17 11:02:37.065283] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:6f6f6f6f cdw11:6f6f6fff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.466 [2024-11-17 11:02:37.065301] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.466 #16 NEW cov: 12438 ft: 14207 corp: 7/91b lim: 40 exec/s: 0 rss: 72Mb L: 20/20 MS: 1 InsertByte- 00:08:12.466 [2024-11-17 11:02:37.114894] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:292b0aff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.466 [2024-11-17 11:02:37.114921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.727 #17 NEW cov: 12438 ft: 14352 corp: 8/104b lim: 40 exec/s: 0 rss: 72Mb L: 13/20 MS: 1 ChangeBit- 00:08:12.727 [2024-11-17 11:02:37.185132] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:2b2bffff cdw11:bfffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.727 [2024-11-17 11:02:37.185160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.727 #18 NEW cov: 12438 ft: 14380 corp: 9/116b lim: 40 exec/s: 0 rss: 72Mb L: 12/20 MS: 1 ChangeBit- 00:08:12.727 [2024-11-17 11:02:37.235450] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:2b7e2bff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.727 [2024-11-17 11:02:37.235478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.727 #19 NEW cov: 12438 ft: 14485 corp: 10/128b lim: 40 exec/s: 0 rss: 72Mb L: 12/20 MS: 1 InsertByte- 00:08:12.727 [2024-11-17 11:02:37.286726] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:2b2b8eff cdw11:ffffff6f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.727 [2024-11-17 11:02:37.286755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.727 [2024-11-17 11:02:37.286891] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:6f6f6f6f cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.727 [2024-11-17 11:02:37.286909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.727 [2024-11-17 11:02:37.287045] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.727 [2024-11-17 11:02:37.287063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.727 [2024-11-17 11:02:37.287208] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.727 [2024-11-17 11:02:37.287228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.727 [2024-11-17 11:02:37.287374] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:8 nsid:0 cdw10:6f6f6fff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.727 [2024-11-17 11:02:37.287391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:12.727 NEW_FUNC[1/1]: 0x1c4e498 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:12.727 #20 NEW cov: 12461 ft: 15010 corp: 11/168b lim: 40 exec/s: 0 rss: 72Mb L: 40/40 MS: 1 InsertRepeatedBytes- 00:08:12.727 [2024-11-17 11:02:37.355707] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:2b2bffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.727 [2024-11-17 11:02:37.355733] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.727 #21 NEW cov: 12461 ft: 15030 corp: 12/179b lim: 40 exec/s: 0 rss: 73Mb L: 11/40 MS: 1 CopyPart- 00:08:12.987 [2024-11-17 11:02:37.405939] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:2bab0aff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.988 [2024-11-17 11:02:37.405967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.988 #22 NEW cov: 12461 ft: 15057 corp: 13/192b lim: 40 exec/s: 22 rss: 73Mb L: 13/40 MS: 1 ChangeBit- 00:08:12.988 [2024-11-17 11:02:37.457546] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:2b2b8eff cdw11:ffffff6f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.988 [2024-11-17 11:02:37.457571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.988 [2024-11-17 11:02:37.457710] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:6f6f6f6f cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.988 [2024-11-17 11:02:37.457727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.988 [2024-11-17 11:02:37.457863] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.988 [2024-11-17 11:02:37.457879] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.988 [2024-11-17 11:02:37.458012] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff00f7 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.988 [2024-11-17 11:02:37.458027] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.988 [2024-11-17 11:02:37.458177] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:8 nsid:0 cdw10:6f6f6fff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.988 [2024-11-17 11:02:37.458195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:12.988 #23 NEW cov: 12461 ft: 15108 corp: 14/232b lim: 40 exec/s: 23 rss: 73Mb L: 40/40 MS: 1 ChangeBinInt- 00:08:12.988 [2024-11-17 11:02:37.526432] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:2b2b0000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.988 [2024-11-17 11:02:37.526459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.988 #24 NEW cov: 12461 ft: 15119 corp: 15/244b lim: 40 exec/s: 24 rss: 73Mb L: 12/40 MS: 1 ChangeBinInt- 00:08:12.988 [2024-11-17 11:02:37.576559] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:2b2bffff cdw11:ffff6f6f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.988 [2024-11-17 11:02:37.576589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.988 #25 NEW cov: 12461 ft: 15147 corp: 16/259b lim: 40 exec/s: 25 rss: 73Mb L: 15/40 MS: 1 ChangeBinInt- 00:08:13.248 [2024-11-17 11:02:37.646862] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:2b2bffbf cdw11:41ffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.248 [2024-11-17 11:02:37.646890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.248 #26 NEW cov: 12461 ft: 15160 corp: 17/271b lim: 40 exec/s: 26 rss: 73Mb L: 12/40 MS: 1 ShuffleBytes- 00:08:13.248 [2024-11-17 11:02:37.716960] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:2babffff cdw11:ffff0aff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.248 [2024-11-17 11:02:37.716988] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.248 #27 NEW cov: 12461 ft: 15195 corp: 18/284b lim: 40 exec/s: 27 rss: 73Mb L: 13/40 MS: 1 ShuffleBytes- 00:08:13.248 [2024-11-17 11:02:37.787245] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:2b2bff2a cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.248 [2024-11-17 11:02:37.787273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.248 #28 NEW cov: 12461 ft: 15201 corp: 19/297b lim: 40 exec/s: 28 rss: 73Mb L: 13/40 MS: 1 InsertByte- 00:08:13.248 [2024-11-17 11:02:37.837479] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:2b2bffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.248 [2024-11-17 11:02:37.837506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.248 #29 NEW cov: 12461 ft: 15223 corp: 20/306b lim: 40 exec/s: 29 rss: 73Mb L: 9/40 MS: 1 EraseBytes- 00:08:13.248 [2024-11-17 11:02:37.887597] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:2bab0aff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.248 [2024-11-17 11:02:37.887627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.509 #30 NEW cov: 12461 ft: 15240 corp: 21/319b lim: 40 exec/s: 30 rss: 73Mb L: 13/40 MS: 1 ShuffleBytes- 00:08:13.509 [2024-11-17 11:02:37.938153] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:2b2b8eff cdw11:ffffff6f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.509 [2024-11-17 11:02:37.938180] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.509 [2024-11-17 11:02:37.938331] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:6f6f6f6f cdw11:886f6f6f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.509 [2024-11-17 11:02:37.938347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.509 #31 NEW cov: 12461 ft: 15245 corp: 22/340b lim: 40 exec/s: 31 rss: 73Mb L: 21/40 MS: 1 InsertByte- 00:08:13.509 [2024-11-17 11:02:37.988356] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:2b2b8eff cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.509 [2024-11-17 11:02:37.988387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.509 [2024-11-17 11:02:37.988530] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:01000000 cdw11:886f6f6f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.509 [2024-11-17 11:02:37.988547] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.509 #32 NEW cov: 12461 ft: 15270 corp: 23/361b lim: 40 exec/s: 32 rss: 73Mb L: 21/40 MS: 1 CMP- DE: "\000\000\000\000\001\000\000\000"- 00:08:13.509 [2024-11-17 11:02:38.059674] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:2b2b8eff cdw11:ffffff6f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.509 [2024-11-17 11:02:38.059701] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.509 [2024-11-17 11:02:38.059841] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:6f6f6f6f cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.509 [2024-11-17 11:02:38.059859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.509 [2024-11-17 11:02:38.060005] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.509 [2024-11-17 11:02:38.060024] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.509 [2024-11-17 11:02:38.060165] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.509 [2024-11-17 11:02:38.060185] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:13.509 [2024-11-17 11:02:38.060317] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:8 nsid:0 cdw10:6f6f6fff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.509 [2024-11-17 11:02:38.060336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:13.509 #33 NEW cov: 12461 ft: 15287 corp: 24/401b lim: 40 exec/s: 33 rss: 73Mb L: 40/40 MS: 1 ShuffleBytes- 00:08:13.509 [2024-11-17 11:02:38.109994] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:2b2b8eff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.509 [2024-11-17 11:02:38.110023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.509 [2024-11-17 11:02:38.110169] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.509 [2024-11-17 11:02:38.110188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.509 [2024-11-17 11:02:38.110329] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.509 [2024-11-17 11:02:38.110349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.509 [2024-11-17 11:02:38.110487] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.509 [2024-11-17 11:02:38.110505] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:13.509 [2024-11-17 11:02:38.110641] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:8 nsid:0 cdw10:6f6f6fff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.509 [2024-11-17 11:02:38.110660] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:13.509 #34 NEW cov: 12461 ft: 15301 corp: 25/441b lim: 40 exec/s: 34 rss: 73Mb L: 40/40 MS: 1 CopyPart- 00:08:13.770 [2024-11-17 11:02:38.179048] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:2b2bffff cdw11:ffff6f6f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.770 [2024-11-17 11:02:38.179077] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.770 #35 NEW cov: 12461 ft: 15340 corp: 26/453b lim: 40 exec/s: 35 rss: 73Mb L: 12/40 MS: 1 EraseBytes- 00:08:13.770 [2024-11-17 11:02:38.230387] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:2b2b8eff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.770 [2024-11-17 11:02:38.230416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.770 [2024-11-17 11:02:38.230566] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.770 [2024-11-17 11:02:38.230584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.770 [2024-11-17 11:02:38.230734] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.770 [2024-11-17 11:02:38.230751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.770 [2024-11-17 11:02:38.230897] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.770 [2024-11-17 11:02:38.230914] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:13.770 [2024-11-17 11:02:38.231055] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:8 nsid:0 cdw10:6f6f6fff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.770 [2024-11-17 11:02:38.231073] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:13.770 #36 NEW cov: 12461 ft: 15429 corp: 27/493b lim: 40 exec/s: 36 rss: 73Mb L: 40/40 MS: 1 CopyPart- 00:08:13.770 [2024-11-17 11:02:38.300296] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:2bab0aff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.770 [2024-11-17 11:02:38.300323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.770 [2024-11-17 11:02:38.300463] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ff41e4e4 cdw11:e4e4e4e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.770 [2024-11-17 11:02:38.300484] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.770 [2024-11-17 11:02:38.300623] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:e4e4e4e4 cdw11:e4e4e4e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.770 [2024-11-17 11:02:38.300642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.770 [2024-11-17 11:02:38.300773] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:e4e4e4e4 cdw11:e4e4ffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.770 [2024-11-17 11:02:38.300791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:13.770 #37 NEW cov: 12461 ft: 15508 corp: 28/526b lim: 40 exec/s: 37 rss: 73Mb L: 33/40 MS: 1 InsertRepeatedBytes- 00:08:13.770 [2024-11-17 11:02:38.359555] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:292b0aff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.770 [2024-11-17 11:02:38.359587] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.770 #38 NEW cov: 12461 ft: 15552 corp: 29/539b lim: 40 exec/s: 38 rss: 73Mb L: 13/40 MS: 1 ShuffleBytes- 00:08:14.031 [2024-11-17 11:02:38.430279] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:2b2b938e cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.031 [2024-11-17 11:02:38.430309] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.031 [2024-11-17 11:02:38.430442] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:6f6f6f6f cdw11:6f886f6f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.031 [2024-11-17 11:02:38.430462] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.031 #39 NEW cov: 12461 ft: 15555 corp: 30/561b lim: 40 exec/s: 19 rss: 73Mb L: 22/40 MS: 1 InsertByte- 00:08:14.031 #39 DONE cov: 12461 ft: 15555 corp: 30/561b lim: 40 exec/s: 19 rss: 73Mb 00:08:14.031 ###### Recommended dictionary. ###### 00:08:14.031 "\000\000\000\000\001\000\000\000" # Uses: 0 00:08:14.031 ###### End of recommended dictionary. ###### 00:08:14.031 Done 39 runs in 2 second(s) 00:08:14.031 11:02:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_12.conf /var/tmp/suppress_nvmf_fuzz 00:08:14.031 11:02:38 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:14.031 11:02:38 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:14.031 11:02:38 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 13 1 0x1 00:08:14.031 11:02:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=13 00:08:14.031 11:02:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:14.031 11:02:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:14.031 11:02:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:08:14.031 11:02:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_13.conf 00:08:14.031 11:02:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:14.031 11:02:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:14.031 11:02:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 13 00:08:14.031 11:02:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4413 00:08:14.031 11:02:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:08:14.031 11:02:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4413' 00:08:14.031 11:02:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4413"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:14.031 11:02:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:14.031 11:02:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:14.031 11:02:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4413' -c /tmp/fuzz_json_13.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 -Z 13 00:08:14.031 [2024-11-17 11:02:38.599433] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:08:14.031 [2024-11-17 11:02:38.599521] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid153804 ] 00:08:14.291 [2024-11-17 11:02:38.798105] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:14.291 [2024-11-17 11:02:38.810640] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:14.291 [2024-11-17 11:02:38.862901] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:14.291 [2024-11-17 11:02:38.879199] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4413 *** 00:08:14.291 INFO: Running with entropic power schedule (0xFF, 100). 00:08:14.291 INFO: Seed: 459549693 00:08:14.291 INFO: Loaded 1 modules (387604 inline 8-bit counters): 387604 [0x2a8d80c, 0x2aec220), 00:08:14.291 INFO: Loaded 1 PC tables (387604 PCs): 387604 [0x2aec220,0x30d6360), 00:08:14.291 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:08:14.291 INFO: A corpus is not provided, starting from an empty corpus 00:08:14.291 #2 INITED exec/s: 0 rss: 65Mb 00:08:14.291 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:14.291 This may also happen if the target rejected all inputs we tried so far 00:08:14.291 [2024-11-17 11:02:38.938014] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a0a2828 cdw11:28282828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.291 [2024-11-17 11:02:38.938045] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.291 [2024-11-17 11:02:38.938105] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:28282828 cdw11:28282828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.291 [2024-11-17 11:02:38.938118] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.291 [2024-11-17 11:02:38.938190] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:28282828 cdw11:28282828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.291 [2024-11-17 11:02:38.938204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.291 [2024-11-17 11:02:38.938259] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:28282828 cdw11:28282828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.291 [2024-11-17 11:02:38.938272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:14.812 NEW_FUNC[1/715]: 0x464cf8 in fuzz_admin_directive_receive_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:257 00:08:14.812 NEW_FUNC[2/715]: 0x48ef68 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:14.812 #15 NEW cov: 12218 ft: 12219 corp: 2/37b lim: 40 exec/s: 0 rss: 72Mb L: 36/36 MS: 3 CrossOver-ShuffleBytes-InsertRepeatedBytes- 00:08:14.812 [2024-11-17 11:02:39.290050] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a0a2828 cdw11:28282828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.812 [2024-11-17 11:02:39.290103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.812 [2024-11-17 11:02:39.290241] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:28282828 cdw11:28282828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.812 [2024-11-17 11:02:39.290263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.812 [2024-11-17 11:02:39.290412] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:28282828 cdw11:28282828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.812 [2024-11-17 11:02:39.290433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.812 #16 NEW cov: 12335 ft: 13494 corp: 3/61b lim: 40 exec/s: 0 rss: 72Mb L: 24/36 MS: 1 EraseBytes- 00:08:14.812 [2024-11-17 11:02:39.360329] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a0a2828 cdw11:28282828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.812 [2024-11-17 11:02:39.360358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.812 [2024-11-17 11:02:39.360483] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:28282828 cdw11:28282828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.812 [2024-11-17 11:02:39.360500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.812 [2024-11-17 11:02:39.360624] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:28282828 cdw11:28282828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.812 [2024-11-17 11:02:39.360640] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.812 [2024-11-17 11:02:39.360767] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:28282828 cdw11:28282828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.812 [2024-11-17 11:02:39.360783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:14.812 #17 NEW cov: 12341 ft: 13680 corp: 4/96b lim: 40 exec/s: 0 rss: 72Mb L: 35/36 MS: 1 CrossOver- 00:08:14.812 [2024-11-17 11:02:39.429802] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a0a2828 cdw11:28282828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.812 [2024-11-17 11:02:39.429832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.812 #18 NEW cov: 12426 ft: 14338 corp: 5/110b lim: 40 exec/s: 0 rss: 72Mb L: 14/36 MS: 1 EraseBytes- 00:08:15.072 [2024-11-17 11:02:39.480497] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a0a2828 cdw11:28282828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.072 [2024-11-17 11:02:39.480525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.072 [2024-11-17 11:02:39.480656] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:28286d28 cdw11:28282828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.072 [2024-11-17 11:02:39.480672] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.072 [2024-11-17 11:02:39.480803] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:28282828 cdw11:28282828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.072 [2024-11-17 11:02:39.480821] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.072 #19 NEW cov: 12426 ft: 14416 corp: 6/135b lim: 40 exec/s: 0 rss: 72Mb L: 25/36 MS: 1 InsertByte- 00:08:15.072 [2024-11-17 11:02:39.530105] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a0a2828 cdw11:28280a28 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.072 [2024-11-17 11:02:39.530131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.072 #20 NEW cov: 12426 ft: 14486 corp: 7/143b lim: 40 exec/s: 0 rss: 72Mb L: 8/36 MS: 1 CrossOver- 00:08:15.072 [2024-11-17 11:02:39.600794] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a0a2828 cdw11:28282828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.072 [2024-11-17 11:02:39.600823] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.072 [2024-11-17 11:02:39.600944] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:2c282828 cdw11:28282828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.072 [2024-11-17 11:02:39.600961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.072 [2024-11-17 11:02:39.601090] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:28282828 cdw11:28282828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.072 [2024-11-17 11:02:39.601107] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.072 #21 NEW cov: 12426 ft: 14596 corp: 8/167b lim: 40 exec/s: 0 rss: 72Mb L: 24/36 MS: 1 ChangeBit- 00:08:15.072 [2024-11-17 11:02:39.650953] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a0a2828 cdw11:28282828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.072 [2024-11-17 11:02:39.650987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.072 [2024-11-17 11:02:39.651120] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:28286d28 cdw11:28282828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.072 [2024-11-17 11:02:39.651139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.072 [2024-11-17 11:02:39.651263] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:283b2828 cdw11:28282828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.072 [2024-11-17 11:02:39.651281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.072 #22 NEW cov: 12426 ft: 14628 corp: 9/193b lim: 40 exec/s: 0 rss: 72Mb L: 26/36 MS: 1 InsertByte- 00:08:15.072 [2024-11-17 11:02:39.721347] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a0a2828 cdw11:28282828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.073 [2024-11-17 11:02:39.721377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.073 [2024-11-17 11:02:39.721502] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:28282828 cdw11:28282828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.073 [2024-11-17 11:02:39.721519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.073 [2024-11-17 11:02:39.721639] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:28282828 cdw11:28282828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.073 [2024-11-17 11:02:39.721657] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.073 [2024-11-17 11:02:39.721785] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:28282828 cdw11:28282828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.073 [2024-11-17 11:02:39.721803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:15.333 #23 NEW cov: 12426 ft: 14653 corp: 10/229b lim: 40 exec/s: 0 rss: 72Mb L: 36/36 MS: 1 ShuffleBytes- 00:08:15.333 [2024-11-17 11:02:39.771346] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a0a2828 cdw11:28282828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.333 [2024-11-17 11:02:39.771376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.333 [2024-11-17 11:02:39.771497] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:28286d29 cdw11:28282828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.333 [2024-11-17 11:02:39.771514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.333 [2024-11-17 11:02:39.771635] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:283b2828 cdw11:28282828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.333 [2024-11-17 11:02:39.771652] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.333 #24 NEW cov: 12426 ft: 14709 corp: 11/255b lim: 40 exec/s: 0 rss: 72Mb L: 26/36 MS: 1 ChangeBit- 00:08:15.333 [2024-11-17 11:02:39.841751] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a0a28bc cdw11:bcbcbcbc SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.333 [2024-11-17 11:02:39.841779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.333 [2024-11-17 11:02:39.841900] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:bcbcbcbc cdw11:bcbcbcbc SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.333 [2024-11-17 11:02:39.841916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.333 [2024-11-17 11:02:39.842046] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:28282828 cdw11:2828286d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.333 [2024-11-17 11:02:39.842063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.333 [2024-11-17 11:02:39.842189] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:29282828 cdw11:28283b28 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.333 [2024-11-17 11:02:39.842206] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:15.333 NEW_FUNC[1/1]: 0x1c4e498 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:15.333 #25 NEW cov: 12449 ft: 14810 corp: 12/294b lim: 40 exec/s: 0 rss: 73Mb L: 39/39 MS: 1 InsertRepeatedBytes- 00:08:15.333 [2024-11-17 11:02:39.911711] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a0a2828 cdw11:28282828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.333 [2024-11-17 11:02:39.911741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.333 [2024-11-17 11:02:39.911869] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:28286d28 cdw11:28282828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.333 [2024-11-17 11:02:39.911886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.333 [2024-11-17 11:02:39.912012] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:28302828 cdw11:28282828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.333 [2024-11-17 11:02:39.912028] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.333 #26 NEW cov: 12449 ft: 14850 corp: 13/319b lim: 40 exec/s: 26 rss: 73Mb L: 25/39 MS: 1 ChangeByte- 00:08:15.333 [2024-11-17 11:02:39.962099] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a0a2828 cdw11:28282828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.333 [2024-11-17 11:02:39.962126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.333 [2024-11-17 11:02:39.962255] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:28282828 cdw11:28282828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.333 [2024-11-17 11:02:39.962271] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.333 [2024-11-17 11:02:39.962395] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:28282828 cdw11:28282828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.333 [2024-11-17 11:02:39.962410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.333 [2024-11-17 11:02:39.962531] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:2828fdfd cdw11:fdfd2828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.333 [2024-11-17 11:02:39.962546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:15.594 #27 NEW cov: 12449 ft: 14867 corp: 14/358b lim: 40 exec/s: 27 rss: 73Mb L: 39/39 MS: 1 InsertRepeatedBytes- 00:08:15.594 [2024-11-17 11:02:40.032277] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a0a2828 cdw11:28282828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.594 [2024-11-17 11:02:40.032311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.594 [2024-11-17 11:02:40.032439] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:28282828 cdw11:28282828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.594 [2024-11-17 11:02:40.032456] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.594 [2024-11-17 11:02:40.032574] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:28282828 cdw11:28282828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.594 [2024-11-17 11:02:40.032593] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.594 [2024-11-17 11:02:40.032723] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:28282828 cdw11:28282828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.594 [2024-11-17 11:02:40.032740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:15.594 #28 NEW cov: 12449 ft: 14896 corp: 15/394b lim: 40 exec/s: 28 rss: 73Mb L: 36/39 MS: 1 ShuffleBytes- 00:08:15.594 [2024-11-17 11:02:40.101805] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a0a2828 cdw11:28282828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.594 [2024-11-17 11:02:40.101835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.594 #29 NEW cov: 12449 ft: 14974 corp: 16/402b lim: 40 exec/s: 29 rss: 73Mb L: 8/39 MS: 1 CrossOver- 00:08:15.594 [2024-11-17 11:02:40.171986] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:2828280a cdw11:0a282828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.594 [2024-11-17 11:02:40.172016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.594 #31 NEW cov: 12449 ft: 14995 corp: 17/412b lim: 40 exec/s: 31 rss: 73Mb L: 10/39 MS: 2 EraseBytes-CopyPart- 00:08:15.594 [2024-11-17 11:02:40.222912] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a0a2828 cdw11:28282828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.594 [2024-11-17 11:02:40.222941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.594 [2024-11-17 11:02:40.223075] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:28282828 cdw11:28282828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.594 [2024-11-17 11:02:40.223109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.594 [2024-11-17 11:02:40.223248] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:28282828 cdw11:28282828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.594 [2024-11-17 11:02:40.223263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.594 [2024-11-17 11:02:40.223392] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:28282828 cdw11:28282828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.594 [2024-11-17 11:02:40.223408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:15.594 #32 NEW cov: 12449 ft: 15084 corp: 18/450b lim: 40 exec/s: 32 rss: 73Mb L: 38/39 MS: 1 CopyPart- 00:08:15.855 [2024-11-17 11:02:40.272249] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a282828 cdw11:0a282828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.855 [2024-11-17 11:02:40.272279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.855 #33 NEW cov: 12449 ft: 15109 corp: 19/458b lim: 40 exec/s: 33 rss: 73Mb L: 8/39 MS: 1 ShuffleBytes- 00:08:15.855 [2024-11-17 11:02:40.343418] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a0a2828 cdw11:28282828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.855 [2024-11-17 11:02:40.343446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.855 [2024-11-17 11:02:40.343577] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:28282828 cdw11:28282828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.855 [2024-11-17 11:02:40.343593] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.855 [2024-11-17 11:02:40.343745] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:28282828 cdw11:28282828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.855 [2024-11-17 11:02:40.343761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.855 [2024-11-17 11:02:40.343890] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:28282828 cdw11:28282828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.855 [2024-11-17 11:02:40.343908] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:15.855 [2024-11-17 11:02:40.344033] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:8 nsid:0 cdw10:28282828 cdw11:28282828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.855 [2024-11-17 11:02:40.344052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:15.855 #34 NEW cov: 12449 ft: 15168 corp: 20/498b lim: 40 exec/s: 34 rss: 73Mb L: 40/40 MS: 1 CopyPart- 00:08:15.855 [2024-11-17 11:02:40.413496] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a0a2828 cdw11:28282828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.855 [2024-11-17 11:02:40.413525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.855 [2024-11-17 11:02:40.413663] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:28286d29 cdw11:28282800 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.855 [2024-11-17 11:02:40.413681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.855 [2024-11-17 11:02:40.413814] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:04000000 cdw11:00000028 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.855 [2024-11-17 11:02:40.413832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.855 [2024-11-17 11:02:40.413962] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:283b2828 cdw11:28282828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.855 [2024-11-17 11:02:40.413979] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:15.855 #35 NEW cov: 12449 ft: 15185 corp: 21/532b lim: 40 exec/s: 35 rss: 73Mb L: 34/40 MS: 1 CMP- DE: "\000\004\000\000\000\000\000\000"- 00:08:15.855 [2024-11-17 11:02:40.463624] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a0a2828 cdw11:28282828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.855 [2024-11-17 11:02:40.463655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.855 [2024-11-17 11:02:40.463779] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:28286d28 cdw11:28282828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.855 [2024-11-17 11:02:40.463801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.855 [2024-11-17 11:02:40.463929] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:28302800 cdw11:04000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.855 [2024-11-17 11:02:40.463945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.855 [2024-11-17 11:02:40.464080] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:00000028 cdw11:28282828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.855 [2024-11-17 11:02:40.464097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:15.855 #36 NEW cov: 12449 ft: 15193 corp: 22/565b lim: 40 exec/s: 36 rss: 73Mb L: 33/40 MS: 1 PersAutoDict- DE: "\000\004\000\000\000\000\000\000"- 00:08:16.117 [2024-11-17 11:02:40.533571] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a0a2828 cdw11:28282828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.117 [2024-11-17 11:02:40.533616] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.117 [2024-11-17 11:02:40.533741] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:28286d28 cdw11:28282828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.117 [2024-11-17 11:02:40.533756] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.117 [2024-11-17 11:02:40.533873] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:28282828 cdw11:28282828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.117 [2024-11-17 11:02:40.533889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.117 #37 NEW cov: 12449 ft: 15229 corp: 23/590b lim: 40 exec/s: 37 rss: 73Mb L: 25/40 MS: 1 ShuffleBytes- 00:08:16.117 [2024-11-17 11:02:40.583325] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a0ad8d7 cdw11:d7d02828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.117 [2024-11-17 11:02:40.583352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.117 #38 NEW cov: 12449 ft: 15257 corp: 24/598b lim: 40 exec/s: 38 rss: 73Mb L: 8/40 MS: 1 ChangeBinInt- 00:08:16.117 [2024-11-17 11:02:40.633820] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a0a2828 cdw11:28282828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.117 [2024-11-17 11:02:40.633849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.117 [2024-11-17 11:02:40.633978] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:2c282828 cdw11:28282828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.117 [2024-11-17 11:02:40.633994] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.117 [2024-11-17 11:02:40.634149] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:28282828 cdw11:28282828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.117 [2024-11-17 11:02:40.634164] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.117 #39 NEW cov: 12449 ft: 15277 corp: 25/622b lim: 40 exec/s: 39 rss: 73Mb L: 24/40 MS: 1 CopyPart- 00:08:16.117 [2024-11-17 11:02:40.704281] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a0a2828 cdw11:28282828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.117 [2024-11-17 11:02:40.704309] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.117 [2024-11-17 11:02:40.704445] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:28202828 cdw11:28282828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.117 [2024-11-17 11:02:40.704462] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.117 [2024-11-17 11:02:40.704583] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:28282828 cdw11:28282828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.117 [2024-11-17 11:02:40.704610] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.117 [2024-11-17 11:02:40.704727] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:28282828 cdw11:28282828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.117 [2024-11-17 11:02:40.704743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:16.117 #40 NEW cov: 12449 ft: 15294 corp: 26/658b lim: 40 exec/s: 40 rss: 73Mb L: 36/40 MS: 1 ChangeBit- 00:08:16.117 [2024-11-17 11:02:40.754446] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a2a2828 cdw11:28282828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.117 [2024-11-17 11:02:40.754475] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.117 [2024-11-17 11:02:40.754595] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:28282828 cdw11:28282828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.117 [2024-11-17 11:02:40.754622] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.117 [2024-11-17 11:02:40.754742] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:28282828 cdw11:28282828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.117 [2024-11-17 11:02:40.754757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.117 [2024-11-17 11:02:40.754880] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:28282828 cdw11:28282828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.117 [2024-11-17 11:02:40.754896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:16.378 #41 NEW cov: 12449 ft: 15305 corp: 27/694b lim: 40 exec/s: 41 rss: 73Mb L: 36/40 MS: 1 ChangeBit- 00:08:16.378 [2024-11-17 11:02:40.804499] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a0a2828 cdw11:28282828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.378 [2024-11-17 11:02:40.804526] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.378 [2024-11-17 11:02:40.804646] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:2c282828 cdw11:28282828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.378 [2024-11-17 11:02:40.804664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.378 [2024-11-17 11:02:40.804790] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:28282828 cdw11:28000400 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.378 [2024-11-17 11:02:40.804806] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.378 [2024-11-17 11:02:40.804928] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00282828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.378 [2024-11-17 11:02:40.804945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:16.378 #42 NEW cov: 12449 ft: 15351 corp: 28/726b lim: 40 exec/s: 42 rss: 74Mb L: 32/40 MS: 1 PersAutoDict- DE: "\000\004\000\000\000\000\000\000"- 00:08:16.378 [2024-11-17 11:02:40.874176] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a0a2828 cdw11:28a82828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.378 [2024-11-17 11:02:40.874207] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.378 #43 NEW cov: 12449 ft: 15381 corp: 29/734b lim: 40 exec/s: 43 rss: 74Mb L: 8/40 MS: 1 ChangeBit- 00:08:16.378 [2024-11-17 11:02:40.924205] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:300a2800 cdw11:04000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.378 [2024-11-17 11:02:40.924231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.378 #48 NEW cov: 12449 ft: 15395 corp: 30/749b lim: 40 exec/s: 24 rss: 74Mb L: 15/40 MS: 5 EraseBytes-ShuffleBytes-ChangeBit-ChangeBinInt-PersAutoDict- DE: "\000\004\000\000\000\000\000\000"- 00:08:16.378 #48 DONE cov: 12449 ft: 15395 corp: 30/749b lim: 40 exec/s: 24 rss: 74Mb 00:08:16.378 ###### Recommended dictionary. ###### 00:08:16.378 "\000\004\000\000\000\000\000\000" # Uses: 3 00:08:16.378 ###### End of recommended dictionary. ###### 00:08:16.378 Done 48 runs in 2 second(s) 00:08:16.639 11:02:41 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_13.conf /var/tmp/suppress_nvmf_fuzz 00:08:16.639 11:02:41 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:16.639 11:02:41 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:16.639 11:02:41 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 14 1 0x1 00:08:16.639 11:02:41 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=14 00:08:16.639 11:02:41 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:16.639 11:02:41 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:16.640 11:02:41 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:08:16.640 11:02:41 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_14.conf 00:08:16.640 11:02:41 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:16.640 11:02:41 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:16.640 11:02:41 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 14 00:08:16.640 11:02:41 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4414 00:08:16.640 11:02:41 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:08:16.640 11:02:41 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4414' 00:08:16.640 11:02:41 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4414"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:16.640 11:02:41 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:16.640 11:02:41 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:16.640 11:02:41 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4414' -c /tmp/fuzz_json_14.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 -Z 14 00:08:16.640 [2024-11-17 11:02:41.110723] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:08:16.640 [2024-11-17 11:02:41.110816] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid154097 ] 00:08:16.901 [2024-11-17 11:02:41.310347] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:16.901 [2024-11-17 11:02:41.322854] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:16.901 [2024-11-17 11:02:41.375302] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:16.901 [2024-11-17 11:02:41.391607] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4414 *** 00:08:16.901 INFO: Running with entropic power schedule (0xFF, 100). 00:08:16.901 INFO: Seed: 2972557085 00:08:16.901 INFO: Loaded 1 modules (387604 inline 8-bit counters): 387604 [0x2a8d80c, 0x2aec220), 00:08:16.901 INFO: Loaded 1 PC tables (387604 PCs): 387604 [0x2aec220,0x30d6360), 00:08:16.901 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:08:16.901 INFO: A corpus is not provided, starting from an empty corpus 00:08:16.901 #2 INITED exec/s: 0 rss: 66Mb 00:08:16.901 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:16.901 This may also happen if the target rejected all inputs we tried so far 00:08:16.901 [2024-11-17 11:02:41.461703] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:80000089 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.901 [2024-11-17 11:02:41.461748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.162 NEW_FUNC[1/716]: 0x4668c8 in fuzz_admin_set_features_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:392 00:08:17.162 NEW_FUNC[2/716]: 0x48ef68 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:17.162 #11 NEW cov: 12214 ft: 12196 corp: 2/9b lim: 35 exec/s: 0 rss: 72Mb L: 8/8 MS: 4 ChangeByte-ChangeByte-CopyPart-InsertRepeatedBytes- 00:08:17.162 [2024-11-17 11:02:41.792455] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:80000089 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.162 [2024-11-17 11:02:41.792511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.423 #12 NEW cov: 12329 ft: 12951 corp: 3/16b lim: 35 exec/s: 0 rss: 72Mb L: 7/8 MS: 1 EraseBytes- 00:08:17.423 [2024-11-17 11:02:41.852686] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.423 [2024-11-17 11:02:41.852722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.423 [2024-11-17 11:02:41.852841] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.423 [2024-11-17 11:02:41.852866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.423 #18 NEW cov: 12335 ft: 13850 corp: 4/31b lim: 35 exec/s: 0 rss: 72Mb L: 15/15 MS: 1 InsertRepeatedBytes- 00:08:17.423 [2024-11-17 11:02:41.892535] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:80000089 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.423 [2024-11-17 11:02:41.892568] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.423 #19 NEW cov: 12420 ft: 14064 corp: 5/38b lim: 35 exec/s: 0 rss: 72Mb L: 7/15 MS: 1 ChangeBit- 00:08:17.423 [2024-11-17 11:02:41.962654] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000089 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.423 [2024-11-17 11:02:41.962682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.423 #20 NEW cov: 12427 ft: 14186 corp: 6/47b lim: 35 exec/s: 0 rss: 72Mb L: 9/15 MS: 1 CMP- DE: "\016\000"- 00:08:17.423 [2024-11-17 11:02:42.023084] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:80000089 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.423 [2024-11-17 11:02:42.023115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.423 [2024-11-17 11:02:42.023242] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.423 [2024-11-17 11:02:42.023262] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.423 #21 NEW cov: 12427 ft: 14317 corp: 7/62b lim: 35 exec/s: 0 rss: 73Mb L: 15/15 MS: 1 CrossOver- 00:08:17.684 [2024-11-17 11:02:42.093125] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000089 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.684 [2024-11-17 11:02:42.093152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.684 #22 NEW cov: 12427 ft: 14419 corp: 8/71b lim: 35 exec/s: 0 rss: 73Mb L: 9/15 MS: 1 ChangeByte- 00:08:17.684 [2024-11-17 11:02:42.163234] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000089 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.684 [2024-11-17 11:02:42.163261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.684 #23 NEW cov: 12427 ft: 14465 corp: 9/80b lim: 35 exec/s: 0 rss: 73Mb L: 9/15 MS: 1 ShuffleBytes- 00:08:17.684 [2024-11-17 11:02:42.204162] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:80000089 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.684 [2024-11-17 11:02:42.204195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.684 [2024-11-17 11:02:42.204313] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.684 [2024-11-17 11:02:42.204336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.684 [2024-11-17 11:02:42.204458] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.684 [2024-11-17 11:02:42.204483] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.684 [2024-11-17 11:02:42.204611] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.684 [2024-11-17 11:02:42.204633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:17.684 #24 NEW cov: 12427 ft: 14815 corp: 10/109b lim: 35 exec/s: 0 rss: 73Mb L: 29/29 MS: 1 CopyPart- 00:08:17.684 [2024-11-17 11:02:42.263547] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:80000089 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.684 [2024-11-17 11:02:42.263580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.684 #25 NEW cov: 12427 ft: 14931 corp: 11/116b lim: 35 exec/s: 0 rss: 73Mb L: 7/29 MS: 1 ChangeBit- 00:08:17.684 [2024-11-17 11:02:42.313910] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000089 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.684 [2024-11-17 11:02:42.313938] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.684 [2024-11-17 11:02:42.314061] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.684 [2024-11-17 11:02:42.314090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.684 NEW_FUNC[1/1]: 0x1c4e498 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:17.684 #26 NEW cov: 12450 ft: 14968 corp: 12/132b lim: 35 exec/s: 0 rss: 73Mb L: 16/29 MS: 1 CrossOver- 00:08:17.945 [2024-11-17 11:02:42.353739] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:80000089 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.945 [2024-11-17 11:02:42.353775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.945 #27 NEW cov: 12450 ft: 15040 corp: 13/140b lim: 35 exec/s: 0 rss: 73Mb L: 8/29 MS: 1 ShuffleBytes- 00:08:17.945 [2024-11-17 11:02:42.393865] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000089 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.945 [2024-11-17 11:02:42.393892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.945 #28 NEW cov: 12450 ft: 15071 corp: 14/149b lim: 35 exec/s: 28 rss: 73Mb L: 9/29 MS: 1 ChangeBit- 00:08:17.945 [2024-11-17 11:02:42.454293] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.945 [2024-11-17 11:02:42.454328] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.945 [2024-11-17 11:02:42.454451] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.945 [2024-11-17 11:02:42.454470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.945 #29 NEW cov: 12450 ft: 15089 corp: 15/164b lim: 35 exec/s: 29 rss: 73Mb L: 15/29 MS: 1 ShuffleBytes- 00:08:17.945 [2024-11-17 11:02:42.494465] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.945 [2024-11-17 11:02:42.494498] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.945 [2024-11-17 11:02:42.494618] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.945 [2024-11-17 11:02:42.494643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.945 #30 NEW cov: 12450 ft: 15099 corp: 16/181b lim: 35 exec/s: 30 rss: 73Mb L: 17/29 MS: 1 PersAutoDict- DE: "\016\000"- 00:08:17.945 [2024-11-17 11:02:42.534756] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.945 [2024-11-17 11:02:42.534792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.945 [2024-11-17 11:02:42.534915] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000003e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.945 [2024-11-17 11:02:42.534932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.945 [2024-11-17 11:02:42.535051] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.945 [2024-11-17 11:02:42.535079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.945 #31 NEW cov: 12450 ft: 15263 corp: 17/204b lim: 35 exec/s: 31 rss: 73Mb L: 23/29 MS: 1 CMP- DE: "\000\000\000\000\002>4\265"- 00:08:17.945 [2024-11-17 11:02:42.574301] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:80000089 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.945 [2024-11-17 11:02:42.574334] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.945 #32 NEW cov: 12450 ft: 15288 corp: 18/211b lim: 35 exec/s: 32 rss: 73Mb L: 7/29 MS: 1 CopyPart- 00:08:18.206 [2024-11-17 11:02:42.625733] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:80000089 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.206 [2024-11-17 11:02:42.625765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.206 [2024-11-17 11:02:42.625901] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.206 [2024-11-17 11:02:42.625922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.206 [2024-11-17 11:02:42.626045] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:6 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.206 [2024-11-17 11:02:42.626071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:18.206 [2024-11-17 11:02:42.626195] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000d8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.206 [2024-11-17 11:02:42.626214] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:18.206 [2024-11-17 11:02:42.626336] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:8 cdw10:800000d8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.206 [2024-11-17 11:02:42.626352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:18.206 NEW_FUNC[1/1]: 0x487e18 in feat_write_atomicity /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:340 00:08:18.206 #33 NEW cov: 12460 ft: 15366 corp: 19/246b lim: 35 exec/s: 33 rss: 73Mb L: 35/35 MS: 1 InsertRepeatedBytes- 00:08:18.206 [2024-11-17 11:02:42.675194] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.206 [2024-11-17 11:02:42.675231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.206 [2024-11-17 11:02:42.675352] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000003e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.206 [2024-11-17 11:02:42.675370] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.206 [2024-11-17 11:02:42.675498] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.206 [2024-11-17 11:02:42.675519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:18.206 #34 NEW cov: 12460 ft: 15401 corp: 20/269b lim: 35 exec/s: 34 rss: 73Mb L: 23/35 MS: 1 ChangeByte- 00:08:18.206 [2024-11-17 11:02:42.744929] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:80000089 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.206 [2024-11-17 11:02:42.744961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.206 #35 NEW cov: 12460 ft: 15409 corp: 21/280b lim: 35 exec/s: 35 rss: 73Mb L: 11/35 MS: 1 CrossOver- 00:08:18.206 [2024-11-17 11:02:42.816170] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:80000089 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.206 [2024-11-17 11:02:42.816202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.206 [2024-11-17 11:02:42.816335] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.207 [2024-11-17 11:02:42.816360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.207 [2024-11-17 11:02:42.816486] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000d8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.207 [2024-11-17 11:02:42.816511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:18.207 [2024-11-17 11:02:42.816640] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000d8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.207 [2024-11-17 11:02:42.816664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:18.207 [2024-11-17 11:02:42.816794] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:8 cdw10:800000d8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.207 [2024-11-17 11:02:42.816818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:18.207 #36 NEW cov: 12460 ft: 15417 corp: 22/315b lim: 35 exec/s: 36 rss: 73Mb L: 35/35 MS: 1 CopyPart- 00:08:18.467 [2024-11-17 11:02:42.885549] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.467 [2024-11-17 11:02:42.885579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.467 [2024-11-17 11:02:42.885717] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000003e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.467 [2024-11-17 11:02:42.885735] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.467 #37 NEW cov: 12460 ft: 15446 corp: 23/334b lim: 35 exec/s: 37 rss: 73Mb L: 19/35 MS: 1 EraseBytes- 00:08:18.467 [2024-11-17 11:02:42.956628] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:80000089 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.467 [2024-11-17 11:02:42.956666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.467 [2024-11-17 11:02:42.956790] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.467 [2024-11-17 11:02:42.956814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.467 [2024-11-17 11:02:42.956943] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:6 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.467 [2024-11-17 11:02:42.956971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:18.467 [2024-11-17 11:02:42.957099] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000d8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.467 [2024-11-17 11:02:42.957123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:18.467 [2024-11-17 11:02:42.957251] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:8 cdw10:800000d8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.467 [2024-11-17 11:02:42.957273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:18.467 #38 NEW cov: 12460 ft: 15466 corp: 24/369b lim: 35 exec/s: 38 rss: 74Mb L: 35/35 MS: 1 CopyPart- 00:08:18.467 [2024-11-17 11:02:42.995864] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:80000089 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.467 [2024-11-17 11:02:42.995899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.467 [2024-11-17 11:02:42.996018] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.468 [2024-11-17 11:02:42.996047] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.468 #39 NEW cov: 12460 ft: 15483 corp: 25/384b lim: 35 exec/s: 39 rss: 74Mb L: 15/35 MS: 1 ChangeBinInt- 00:08:18.468 [2024-11-17 11:02:43.046536] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:80000089 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.468 [2024-11-17 11:02:43.046573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.468 [2024-11-17 11:02:43.046705] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.468 [2024-11-17 11:02:43.046728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.468 [2024-11-17 11:02:43.046844] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000fd SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.468 [2024-11-17 11:02:43.046866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:18.468 [2024-11-17 11:02:43.046992] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.468 [2024-11-17 11:02:43.047016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:18.468 #40 NEW cov: 12460 ft: 15487 corp: 26/413b lim: 35 exec/s: 40 rss: 74Mb L: 29/35 MS: 1 ChangeBit- 00:08:18.468 [2024-11-17 11:02:43.116302] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.468 [2024-11-17 11:02:43.116337] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.468 [2024-11-17 11:02:43.116473] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.468 [2024-11-17 11:02:43.116492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.728 #41 NEW cov: 12460 ft: 15494 corp: 27/432b lim: 35 exec/s: 41 rss: 74Mb L: 19/35 MS: 1 CMP- DE: "\001\000\002\000"- 00:08:18.728 [2024-11-17 11:02:43.187121] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:80000089 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.728 [2024-11-17 11:02:43.187157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.728 [2024-11-17 11:02:43.187299] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.728 [2024-11-17 11:02:43.187322] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.728 [2024-11-17 11:02:43.187451] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.728 [2024-11-17 11:02:43.187470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:18.728 [2024-11-17 11:02:43.187596] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.728 [2024-11-17 11:02:43.187618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:18.728 #42 NEW cov: 12460 ft: 15513 corp: 28/461b lim: 35 exec/s: 42 rss: 74Mb L: 29/35 MS: 1 CopyPart- 00:08:18.728 [2024-11-17 11:02:43.236350] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:80000089 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.728 [2024-11-17 11:02:43.236386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.728 #43 NEW cov: 12460 ft: 15609 corp: 29/469b lim: 35 exec/s: 43 rss: 74Mb L: 8/35 MS: 1 InsertByte- 00:08:18.728 [2024-11-17 11:02:43.297164] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000007a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.728 [2024-11-17 11:02:43.297197] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.728 [2024-11-17 11:02:43.297324] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:0000007a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.728 [2024-11-17 11:02:43.297342] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:18.728 NEW_FUNC[1/1]: 0x13995c8 in nvmf_ctrlr_set_features_write_atomicity /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:1766 00:08:18.729 #45 NEW cov: 12483 ft: 15661 corp: 30/491b lim: 35 exec/s: 45 rss: 74Mb L: 22/35 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:08:18.729 [2024-11-17 11:02:43.356696] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:80000089 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.729 [2024-11-17 11:02:43.356730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.729 #46 NEW cov: 12483 ft: 15686 corp: 31/498b lim: 35 exec/s: 46 rss: 74Mb L: 7/35 MS: 1 ChangeBinInt- 00:08:18.990 [2024-11-17 11:02:43.396773] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.990 [2024-11-17 11:02:43.396801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.990 #47 NEW cov: 12483 ft: 15710 corp: 32/507b lim: 35 exec/s: 47 rss: 74Mb L: 9/35 MS: 1 ShuffleBytes- 00:08:18.990 [2024-11-17 11:02:43.437479] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000007a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.990 [2024-11-17 11:02:43.437506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.990 [2024-11-17 11:02:43.437630] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:0000007a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.990 [2024-11-17 11:02:43.437647] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:18.990 #48 NEW cov: 12483 ft: 15733 corp: 33/529b lim: 35 exec/s: 24 rss: 74Mb L: 22/35 MS: 1 ChangeBinInt- 00:08:18.990 #48 DONE cov: 12483 ft: 15733 corp: 33/529b lim: 35 exec/s: 24 rss: 74Mb 00:08:18.990 ###### Recommended dictionary. ###### 00:08:18.990 "\016\000" # Uses: 1 00:08:18.990 "\000\000\000\000\002>4\265" # Uses: 0 00:08:18.990 "\001\000\002\000" # Uses: 0 00:08:18.990 ###### End of recommended dictionary. ###### 00:08:18.990 Done 48 runs in 2 second(s) 00:08:18.990 11:02:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_14.conf /var/tmp/suppress_nvmf_fuzz 00:08:18.990 11:02:43 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:18.990 11:02:43 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:18.990 11:02:43 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 15 1 0x1 00:08:18.990 11:02:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=15 00:08:18.990 11:02:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:18.990 11:02:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:18.990 11:02:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:08:18.990 11:02:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_15.conf 00:08:18.990 11:02:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:18.990 11:02:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:18.990 11:02:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 15 00:08:18.990 11:02:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4415 00:08:18.990 11:02:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:08:18.990 11:02:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4415' 00:08:18.990 11:02:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4415"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:18.990 11:02:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:18.990 11:02:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:18.990 11:02:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4415' -c /tmp/fuzz_json_15.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 -Z 15 00:08:18.990 [2024-11-17 11:02:43.619879] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:08:18.990 [2024-11-17 11:02:43.619964] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid154627 ] 00:08:19.251 [2024-11-17 11:02:43.819492] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:19.251 [2024-11-17 11:02:43.832977] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:19.251 [2024-11-17 11:02:43.885594] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:19.251 [2024-11-17 11:02:43.901910] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4415 *** 00:08:19.511 INFO: Running with entropic power schedule (0xFF, 100). 00:08:19.511 INFO: Seed: 1188612287 00:08:19.511 INFO: Loaded 1 modules (387604 inline 8-bit counters): 387604 [0x2a8d80c, 0x2aec220), 00:08:19.511 INFO: Loaded 1 PC tables (387604 PCs): 387604 [0x2aec220,0x30d6360), 00:08:19.511 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:08:19.511 INFO: A corpus is not provided, starting from an empty corpus 00:08:19.511 #2 INITED exec/s: 0 rss: 65Mb 00:08:19.511 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:19.511 This may also happen if the target rejected all inputs we tried so far 00:08:19.772 NEW_FUNC[1/702]: 0x467e08 in fuzz_admin_get_features_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:460 00:08:19.772 NEW_FUNC[2/702]: 0x487e18 in feat_write_atomicity /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:340 00:08:19.772 #6 NEW cov: 12068 ft: 12053 corp: 2/9b lim: 35 exec/s: 0 rss: 72Mb L: 8/8 MS: 4 CrossOver-CopyPart-CopyPart-InsertRepeatedBytes- 00:08:19.772 [2024-11-17 11:02:44.298815] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:0000058a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.772 [2024-11-17 11:02:44.298871] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.772 NEW_FUNC[1/14]: 0x1957158 in spdk_nvme_print_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_qpair.c:263 00:08:19.772 NEW_FUNC[2/14]: 0x1957398 in nvme_admin_qpair_print_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_qpair.c:202 00:08:19.772 #7 NEW cov: 12330 ft: 13133 corp: 3/25b lim: 35 exec/s: 0 rss: 72Mb L: 16/16 MS: 1 CMP- DE: "\001\212\254\262\003\316\225\212"- 00:08:19.772 [2024-11-17 11:02:44.368313] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.772 [2024-11-17 11:02:44.368340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.772 #12 NEW cov: 12336 ft: 13398 corp: 4/34b lim: 35 exec/s: 0 rss: 72Mb L: 9/16 MS: 5 InsertByte-ChangeByte-EraseBytes-ShuffleBytes-InsertRepeatedBytes- 00:08:19.772 [2024-11-17 11:02:44.408429] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.772 [2024-11-17 11:02:44.408457] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.033 #13 NEW cov: 12421 ft: 13588 corp: 5/44b lim: 35 exec/s: 0 rss: 72Mb L: 10/16 MS: 1 CrossOver- 00:08:20.033 [2024-11-17 11:02:44.468609] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.033 [2024-11-17 11:02:44.468635] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.033 #14 NEW cov: 12421 ft: 13672 corp: 6/54b lim: 35 exec/s: 0 rss: 72Mb L: 10/16 MS: 1 ChangeBit- 00:08:20.033 [2024-11-17 11:02:44.528956] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:0000058a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.033 [2024-11-17 11:02:44.528984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.033 #15 NEW cov: 12421 ft: 13791 corp: 7/70b lim: 35 exec/s: 0 rss: 73Mb L: 16/16 MS: 1 ChangeBinInt- 00:08:20.033 [2024-11-17 11:02:44.588925] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.033 [2024-11-17 11:02:44.588951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.033 #16 NEW cov: 12421 ft: 13921 corp: 8/80b lim: 35 exec/s: 0 rss: 73Mb L: 10/16 MS: 1 ChangeBit- 00:08:20.033 #17 NEW cov: 12421 ft: 14004 corp: 9/88b lim: 35 exec/s: 0 rss: 73Mb L: 8/16 MS: 1 EraseBytes- 00:08:20.294 [2024-11-17 11:02:44.689539] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.294 [2024-11-17 11:02:44.689566] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.294 [2024-11-17 11:02:44.689639] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000700 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.294 [2024-11-17 11:02:44.689654] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.294 [2024-11-17 11:02:44.689713] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000000b2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.294 [2024-11-17 11:02:44.689728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.294 #18 NEW cov: 12421 ft: 14307 corp: 10/114b lim: 35 exec/s: 0 rss: 73Mb L: 26/26 MS: 1 CrossOver- 00:08:20.294 [2024-11-17 11:02:44.749499] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000004ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.294 [2024-11-17 11:02:44.749525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.294 #19 NEW cov: 12421 ft: 14341 corp: 11/130b lim: 35 exec/s: 0 rss: 73Mb L: 16/26 MS: 1 CrossOver- 00:08:20.294 [2024-11-17 11:02:44.789868] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.294 [2024-11-17 11:02:44.789893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.294 [2024-11-17 11:02:44.789958] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000700 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.294 [2024-11-17 11:02:44.789972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.294 [2024-11-17 11:02:44.790034] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000000b2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.294 [2024-11-17 11:02:44.790052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.294 [2024-11-17 11:02:44.790114] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000000b2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.294 [2024-11-17 11:02:44.790129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:20.294 #20 NEW cov: 12421 ft: 14850 corp: 12/164b lim: 35 exec/s: 0 rss: 73Mb L: 34/34 MS: 1 PersAutoDict- DE: "\001\212\254\262\003\316\225\212"- 00:08:20.294 NEW_FUNC[1/1]: 0x1c4e498 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:20.294 #21 NEW cov: 12444 ft: 14937 corp: 13/172b lim: 35 exec/s: 0 rss: 73Mb L: 8/34 MS: 1 ChangeByte- 00:08:20.294 [2024-11-17 11:02:44.890025] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.294 [2024-11-17 11:02:44.890055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.294 [2024-11-17 11:02:44.890117] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000700 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.294 [2024-11-17 11:02:44.890131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.294 [2024-11-17 11:02:44.890191] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:0000058a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.294 [2024-11-17 11:02:44.890205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.294 #22 NEW cov: 12444 ft: 14983 corp: 14/198b lim: 35 exec/s: 0 rss: 73Mb L: 26/34 MS: 1 PersAutoDict- DE: "\001\212\254\262\003\316\225\212"- 00:08:20.294 [2024-11-17 11:02:44.930027] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:0000058a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.294 [2024-11-17 11:02:44.930057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.555 #23 NEW cov: 12444 ft: 15058 corp: 15/214b lim: 35 exec/s: 23 rss: 73Mb L: 16/34 MS: 1 CopyPart- 00:08:20.555 NEW_FUNC[1/1]: 0x4867d8 in feat_interrupt_coalescing /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:325 00:08:20.555 #24 NEW cov: 12466 ft: 15102 corp: 16/222b lim: 35 exec/s: 24 rss: 73Mb L: 8/34 MS: 1 ChangeBinInt- 00:08:20.555 [2024-11-17 11:02:45.030565] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.555 [2024-11-17 11:02:45.030591] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.555 [2024-11-17 11:02:45.030652] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000700 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.555 [2024-11-17 11:02:45.030667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.555 [2024-11-17 11:02:45.030727] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000000b2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.555 [2024-11-17 11:02:45.030741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.555 [2024-11-17 11:02:45.030797] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000000b2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.555 [2024-11-17 11:02:45.030811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:20.555 #25 NEW cov: 12466 ft: 15167 corp: 17/256b lim: 35 exec/s: 25 rss: 73Mb L: 34/34 MS: 1 ChangeBit- 00:08:20.555 [2024-11-17 11:02:45.090505] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.555 [2024-11-17 11:02:45.090530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.555 #26 NEW cov: 12466 ft: 15228 corp: 18/272b lim: 35 exec/s: 26 rss: 73Mb L: 16/34 MS: 1 PersAutoDict- DE: "\001\212\254\262\003\316\225\212"- 00:08:20.555 [2024-11-17 11:02:45.150775] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000200 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.555 [2024-11-17 11:02:45.150800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.555 [2024-11-17 11:02:45.150863] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:0000025e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.555 [2024-11-17 11:02:45.150877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.555 [2024-11-17 11:02:45.150952] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:0000025e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.555 [2024-11-17 11:02:45.150967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.555 #27 NEW cov: 12466 ft: 15266 corp: 19/299b lim: 35 exec/s: 27 rss: 73Mb L: 27/34 MS: 1 InsertRepeatedBytes- 00:08:20.815 #28 NEW cov: 12466 ft: 15274 corp: 20/308b lim: 35 exec/s: 28 rss: 73Mb L: 9/34 MS: 1 InsertByte- 00:08:20.815 [2024-11-17 11:02:45.230746] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000026 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.815 [2024-11-17 11:02:45.230772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.816 #29 NEW cov: 12466 ft: 15287 corp: 21/318b lim: 35 exec/s: 29 rss: 73Mb L: 10/34 MS: 1 ChangeByte- 00:08:20.816 [2024-11-17 11:02:45.271061] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000004ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.816 [2024-11-17 11:02:45.271087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.816 #30 NEW cov: 12466 ft: 15296 corp: 22/334b lim: 35 exec/s: 30 rss: 73Mb L: 16/34 MS: 1 ChangeBit- 00:08:20.816 [2024-11-17 11:02:45.330996] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000026 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.816 [2024-11-17 11:02:45.331022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.816 #31 NEW cov: 12466 ft: 15317 corp: 23/345b lim: 35 exec/s: 31 rss: 74Mb L: 11/34 MS: 1 InsertByte- 00:08:20.816 #32 NEW cov: 12466 ft: 15419 corp: 24/365b lim: 35 exec/s: 32 rss: 74Mb L: 20/34 MS: 1 CMP- DE: "\011\000\000\000"- 00:08:20.816 [2024-11-17 11:02:45.451458] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.816 [2024-11-17 11:02:45.451484] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.816 [2024-11-17 11:02:45.451543] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES LBA RANGE TYPE cid:5 cdw10:00000603 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.816 [2024-11-17 11:02:45.451557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.076 NEW_FUNC[1/1]: 0x483058 in feat_lba_range_type /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:289 00:08:21.076 #33 NEW cov: 12477 ft: 15552 corp: 25/384b lim: 35 exec/s: 33 rss: 74Mb L: 19/34 MS: 1 EraseBytes- 00:08:21.076 [2024-11-17 11:02:45.511452] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000000ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.076 [2024-11-17 11:02:45.511477] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.076 #34 NEW cov: 12477 ft: 15578 corp: 26/392b lim: 35 exec/s: 34 rss: 74Mb L: 8/34 MS: 1 ShuffleBytes- 00:08:21.076 NEW_FUNC[1/1]: 0x4812a8 in feat_arbitration /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:273 00:08:21.076 #35 NEW cov: 12515 ft: 15616 corp: 27/408b lim: 35 exec/s: 35 rss: 74Mb L: 16/34 MS: 1 ShuffleBytes- 00:08:21.076 [2024-11-17 11:02:45.631775] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000001ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.076 [2024-11-17 11:02:45.631801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.076 #36 NEW cov: 12515 ft: 15635 corp: 28/417b lim: 35 exec/s: 36 rss: 74Mb L: 9/34 MS: 1 InsertByte- 00:08:21.076 [2024-11-17 11:02:45.692306] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005a5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.076 [2024-11-17 11:02:45.692332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.076 [2024-11-17 11:02:45.692397] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000005a5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.076 [2024-11-17 11:02:45.692411] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:21.076 #37 NEW cov: 12515 ft: 15648 corp: 29/438b lim: 35 exec/s: 37 rss: 74Mb L: 21/34 MS: 1 InsertRepeatedBytes- 00:08:21.337 #38 NEW cov: 12515 ft: 15672 corp: 30/446b lim: 35 exec/s: 38 rss: 74Mb L: 8/34 MS: 1 CopyPart- 00:08:21.337 [2024-11-17 11:02:45.772571] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000200 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.337 [2024-11-17 11:02:45.772598] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.337 [2024-11-17 11:02:45.772659] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000200 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.337 [2024-11-17 11:02:45.772674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.337 [2024-11-17 11:02:45.772753] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:0000025e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.337 [2024-11-17 11:02:45.772767] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:21.337 [2024-11-17 11:02:45.772831] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:0000025e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.337 [2024-11-17 11:02:45.772844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:21.337 #39 NEW cov: 12515 ft: 15675 corp: 31/478b lim: 35 exec/s: 39 rss: 74Mb L: 32/34 MS: 1 CrossOver- 00:08:21.337 [2024-11-17 11:02:45.832494] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.337 [2024-11-17 11:02:45.832520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.337 [2024-11-17 11:02:45.832584] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES LBA RANGE TYPE cid:5 cdw10:00000603 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.337 [2024-11-17 11:02:45.832599] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.337 #40 NEW cov: 12515 ft: 15716 corp: 32/497b lim: 35 exec/s: 40 rss: 74Mb L: 19/34 MS: 1 ChangeBit- 00:08:21.337 [2024-11-17 11:02:45.892653] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.337 [2024-11-17 11:02:45.892679] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.337 [2024-11-17 11:02:45.892742] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000700 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.337 [2024-11-17 11:02:45.892760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.337 #41 NEW cov: 12515 ft: 15727 corp: 33/517b lim: 35 exec/s: 41 rss: 74Mb L: 20/34 MS: 1 EraseBytes- 00:08:21.337 [2024-11-17 11:02:45.932776] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.337 [2024-11-17 11:02:45.932802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.337 [2024-11-17 11:02:45.932865] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.337 [2024-11-17 11:02:45.932879] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.337 #43 NEW cov: 12515 ft: 15760 corp: 34/536b lim: 35 exec/s: 21 rss: 74Mb L: 19/34 MS: 2 CrossOver-CrossOver- 00:08:21.337 #43 DONE cov: 12515 ft: 15760 corp: 34/536b lim: 35 exec/s: 21 rss: 74Mb 00:08:21.337 ###### Recommended dictionary. ###### 00:08:21.337 "\001\212\254\262\003\316\225\212" # Uses: 3 00:08:21.337 "\011\000\000\000" # Uses: 0 00:08:21.337 ###### End of recommended dictionary. ###### 00:08:21.337 Done 43 runs in 2 second(s) 00:08:21.599 11:02:46 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_15.conf /var/tmp/suppress_nvmf_fuzz 00:08:21.599 11:02:46 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:21.599 11:02:46 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:21.599 11:02:46 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 16 1 0x1 00:08:21.599 11:02:46 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=16 00:08:21.599 11:02:46 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:21.599 11:02:46 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:21.599 11:02:46 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:08:21.599 11:02:46 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_16.conf 00:08:21.599 11:02:46 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:21.599 11:02:46 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:21.599 11:02:46 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 16 00:08:21.599 11:02:46 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4416 00:08:21.599 11:02:46 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:08:21.599 11:02:46 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4416' 00:08:21.599 11:02:46 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4416"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:21.599 11:02:46 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:21.599 11:02:46 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:21.599 11:02:46 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4416' -c /tmp/fuzz_json_16.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 -Z 16 00:08:21.599 [2024-11-17 11:02:46.119772] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:08:21.599 [2024-11-17 11:02:46.119866] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid155053 ] 00:08:21.860 [2024-11-17 11:02:46.325970] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:21.860 [2024-11-17 11:02:46.338400] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:21.860 [2024-11-17 11:02:46.390629] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:21.860 [2024-11-17 11:02:46.406962] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4416 *** 00:08:21.860 INFO: Running with entropic power schedule (0xFF, 100). 00:08:21.860 INFO: Seed: 3694576823 00:08:21.860 INFO: Loaded 1 modules (387604 inline 8-bit counters): 387604 [0x2a8d80c, 0x2aec220), 00:08:21.860 INFO: Loaded 1 PC tables (387604 PCs): 387604 [0x2aec220,0x30d6360), 00:08:21.860 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:08:21.860 INFO: A corpus is not provided, starting from an empty corpus 00:08:21.860 #2 INITED exec/s: 0 rss: 65Mb 00:08:21.860 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:21.860 This may also happen if the target rejected all inputs we tried so far 00:08:21.860 [2024-11-17 11:02:46.482847] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.860 [2024-11-17 11:02:46.482888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.380 NEW_FUNC[1/716]: 0x4692c8 in fuzz_nvm_read_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:519 00:08:22.380 NEW_FUNC[2/716]: 0x48ef68 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:22.380 #9 NEW cov: 12289 ft: 12273 corp: 2/27b lim: 105 exec/s: 0 rss: 72Mb L: 26/26 MS: 2 InsertRepeatedBytes-CopyPart- 00:08:22.380 [2024-11-17 11:02:46.814099] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.381 [2024-11-17 11:02:46.814154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.381 [2024-11-17 11:02:46.814308] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.381 [2024-11-17 11:02:46.814337] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.381 #28 NEW cov: 12420 ft: 13291 corp: 3/71b lim: 105 exec/s: 0 rss: 72Mb L: 44/44 MS: 4 ChangeBit-ShuffleBytes-ShuffleBytes-InsertRepeatedBytes- 00:08:22.381 [2024-11-17 11:02:46.864437] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1808504321170020633 len:6426 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.381 [2024-11-17 11:02:46.864472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.381 [2024-11-17 11:02:46.864597] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:1808504320951916825 len:6426 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.381 [2024-11-17 11:02:46.864618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.381 [2024-11-17 11:02:46.864736] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:1808504320951916825 len:6426 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.381 [2024-11-17 11:02:46.864760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.381 #42 NEW cov: 12426 ft: 13927 corp: 4/148b lim: 105 exec/s: 0 rss: 72Mb L: 77/77 MS: 4 InsertByte-ChangeBit-CopyPart-InsertRepeatedBytes- 00:08:22.381 [2024-11-17 11:02:46.914516] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1808504321170020633 len:6426 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.381 [2024-11-17 11:02:46.914552] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.381 [2024-11-17 11:02:46.914657] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:1808504320951916825 len:6426 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.381 [2024-11-17 11:02:46.914681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.381 [2024-11-17 11:02:46.914802] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:1808504320951916825 len:6426 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.381 [2024-11-17 11:02:46.914822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.381 #43 NEW cov: 12511 ft: 14177 corp: 5/225b lim: 105 exec/s: 0 rss: 72Mb L: 77/77 MS: 1 ShuffleBytes- 00:08:22.381 [2024-11-17 11:02:46.984717] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1808504321170020633 len:6426 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.381 [2024-11-17 11:02:46.984754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.381 [2024-11-17 11:02:46.984870] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:1808504320951916825 len:6426 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.381 [2024-11-17 11:02:46.984895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.381 [2024-11-17 11:02:46.985018] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:1808504320951916825 len:6426 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.381 [2024-11-17 11:02:46.985048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.381 #44 NEW cov: 12511 ft: 14342 corp: 6/302b lim: 105 exec/s: 0 rss: 72Mb L: 77/77 MS: 1 CrossOver- 00:08:22.641 [2024-11-17 11:02:47.054602] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744071159414783 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.641 [2024-11-17 11:02:47.054636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.641 #45 NEW cov: 12511 ft: 14398 corp: 7/328b lim: 105 exec/s: 0 rss: 72Mb L: 26/77 MS: 1 ChangeByte- 00:08:22.641 [2024-11-17 11:02:47.125081] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:25614222880669696 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.641 [2024-11-17 11:02:47.125115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.641 [2024-11-17 11:02:47.125226] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.641 [2024-11-17 11:02:47.125248] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.641 #46 NEW cov: 12511 ft: 14471 corp: 8/372b lim: 105 exec/s: 0 rss: 72Mb L: 44/77 MS: 1 ChangeByte- 00:08:22.641 [2024-11-17 11:02:47.195404] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1808504321170020633 len:6426 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.641 [2024-11-17 11:02:47.195441] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.641 [2024-11-17 11:02:47.195553] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:1808504320951916825 len:6426 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.641 [2024-11-17 11:02:47.195579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.641 [2024-11-17 11:02:47.195708] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:1808504320951916825 len:6426 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.641 [2024-11-17 11:02:47.195734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.641 #47 NEW cov: 12511 ft: 14533 corp: 9/449b lim: 105 exec/s: 0 rss: 72Mb L: 77/77 MS: 1 ShuffleBytes- 00:08:22.641 [2024-11-17 11:02:47.245530] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1808504321170020633 len:6426 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.641 [2024-11-17 11:02:47.245566] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.641 [2024-11-17 11:02:47.245677] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:1808504320951916825 len:6426 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.641 [2024-11-17 11:02:47.245697] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.641 [2024-11-17 11:02:47.245832] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:1808504320951916825 len:6426 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.641 [2024-11-17 11:02:47.245861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.641 #53 NEW cov: 12511 ft: 14582 corp: 10/526b lim: 105 exec/s: 0 rss: 72Mb L: 77/77 MS: 1 ShuffleBytes- 00:08:22.641 [2024-11-17 11:02:47.295268] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.641 [2024-11-17 11:02:47.295301] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.901 #54 NEW cov: 12511 ft: 14659 corp: 11/552b lim: 105 exec/s: 0 rss: 72Mb L: 26/77 MS: 1 ChangeByte- 00:08:22.901 [2024-11-17 11:02:47.345420] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744071159414783 len:8704 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.901 [2024-11-17 11:02:47.345453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.901 NEW_FUNC[1/1]: 0x1c4e498 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:22.901 #55 NEW cov: 12534 ft: 14727 corp: 12/579b lim: 105 exec/s: 0 rss: 72Mb L: 27/77 MS: 1 InsertByte- 00:08:22.901 [2024-11-17 11:02:47.415735] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.901 [2024-11-17 11:02:47.415770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.901 #56 NEW cov: 12534 ft: 14750 corp: 13/605b lim: 105 exec/s: 0 rss: 73Mb L: 26/77 MS: 1 ChangeBinInt- 00:08:22.901 [2024-11-17 11:02:47.465872] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744071146831871 len:8704 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.901 [2024-11-17 11:02:47.465900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.901 #57 NEW cov: 12534 ft: 14795 corp: 14/632b lim: 105 exec/s: 57 rss: 73Mb L: 27/77 MS: 1 ChangeByte- 00:08:22.901 [2024-11-17 11:02:47.536330] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1808504321170020633 len:6426 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.901 [2024-11-17 11:02:47.536365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.901 [2024-11-17 11:02:47.536474] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:1808504320784144665 len:6426 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.901 [2024-11-17 11:02:47.536499] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.902 [2024-11-17 11:02:47.536618] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:1808504320951916825 len:6426 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.902 [2024-11-17 11:02:47.536649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.162 #63 NEW cov: 12534 ft: 14811 corp: 15/709b lim: 105 exec/s: 63 rss: 73Mb L: 77/77 MS: 1 ChangeBinInt- 00:08:23.162 [2024-11-17 11:02:47.586499] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:12948890935180374963 len:46004 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:23.162 [2024-11-17 11:02:47.586537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.162 [2024-11-17 11:02:47.586656] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:12948890938015724467 len:46004 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:23.162 [2024-11-17 11:02:47.586679] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.162 [2024-11-17 11:02:47.586804] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:12948890938015724467 len:46004 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:23.162 [2024-11-17 11:02:47.586831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.162 #66 NEW cov: 12534 ft: 14826 corp: 16/781b lim: 105 exec/s: 66 rss: 73Mb L: 72/77 MS: 3 CopyPart-ChangeBinInt-InsertRepeatedBytes- 00:08:23.162 [2024-11-17 11:02:47.636721] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1808504321170020633 len:6426 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:23.162 [2024-11-17 11:02:47.636756] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.162 [2024-11-17 11:02:47.636876] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:1808504320951916825 len:6426 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:23.162 [2024-11-17 11:02:47.636898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.162 [2024-11-17 11:02:47.637028] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:1808504320951916825 len:6426 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:23.162 [2024-11-17 11:02:47.637052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.162 #67 NEW cov: 12534 ft: 14858 corp: 17/847b lim: 105 exec/s: 67 rss: 73Mb L: 66/77 MS: 1 EraseBytes- 00:08:23.162 [2024-11-17 11:02:47.706554] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744071159414783 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:23.162 [2024-11-17 11:02:47.706588] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.162 #68 NEW cov: 12534 ft: 14873 corp: 18/873b lim: 105 exec/s: 68 rss: 73Mb L: 26/77 MS: 1 CopyPart- 00:08:23.162 [2024-11-17 11:02:47.756942] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:23.162 [2024-11-17 11:02:47.756975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.162 [2024-11-17 11:02:47.757116] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:16711680 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:23.162 [2024-11-17 11:02:47.757138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.162 #69 NEW cov: 12534 ft: 14906 corp: 19/918b lim: 105 exec/s: 69 rss: 73Mb L: 45/77 MS: 1 InsertByte- 00:08:23.162 [2024-11-17 11:02:47.807489] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:23.162 [2024-11-17 11:02:47.807522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.162 [2024-11-17 11:02:47.807623] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:6727636073941130589 len:23902 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:23.162 [2024-11-17 11:02:47.807646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.162 [2024-11-17 11:02:47.807765] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:6727636073941130589 len:23902 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:23.162 [2024-11-17 11:02:47.807788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.162 [2024-11-17 11:02:47.807910] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18374686479671623680 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:23.162 [2024-11-17 11:02:47.807935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:23.422 #70 NEW cov: 12534 ft: 15416 corp: 20/1008b lim: 105 exec/s: 70 rss: 73Mb L: 90/90 MS: 1 InsertRepeatedBytes- 00:08:23.423 [2024-11-17 11:02:47.877224] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:23.423 [2024-11-17 11:02:47.877259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.423 [2024-11-17 11:02:47.877372] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:16711680 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:23.423 [2024-11-17 11:02:47.877396] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.423 #71 NEW cov: 12534 ft: 15503 corp: 21/1054b lim: 105 exec/s: 71 rss: 73Mb L: 46/90 MS: 1 InsertByte- 00:08:23.423 [2024-11-17 11:02:47.927387] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:23.423 [2024-11-17 11:02:47.927424] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.423 [2024-11-17 11:02:47.927541] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:23.423 [2024-11-17 11:02:47.927562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.423 #72 NEW cov: 12534 ft: 15543 corp: 22/1098b lim: 105 exec/s: 72 rss: 73Mb L: 44/90 MS: 1 ShuffleBytes- 00:08:23.423 [2024-11-17 11:02:47.977681] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1808504321170020633 len:6426 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:23.423 [2024-11-17 11:02:47.977716] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.423 [2024-11-17 11:02:47.977840] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:1808504320951916825 len:6426 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:23.423 [2024-11-17 11:02:47.977864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.423 #73 NEW cov: 12534 ft: 15554 corp: 23/1147b lim: 105 exec/s: 73 rss: 73Mb L: 49/90 MS: 1 EraseBytes- 00:08:23.423 [2024-11-17 11:02:48.047997] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:23.423 [2024-11-17 11:02:48.048031] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.423 [2024-11-17 11:02:48.048150] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:6727636073941130589 len:23902 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:23.423 [2024-11-17 11:02:48.048174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.423 [2024-11-17 11:02:48.048293] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:1576992768 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:23.423 [2024-11-17 11:02:48.048310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.683 #74 NEW cov: 12534 ft: 15563 corp: 24/1213b lim: 105 exec/s: 74 rss: 73Mb L: 66/90 MS: 1 EraseBytes- 00:08:23.683 [2024-11-17 11:02:48.118657] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1808504321170020633 len:6426 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:23.683 [2024-11-17 11:02:48.118685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.683 [2024-11-17 11:02:48.118783] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:1808504320951916825 len:6426 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:23.683 [2024-11-17 11:02:48.118801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.683 [2024-11-17 11:02:48.118916] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:1808504320951916825 len:6426 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:23.683 [2024-11-17 11:02:48.118940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.683 [2024-11-17 11:02:48.119062] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:1808504320951916825 len:6426 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:23.683 [2024-11-17 11:02:48.119090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:23.683 [2024-11-17 11:02:48.119217] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:4 nsid:0 lba:1808504320951916825 len:6426 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:23.683 [2024-11-17 11:02:48.119243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:23.683 #75 NEW cov: 12534 ft: 15643 corp: 25/1318b lim: 105 exec/s: 75 rss: 73Mb L: 105/105 MS: 1 CrossOver- 00:08:23.683 [2024-11-17 11:02:48.167905] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744071159414783 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:23.683 [2024-11-17 11:02:48.167940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.683 #76 NEW cov: 12534 ft: 15647 corp: 26/1344b lim: 105 exec/s: 76 rss: 73Mb L: 26/105 MS: 1 CopyPart- 00:08:23.683 [2024-11-17 11:02:48.238504] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1808504321170020633 len:6426 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:23.683 [2024-11-17 11:02:48.238545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.684 [2024-11-17 11:02:48.238670] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:1808504320784144665 len:6426 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:23.684 [2024-11-17 11:02:48.238693] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.684 [2024-11-17 11:02:48.238817] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:1808504324825749785 len:6426 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:23.684 [2024-11-17 11:02:48.238848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.684 #77 NEW cov: 12534 ft: 15698 corp: 27/1421b lim: 105 exec/s: 77 rss: 73Mb L: 77/105 MS: 1 CrossOver- 00:08:23.684 [2024-11-17 11:02:48.308842] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:16580311098958735078 len:6426 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:23.684 [2024-11-17 11:02:48.308879] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.684 [2024-11-17 11:02:48.309001] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:1808504320951916825 len:6426 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:23.684 [2024-11-17 11:02:48.309024] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.684 [2024-11-17 11:02:48.309147] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:1808504320951916825 len:6426 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:23.684 [2024-11-17 11:02:48.309175] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.945 #78 NEW cov: 12534 ft: 15722 corp: 28/1498b lim: 105 exec/s: 78 rss: 73Mb L: 77/105 MS: 1 ChangeBinInt- 00:08:23.945 [2024-11-17 11:02:48.378741] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446492281436372991 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:23.945 [2024-11-17 11:02:48.378776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.945 #79 NEW cov: 12534 ft: 15761 corp: 29/1524b lim: 105 exec/s: 79 rss: 74Mb L: 26/105 MS: 1 ChangeBinInt- 00:08:23.945 [2024-11-17 11:02:48.449505] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1808504321170020633 len:6426 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:23.945 [2024-11-17 11:02:48.449538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.945 [2024-11-17 11:02:48.449611] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:1808504320784144665 len:6426 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:23.945 [2024-11-17 11:02:48.449636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.945 [2024-11-17 11:02:48.449762] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:1808504320951916825 len:6426 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:23.945 [2024-11-17 11:02:48.449783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.945 [2024-11-17 11:02:48.449898] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:1808504320951916825 len:49088 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:23.945 [2024-11-17 11:02:48.449920] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:23.945 #80 NEW cov: 12534 ft: 15763 corp: 30/1622b lim: 105 exec/s: 40 rss: 74Mb L: 98/105 MS: 1 InsertRepeatedBytes- 00:08:23.945 #80 DONE cov: 12534 ft: 15763 corp: 30/1622b lim: 105 exec/s: 40 rss: 74Mb 00:08:23.945 Done 80 runs in 2 second(s) 00:08:23.945 11:02:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_16.conf /var/tmp/suppress_nvmf_fuzz 00:08:23.945 11:02:48 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:23.945 11:02:48 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:23.945 11:02:48 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 17 1 0x1 00:08:23.945 11:02:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=17 00:08:23.945 11:02:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:23.945 11:02:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:23.945 11:02:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:08:23.945 11:02:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_17.conf 00:08:23.945 11:02:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:23.945 11:02:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:23.945 11:02:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 17 00:08:23.945 11:02:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4417 00:08:23.945 11:02:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:08:23.945 11:02:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4417' 00:08:23.945 11:02:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4417"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:23.945 11:02:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:23.945 11:02:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:23.945 11:02:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4417' -c /tmp/fuzz_json_17.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 -Z 17 00:08:24.206 [2024-11-17 11:02:48.619173] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:08:24.206 [2024-11-17 11:02:48.619243] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid155451 ] 00:08:24.206 [2024-11-17 11:02:48.818768] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:24.206 [2024-11-17 11:02:48.831199] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:24.466 [2024-11-17 11:02:48.883507] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:24.466 [2024-11-17 11:02:48.899795] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4417 *** 00:08:24.466 INFO: Running with entropic power schedule (0xFF, 100). 00:08:24.466 INFO: Seed: 1891627312 00:08:24.466 INFO: Loaded 1 modules (387604 inline 8-bit counters): 387604 [0x2a8d80c, 0x2aec220), 00:08:24.466 INFO: Loaded 1 PC tables (387604 PCs): 387604 [0x2aec220,0x30d6360), 00:08:24.467 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:08:24.467 INFO: A corpus is not provided, starting from an empty corpus 00:08:24.467 #2 INITED exec/s: 0 rss: 65Mb 00:08:24.467 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:24.467 This may also happen if the target rejected all inputs we tried so far 00:08:24.467 [2024-11-17 11:02:48.944670] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.467 [2024-11-17 11:02:48.944704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.467 [2024-11-17 11:02:48.944754] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.467 [2024-11-17 11:02:48.944772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.467 [2024-11-17 11:02:48.944801] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.467 [2024-11-17 11:02:48.944818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.727 NEW_FUNC[1/717]: 0x46c648 in fuzz_nvm_write_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:540 00:08:24.727 NEW_FUNC[2/717]: 0x48ef68 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:24.727 #11 NEW cov: 12329 ft: 12328 corp: 2/96b lim: 120 exec/s: 0 rss: 72Mb L: 95/95 MS: 4 ChangeByte-CopyPart-ChangeByte-InsertRepeatedBytes- 00:08:24.727 [2024-11-17 11:02:49.305574] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.727 [2024-11-17 11:02:49.305612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.727 [2024-11-17 11:02:49.305661] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.727 [2024-11-17 11:02:49.305680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.727 [2024-11-17 11:02:49.305710] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.727 [2024-11-17 11:02:49.305726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.727 #12 NEW cov: 12442 ft: 12968 corp: 3/191b lim: 120 exec/s: 0 rss: 72Mb L: 95/95 MS: 1 ChangeBit- 00:08:24.988 [2024-11-17 11:02:49.395762] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.988 [2024-11-17 11:02:49.395796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.988 [2024-11-17 11:02:49.395830] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.988 [2024-11-17 11:02:49.395848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.988 [2024-11-17 11:02:49.395877] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.988 [2024-11-17 11:02:49.395894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.988 #13 NEW cov: 12448 ft: 13131 corp: 4/286b lim: 120 exec/s: 0 rss: 72Mb L: 95/95 MS: 1 ChangeBit- 00:08:24.988 [2024-11-17 11:02:49.485897] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.988 [2024-11-17 11:02:49.485926] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.988 [2024-11-17 11:02:49.485974] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.988 [2024-11-17 11:02:49.485992] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.988 [2024-11-17 11:02:49.486022] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.988 [2024-11-17 11:02:49.486039] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.988 #14 NEW cov: 12533 ft: 13428 corp: 5/381b lim: 120 exec/s: 0 rss: 72Mb L: 95/95 MS: 1 CopyPart- 00:08:24.988 [2024-11-17 11:02:49.546140] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:14685055082891561931 len:52172 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.988 [2024-11-17 11:02:49.546170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.988 [2024-11-17 11:02:49.546219] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:14685055086129564619 len:52172 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.988 [2024-11-17 11:02:49.546236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.988 [2024-11-17 11:02:49.546271] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:14685055086129564619 len:52172 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.988 [2024-11-17 11:02:49.546288] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.989 [2024-11-17 11:02:49.546316] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:14685055086129564619 len:52172 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.989 [2024-11-17 11:02:49.546332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:24.989 #15 NEW cov: 12533 ft: 13855 corp: 6/477b lim: 120 exec/s: 0 rss: 72Mb L: 96/96 MS: 1 InsertRepeatedBytes- 00:08:24.989 [2024-11-17 11:02:49.606224] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.989 [2024-11-17 11:02:49.606253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.989 [2024-11-17 11:02:49.606301] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65472 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.989 [2024-11-17 11:02:49.606319] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.989 [2024-11-17 11:02:49.606349] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.989 [2024-11-17 11:02:49.606365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.989 #16 NEW cov: 12533 ft: 13996 corp: 7/572b lim: 120 exec/s: 0 rss: 72Mb L: 95/96 MS: 1 ChangeBit- 00:08:25.249 [2024-11-17 11:02:49.656346] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.249 [2024-11-17 11:02:49.656376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.249 [2024-11-17 11:02:49.656425] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65472 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.249 [2024-11-17 11:02:49.656444] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.249 [2024-11-17 11:02:49.656474] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551590 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.249 [2024-11-17 11:02:49.656490] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.249 #17 NEW cov: 12533 ft: 14075 corp: 8/667b lim: 120 exec/s: 0 rss: 72Mb L: 95/96 MS: 1 ChangeByte- 00:08:25.249 [2024-11-17 11:02:49.746572] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.249 [2024-11-17 11:02:49.746601] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.249 [2024-11-17 11:02:49.746649] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65281 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.249 [2024-11-17 11:02:49.746666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.249 [2024-11-17 11:02:49.746696] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.249 [2024-11-17 11:02:49.746717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.249 #18 NEW cov: 12533 ft: 14105 corp: 9/762b lim: 120 exec/s: 0 rss: 72Mb L: 95/96 MS: 1 ChangeBinInt- 00:08:25.249 [2024-11-17 11:02:49.796700] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.249 [2024-11-17 11:02:49.796729] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.249 [2024-11-17 11:02:49.796777] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65472 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.249 [2024-11-17 11:02:49.796795] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.249 [2024-11-17 11:02:49.796824] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.249 [2024-11-17 11:02:49.796840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.249 #19 NEW cov: 12533 ft: 14218 corp: 10/857b lim: 120 exec/s: 0 rss: 72Mb L: 95/96 MS: 1 ShuffleBytes- 00:08:25.249 [2024-11-17 11:02:49.846833] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.249 [2024-11-17 11:02:49.846862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.249 [2024-11-17 11:02:49.846910] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65472 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.249 [2024-11-17 11:02:49.846927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.249 [2024-11-17 11:02:49.846957] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551590 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.249 [2024-11-17 11:02:49.846973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.509 #20 NEW cov: 12533 ft: 14244 corp: 11/952b lim: 120 exec/s: 0 rss: 72Mb L: 95/96 MS: 1 ShuffleBytes- 00:08:25.509 [2024-11-17 11:02:49.937000] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:5714873654208057167 len:20304 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.509 [2024-11-17 11:02:49.937029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.509 [2024-11-17 11:02:49.937084] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:5714873654208057167 len:20304 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.509 [2024-11-17 11:02:49.937103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.510 #21 NEW cov: 12533 ft: 14604 corp: 12/1003b lim: 120 exec/s: 21 rss: 72Mb L: 51/96 MS: 1 InsertRepeatedBytes- 00:08:25.510 [2024-11-17 11:02:49.997220] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.510 [2024-11-17 11:02:49.997249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.510 [2024-11-17 11:02:49.997297] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65472 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.510 [2024-11-17 11:02:49.997315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.510 [2024-11-17 11:02:49.997349] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551590 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.510 [2024-11-17 11:02:49.997366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.510 #22 NEW cov: 12533 ft: 14639 corp: 13/1098b lim: 120 exec/s: 22 rss: 73Mb L: 95/96 MS: 1 ChangeByte- 00:08:25.510 [2024-11-17 11:02:50.087569] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.510 [2024-11-17 11:02:50.087601] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.510 [2024-11-17 11:02:50.087633] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65472 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.510 [2024-11-17 11:02:50.087651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.510 [2024-11-17 11:02:50.087680] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551590 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.510 [2024-11-17 11:02:50.087697] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.510 [2024-11-17 11:02:50.087725] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.510 [2024-11-17 11:02:50.087741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:25.510 #23 NEW cov: 12533 ft: 14717 corp: 14/1197b lim: 120 exec/s: 23 rss: 73Mb L: 99/99 MS: 1 CMP- DE: "\377\377\001\000"- 00:08:25.510 [2024-11-17 11:02:50.147626] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.510 [2024-11-17 11:02:50.147655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.510 [2024-11-17 11:02:50.147703] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.510 [2024-11-17 11:02:50.147721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.510 [2024-11-17 11:02:50.147751] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551590 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.510 [2024-11-17 11:02:50.147767] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.770 #24 NEW cov: 12533 ft: 14765 corp: 15/1292b lim: 120 exec/s: 24 rss: 73Mb L: 95/99 MS: 1 CrossOver- 00:08:25.770 [2024-11-17 11:02:50.237909] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.770 [2024-11-17 11:02:50.237939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.770 [2024-11-17 11:02:50.237987] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709486336 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.770 [2024-11-17 11:02:50.238004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.770 [2024-11-17 11:02:50.238034] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073701163007 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.770 [2024-11-17 11:02:50.238057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.770 [2024-11-17 11:02:50.238090] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.770 [2024-11-17 11:02:50.238107] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:25.770 #25 NEW cov: 12533 ft: 14807 corp: 16/1391b lim: 120 exec/s: 25 rss: 73Mb L: 99/99 MS: 1 PersAutoDict- DE: "\377\377\001\000"- 00:08:25.770 [2024-11-17 11:02:50.298116] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.770 [2024-11-17 11:02:50.298146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.770 [2024-11-17 11:02:50.298178] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65472 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.770 [2024-11-17 11:02:50.298196] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.770 [2024-11-17 11:02:50.298226] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.770 [2024-11-17 11:02:50.298242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.770 [2024-11-17 11:02:50.298270] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:4785070309113856 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.771 [2024-11-17 11:02:50.298286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:25.771 #26 NEW cov: 12533 ft: 14829 corp: 17/1494b lim: 120 exec/s: 26 rss: 73Mb L: 103/103 MS: 1 CMP- DE: "\001\000\000\000\000\000\000\020"- 00:08:25.771 [2024-11-17 11:02:50.388252] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.771 [2024-11-17 11:02:50.388283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.771 [2024-11-17 11:02:50.388330] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.771 [2024-11-17 11:02:50.388348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.771 [2024-11-17 11:02:50.388379] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.771 [2024-11-17 11:02:50.388396] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.031 #27 NEW cov: 12533 ft: 14858 corp: 18/1589b lim: 120 exec/s: 27 rss: 73Mb L: 95/103 MS: 1 ChangeBit- 00:08:26.031 [2024-11-17 11:02:50.478541] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18381723354089390079 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.031 [2024-11-17 11:02:50.478572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.031 [2024-11-17 11:02:50.478606] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.031 [2024-11-17 11:02:50.478625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.031 [2024-11-17 11:02:50.478655] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.031 [2024-11-17 11:02:50.478676] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.031 #28 NEW cov: 12533 ft: 14867 corp: 19/1684b lim: 120 exec/s: 28 rss: 73Mb L: 95/103 MS: 1 ChangeByte- 00:08:26.031 [2024-11-17 11:02:50.568714] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.031 [2024-11-17 11:02:50.568744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.031 [2024-11-17 11:02:50.568792] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.031 [2024-11-17 11:02:50.568810] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.031 [2024-11-17 11:02:50.568840] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.032 [2024-11-17 11:02:50.568858] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.032 #29 NEW cov: 12533 ft: 14880 corp: 20/1756b lim: 120 exec/s: 29 rss: 73Mb L: 72/103 MS: 1 EraseBytes- 00:08:26.032 [2024-11-17 11:02:50.659086] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.032 [2024-11-17 11:02:50.659117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.032 [2024-11-17 11:02:50.659152] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.032 [2024-11-17 11:02:50.659170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.032 [2024-11-17 11:02:50.659201] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446743240485896191 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.032 [2024-11-17 11:02:50.659219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.032 [2024-11-17 11:02:50.659248] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.032 [2024-11-17 11:02:50.659265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:26.292 #30 NEW cov: 12533 ft: 14905 corp: 21/1852b lim: 120 exec/s: 30 rss: 73Mb L: 96/103 MS: 1 InsertByte- 00:08:26.292 [2024-11-17 11:02:50.749206] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.292 [2024-11-17 11:02:50.749236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.292 [2024-11-17 11:02:50.749284] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.292 [2024-11-17 11:02:50.749302] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.292 [2024-11-17 11:02:50.749332] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.292 [2024-11-17 11:02:50.749348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.292 #31 NEW cov: 12533 ft: 14945 corp: 22/1947b lim: 120 exec/s: 31 rss: 73Mb L: 95/103 MS: 1 ChangeByte- 00:08:26.292 [2024-11-17 11:02:50.809481] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.292 [2024-11-17 11:02:50.809518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.292 [2024-11-17 11:02:50.809552] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.292 [2024-11-17 11:02:50.809570] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.292 [2024-11-17 11:02:50.809600] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446743240485896191 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.292 [2024-11-17 11:02:50.809617] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.292 [2024-11-17 11:02:50.809645] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.292 [2024-11-17 11:02:50.809662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:26.292 NEW_FUNC[1/1]: 0x1c4e498 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:26.292 #32 NEW cov: 12556 ft: 14978 corp: 23/2045b lim: 120 exec/s: 32 rss: 73Mb L: 98/103 MS: 1 CopyPart- 00:08:26.292 [2024-11-17 11:02:50.899531] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.292 [2024-11-17 11:02:50.899560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.292 [2024-11-17 11:02:50.899608] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.292 [2024-11-17 11:02:50.899626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.553 #33 NEW cov: 12556 ft: 15002 corp: 24/2110b lim: 120 exec/s: 16 rss: 73Mb L: 65/103 MS: 1 EraseBytes- 00:08:26.553 #33 DONE cov: 12556 ft: 15002 corp: 24/2110b lim: 120 exec/s: 16 rss: 73Mb 00:08:26.553 ###### Recommended dictionary. ###### 00:08:26.553 "\377\377\001\000" # Uses: 1 00:08:26.553 "\001\000\000\000\000\000\000\020" # Uses: 0 00:08:26.553 ###### End of recommended dictionary. ###### 00:08:26.553 Done 33 runs in 2 second(s) 00:08:26.553 11:02:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_17.conf /var/tmp/suppress_nvmf_fuzz 00:08:26.553 11:02:51 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:26.553 11:02:51 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:26.553 11:02:51 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 18 1 0x1 00:08:26.553 11:02:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=18 00:08:26.553 11:02:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:26.553 11:02:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:26.553 11:02:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:08:26.553 11:02:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_18.conf 00:08:26.553 11:02:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:26.554 11:02:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:26.554 11:02:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 18 00:08:26.554 11:02:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4418 00:08:26.554 11:02:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:08:26.554 11:02:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4418' 00:08:26.554 11:02:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4418"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:26.554 11:02:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:26.554 11:02:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:26.554 11:02:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4418' -c /tmp/fuzz_json_18.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 -Z 18 00:08:26.554 [2024-11-17 11:02:51.116999] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:08:26.554 [2024-11-17 11:02:51.117101] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid155980 ] 00:08:26.814 [2024-11-17 11:02:51.325849] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:26.814 [2024-11-17 11:02:51.338766] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:26.814 [2024-11-17 11:02:51.390972] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:26.814 [2024-11-17 11:02:51.407318] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4418 *** 00:08:26.814 INFO: Running with entropic power schedule (0xFF, 100). 00:08:26.814 INFO: Seed: 104655547 00:08:26.814 INFO: Loaded 1 modules (387604 inline 8-bit counters): 387604 [0x2a8d80c, 0x2aec220), 00:08:26.814 INFO: Loaded 1 PC tables (387604 PCs): 387604 [0x2aec220,0x30d6360), 00:08:26.814 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:08:26.814 INFO: A corpus is not provided, starting from an empty corpus 00:08:26.814 #2 INITED exec/s: 0 rss: 64Mb 00:08:26.814 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:26.814 This may also happen if the target rejected all inputs we tried so far 00:08:26.814 [2024-11-17 11:02:51.451990] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:26.814 [2024-11-17 11:02:51.452022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.335 NEW_FUNC[1/714]: 0x46ff38 in fuzz_nvm_write_zeroes_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:562 00:08:27.335 NEW_FUNC[2/714]: 0x48ef68 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:27.335 #10 NEW cov: 12268 ft: 12256 corp: 2/31b lim: 100 exec/s: 0 rss: 71Mb L: 30/30 MS: 3 CrossOver-InsertRepeatedBytes-CopyPart- 00:08:27.335 [2024-11-17 11:02:51.802936] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:27.335 [2024-11-17 11:02:51.802973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.335 [2024-11-17 11:02:51.803021] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:27.335 [2024-11-17 11:02:51.803039] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.335 [2024-11-17 11:02:51.803074] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:27.335 [2024-11-17 11:02:51.803089] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.335 NEW_FUNC[1/1]: 0x151e6a8 in nvmf_tcp_poll_group_poll /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/tcp.c:3555 00:08:27.335 #14 NEW cov: 12385 ft: 13212 corp: 3/104b lim: 100 exec/s: 0 rss: 71Mb L: 73/73 MS: 4 ChangeBinInt-CopyPart-ShuffleBytes-InsertRepeatedBytes- 00:08:27.335 [2024-11-17 11:02:51.872911] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:27.335 [2024-11-17 11:02:51.872939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.335 #20 NEW cov: 12391 ft: 13484 corp: 4/142b lim: 100 exec/s: 0 rss: 71Mb L: 38/73 MS: 1 EraseBytes- 00:08:27.335 [2024-11-17 11:02:51.963142] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:27.335 [2024-11-17 11:02:51.963171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.603 #26 NEW cov: 12476 ft: 13753 corp: 5/172b lim: 100 exec/s: 0 rss: 71Mb L: 30/73 MS: 1 ChangeByte- 00:08:27.603 [2024-11-17 11:02:52.053383] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:27.603 [2024-11-17 11:02:52.053412] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.603 #27 NEW cov: 12476 ft: 13822 corp: 6/210b lim: 100 exec/s: 0 rss: 71Mb L: 38/73 MS: 1 ChangeBinInt- 00:08:27.603 [2024-11-17 11:02:52.143611] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:27.603 [2024-11-17 11:02:52.143640] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.603 #28 NEW cov: 12476 ft: 13916 corp: 7/249b lim: 100 exec/s: 0 rss: 72Mb L: 39/73 MS: 1 InsertByte- 00:08:27.604 [2024-11-17 11:02:52.233839] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:27.604 [2024-11-17 11:02:52.233867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.865 #29 NEW cov: 12476 ft: 13996 corp: 8/288b lim: 100 exec/s: 0 rss: 72Mb L: 39/73 MS: 1 ChangeBinInt- 00:08:27.865 [2024-11-17 11:02:52.324163] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:27.865 [2024-11-17 11:02:52.324192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.865 [2024-11-17 11:02:52.324238] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:27.865 [2024-11-17 11:02:52.324254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.865 [2024-11-17 11:02:52.324283] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:27.865 [2024-11-17 11:02:52.324297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.865 NEW_FUNC[1/1]: 0x1c4e498 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:27.865 #30 NEW cov: 12499 ft: 14081 corp: 9/361b lim: 100 exec/s: 0 rss: 72Mb L: 73/73 MS: 1 ChangeByte- 00:08:27.865 [2024-11-17 11:02:52.384306] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:27.865 [2024-11-17 11:02:52.384334] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.865 [2024-11-17 11:02:52.384364] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:27.865 [2024-11-17 11:02:52.384395] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.865 [2024-11-17 11:02:52.384424] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:27.865 [2024-11-17 11:02:52.384439] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.865 #31 NEW cov: 12499 ft: 14150 corp: 10/434b lim: 100 exec/s: 31 rss: 72Mb L: 73/73 MS: 1 CopyPart- 00:08:27.865 [2024-11-17 11:02:52.474501] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:27.865 [2024-11-17 11:02:52.474534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.865 [2024-11-17 11:02:52.474581] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:27.865 [2024-11-17 11:02:52.474597] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.865 #32 NEW cov: 12499 ft: 14508 corp: 11/475b lim: 100 exec/s: 32 rss: 72Mb L: 41/73 MS: 1 CMP- DE: "\000\004"- 00:08:28.126 [2024-11-17 11:02:52.534610] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:28.126 [2024-11-17 11:02:52.534638] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.126 #33 NEW cov: 12499 ft: 14525 corp: 12/513b lim: 100 exec/s: 33 rss: 72Mb L: 38/73 MS: 1 ChangeBinInt- 00:08:28.126 [2024-11-17 11:02:52.584739] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:28.126 [2024-11-17 11:02:52.584766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.126 #34 NEW cov: 12499 ft: 14555 corp: 13/547b lim: 100 exec/s: 34 rss: 72Mb L: 34/73 MS: 1 EraseBytes- 00:08:28.126 [2024-11-17 11:02:52.634915] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:28.126 [2024-11-17 11:02:52.634942] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.126 [2024-11-17 11:02:52.634989] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:28.126 [2024-11-17 11:02:52.635005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.126 #35 NEW cov: 12499 ft: 14564 corp: 14/588b lim: 100 exec/s: 35 rss: 72Mb L: 41/73 MS: 1 CMP- DE: "\007\000\000\000\000\000\000\000"- 00:08:28.126 [2024-11-17 11:02:52.725169] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:28.126 [2024-11-17 11:02:52.725196] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.126 [2024-11-17 11:02:52.725242] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:28.126 [2024-11-17 11:02:52.725258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.126 #36 NEW cov: 12499 ft: 14586 corp: 15/632b lim: 100 exec/s: 36 rss: 72Mb L: 44/73 MS: 1 InsertRepeatedBytes- 00:08:28.126 [2024-11-17 11:02:52.775253] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:28.126 [2024-11-17 11:02:52.775296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.387 #37 NEW cov: 12499 ft: 14637 corp: 16/671b lim: 100 exec/s: 37 rss: 72Mb L: 39/73 MS: 1 CrossOver- 00:08:28.387 [2024-11-17 11:02:52.825392] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:28.387 [2024-11-17 11:02:52.825419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.387 #38 NEW cov: 12499 ft: 14703 corp: 17/710b lim: 100 exec/s: 38 rss: 72Mb L: 39/73 MS: 1 CopyPart- 00:08:28.387 [2024-11-17 11:02:52.875575] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:28.387 [2024-11-17 11:02:52.875605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.387 #39 NEW cov: 12499 ft: 14715 corp: 18/748b lim: 100 exec/s: 39 rss: 72Mb L: 38/73 MS: 1 CMP- DE: "!\240X\377\266\254\212\000"- 00:08:28.387 [2024-11-17 11:02:52.965753] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:28.387 [2024-11-17 11:02:52.965785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.387 #40 NEW cov: 12499 ft: 14737 corp: 19/786b lim: 100 exec/s: 40 rss: 72Mb L: 38/73 MS: 1 ShuffleBytes- 00:08:28.387 [2024-11-17 11:02:53.015883] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:28.387 [2024-11-17 11:02:53.015911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.647 #41 NEW cov: 12499 ft: 14786 corp: 20/819b lim: 100 exec/s: 41 rss: 72Mb L: 33/73 MS: 1 EraseBytes- 00:08:28.647 [2024-11-17 11:02:53.066128] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:28.647 [2024-11-17 11:02:53.066155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.647 [2024-11-17 11:02:53.066200] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:28.647 [2024-11-17 11:02:53.066216] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.647 [2024-11-17 11:02:53.066245] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:28.647 [2024-11-17 11:02:53.066260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.647 #42 NEW cov: 12499 ft: 14800 corp: 21/894b lim: 100 exec/s: 42 rss: 72Mb L: 75/75 MS: 1 PersAutoDict- DE: "\000\004"- 00:08:28.647 [2024-11-17 11:02:53.156288] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:28.647 [2024-11-17 11:02:53.156315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.647 [2024-11-17 11:02:53.156362] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:28.647 [2024-11-17 11:02:53.156378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.647 #43 NEW cov: 12499 ft: 14820 corp: 22/946b lim: 100 exec/s: 43 rss: 72Mb L: 52/75 MS: 1 CrossOver- 00:08:28.647 [2024-11-17 11:02:53.246566] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:28.647 [2024-11-17 11:02:53.246594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.647 [2024-11-17 11:02:53.246639] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:28.647 [2024-11-17 11:02:53.246655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.647 [2024-11-17 11:02:53.246684] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:28.647 [2024-11-17 11:02:53.246698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.908 #44 NEW cov: 12499 ft: 14834 corp: 23/1019b lim: 100 exec/s: 44 rss: 72Mb L: 73/75 MS: 1 EraseBytes- 00:08:28.908 [2024-11-17 11:02:53.336748] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:28.908 [2024-11-17 11:02:53.336775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.908 [2024-11-17 11:02:53.336822] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:28.908 [2024-11-17 11:02:53.336838] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.908 #45 NEW cov: 12499 ft: 14873 corp: 24/1078b lim: 100 exec/s: 45 rss: 72Mb L: 59/75 MS: 1 CopyPart- 00:08:28.908 [2024-11-17 11:02:53.396863] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:28.908 [2024-11-17 11:02:53.396894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.908 #46 NEW cov: 12499 ft: 14879 corp: 25/1112b lim: 100 exec/s: 46 rss: 72Mb L: 34/75 MS: 1 CrossOver- 00:08:28.908 [2024-11-17 11:02:53.447039] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:28.908 [2024-11-17 11:02:53.447073] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.908 [2024-11-17 11:02:53.447120] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:28.908 [2024-11-17 11:02:53.447135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.908 #47 NEW cov: 12499 ft: 14891 corp: 26/1152b lim: 100 exec/s: 23 rss: 73Mb L: 40/75 MS: 1 InsertByte- 00:08:28.908 #47 DONE cov: 12499 ft: 14891 corp: 26/1152b lim: 100 exec/s: 23 rss: 73Mb 00:08:28.908 ###### Recommended dictionary. ###### 00:08:28.908 "\000\004" # Uses: 1 00:08:28.908 "\007\000\000\000\000\000\000\000" # Uses: 0 00:08:28.908 "!\240X\377\266\254\212\000" # Uses: 0 00:08:28.908 ###### End of recommended dictionary. ###### 00:08:28.908 Done 47 runs in 2 second(s) 00:08:29.169 11:02:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_18.conf /var/tmp/suppress_nvmf_fuzz 00:08:29.169 11:02:53 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:29.169 11:02:53 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:29.169 11:02:53 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 19 1 0x1 00:08:29.169 11:02:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=19 00:08:29.169 11:02:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:29.169 11:02:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:29.169 11:02:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:08:29.169 11:02:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_19.conf 00:08:29.169 11:02:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:29.169 11:02:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:29.169 11:02:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 19 00:08:29.169 11:02:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4419 00:08:29.169 11:02:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:08:29.169 11:02:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4419' 00:08:29.169 11:02:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4419"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:29.169 11:02:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:29.169 11:02:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:29.169 11:02:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4419' -c /tmp/fuzz_json_19.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 -Z 19 00:08:29.169 [2024-11-17 11:02:53.661307] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:08:29.169 [2024-11-17 11:02:53.661377] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid156318 ] 00:08:29.430 [2024-11-17 11:02:53.858374] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:29.430 [2024-11-17 11:02:53.871814] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:29.430 [2024-11-17 11:02:53.924475] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:29.430 [2024-11-17 11:02:53.940808] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4419 *** 00:08:29.430 INFO: Running with entropic power schedule (0xFF, 100). 00:08:29.430 INFO: Seed: 2638653975 00:08:29.430 INFO: Loaded 1 modules (387604 inline 8-bit counters): 387604 [0x2a8d80c, 0x2aec220), 00:08:29.430 INFO: Loaded 1 PC tables (387604 PCs): 387604 [0x2aec220,0x30d6360), 00:08:29.430 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:08:29.430 INFO: A corpus is not provided, starting from an empty corpus 00:08:29.430 #2 INITED exec/s: 0 rss: 65Mb 00:08:29.430 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:29.430 This may also happen if the target rejected all inputs we tried so far 00:08:29.430 [2024-11-17 11:02:54.007052] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3544668469485187377 len:12594 00:08:29.430 [2024-11-17 11:02:54.007092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.430 [2024-11-17 11:02:54.007199] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:3544668469065756977 len:12594 00:08:29.430 [2024-11-17 11:02:54.007223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.430 [2024-11-17 11:02:54.007344] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:3544668469065756977 len:12594 00:08:29.430 [2024-11-17 11:02:54.007366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.691 NEW_FUNC[1/715]: 0x472ef8 in fuzz_nvm_write_uncorrectable_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:582 00:08:29.691 NEW_FUNC[2/715]: 0x48ef68 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:29.691 #20 NEW cov: 12250 ft: 12251 corp: 2/39b lim: 50 exec/s: 0 rss: 72Mb L: 38/38 MS: 3 CopyPart-ChangeBit-InsertRepeatedBytes- 00:08:29.691 [2024-11-17 11:02:54.337936] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3544668469485187377 len:12594 00:08:29.691 [2024-11-17 11:02:54.337982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.691 [2024-11-17 11:02:54.338109] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:3544668469065756977 len:12594 00:08:29.691 [2024-11-17 11:02:54.338135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.691 [2024-11-17 11:02:54.338253] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:3544668469065756977 len:12594 00:08:29.691 [2024-11-17 11:02:54.338276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.952 #31 NEW cov: 12363 ft: 12857 corp: 3/77b lim: 50 exec/s: 0 rss: 72Mb L: 38/38 MS: 1 ChangeByte- 00:08:29.952 [2024-11-17 11:02:54.408089] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3544668468413083953 len:12594 00:08:29.952 [2024-11-17 11:02:54.408123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.952 [2024-11-17 11:02:54.408215] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:3544668469065756977 len:12594 00:08:29.952 [2024-11-17 11:02:54.408239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.952 [2024-11-17 11:02:54.408353] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:3544668469065756977 len:12594 00:08:29.952 [2024-11-17 11:02:54.408382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.952 #35 NEW cov: 12369 ft: 13075 corp: 4/116b lim: 50 exec/s: 0 rss: 72Mb L: 39/39 MS: 4 CrossOver-CopyPart-ChangeBit-CrossOver- 00:08:29.952 [2024-11-17 11:02:54.458030] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3544668469485187377 len:12594 00:08:29.952 [2024-11-17 11:02:54.458066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.952 [2024-11-17 11:02:54.458178] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:3544668469065756977 len:12594 00:08:29.952 [2024-11-17 11:02:54.458205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.952 #36 NEW cov: 12454 ft: 13633 corp: 5/142b lim: 50 exec/s: 0 rss: 72Mb L: 26/39 MS: 1 EraseBytes- 00:08:29.952 [2024-11-17 11:02:54.528356] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3544668469485187377 len:12594 00:08:29.952 [2024-11-17 11:02:54.528392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.952 [2024-11-17 11:02:54.528499] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:3544668469065756977 len:12594 00:08:29.952 [2024-11-17 11:02:54.528520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.952 [2024-11-17 11:02:54.528637] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:3544668469065756977 len:12594 00:08:29.952 [2024-11-17 11:02:54.528660] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.952 #37 NEW cov: 12454 ft: 13726 corp: 6/180b lim: 50 exec/s: 0 rss: 72Mb L: 38/39 MS: 1 ChangeBinInt- 00:08:29.952 [2024-11-17 11:02:54.578779] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3544668469485187377 len:12594 00:08:29.952 [2024-11-17 11:02:54.578811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.952 [2024-11-17 11:02:54.578887] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:3544668469065756977 len:12594 00:08:29.952 [2024-11-17 11:02:54.578914] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.952 [2024-11-17 11:02:54.579031] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:822083584 len:1 00:08:29.952 [2024-11-17 11:02:54.579059] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.952 [2024-11-17 11:02:54.579185] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:3544668469065756977 len:12594 00:08:29.952 [2024-11-17 11:02:54.579211] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.952 #38 NEW cov: 12454 ft: 14024 corp: 7/227b lim: 50 exec/s: 0 rss: 72Mb L: 47/47 MS: 1 InsertRepeatedBytes- 00:08:30.213 [2024-11-17 11:02:54.628753] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3544668469485187377 len:12594 00:08:30.213 [2024-11-17 11:02:54.628785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.213 [2024-11-17 11:02:54.628886] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:3544668469065756977 len:12594 00:08:30.213 [2024-11-17 11:02:54.628911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.214 [2024-11-17 11:02:54.629021] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:3544668469065756977 len:12594 00:08:30.214 [2024-11-17 11:02:54.629038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.214 #39 NEW cov: 12454 ft: 14128 corp: 8/258b lim: 50 exec/s: 0 rss: 72Mb L: 31/47 MS: 1 EraseBytes- 00:08:30.214 [2024-11-17 11:02:54.699396] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3544668469485187377 len:12594 00:08:30.214 [2024-11-17 11:02:54.699430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.214 [2024-11-17 11:02:54.699510] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:3544668469065756977 len:12594 00:08:30.214 [2024-11-17 11:02:54.699532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.214 [2024-11-17 11:02:54.699638] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:822083584 len:1 00:08:30.214 [2024-11-17 11:02:54.699661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.214 [2024-11-17 11:02:54.699783] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:3544668469065756977 len:12594 00:08:30.214 [2024-11-17 11:02:54.699806] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:30.214 [2024-11-17 11:02:54.699929] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:4 nsid:0 lba:3544668469065756977 len:12555 00:08:30.214 [2024-11-17 11:02:54.699954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:30.214 #40 NEW cov: 12454 ft: 14223 corp: 9/308b lim: 50 exec/s: 0 rss: 72Mb L: 50/50 MS: 1 CrossOver- 00:08:30.214 [2024-11-17 11:02:54.769145] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3544668469485187377 len:12594 00:08:30.214 [2024-11-17 11:02:54.769179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.214 [2024-11-17 11:02:54.769270] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:3544668469065756977 len:12594 00:08:30.214 [2024-11-17 11:02:54.769295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.214 [2024-11-17 11:02:54.769411] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:3544668469065756977 len:12593 00:08:30.214 [2024-11-17 11:02:54.769434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.214 #41 NEW cov: 12454 ft: 14256 corp: 10/339b lim: 50 exec/s: 0 rss: 72Mb L: 31/50 MS: 1 ChangeASCIIInt- 00:08:30.214 [2024-11-17 11:02:54.839387] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3544668469485187377 len:12594 00:08:30.214 [2024-11-17 11:02:54.839419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.214 [2024-11-17 11:02:54.839522] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:3544668469065756977 len:12594 00:08:30.214 [2024-11-17 11:02:54.839547] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.214 [2024-11-17 11:02:54.839656] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:3544668469065756977 len:12594 00:08:30.214 [2024-11-17 11:02:54.839682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.214 #42 NEW cov: 12454 ft: 14396 corp: 11/370b lim: 50 exec/s: 0 rss: 72Mb L: 31/50 MS: 1 ChangeBit- 00:08:30.475 [2024-11-17 11:02:54.889825] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:6004234345560363859 len:21332 00:08:30.475 [2024-11-17 11:02:54.889857] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.475 [2024-11-17 11:02:54.889956] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:6004234345560363859 len:21332 00:08:30.475 [2024-11-17 11:02:54.889980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.475 [2024-11-17 11:02:54.890099] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:6004234345560363859 len:21332 00:08:30.475 [2024-11-17 11:02:54.890125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.475 [2024-11-17 11:02:54.890239] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:6004234345560363859 len:21332 00:08:30.475 [2024-11-17 11:02:54.890264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:30.475 NEW_FUNC[1/1]: 0x1c4e498 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:30.475 #43 NEW cov: 12477 ft: 14463 corp: 12/413b lim: 50 exec/s: 0 rss: 73Mb L: 43/50 MS: 1 InsertRepeatedBytes- 00:08:30.475 [2024-11-17 11:02:54.940094] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3544668469485187377 len:12594 00:08:30.475 [2024-11-17 11:02:54.940124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.475 [2024-11-17 11:02:54.940227] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:3544668469065756977 len:12594 00:08:30.476 [2024-11-17 11:02:54.940249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.476 [2024-11-17 11:02:54.940367] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:822083584 len:1 00:08:30.476 [2024-11-17 11:02:54.940392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.476 [2024-11-17 11:02:54.940502] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:3544668469065756977 len:12594 00:08:30.476 [2024-11-17 11:02:54.940525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:30.476 [2024-11-17 11:02:54.940646] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:4 nsid:0 lba:3544668469065757233 len:12555 00:08:30.476 [2024-11-17 11:02:54.940667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:30.476 #44 NEW cov: 12477 ft: 14511 corp: 13/463b lim: 50 exec/s: 0 rss: 73Mb L: 50/50 MS: 1 ChangeBinInt- 00:08:30.476 [2024-11-17 11:02:55.009902] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3544668469485187377 len:12594 00:08:30.476 [2024-11-17 11:02:55.009936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.476 [2024-11-17 11:02:55.010044] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:3544668469065756977 len:12594 00:08:30.476 [2024-11-17 11:02:55.010072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.476 [2024-11-17 11:02:55.010188] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:3544668469065756977 len:12594 00:08:30.476 [2024-11-17 11:02:55.010214] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.476 #45 NEW cov: 12477 ft: 14535 corp: 14/500b lim: 50 exec/s: 45 rss: 73Mb L: 37/50 MS: 1 CopyPart- 00:08:30.476 [2024-11-17 11:02:55.080122] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3544668469485187377 len:12588 00:08:30.476 [2024-11-17 11:02:55.080156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.476 [2024-11-17 11:02:55.080275] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:3544668469065756977 len:12594 00:08:30.476 [2024-11-17 11:02:55.080298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.476 [2024-11-17 11:02:55.080411] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:3544668469065756977 len:12594 00:08:30.476 [2024-11-17 11:02:55.080435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.476 #46 NEW cov: 12477 ft: 14588 corp: 15/537b lim: 50 exec/s: 46 rss: 73Mb L: 37/50 MS: 1 ChangeBinInt- 00:08:30.737 [2024-11-17 11:02:55.150344] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3544668469485187377 len:12594 00:08:30.737 [2024-11-17 11:02:55.150378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.737 [2024-11-17 11:02:55.150483] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:3544668469065756977 len:12594 00:08:30.737 [2024-11-17 11:02:55.150507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.737 [2024-11-17 11:02:55.150633] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:3544895853224341809 len:65288 00:08:30.737 [2024-11-17 11:02:55.150656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.737 #47 NEW cov: 12477 ft: 14610 corp: 16/572b lim: 50 exec/s: 47 rss: 73Mb L: 35/50 MS: 1 CMP- DE: "\377\377\377\007"- 00:08:30.737 [2024-11-17 11:02:55.200577] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3544668469485187377 len:12588 00:08:30.737 [2024-11-17 11:02:55.200612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.737 [2024-11-17 11:02:55.200715] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:3544668469065756977 len:12594 00:08:30.737 [2024-11-17 11:02:55.200739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.737 [2024-11-17 11:02:55.200858] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:3544668469065756977 len:12594 00:08:30.737 [2024-11-17 11:02:55.200877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.737 #48 NEW cov: 12477 ft: 14656 corp: 17/609b lim: 50 exec/s: 48 rss: 73Mb L: 37/50 MS: 1 CopyPart- 00:08:30.737 [2024-11-17 11:02:55.270731] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3544668468413083953 len:12594 00:08:30.737 [2024-11-17 11:02:55.270768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.737 [2024-11-17 11:02:55.270886] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:3544668469065756977 len:12594 00:08:30.737 [2024-11-17 11:02:55.270910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.737 [2024-11-17 11:02:55.271023] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:3544668471213240625 len:12594 00:08:30.737 [2024-11-17 11:02:55.271053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.737 #49 NEW cov: 12477 ft: 14678 corp: 18/648b lim: 50 exec/s: 49 rss: 73Mb L: 39/50 MS: 1 ChangeBit- 00:08:30.737 [2024-11-17 11:02:55.340829] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3544668469485187377 len:12594 00:08:30.737 [2024-11-17 11:02:55.340863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.737 [2024-11-17 11:02:55.340952] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:3544668469065756977 len:12594 00:08:30.737 [2024-11-17 11:02:55.340975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.737 [2024-11-17 11:02:55.341098] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:3544668469065756977 len:12594 00:08:30.737 [2024-11-17 11:02:55.341120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.737 #50 NEW cov: 12477 ft: 14707 corp: 19/686b lim: 50 exec/s: 50 rss: 73Mb L: 38/50 MS: 1 ChangeASCIIInt- 00:08:30.737 [2024-11-17 11:02:55.391066] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3544668469485187377 len:12588 00:08:30.737 [2024-11-17 11:02:55.391102] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.737 [2024-11-17 11:02:55.391218] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:3544668469065756977 len:12594 00:08:30.737 [2024-11-17 11:02:55.391245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.737 [2024-11-17 11:02:55.391358] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:446191925434855729 len:12594 00:08:30.737 [2024-11-17 11:02:55.391380] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.997 #51 NEW cov: 12477 ft: 14734 corp: 20/724b lim: 50 exec/s: 51 rss: 73Mb L: 38/50 MS: 1 InsertByte- 00:08:30.997 [2024-11-17 11:02:55.461310] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3544668469485187377 len:12594 00:08:30.997 [2024-11-17 11:02:55.461346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.997 [2024-11-17 11:02:55.461442] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:3544668469065756977 len:9778 00:08:30.997 [2024-11-17 11:02:55.461468] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.997 [2024-11-17 11:02:55.461587] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:3544668469065756977 len:12594 00:08:30.997 [2024-11-17 11:02:55.461608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.997 #52 NEW cov: 12477 ft: 14773 corp: 21/763b lim: 50 exec/s: 52 rss: 73Mb L: 39/50 MS: 1 InsertByte- 00:08:30.997 [2024-11-17 11:02:55.531718] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3544668469485187377 len:12594 00:08:30.997 [2024-11-17 11:02:55.531753] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.997 [2024-11-17 11:02:55.531865] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:3544668469065756977 len:12594 00:08:30.997 [2024-11-17 11:02:55.531889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.997 [2024-11-17 11:02:55.531998] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:504403159104356351 len:1 00:08:30.997 [2024-11-17 11:02:55.532020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.997 [2024-11-17 11:02:55.532136] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:3544668469065756977 len:12594 00:08:30.997 [2024-11-17 11:02:55.532155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:30.997 #53 NEW cov: 12477 ft: 14793 corp: 22/810b lim: 50 exec/s: 53 rss: 73Mb L: 47/50 MS: 1 PersAutoDict- DE: "\377\377\377\007"- 00:08:30.997 [2024-11-17 11:02:55.581714] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3544668469485187377 len:12594 00:08:30.997 [2024-11-17 11:02:55.581749] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.997 [2024-11-17 11:02:55.581858] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:3544668469065756977 len:12594 00:08:30.997 [2024-11-17 11:02:55.581877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.997 [2024-11-17 11:02:55.581985] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:3544668469065756977 len:12594 00:08:30.997 [2024-11-17 11:02:55.582011] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.997 #54 NEW cov: 12477 ft: 14845 corp: 23/848b lim: 50 exec/s: 54 rss: 73Mb L: 38/50 MS: 1 CopyPart- 00:08:30.997 [2024-11-17 11:02:55.631861] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3544668468413083953 len:12594 00:08:30.997 [2024-11-17 11:02:55.631892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.997 [2024-11-17 11:02:55.631981] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:3544668472535416583 len:12594 00:08:30.997 [2024-11-17 11:02:55.632008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.997 [2024-11-17 11:02:55.632124] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:3544668471213240625 len:12594 00:08:30.997 [2024-11-17 11:02:55.632148] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.257 #55 NEW cov: 12477 ft: 14852 corp: 24/887b lim: 50 exec/s: 55 rss: 74Mb L: 39/50 MS: 1 PersAutoDict- DE: "\377\377\377\007"- 00:08:31.257 [2024-11-17 11:02:55.701965] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3544668469485187377 len:12594 00:08:31.257 [2024-11-17 11:02:55.701997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.257 [2024-11-17 11:02:55.702107] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:3544668469065756977 len:9778 00:08:31.257 [2024-11-17 11:02:55.702127] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.257 [2024-11-17 11:02:55.702246] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:3544668469065756977 len:12594 00:08:31.257 [2024-11-17 11:02:55.702265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.257 #56 NEW cov: 12477 ft: 14872 corp: 25/926b lim: 50 exec/s: 56 rss: 74Mb L: 39/50 MS: 1 ChangeASCIIInt- 00:08:31.258 [2024-11-17 11:02:55.772559] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3544668469485187377 len:12594 00:08:31.258 [2024-11-17 11:02:55.772594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.258 [2024-11-17 11:02:55.772709] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:3544668469065756977 len:12594 00:08:31.258 [2024-11-17 11:02:55.772726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.258 [2024-11-17 11:02:55.772836] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:805306368 len:1 00:08:31.258 [2024-11-17 11:02:55.772860] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.258 [2024-11-17 11:02:55.772980] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:3544668469065756977 len:12594 00:08:31.258 [2024-11-17 11:02:55.773005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:31.258 [2024-11-17 11:02:55.773113] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:4 nsid:0 lba:3544668469065756977 len:12555 00:08:31.258 [2024-11-17 11:02:55.773136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:31.258 #57 NEW cov: 12477 ft: 14954 corp: 26/976b lim: 50 exec/s: 57 rss: 74Mb L: 50/50 MS: 1 ChangeASCIIInt- 00:08:31.258 [2024-11-17 11:02:55.822325] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3544668468413083953 len:12594 00:08:31.258 [2024-11-17 11:02:55.822360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.258 [2024-11-17 11:02:55.822470] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:3544668469065756977 len:12594 00:08:31.258 [2024-11-17 11:02:55.822493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.258 [2024-11-17 11:02:55.822610] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:3544668469065756977 len:12594 00:08:31.258 [2024-11-17 11:02:55.822635] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.258 #58 NEW cov: 12477 ft: 14978 corp: 27/1008b lim: 50 exec/s: 58 rss: 74Mb L: 32/50 MS: 1 CrossOver- 00:08:31.258 [2024-11-17 11:02:55.872501] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3544668469485187377 len:12594 00:08:31.258 [2024-11-17 11:02:55.872536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.258 [2024-11-17 11:02:55.872650] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:3544668469065756977 len:12594 00:08:31.258 [2024-11-17 11:02:55.872672] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.258 [2024-11-17 11:02:55.872787] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:3544668469065756977 len:12594 00:08:31.258 [2024-11-17 11:02:55.872811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.258 #59 NEW cov: 12477 ft: 15044 corp: 28/1045b lim: 50 exec/s: 59 rss: 74Mb L: 37/50 MS: 1 ShuffleBytes- 00:08:31.518 [2024-11-17 11:02:55.922809] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3544668469485187377 len:12588 00:08:31.518 [2024-11-17 11:02:55.922843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.518 [2024-11-17 11:02:55.922934] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:3544668469065756977 len:12594 00:08:31.518 [2024-11-17 11:02:55.922957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.518 [2024-11-17 11:02:55.923077] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:6872316418842505567 len:12594 00:08:31.518 [2024-11-17 11:02:55.923102] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.518 [2024-11-17 11:02:55.923215] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:3544668469065756977 len:12594 00:08:31.518 [2024-11-17 11:02:55.923236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:31.518 #60 NEW cov: 12477 ft: 15069 corp: 29/1087b lim: 50 exec/s: 60 rss: 74Mb L: 42/50 MS: 1 InsertRepeatedBytes- 00:08:31.518 [2024-11-17 11:02:55.972777] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3544668469485187377 len:12594 00:08:31.518 [2024-11-17 11:02:55.972808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.518 [2024-11-17 11:02:55.972917] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:3544668469065795377 len:12594 00:08:31.518 [2024-11-17 11:02:55.972940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.518 [2024-11-17 11:02:55.973054] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:3544668469065756977 len:12594 00:08:31.518 [2024-11-17 11:02:55.973074] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.518 #61 NEW cov: 12477 ft: 15075 corp: 30/1125b lim: 50 exec/s: 30 rss: 74Mb L: 38/50 MS: 1 ChangeBinInt- 00:08:31.518 #61 DONE cov: 12477 ft: 15075 corp: 30/1125b lim: 50 exec/s: 30 rss: 74Mb 00:08:31.518 ###### Recommended dictionary. ###### 00:08:31.518 "\377\377\377\007" # Uses: 2 00:08:31.518 ###### End of recommended dictionary. ###### 00:08:31.518 Done 61 runs in 2 second(s) 00:08:31.518 11:02:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_19.conf /var/tmp/suppress_nvmf_fuzz 00:08:31.518 11:02:56 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:31.518 11:02:56 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:31.518 11:02:56 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 20 1 0x1 00:08:31.518 11:02:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=20 00:08:31.518 11:02:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:31.518 11:02:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:31.518 11:02:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:08:31.519 11:02:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_20.conf 00:08:31.519 11:02:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:31.519 11:02:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:31.519 11:02:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 20 00:08:31.519 11:02:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4420 00:08:31.519 11:02:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:08:31.519 11:02:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4420' 00:08:31.519 11:02:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4420"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:31.519 11:02:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:31.519 11:02:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:31.519 11:02:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4420' -c /tmp/fuzz_json_20.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 -Z 20 00:08:31.519 [2024-11-17 11:02:56.137828] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:08:31.519 [2024-11-17 11:02:56.137891] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid156800 ] 00:08:31.779 [2024-11-17 11:02:56.329416] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:31.779 [2024-11-17 11:02:56.343142] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:31.779 [2024-11-17 11:02:56.395729] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:31.779 [2024-11-17 11:02:56.412064] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4420 *** 00:08:31.779 INFO: Running with entropic power schedule (0xFF, 100). 00:08:31.779 INFO: Seed: 814693382 00:08:32.040 INFO: Loaded 1 modules (387604 inline 8-bit counters): 387604 [0x2a8d80c, 0x2aec220), 00:08:32.040 INFO: Loaded 1 PC tables (387604 PCs): 387604 [0x2aec220,0x30d6360), 00:08:32.040 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:08:32.040 INFO: A corpus is not provided, starting from an empty corpus 00:08:32.040 #2 INITED exec/s: 0 rss: 65Mb 00:08:32.040 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:32.040 This may also happen if the target rejected all inputs we tried so far 00:08:32.040 [2024-11-17 11:02:56.467656] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:32.040 [2024-11-17 11:02:56.467693] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.040 [2024-11-17 11:02:56.467760] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:32.040 [2024-11-17 11:02:56.467780] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.040 [2024-11-17 11:02:56.467841] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:32.040 [2024-11-17 11:02:56.467860] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.300 NEW_FUNC[1/717]: 0x474ab8 in fuzz_nvm_reservation_acquire_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:597 00:08:32.300 NEW_FUNC[2/717]: 0x48ef68 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:32.300 #8 NEW cov: 12308 ft: 12307 corp: 2/60b lim: 90 exec/s: 0 rss: 71Mb L: 59/59 MS: 1 InsertRepeatedBytes- 00:08:32.300 [2024-11-17 11:02:56.798618] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:32.300 [2024-11-17 11:02:56.798685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.300 [2024-11-17 11:02:56.798770] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:32.300 [2024-11-17 11:02:56.798798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.300 [2024-11-17 11:02:56.798884] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:32.300 [2024-11-17 11:02:56.798913] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.300 #14 NEW cov: 12421 ft: 12936 corp: 3/120b lim: 90 exec/s: 0 rss: 72Mb L: 60/60 MS: 1 InsertByte- 00:08:32.300 [2024-11-17 11:02:56.868544] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:32.300 [2024-11-17 11:02:56.868572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.300 [2024-11-17 11:02:56.868626] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:32.300 [2024-11-17 11:02:56.868642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.300 [2024-11-17 11:02:56.868699] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:32.300 [2024-11-17 11:02:56.868713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.300 #15 NEW cov: 12427 ft: 13232 corp: 4/180b lim: 90 exec/s: 0 rss: 72Mb L: 60/60 MS: 1 ChangeASCIIInt- 00:08:32.300 [2024-11-17 11:02:56.928665] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:32.300 [2024-11-17 11:02:56.928693] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.300 [2024-11-17 11:02:56.928733] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:32.301 [2024-11-17 11:02:56.928750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.301 [2024-11-17 11:02:56.928803] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:32.301 [2024-11-17 11:02:56.928819] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.561 #21 NEW cov: 12512 ft: 13518 corp: 5/243b lim: 90 exec/s: 0 rss: 72Mb L: 63/63 MS: 1 CopyPart- 00:08:32.561 [2024-11-17 11:02:56.988847] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:32.561 [2024-11-17 11:02:56.988873] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.561 [2024-11-17 11:02:56.988918] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:32.561 [2024-11-17 11:02:56.988933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.561 [2024-11-17 11:02:56.988988] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:32.561 [2024-11-17 11:02:56.989004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.561 #22 NEW cov: 12512 ft: 13628 corp: 6/303b lim: 90 exec/s: 0 rss: 72Mb L: 60/63 MS: 1 ShuffleBytes- 00:08:32.561 [2024-11-17 11:02:57.029080] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:32.561 [2024-11-17 11:02:57.029109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.561 [2024-11-17 11:02:57.029152] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:32.561 [2024-11-17 11:02:57.029166] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.561 [2024-11-17 11:02:57.029219] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:32.562 [2024-11-17 11:02:57.029235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.562 [2024-11-17 11:02:57.029288] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:32.562 [2024-11-17 11:02:57.029304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:32.562 #26 NEW cov: 12512 ft: 14083 corp: 7/379b lim: 90 exec/s: 0 rss: 72Mb L: 76/76 MS: 4 InsertByte-ChangeByte-EraseBytes-InsertRepeatedBytes- 00:08:32.562 [2024-11-17 11:02:57.068754] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:32.562 [2024-11-17 11:02:57.068782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.562 #27 NEW cov: 12512 ft: 14992 corp: 8/398b lim: 90 exec/s: 0 rss: 72Mb L: 19/76 MS: 1 CrossOver- 00:08:32.562 [2024-11-17 11:02:57.129219] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:32.562 [2024-11-17 11:02:57.129245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.562 [2024-11-17 11:02:57.129287] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:32.562 [2024-11-17 11:02:57.129301] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.562 [2024-11-17 11:02:57.129357] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:32.562 [2024-11-17 11:02:57.129372] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.562 #28 NEW cov: 12512 ft: 15017 corp: 9/458b lim: 90 exec/s: 0 rss: 72Mb L: 60/76 MS: 1 ShuffleBytes- 00:08:32.562 [2024-11-17 11:02:57.169337] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:32.562 [2024-11-17 11:02:57.169364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.562 [2024-11-17 11:02:57.169403] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:32.562 [2024-11-17 11:02:57.169419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.562 [2024-11-17 11:02:57.169475] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:32.562 [2024-11-17 11:02:57.169491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.562 #29 NEW cov: 12512 ft: 15108 corp: 10/517b lim: 90 exec/s: 0 rss: 72Mb L: 59/76 MS: 1 ChangeASCIIInt- 00:08:32.562 [2024-11-17 11:02:57.209287] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:32.562 [2024-11-17 11:02:57.209313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.562 [2024-11-17 11:02:57.209350] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:32.562 [2024-11-17 11:02:57.209365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.823 #30 NEW cov: 12512 ft: 15446 corp: 11/569b lim: 90 exec/s: 0 rss: 72Mb L: 52/76 MS: 1 EraseBytes- 00:08:32.823 [2024-11-17 11:02:57.249690] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:32.823 [2024-11-17 11:02:57.249717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.823 [2024-11-17 11:02:57.249765] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:32.823 [2024-11-17 11:02:57.249781] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.823 [2024-11-17 11:02:57.249836] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:32.823 [2024-11-17 11:02:57.249851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.823 [2024-11-17 11:02:57.249905] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:32.823 [2024-11-17 11:02:57.249921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:32.823 #32 NEW cov: 12512 ft: 15457 corp: 12/644b lim: 90 exec/s: 0 rss: 72Mb L: 75/76 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:08:32.823 [2024-11-17 11:02:57.289660] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:32.823 [2024-11-17 11:02:57.289686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.823 [2024-11-17 11:02:57.289724] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:32.823 [2024-11-17 11:02:57.289739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.823 [2024-11-17 11:02:57.289795] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:32.823 [2024-11-17 11:02:57.289810] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.823 #33 NEW cov: 12512 ft: 15498 corp: 13/705b lim: 90 exec/s: 0 rss: 72Mb L: 61/76 MS: 1 InsertByte- 00:08:32.823 [2024-11-17 11:02:57.349988] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:32.823 [2024-11-17 11:02:57.350015] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.823 [2024-11-17 11:02:57.350060] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:32.823 [2024-11-17 11:02:57.350076] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.823 [2024-11-17 11:02:57.350132] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:32.823 [2024-11-17 11:02:57.350164] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.823 [2024-11-17 11:02:57.350219] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:32.823 [2024-11-17 11:02:57.350234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:32.823 NEW_FUNC[1/1]: 0x1c4e498 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:32.823 #34 NEW cov: 12535 ft: 15568 corp: 14/780b lim: 90 exec/s: 0 rss: 73Mb L: 75/76 MS: 1 ChangeBinInt- 00:08:32.823 [2024-11-17 11:02:57.409993] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:32.823 [2024-11-17 11:02:57.410019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.823 [2024-11-17 11:02:57.410067] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:32.823 [2024-11-17 11:02:57.410086] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.823 [2024-11-17 11:02:57.410143] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:32.823 [2024-11-17 11:02:57.410159] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.823 #35 NEW cov: 12535 ft: 15592 corp: 15/849b lim: 90 exec/s: 0 rss: 73Mb L: 69/76 MS: 1 EraseBytes- 00:08:32.823 [2024-11-17 11:02:57.450255] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:32.823 [2024-11-17 11:02:57.450282] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.823 [2024-11-17 11:02:57.450346] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:32.823 [2024-11-17 11:02:57.450362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.823 [2024-11-17 11:02:57.450416] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:32.823 [2024-11-17 11:02:57.450432] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.823 [2024-11-17 11:02:57.450487] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:32.823 [2024-11-17 11:02:57.450503] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:32.823 #36 NEW cov: 12535 ft: 15602 corp: 16/924b lim: 90 exec/s: 36 rss: 73Mb L: 75/76 MS: 1 ChangeBit- 00:08:33.084 [2024-11-17 11:02:57.490112] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:33.084 [2024-11-17 11:02:57.490141] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.084 [2024-11-17 11:02:57.490224] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:33.084 [2024-11-17 11:02:57.490241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.084 #37 NEW cov: 12535 ft: 15634 corp: 17/976b lim: 90 exec/s: 37 rss: 73Mb L: 52/76 MS: 1 ChangeBit- 00:08:33.084 [2024-11-17 11:02:57.550360] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:33.084 [2024-11-17 11:02:57.550386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.084 [2024-11-17 11:02:57.550426] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:33.084 [2024-11-17 11:02:57.550442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.084 [2024-11-17 11:02:57.550500] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:33.084 [2024-11-17 11:02:57.550515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.084 #38 NEW cov: 12535 ft: 15643 corp: 18/1037b lim: 90 exec/s: 38 rss: 73Mb L: 61/76 MS: 1 InsertByte- 00:08:33.084 [2024-11-17 11:02:57.590516] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:33.084 [2024-11-17 11:02:57.590542] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.084 [2024-11-17 11:02:57.590588] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:33.084 [2024-11-17 11:02:57.590604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.084 [2024-11-17 11:02:57.590663] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:33.084 [2024-11-17 11:02:57.590679] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.084 #39 NEW cov: 12535 ft: 15667 corp: 19/1100b lim: 90 exec/s: 39 rss: 73Mb L: 63/76 MS: 1 ShuffleBytes- 00:08:33.084 [2024-11-17 11:02:57.650718] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:33.084 [2024-11-17 11:02:57.650745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.084 [2024-11-17 11:02:57.650781] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:33.084 [2024-11-17 11:02:57.650797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.084 [2024-11-17 11:02:57.650854] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:33.084 [2024-11-17 11:02:57.650869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.084 #40 NEW cov: 12535 ft: 15683 corp: 20/1163b lim: 90 exec/s: 40 rss: 73Mb L: 63/76 MS: 1 ChangeBinInt- 00:08:33.084 [2024-11-17 11:02:57.710859] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:33.084 [2024-11-17 11:02:57.710886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.084 [2024-11-17 11:02:57.710932] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:33.084 [2024-11-17 11:02:57.710948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.084 [2024-11-17 11:02:57.711004] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:33.084 [2024-11-17 11:02:57.711019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.345 #41 NEW cov: 12535 ft: 15732 corp: 21/1232b lim: 90 exec/s: 41 rss: 73Mb L: 69/76 MS: 1 CMP- DE: "\377\377\377\377\377\377\003\000"- 00:08:33.345 [2024-11-17 11:02:57.771008] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:33.345 [2024-11-17 11:02:57.771034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.345 [2024-11-17 11:02:57.771083] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:33.345 [2024-11-17 11:02:57.771099] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.345 [2024-11-17 11:02:57.771155] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:33.345 [2024-11-17 11:02:57.771169] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.345 #42 NEW cov: 12535 ft: 15745 corp: 22/1291b lim: 90 exec/s: 42 rss: 73Mb L: 59/76 MS: 1 ChangeBit- 00:08:33.345 [2024-11-17 11:02:57.811121] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:33.345 [2024-11-17 11:02:57.811146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.345 [2024-11-17 11:02:57.811207] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:33.345 [2024-11-17 11:02:57.811223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.345 [2024-11-17 11:02:57.811280] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:33.345 [2024-11-17 11:02:57.811300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.345 #43 NEW cov: 12535 ft: 15762 corp: 23/1359b lim: 90 exec/s: 43 rss: 73Mb L: 68/76 MS: 1 InsertRepeatedBytes- 00:08:33.345 [2024-11-17 11:02:57.871426] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:33.345 [2024-11-17 11:02:57.871452] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.345 [2024-11-17 11:02:57.871501] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:33.345 [2024-11-17 11:02:57.871517] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.345 [2024-11-17 11:02:57.871573] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:33.345 [2024-11-17 11:02:57.871587] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.345 [2024-11-17 11:02:57.871643] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:33.345 [2024-11-17 11:02:57.871657] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:33.345 #44 NEW cov: 12535 ft: 15770 corp: 24/1445b lim: 90 exec/s: 44 rss: 73Mb L: 86/86 MS: 1 InsertRepeatedBytes- 00:08:33.345 [2024-11-17 11:02:57.931457] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:33.345 [2024-11-17 11:02:57.931482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.345 [2024-11-17 11:02:57.931527] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:33.345 [2024-11-17 11:02:57.931543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.345 [2024-11-17 11:02:57.931597] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:33.345 [2024-11-17 11:02:57.931628] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.345 #45 NEW cov: 12535 ft: 15778 corp: 25/1516b lim: 90 exec/s: 45 rss: 73Mb L: 71/86 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\003\000"- 00:08:33.345 [2024-11-17 11:02:57.971572] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:33.345 [2024-11-17 11:02:57.971598] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.345 [2024-11-17 11:02:57.971633] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:33.345 [2024-11-17 11:02:57.971649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.345 [2024-11-17 11:02:57.971705] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:33.345 [2024-11-17 11:02:57.971721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.345 #46 NEW cov: 12535 ft: 15786 corp: 26/1575b lim: 90 exec/s: 46 rss: 73Mb L: 59/86 MS: 1 ChangeBit- 00:08:33.606 [2024-11-17 11:02:58.011734] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:33.606 [2024-11-17 11:02:58.011763] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.606 [2024-11-17 11:02:58.011807] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:33.606 [2024-11-17 11:02:58.011823] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.606 [2024-11-17 11:02:58.011881] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:33.606 [2024-11-17 11:02:58.011897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.606 #47 NEW cov: 12535 ft: 15797 corp: 27/1636b lim: 90 exec/s: 47 rss: 73Mb L: 61/86 MS: 1 ChangeBit- 00:08:33.606 [2024-11-17 11:02:58.051672] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:33.606 [2024-11-17 11:02:58.051698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.606 [2024-11-17 11:02:58.051736] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:33.606 [2024-11-17 11:02:58.051752] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.606 #48 NEW cov: 12535 ft: 15863 corp: 28/1688b lim: 90 exec/s: 48 rss: 73Mb L: 52/86 MS: 1 ChangeBit- 00:08:33.606 [2024-11-17 11:02:58.112134] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:33.606 [2024-11-17 11:02:58.112161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.606 [2024-11-17 11:02:58.112211] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:33.606 [2024-11-17 11:02:58.112226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.606 [2024-11-17 11:02:58.112278] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:33.606 [2024-11-17 11:02:58.112293] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.606 [2024-11-17 11:02:58.112348] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:33.606 [2024-11-17 11:02:58.112364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:33.606 #49 NEW cov: 12535 ft: 15881 corp: 29/1764b lim: 90 exec/s: 49 rss: 73Mb L: 76/86 MS: 1 InsertByte- 00:08:33.606 [2024-11-17 11:02:58.152290] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:33.606 [2024-11-17 11:02:58.152317] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.606 [2024-11-17 11:02:58.152369] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:33.606 [2024-11-17 11:02:58.152384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.606 [2024-11-17 11:02:58.152439] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:33.606 [2024-11-17 11:02:58.152454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.606 [2024-11-17 11:02:58.152508] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:33.606 [2024-11-17 11:02:58.152523] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:33.606 #50 NEW cov: 12535 ft: 15886 corp: 30/1853b lim: 90 exec/s: 50 rss: 73Mb L: 89/89 MS: 1 CopyPart- 00:08:33.606 [2024-11-17 11:02:58.192391] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:33.606 [2024-11-17 11:02:58.192419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.606 [2024-11-17 11:02:58.192460] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:33.606 [2024-11-17 11:02:58.192479] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.606 [2024-11-17 11:02:58.192531] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:33.606 [2024-11-17 11:02:58.192563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.606 [2024-11-17 11:02:58.192616] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:33.606 [2024-11-17 11:02:58.192630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:33.606 #51 NEW cov: 12535 ft: 15904 corp: 31/1928b lim: 90 exec/s: 51 rss: 73Mb L: 75/89 MS: 1 CopyPart- 00:08:33.606 [2024-11-17 11:02:58.252452] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:33.606 [2024-11-17 11:02:58.252478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.606 [2024-11-17 11:02:58.252524] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:33.606 [2024-11-17 11:02:58.252539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.606 [2024-11-17 11:02:58.252593] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:33.606 [2024-11-17 11:02:58.252608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.867 #52 NEW cov: 12535 ft: 15909 corp: 32/1997b lim: 90 exec/s: 52 rss: 73Mb L: 69/89 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\003\000"- 00:08:33.867 [2024-11-17 11:02:58.292502] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:33.867 [2024-11-17 11:02:58.292528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.867 [2024-11-17 11:02:58.292563] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:33.867 [2024-11-17 11:02:58.292579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.867 [2024-11-17 11:02:58.292633] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:33.867 [2024-11-17 11:02:58.292648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.867 #53 NEW cov: 12535 ft: 15932 corp: 33/2066b lim: 90 exec/s: 53 rss: 73Mb L: 69/89 MS: 1 ChangeBinInt- 00:08:33.867 [2024-11-17 11:02:58.352674] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:33.867 [2024-11-17 11:02:58.352700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.867 [2024-11-17 11:02:58.352748] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:33.867 [2024-11-17 11:02:58.352763] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.867 [2024-11-17 11:02:58.352815] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:33.867 [2024-11-17 11:02:58.352829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.867 #54 NEW cov: 12535 ft: 15988 corp: 34/2126b lim: 90 exec/s: 54 rss: 73Mb L: 60/89 MS: 1 CrossOver- 00:08:33.867 [2024-11-17 11:02:58.392625] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:33.867 [2024-11-17 11:02:58.392656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.867 [2024-11-17 11:02:58.392704] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:33.867 [2024-11-17 11:02:58.392720] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.867 #55 NEW cov: 12535 ft: 16007 corp: 35/2173b lim: 90 exec/s: 55 rss: 73Mb L: 47/89 MS: 1 EraseBytes- 00:08:33.867 [2024-11-17 11:02:58.432881] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:33.867 [2024-11-17 11:02:58.432907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.867 [2024-11-17 11:02:58.432954] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:33.867 [2024-11-17 11:02:58.432970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.867 [2024-11-17 11:02:58.433023] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:33.867 [2024-11-17 11:02:58.433045] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.867 #56 NEW cov: 12535 ft: 16055 corp: 36/2233b lim: 90 exec/s: 56 rss: 73Mb L: 60/89 MS: 1 ShuffleBytes- 00:08:33.867 [2024-11-17 11:02:58.473149] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:33.867 [2024-11-17 11:02:58.473177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.867 [2024-11-17 11:02:58.473227] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:33.867 [2024-11-17 11:02:58.473243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.867 [2024-11-17 11:02:58.473314] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:33.867 [2024-11-17 11:02:58.473331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.867 [2024-11-17 11:02:58.473385] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:33.867 [2024-11-17 11:02:58.473401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:33.867 #57 NEW cov: 12535 ft: 16083 corp: 37/2308b lim: 90 exec/s: 28 rss: 73Mb L: 75/89 MS: 1 ChangeByte- 00:08:33.867 #57 DONE cov: 12535 ft: 16083 corp: 37/2308b lim: 90 exec/s: 28 rss: 73Mb 00:08:33.867 ###### Recommended dictionary. ###### 00:08:33.867 "\377\377\377\377\377\377\003\000" # Uses: 2 00:08:33.867 ###### End of recommended dictionary. ###### 00:08:33.867 Done 57 runs in 2 second(s) 00:08:34.128 11:02:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_20.conf /var/tmp/suppress_nvmf_fuzz 00:08:34.128 11:02:58 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:34.128 11:02:58 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:34.128 11:02:58 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 21 1 0x1 00:08:34.128 11:02:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=21 00:08:34.128 11:02:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:34.128 11:02:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:34.128 11:02:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:34.128 11:02:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_21.conf 00:08:34.128 11:02:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:34.128 11:02:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:34.128 11:02:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 21 00:08:34.128 11:02:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4421 00:08:34.128 11:02:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:34.128 11:02:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4421' 00:08:34.128 11:02:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4421"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:34.128 11:02:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:34.128 11:02:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:34.128 11:02:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4421' -c /tmp/fuzz_json_21.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 -Z 21 00:08:34.128 [2024-11-17 11:02:58.637678] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:08:34.128 [2024-11-17 11:02:58.637749] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid157291 ] 00:08:34.389 [2024-11-17 11:02:58.835793] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:34.389 [2024-11-17 11:02:58.848545] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:34.389 [2024-11-17 11:02:58.900828] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:34.389 [2024-11-17 11:02:58.917159] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4421 *** 00:08:34.389 INFO: Running with entropic power schedule (0xFF, 100). 00:08:34.389 INFO: Seed: 3317681279 00:08:34.389 INFO: Loaded 1 modules (387604 inline 8-bit counters): 387604 [0x2a8d80c, 0x2aec220), 00:08:34.389 INFO: Loaded 1 PC tables (387604 PCs): 387604 [0x2aec220,0x30d6360), 00:08:34.389 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:34.389 INFO: A corpus is not provided, starting from an empty corpus 00:08:34.389 #2 INITED exec/s: 0 rss: 65Mb 00:08:34.389 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:34.389 This may also happen if the target rejected all inputs we tried so far 00:08:34.389 [2024-11-17 11:02:58.975863] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:34.389 [2024-11-17 11:02:58.975892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.389 [2024-11-17 11:02:58.975935] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:34.389 [2024-11-17 11:02:58.975952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.389 [2024-11-17 11:02:58.976010] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:34.389 [2024-11-17 11:02:58.976026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.659 NEW_FUNC[1/717]: 0x477ce8 in fuzz_nvm_reservation_release_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:623 00:08:34.659 NEW_FUNC[2/717]: 0x48ef68 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:34.659 #7 NEW cov: 12283 ft: 12284 corp: 2/34b lim: 50 exec/s: 0 rss: 71Mb L: 33/33 MS: 5 CrossOver-ChangeBit-ChangeBit-CrossOver-InsertRepeatedBytes- 00:08:34.659 [2024-11-17 11:02:59.306499] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:34.659 [2024-11-17 11:02:59.306566] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.920 #8 NEW cov: 12396 ft: 13672 corp: 3/49b lim: 50 exec/s: 0 rss: 72Mb L: 15/33 MS: 1 InsertRepeatedBytes- 00:08:34.920 [2024-11-17 11:02:59.356421] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:34.920 [2024-11-17 11:02:59.356450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.920 #10 NEW cov: 12402 ft: 14003 corp: 4/59b lim: 50 exec/s: 0 rss: 72Mb L: 10/33 MS: 2 CMP-InsertByte- DE: "\376\362\241\302\272\254\212\000"- 00:08:34.920 [2024-11-17 11:02:59.396842] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:34.920 [2024-11-17 11:02:59.396869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.920 [2024-11-17 11:02:59.396920] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:34.920 [2024-11-17 11:02:59.396937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.920 [2024-11-17 11:02:59.396993] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:34.920 [2024-11-17 11:02:59.397010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.920 #11 NEW cov: 12487 ft: 14235 corp: 5/92b lim: 50 exec/s: 0 rss: 72Mb L: 33/33 MS: 1 ShuffleBytes- 00:08:34.920 [2024-11-17 11:02:59.456692] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:34.920 [2024-11-17 11:02:59.456718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.920 #12 NEW cov: 12487 ft: 14292 corp: 6/102b lim: 50 exec/s: 0 rss: 72Mb L: 10/33 MS: 1 ChangeBinInt- 00:08:34.920 [2024-11-17 11:02:59.516867] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:34.920 [2024-11-17 11:02:59.516894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.920 #13 NEW cov: 12487 ft: 14329 corp: 7/118b lim: 50 exec/s: 0 rss: 72Mb L: 16/33 MS: 1 InsertByte- 00:08:35.181 [2024-11-17 11:02:59.577059] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:35.181 [2024-11-17 11:02:59.577086] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.181 #17 NEW cov: 12487 ft: 14393 corp: 8/135b lim: 50 exec/s: 0 rss: 72Mb L: 17/33 MS: 4 CrossOver-ChangeByte-InsertByte-InsertRepeatedBytes- 00:08:35.181 [2024-11-17 11:02:59.617123] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:35.181 [2024-11-17 11:02:59.617152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.181 #18 NEW cov: 12487 ft: 14417 corp: 9/152b lim: 50 exec/s: 0 rss: 72Mb L: 17/33 MS: 1 ChangeByte- 00:08:35.181 [2024-11-17 11:02:59.677317] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:35.181 [2024-11-17 11:02:59.677346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.181 #19 NEW cov: 12487 ft: 14437 corp: 10/162b lim: 50 exec/s: 0 rss: 72Mb L: 10/33 MS: 1 ChangeBinInt- 00:08:35.181 [2024-11-17 11:02:59.717708] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:35.181 [2024-11-17 11:02:59.717735] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.181 [2024-11-17 11:02:59.717776] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:35.181 [2024-11-17 11:02:59.717791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.181 [2024-11-17 11:02:59.717847] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:35.181 [2024-11-17 11:02:59.717862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.181 #21 NEW cov: 12487 ft: 14484 corp: 11/201b lim: 50 exec/s: 0 rss: 72Mb L: 39/39 MS: 2 EraseBytes-InsertRepeatedBytes- 00:08:35.181 [2024-11-17 11:02:59.757522] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:35.181 [2024-11-17 11:02:59.757550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.181 #25 NEW cov: 12487 ft: 14711 corp: 12/211b lim: 50 exec/s: 0 rss: 72Mb L: 10/39 MS: 4 InsertByte-ChangeByte-CopyPart-PersAutoDict- DE: "\376\362\241\302\272\254\212\000"- 00:08:35.181 [2024-11-17 11:02:59.797620] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:35.181 [2024-11-17 11:02:59.797646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.442 #26 NEW cov: 12487 ft: 14750 corp: 13/227b lim: 50 exec/s: 0 rss: 72Mb L: 16/39 MS: 1 ChangeBit- 00:08:35.442 [2024-11-17 11:02:59.857833] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:35.442 [2024-11-17 11:02:59.857861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.442 NEW_FUNC[1/1]: 0x1c4e498 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:35.442 #27 NEW cov: 12510 ft: 14788 corp: 14/241b lim: 50 exec/s: 0 rss: 72Mb L: 14/39 MS: 1 EraseBytes- 00:08:35.442 [2024-11-17 11:02:59.897888] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:35.442 [2024-11-17 11:02:59.897916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.442 #28 NEW cov: 12510 ft: 14818 corp: 15/251b lim: 50 exec/s: 0 rss: 73Mb L: 10/39 MS: 1 ChangeByte- 00:08:35.442 [2024-11-17 11:02:59.958093] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:35.442 [2024-11-17 11:02:59.958122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.442 #34 NEW cov: 12510 ft: 14830 corp: 16/267b lim: 50 exec/s: 34 rss: 73Mb L: 16/39 MS: 1 CMP- DE: "\000\021"- 00:08:35.442 [2024-11-17 11:03:00.018293] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:35.442 [2024-11-17 11:03:00.018322] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.442 #35 NEW cov: 12510 ft: 14981 corp: 17/281b lim: 50 exec/s: 35 rss: 73Mb L: 14/39 MS: 1 ShuffleBytes- 00:08:35.442 [2024-11-17 11:03:00.078622] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:35.442 [2024-11-17 11:03:00.078652] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.442 [2024-11-17 11:03:00.078713] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:35.442 [2024-11-17 11:03:00.078730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.702 #37 NEW cov: 12510 ft: 15333 corp: 18/302b lim: 50 exec/s: 37 rss: 73Mb L: 21/39 MS: 2 EraseBytes-InsertRepeatedBytes- 00:08:35.702 [2024-11-17 11:03:00.118877] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:35.702 [2024-11-17 11:03:00.118911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.702 [2024-11-17 11:03:00.118953] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:35.702 [2024-11-17 11:03:00.118969] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.702 [2024-11-17 11:03:00.119025] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:35.702 [2024-11-17 11:03:00.119049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.702 #38 NEW cov: 12510 ft: 15409 corp: 19/332b lim: 50 exec/s: 38 rss: 73Mb L: 30/39 MS: 1 CopyPart- 00:08:35.702 [2024-11-17 11:03:00.179225] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:35.702 [2024-11-17 11:03:00.179252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.702 [2024-11-17 11:03:00.179296] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:35.702 [2024-11-17 11:03:00.179312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.702 [2024-11-17 11:03:00.179368] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:35.702 [2024-11-17 11:03:00.179384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.702 [2024-11-17 11:03:00.179443] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:35.702 [2024-11-17 11:03:00.179458] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:35.702 #39 NEW cov: 12510 ft: 15790 corp: 20/373b lim: 50 exec/s: 39 rss: 73Mb L: 41/41 MS: 1 PersAutoDict- DE: "\376\362\241\302\272\254\212\000"- 00:08:35.702 [2024-11-17 11:03:00.218820] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:35.702 [2024-11-17 11:03:00.218848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.702 #40 NEW cov: 12510 ft: 15815 corp: 21/389b lim: 50 exec/s: 40 rss: 73Mb L: 16/41 MS: 1 ChangeBit- 00:08:35.702 [2024-11-17 11:03:00.258950] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:35.703 [2024-11-17 11:03:00.258978] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.703 #41 NEW cov: 12510 ft: 15824 corp: 22/406b lim: 50 exec/s: 41 rss: 73Mb L: 17/41 MS: 1 InsertByte- 00:08:35.703 [2024-11-17 11:03:00.299259] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:35.703 [2024-11-17 11:03:00.299285] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.703 [2024-11-17 11:03:00.299331] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:35.703 [2024-11-17 11:03:00.299346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.703 #42 NEW cov: 12510 ft: 15870 corp: 23/427b lim: 50 exec/s: 42 rss: 73Mb L: 21/41 MS: 1 EraseBytes- 00:08:35.963 [2024-11-17 11:03:00.359516] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:35.963 [2024-11-17 11:03:00.359543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.963 [2024-11-17 11:03:00.359587] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:35.963 [2024-11-17 11:03:00.359608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.963 [2024-11-17 11:03:00.359664] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:35.963 [2024-11-17 11:03:00.359679] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.963 #43 NEW cov: 12510 ft: 15893 corp: 24/466b lim: 50 exec/s: 43 rss: 73Mb L: 39/41 MS: 1 ChangeBit- 00:08:35.963 [2024-11-17 11:03:00.419415] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:35.963 [2024-11-17 11:03:00.419443] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.963 #44 NEW cov: 12510 ft: 15896 corp: 25/484b lim: 50 exec/s: 44 rss: 73Mb L: 18/41 MS: 1 CopyPart- 00:08:35.963 [2024-11-17 11:03:00.479620] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:35.963 [2024-11-17 11:03:00.479648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.963 #45 NEW cov: 12510 ft: 15943 corp: 26/494b lim: 50 exec/s: 45 rss: 73Mb L: 10/41 MS: 1 ChangeBinInt- 00:08:35.963 [2024-11-17 11:03:00.519691] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:35.963 [2024-11-17 11:03:00.519718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.963 #46 NEW cov: 12510 ft: 15955 corp: 27/510b lim: 50 exec/s: 46 rss: 73Mb L: 16/41 MS: 1 CopyPart- 00:08:35.963 [2024-11-17 11:03:00.560256] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:35.963 [2024-11-17 11:03:00.560283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.963 [2024-11-17 11:03:00.560332] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:35.963 [2024-11-17 11:03:00.560348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.963 [2024-11-17 11:03:00.560402] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:35.963 [2024-11-17 11:03:00.560417] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.963 [2024-11-17 11:03:00.560473] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:35.963 [2024-11-17 11:03:00.560489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:35.963 #47 NEW cov: 12510 ft: 15964 corp: 28/554b lim: 50 exec/s: 47 rss: 73Mb L: 44/44 MS: 1 InsertRepeatedBytes- 00:08:35.963 [2024-11-17 11:03:00.600330] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:35.963 [2024-11-17 11:03:00.600357] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.963 [2024-11-17 11:03:00.600406] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:35.964 [2024-11-17 11:03:00.600422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.964 [2024-11-17 11:03:00.600478] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:35.964 [2024-11-17 11:03:00.600493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.964 [2024-11-17 11:03:00.600549] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:35.964 [2024-11-17 11:03:00.600569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.224 #49 NEW cov: 12510 ft: 16053 corp: 29/596b lim: 50 exec/s: 49 rss: 73Mb L: 42/44 MS: 2 EraseBytes-InsertRepeatedBytes- 00:08:36.224 [2024-11-17 11:03:00.660538] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:36.224 [2024-11-17 11:03:00.660565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.224 [2024-11-17 11:03:00.660614] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:36.224 [2024-11-17 11:03:00.660631] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.224 [2024-11-17 11:03:00.660686] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:36.224 [2024-11-17 11:03:00.660703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.224 [2024-11-17 11:03:00.660759] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:36.224 [2024-11-17 11:03:00.660775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.224 #50 NEW cov: 12510 ft: 16100 corp: 30/639b lim: 50 exec/s: 50 rss: 73Mb L: 43/44 MS: 1 CrossOver- 00:08:36.224 [2024-11-17 11:03:00.720400] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:36.224 [2024-11-17 11:03:00.720428] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.224 [2024-11-17 11:03:00.720474] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:36.224 [2024-11-17 11:03:00.720489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.224 #51 NEW cov: 12510 ft: 16112 corp: 31/661b lim: 50 exec/s: 51 rss: 74Mb L: 22/44 MS: 1 CopyPart- 00:08:36.224 [2024-11-17 11:03:00.780423] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:36.224 [2024-11-17 11:03:00.780452] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.224 #52 NEW cov: 12510 ft: 16120 corp: 32/671b lim: 50 exec/s: 52 rss: 74Mb L: 10/44 MS: 1 ChangeBit- 00:08:36.224 [2024-11-17 11:03:00.840561] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:36.224 [2024-11-17 11:03:00.840590] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.224 #53 NEW cov: 12510 ft: 16144 corp: 33/688b lim: 50 exec/s: 53 rss: 74Mb L: 17/44 MS: 1 InsertByte- 00:08:36.485 [2024-11-17 11:03:00.880635] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:36.485 [2024-11-17 11:03:00.880664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.485 #54 NEW cov: 12510 ft: 16147 corp: 34/702b lim: 50 exec/s: 54 rss: 74Mb L: 14/44 MS: 1 ChangeBinInt- 00:08:36.485 [2024-11-17 11:03:00.921241] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:36.485 [2024-11-17 11:03:00.921268] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.485 [2024-11-17 11:03:00.921316] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:36.485 [2024-11-17 11:03:00.921332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.485 [2024-11-17 11:03:00.921387] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:36.485 [2024-11-17 11:03:00.921406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.485 [2024-11-17 11:03:00.921460] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:36.485 [2024-11-17 11:03:00.921476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.485 #55 NEW cov: 12510 ft: 16167 corp: 35/743b lim: 50 exec/s: 27 rss: 74Mb L: 41/44 MS: 1 InsertRepeatedBytes- 00:08:36.485 #55 DONE cov: 12510 ft: 16167 corp: 35/743b lim: 50 exec/s: 27 rss: 74Mb 00:08:36.485 ###### Recommended dictionary. ###### 00:08:36.485 "\376\362\241\302\272\254\212\000" # Uses: 2 00:08:36.485 "\000\021" # Uses: 0 00:08:36.485 ###### End of recommended dictionary. ###### 00:08:36.485 Done 55 runs in 2 second(s) 00:08:36.485 11:03:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_21.conf /var/tmp/suppress_nvmf_fuzz 00:08:36.485 11:03:01 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:36.485 11:03:01 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:36.485 11:03:01 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 22 1 0x1 00:08:36.485 11:03:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=22 00:08:36.485 11:03:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:36.485 11:03:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:36.485 11:03:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:36.485 11:03:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_22.conf 00:08:36.485 11:03:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:36.485 11:03:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:36.485 11:03:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 22 00:08:36.485 11:03:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4422 00:08:36.485 11:03:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:36.485 11:03:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4422' 00:08:36.486 11:03:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4422"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:36.486 11:03:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:36.486 11:03:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:36.486 11:03:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4422' -c /tmp/fuzz_json_22.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 -Z 22 00:08:36.486 [2024-11-17 11:03:01.105096] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:08:36.486 [2024-11-17 11:03:01.105162] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid157696 ] 00:08:36.747 [2024-11-17 11:03:01.304071] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:36.747 [2024-11-17 11:03:01.317882] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:36.747 [2024-11-17 11:03:01.370620] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:36.747 [2024-11-17 11:03:01.386958] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4422 *** 00:08:36.747 INFO: Running with entropic power schedule (0xFF, 100). 00:08:36.747 INFO: Seed: 1491796441 00:08:37.008 INFO: Loaded 1 modules (387604 inline 8-bit counters): 387604 [0x2a8d80c, 0x2aec220), 00:08:37.008 INFO: Loaded 1 PC tables (387604 PCs): 387604 [0x2aec220,0x30d6360), 00:08:37.008 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:37.008 INFO: A corpus is not provided, starting from an empty corpus 00:08:37.008 #2 INITED exec/s: 0 rss: 65Mb 00:08:37.008 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:37.008 This may also happen if the target rejected all inputs we tried so far 00:08:37.008 [2024-11-17 11:03:01.455606] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:37.008 [2024-11-17 11:03:01.455651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.008 [2024-11-17 11:03:01.455770] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:37.008 [2024-11-17 11:03:01.455791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.269 NEW_FUNC[1/717]: 0x479fb8 in fuzz_nvm_reservation_register_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:644 00:08:37.269 NEW_FUNC[2/717]: 0x48ef68 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:37.269 #4 NEW cov: 12309 ft: 12306 corp: 2/39b lim: 85 exec/s: 0 rss: 72Mb L: 38/38 MS: 2 ChangeByte-InsertRepeatedBytes- 00:08:37.269 [2024-11-17 11:03:01.806806] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:37.269 [2024-11-17 11:03:01.806864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.269 [2024-11-17 11:03:01.807008] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:37.269 [2024-11-17 11:03:01.807037] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.269 #12 NEW cov: 12422 ft: 12988 corp: 3/84b lim: 85 exec/s: 0 rss: 72Mb L: 45/45 MS: 3 InsertByte-InsertByte-InsertRepeatedBytes- 00:08:37.269 [2024-11-17 11:03:01.856796] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:37.269 [2024-11-17 11:03:01.856828] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.269 [2024-11-17 11:03:01.856979] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:37.269 [2024-11-17 11:03:01.857002] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.269 #18 NEW cov: 12428 ft: 13277 corp: 4/122b lim: 85 exec/s: 0 rss: 72Mb L: 38/45 MS: 1 ChangeByte- 00:08:37.529 [2024-11-17 11:03:01.927014] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:37.529 [2024-11-17 11:03:01.927053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.529 [2024-11-17 11:03:01.927199] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:37.529 [2024-11-17 11:03:01.927223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.529 #19 NEW cov: 12513 ft: 13532 corp: 5/161b lim: 85 exec/s: 0 rss: 72Mb L: 39/45 MS: 1 InsertByte- 00:08:37.529 [2024-11-17 11:03:01.977295] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:37.529 [2024-11-17 11:03:01.977329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.529 [2024-11-17 11:03:01.977470] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:37.529 [2024-11-17 11:03:01.977497] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.529 [2024-11-17 11:03:01.977635] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:37.529 [2024-11-17 11:03:01.977656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:37.529 #20 NEW cov: 12513 ft: 13962 corp: 6/224b lim: 85 exec/s: 0 rss: 72Mb L: 63/63 MS: 1 CopyPart- 00:08:37.529 [2024-11-17 11:03:02.027356] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:37.529 [2024-11-17 11:03:02.027394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.530 [2024-11-17 11:03:02.027536] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:37.530 [2024-11-17 11:03:02.027560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.530 [2024-11-17 11:03:02.027690] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:37.530 [2024-11-17 11:03:02.027714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:37.530 #21 NEW cov: 12513 ft: 14089 corp: 7/287b lim: 85 exec/s: 0 rss: 73Mb L: 63/63 MS: 1 ChangeBinInt- 00:08:37.530 [2024-11-17 11:03:02.097432] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:37.530 [2024-11-17 11:03:02.097469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.530 [2024-11-17 11:03:02.097599] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:37.530 [2024-11-17 11:03:02.097626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.530 #22 NEW cov: 12513 ft: 14239 corp: 8/325b lim: 85 exec/s: 0 rss: 73Mb L: 38/63 MS: 1 CopyPart- 00:08:37.530 [2024-11-17 11:03:02.168141] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:37.530 [2024-11-17 11:03:02.168176] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.530 [2024-11-17 11:03:02.168309] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:37.530 [2024-11-17 11:03:02.168333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.791 #23 NEW cov: 12522 ft: 14446 corp: 9/363b lim: 85 exec/s: 0 rss: 73Mb L: 38/63 MS: 1 ChangeBinInt- 00:08:37.791 [2024-11-17 11:03:02.218019] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:37.791 [2024-11-17 11:03:02.218048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.791 [2024-11-17 11:03:02.218198] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:37.791 [2024-11-17 11:03:02.218220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.791 #24 NEW cov: 12522 ft: 14470 corp: 10/399b lim: 85 exec/s: 0 rss: 73Mb L: 36/63 MS: 1 CrossOver- 00:08:37.791 [2024-11-17 11:03:02.267981] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:37.791 [2024-11-17 11:03:02.268016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.791 [2024-11-17 11:03:02.268168] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:37.791 [2024-11-17 11:03:02.268190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.791 #25 NEW cov: 12522 ft: 14514 corp: 11/438b lim: 85 exec/s: 0 rss: 73Mb L: 39/63 MS: 1 ChangeByte- 00:08:37.791 [2024-11-17 11:03:02.338225] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:37.791 [2024-11-17 11:03:02.338257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.791 [2024-11-17 11:03:02.338409] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:37.791 [2024-11-17 11:03:02.338434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.791 NEW_FUNC[1/1]: 0x1c4e498 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:37.791 #26 NEW cov: 12545 ft: 14574 corp: 12/476b lim: 85 exec/s: 0 rss: 73Mb L: 38/63 MS: 1 CopyPart- 00:08:37.791 [2024-11-17 11:03:02.388358] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:37.791 [2024-11-17 11:03:02.388384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.791 [2024-11-17 11:03:02.388534] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:37.791 [2024-11-17 11:03:02.388558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.791 #27 NEW cov: 12545 ft: 14596 corp: 13/515b lim: 85 exec/s: 27 rss: 73Mb L: 39/63 MS: 1 ChangeASCIIInt- 00:08:38.060 [2024-11-17 11:03:02.458652] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:38.060 [2024-11-17 11:03:02.458682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.060 [2024-11-17 11:03:02.458822] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:38.060 [2024-11-17 11:03:02.458845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.060 #28 NEW cov: 12545 ft: 14639 corp: 14/552b lim: 85 exec/s: 28 rss: 73Mb L: 37/63 MS: 1 InsertByte- 00:08:38.060 [2024-11-17 11:03:02.528836] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:38.060 [2024-11-17 11:03:02.528868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.060 [2024-11-17 11:03:02.529011] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:38.060 [2024-11-17 11:03:02.529036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.060 #29 NEW cov: 12545 ft: 14669 corp: 15/592b lim: 85 exec/s: 29 rss: 73Mb L: 40/63 MS: 1 InsertByte- 00:08:38.060 [2024-11-17 11:03:02.579208] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:38.060 [2024-11-17 11:03:02.579237] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.060 [2024-11-17 11:03:02.579320] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:38.060 [2024-11-17 11:03:02.579345] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.060 [2024-11-17 11:03:02.579472] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:38.060 [2024-11-17 11:03:02.579495] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.060 #30 NEW cov: 12545 ft: 14714 corp: 16/646b lim: 85 exec/s: 30 rss: 73Mb L: 54/63 MS: 1 CopyPart- 00:08:38.060 [2024-11-17 11:03:02.649423] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:38.060 [2024-11-17 11:03:02.649459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.060 [2024-11-17 11:03:02.649580] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:38.060 [2024-11-17 11:03:02.649605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.060 #31 NEW cov: 12545 ft: 14720 corp: 17/684b lim: 85 exec/s: 31 rss: 73Mb L: 38/63 MS: 1 ChangeBit- 00:08:38.321 [2024-11-17 11:03:02.719312] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:38.321 [2024-11-17 11:03:02.719339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.321 #32 NEW cov: 12545 ft: 15494 corp: 18/705b lim: 85 exec/s: 32 rss: 73Mb L: 21/63 MS: 1 CrossOver- 00:08:38.321 [2024-11-17 11:03:02.789835] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:38.321 [2024-11-17 11:03:02.789871] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.321 [2024-11-17 11:03:02.789997] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:38.321 [2024-11-17 11:03:02.790023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.321 #33 NEW cov: 12545 ft: 15522 corp: 19/743b lim: 85 exec/s: 33 rss: 74Mb L: 38/63 MS: 1 CrossOver- 00:08:38.321 [2024-11-17 11:03:02.860167] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:38.321 [2024-11-17 11:03:02.860203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.321 [2024-11-17 11:03:02.860312] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:38.321 [2024-11-17 11:03:02.860336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.321 [2024-11-17 11:03:02.860470] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:38.321 [2024-11-17 11:03:02.860494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.321 #34 NEW cov: 12545 ft: 15540 corp: 20/808b lim: 85 exec/s: 34 rss: 74Mb L: 65/65 MS: 1 CopyPart- 00:08:38.321 [2024-11-17 11:03:02.910278] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:38.321 [2024-11-17 11:03:02.910314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.321 [2024-11-17 11:03:02.910437] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:38.321 [2024-11-17 11:03:02.910457] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.321 [2024-11-17 11:03:02.910595] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:38.321 [2024-11-17 11:03:02.910620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.321 #35 NEW cov: 12545 ft: 15552 corp: 21/862b lim: 85 exec/s: 35 rss: 74Mb L: 54/65 MS: 1 ChangeByte- 00:08:38.582 [2024-11-17 11:03:02.980612] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:38.582 [2024-11-17 11:03:02.980648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.582 [2024-11-17 11:03:02.980762] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:38.583 [2024-11-17 11:03:02.980785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.583 [2024-11-17 11:03:02.980912] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:38.583 [2024-11-17 11:03:02.980937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.583 #36 NEW cov: 12545 ft: 15563 corp: 22/927b lim: 85 exec/s: 36 rss: 74Mb L: 65/65 MS: 1 InsertRepeatedBytes- 00:08:38.583 [2024-11-17 11:03:03.030715] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:38.583 [2024-11-17 11:03:03.030752] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.583 [2024-11-17 11:03:03.030899] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:38.583 [2024-11-17 11:03:03.030920] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.583 [2024-11-17 11:03:03.031062] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:38.583 [2024-11-17 11:03:03.031083] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.583 #37 NEW cov: 12545 ft: 15572 corp: 23/980b lim: 85 exec/s: 37 rss: 74Mb L: 53/65 MS: 1 EraseBytes- 00:08:38.583 [2024-11-17 11:03:03.100397] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:38.583 [2024-11-17 11:03:03.100432] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.583 #38 NEW cov: 12545 ft: 15601 corp: 24/1003b lim: 85 exec/s: 38 rss: 74Mb L: 23/65 MS: 1 EraseBytes- 00:08:38.583 [2024-11-17 11:03:03.150905] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:38.583 [2024-11-17 11:03:03.150935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.583 [2024-11-17 11:03:03.151087] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:38.583 [2024-11-17 11:03:03.151114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.583 #39 NEW cov: 12545 ft: 15619 corp: 25/1042b lim: 85 exec/s: 39 rss: 74Mb L: 39/65 MS: 1 InsertByte- 00:08:38.583 [2024-11-17 11:03:03.221611] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:38.583 [2024-11-17 11:03:03.221644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.583 [2024-11-17 11:03:03.221792] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:38.583 [2024-11-17 11:03:03.221815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.583 [2024-11-17 11:03:03.221938] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:38.583 [2024-11-17 11:03:03.221955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.583 [2024-11-17 11:03:03.222021] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:38.583 [2024-11-17 11:03:03.222048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:38.844 #40 NEW cov: 12545 ft: 16003 corp: 26/1118b lim: 85 exec/s: 40 rss: 74Mb L: 76/76 MS: 1 InsertRepeatedBytes- 00:08:38.844 [2024-11-17 11:03:03.271614] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:38.844 [2024-11-17 11:03:03.271653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.844 [2024-11-17 11:03:03.271759] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:38.844 [2024-11-17 11:03:03.271780] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.844 [2024-11-17 11:03:03.271917] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:38.844 [2024-11-17 11:03:03.271935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.844 #41 NEW cov: 12545 ft: 16027 corp: 27/1185b lim: 85 exec/s: 41 rss: 74Mb L: 67/76 MS: 1 CopyPart- 00:08:38.844 [2024-11-17 11:03:03.341728] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:38.844 [2024-11-17 11:03:03.341759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.844 [2024-11-17 11:03:03.341900] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:38.844 [2024-11-17 11:03:03.341918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.844 [2024-11-17 11:03:03.342062] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:38.844 [2024-11-17 11:03:03.342085] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.844 #42 NEW cov: 12545 ft: 16037 corp: 28/1250b lim: 85 exec/s: 42 rss: 74Mb L: 65/76 MS: 1 ChangeBinInt- 00:08:38.844 [2024-11-17 11:03:03.391775] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:38.844 [2024-11-17 11:03:03.391807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.844 [2024-11-17 11:03:03.391930] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:38.844 [2024-11-17 11:03:03.391952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.844 [2024-11-17 11:03:03.392080] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:38.844 [2024-11-17 11:03:03.392102] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.844 #43 NEW cov: 12545 ft: 16046 corp: 29/1302b lim: 85 exec/s: 43 rss: 74Mb L: 52/76 MS: 1 InsertRepeatedBytes- 00:08:38.844 [2024-11-17 11:03:03.442330] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:38.844 [2024-11-17 11:03:03.442361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.844 [2024-11-17 11:03:03.442488] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:38.844 [2024-11-17 11:03:03.442511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.844 [2024-11-17 11:03:03.442643] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:38.844 [2024-11-17 11:03:03.442665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.844 [2024-11-17 11:03:03.442801] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:38.844 [2024-11-17 11:03:03.442825] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:38.844 #44 NEW cov: 12545 ft: 16050 corp: 30/1378b lim: 85 exec/s: 22 rss: 74Mb L: 76/76 MS: 1 ChangeByte- 00:08:38.844 #44 DONE cov: 12545 ft: 16050 corp: 30/1378b lim: 85 exec/s: 22 rss: 74Mb 00:08:38.844 Done 44 runs in 2 second(s) 00:08:39.105 11:03:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_22.conf /var/tmp/suppress_nvmf_fuzz 00:08:39.105 11:03:03 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:39.105 11:03:03 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:39.105 11:03:03 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 23 1 0x1 00:08:39.105 11:03:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=23 00:08:39.105 11:03:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:39.105 11:03:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:39.105 11:03:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:39.105 11:03:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_23.conf 00:08:39.105 11:03:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:39.105 11:03:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:39.105 11:03:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 23 00:08:39.105 11:03:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4423 00:08:39.105 11:03:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:39.105 11:03:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4423' 00:08:39.105 11:03:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4423"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:39.105 11:03:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:39.105 11:03:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:39.105 11:03:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4423' -c /tmp/fuzz_json_23.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 -Z 23 00:08:39.105 [2024-11-17 11:03:03.627415] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:08:39.105 [2024-11-17 11:03:03.627485] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid158274 ] 00:08:39.366 [2024-11-17 11:03:03.824336] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:39.366 [2024-11-17 11:03:03.838484] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:39.366 [2024-11-17 11:03:03.891241] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:39.366 [2024-11-17 11:03:03.907535] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4423 *** 00:08:39.366 INFO: Running with entropic power schedule (0xFF, 100). 00:08:39.366 INFO: Seed: 4015723651 00:08:39.366 INFO: Loaded 1 modules (387604 inline 8-bit counters): 387604 [0x2a8d80c, 0x2aec220), 00:08:39.366 INFO: Loaded 1 PC tables (387604 PCs): 387604 [0x2aec220,0x30d6360), 00:08:39.366 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:39.366 INFO: A corpus is not provided, starting from an empty corpus 00:08:39.366 #2 INITED exec/s: 0 rss: 65Mb 00:08:39.366 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:39.366 This may also happen if the target rejected all inputs we tried so far 00:08:39.366 [2024-11-17 11:03:03.962935] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:39.366 [2024-11-17 11:03:03.962973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.366 [2024-11-17 11:03:03.963038] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:39.366 [2024-11-17 11:03:03.963065] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.938 NEW_FUNC[1/716]: 0x47d1f8 in fuzz_nvm_reservation_report_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:671 00:08:39.938 NEW_FUNC[2/716]: 0x48ef68 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:39.938 #13 NEW cov: 12242 ft: 12241 corp: 2/14b lim: 25 exec/s: 0 rss: 72Mb L: 13/13 MS: 1 InsertRepeatedBytes- 00:08:39.938 [2024-11-17 11:03:04.303701] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:39.938 [2024-11-17 11:03:04.303744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.938 [2024-11-17 11:03:04.303812] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:39.938 [2024-11-17 11:03:04.303832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.938 #14 NEW cov: 12355 ft: 12759 corp: 3/27b lim: 25 exec/s: 0 rss: 72Mb L: 13/13 MS: 1 ChangeBit- 00:08:39.938 [2024-11-17 11:03:04.364007] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:39.938 [2024-11-17 11:03:04.364036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.938 [2024-11-17 11:03:04.364086] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:39.938 [2024-11-17 11:03:04.364103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.938 [2024-11-17 11:03:04.364157] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:39.938 [2024-11-17 11:03:04.364173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:39.938 [2024-11-17 11:03:04.364229] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:39.938 [2024-11-17 11:03:04.364246] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:39.938 #15 NEW cov: 12361 ft: 13341 corp: 4/51b lim: 25 exec/s: 0 rss: 72Mb L: 24/24 MS: 1 CrossOver- 00:08:39.938 [2024-11-17 11:03:04.423939] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:39.938 [2024-11-17 11:03:04.423969] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.938 [2024-11-17 11:03:04.424014] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:39.938 [2024-11-17 11:03:04.424030] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.938 #16 NEW cov: 12446 ft: 13677 corp: 5/64b lim: 25 exec/s: 0 rss: 72Mb L: 13/24 MS: 1 CopyPart- 00:08:39.938 [2024-11-17 11:03:04.464025] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:39.938 [2024-11-17 11:03:04.464057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.938 [2024-11-17 11:03:04.464096] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:39.938 [2024-11-17 11:03:04.464112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.938 #17 NEW cov: 12446 ft: 13852 corp: 6/76b lim: 25 exec/s: 0 rss: 72Mb L: 12/24 MS: 1 InsertRepeatedBytes- 00:08:39.938 [2024-11-17 11:03:04.504365] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:39.938 [2024-11-17 11:03:04.504392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.938 [2024-11-17 11:03:04.504438] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:39.938 [2024-11-17 11:03:04.504455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.938 [2024-11-17 11:03:04.504508] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:39.938 [2024-11-17 11:03:04.504524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:39.938 [2024-11-17 11:03:04.504577] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:39.938 [2024-11-17 11:03:04.504594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:39.939 #18 NEW cov: 12446 ft: 14009 corp: 7/100b lim: 25 exec/s: 0 rss: 72Mb L: 24/24 MS: 1 InsertRepeatedBytes- 00:08:39.939 [2024-11-17 11:03:04.544603] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:39.939 [2024-11-17 11:03:04.544630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.939 [2024-11-17 11:03:04.544685] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:39.939 [2024-11-17 11:03:04.544700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.939 [2024-11-17 11:03:04.544752] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:39.939 [2024-11-17 11:03:04.544767] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:39.939 [2024-11-17 11:03:04.544821] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:39.939 [2024-11-17 11:03:04.544837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:39.939 [2024-11-17 11:03:04.544889] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:39.939 [2024-11-17 11:03:04.544905] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:39.939 #19 NEW cov: 12446 ft: 14162 corp: 8/125b lim: 25 exec/s: 0 rss: 72Mb L: 25/25 MS: 1 InsertByte- 00:08:40.199 [2024-11-17 11:03:04.604458] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:40.199 [2024-11-17 11:03:04.604485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.199 [2024-11-17 11:03:04.604522] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:40.199 [2024-11-17 11:03:04.604536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.199 #20 NEW cov: 12446 ft: 14255 corp: 9/138b lim: 25 exec/s: 0 rss: 72Mb L: 13/25 MS: 1 CopyPart- 00:08:40.199 [2024-11-17 11:03:04.664615] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:40.199 [2024-11-17 11:03:04.664642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.199 [2024-11-17 11:03:04.664696] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:40.199 [2024-11-17 11:03:04.664711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.199 #24 NEW cov: 12446 ft: 14279 corp: 10/150b lim: 25 exec/s: 0 rss: 72Mb L: 12/25 MS: 4 ChangeByte-ShuffleBytes-CrossOver-InsertRepeatedBytes- 00:08:40.199 [2024-11-17 11:03:04.704687] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:40.199 [2024-11-17 11:03:04.704714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.199 [2024-11-17 11:03:04.704750] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:40.199 [2024-11-17 11:03:04.704765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.199 #25 NEW cov: 12446 ft: 14338 corp: 11/163b lim: 25 exec/s: 0 rss: 72Mb L: 13/25 MS: 1 ChangeByte- 00:08:40.199 [2024-11-17 11:03:04.744946] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:40.199 [2024-11-17 11:03:04.744974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.199 [2024-11-17 11:03:04.745008] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:40.199 [2024-11-17 11:03:04.745025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.199 [2024-11-17 11:03:04.745079] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:40.199 [2024-11-17 11:03:04.745094] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:40.199 #26 NEW cov: 12446 ft: 14556 corp: 12/181b lim: 25 exec/s: 0 rss: 72Mb L: 18/25 MS: 1 InsertRepeatedBytes- 00:08:40.199 [2024-11-17 11:03:04.784802] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:40.199 [2024-11-17 11:03:04.784830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.199 #27 NEW cov: 12446 ft: 14918 corp: 13/186b lim: 25 exec/s: 0 rss: 72Mb L: 5/25 MS: 1 CMP- DE: "\006\000\000\000"- 00:08:40.199 [2024-11-17 11:03:04.825046] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:40.199 [2024-11-17 11:03:04.825073] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.199 [2024-11-17 11:03:04.825112] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:40.199 [2024-11-17 11:03:04.825127] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.199 NEW_FUNC[1/1]: 0x1c4e498 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:40.199 #28 NEW cov: 12469 ft: 14937 corp: 14/199b lim: 25 exec/s: 0 rss: 72Mb L: 13/25 MS: 1 ChangeBinInt- 00:08:40.460 [2024-11-17 11:03:04.865506] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:40.460 [2024-11-17 11:03:04.865534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.460 [2024-11-17 11:03:04.865584] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:40.460 [2024-11-17 11:03:04.865600] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.460 [2024-11-17 11:03:04.865653] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:40.460 [2024-11-17 11:03:04.865683] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:40.460 [2024-11-17 11:03:04.865735] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:40.460 [2024-11-17 11:03:04.865754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:40.460 [2024-11-17 11:03:04.865807] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:40.460 [2024-11-17 11:03:04.865823] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:40.460 #29 NEW cov: 12469 ft: 14949 corp: 15/224b lim: 25 exec/s: 0 rss: 72Mb L: 25/25 MS: 1 ShuffleBytes- 00:08:40.460 [2024-11-17 11:03:04.925231] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:40.460 [2024-11-17 11:03:04.925259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.460 #30 NEW cov: 12469 ft: 14996 corp: 16/231b lim: 25 exec/s: 30 rss: 73Mb L: 7/25 MS: 1 CrossOver- 00:08:40.460 [2024-11-17 11:03:04.985791] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:40.460 [2024-11-17 11:03:04.985818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.460 [2024-11-17 11:03:04.985869] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:40.460 [2024-11-17 11:03:04.985884] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.460 [2024-11-17 11:03:04.985936] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:40.460 [2024-11-17 11:03:04.985952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:40.460 [2024-11-17 11:03:04.986005] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:40.460 [2024-11-17 11:03:04.986020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:40.460 #31 NEW cov: 12469 ft: 15016 corp: 17/255b lim: 25 exec/s: 31 rss: 73Mb L: 24/25 MS: 1 ChangeBinInt- 00:08:40.460 [2024-11-17 11:03:05.045686] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:40.460 [2024-11-17 11:03:05.045712] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.460 [2024-11-17 11:03:05.045750] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:40.460 [2024-11-17 11:03:05.045765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.460 #33 NEW cov: 12469 ft: 15104 corp: 18/265b lim: 25 exec/s: 33 rss: 73Mb L: 10/25 MS: 2 InsertByte-CMP- DE: "\000\000\000\000\000\000\000\000"- 00:08:40.460 [2024-11-17 11:03:05.086001] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:40.460 [2024-11-17 11:03:05.086029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.460 [2024-11-17 11:03:05.086083] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:40.460 [2024-11-17 11:03:05.086099] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.460 [2024-11-17 11:03:05.086151] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:40.460 [2024-11-17 11:03:05.086165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:40.460 [2024-11-17 11:03:05.086217] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:40.460 [2024-11-17 11:03:05.086236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:40.722 #34 NEW cov: 12469 ft: 15130 corp: 19/286b lim: 25 exec/s: 34 rss: 73Mb L: 21/25 MS: 1 InsertRepeatedBytes- 00:08:40.722 [2024-11-17 11:03:05.145955] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:40.722 [2024-11-17 11:03:05.145982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.722 [2024-11-17 11:03:05.146018] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:40.722 [2024-11-17 11:03:05.146034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.722 #35 NEW cov: 12469 ft: 15161 corp: 20/297b lim: 25 exec/s: 35 rss: 73Mb L: 11/25 MS: 1 EraseBytes- 00:08:40.722 [2024-11-17 11:03:05.206124] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:40.722 [2024-11-17 11:03:05.206151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.722 [2024-11-17 11:03:05.206206] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:40.722 [2024-11-17 11:03:05.206222] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.722 #36 NEW cov: 12469 ft: 15163 corp: 21/309b lim: 25 exec/s: 36 rss: 73Mb L: 12/25 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000\000"- 00:08:40.722 [2024-11-17 11:03:05.246219] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:40.722 [2024-11-17 11:03:05.246245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.722 [2024-11-17 11:03:05.246297] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:40.722 [2024-11-17 11:03:05.246312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.722 #39 NEW cov: 12469 ft: 15175 corp: 22/320b lim: 25 exec/s: 39 rss: 73Mb L: 11/25 MS: 3 EraseBytes-ChangeByte-InsertRepeatedBytes- 00:08:40.722 [2024-11-17 11:03:05.306379] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:40.722 [2024-11-17 11:03:05.306407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.722 [2024-11-17 11:03:05.306450] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:40.722 [2024-11-17 11:03:05.306467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.722 #40 NEW cov: 12469 ft: 15229 corp: 23/333b lim: 25 exec/s: 40 rss: 73Mb L: 13/25 MS: 1 CopyPart- 00:08:40.722 [2024-11-17 11:03:05.346389] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:40.722 [2024-11-17 11:03:05.346416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.722 #41 NEW cov: 12469 ft: 15263 corp: 24/339b lim: 25 exec/s: 41 rss: 73Mb L: 6/25 MS: 1 InsertByte- 00:08:40.983 [2024-11-17 11:03:05.386493] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:40.983 [2024-11-17 11:03:05.386522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.983 #42 NEW cov: 12469 ft: 15306 corp: 25/346b lim: 25 exec/s: 42 rss: 73Mb L: 7/25 MS: 1 EraseBytes- 00:08:40.983 [2024-11-17 11:03:05.426736] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:40.983 [2024-11-17 11:03:05.426763] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.983 [2024-11-17 11:03:05.426818] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:40.983 [2024-11-17 11:03:05.426832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.983 #43 NEW cov: 12469 ft: 15342 corp: 26/359b lim: 25 exec/s: 43 rss: 73Mb L: 13/25 MS: 1 ShuffleBytes- 00:08:40.983 [2024-11-17 11:03:05.467228] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:40.983 [2024-11-17 11:03:05.467257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.983 [2024-11-17 11:03:05.467302] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:40.983 [2024-11-17 11:03:05.467321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.983 [2024-11-17 11:03:05.467375] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:40.983 [2024-11-17 11:03:05.467390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:40.983 [2024-11-17 11:03:05.467442] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:40.983 [2024-11-17 11:03:05.467456] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:40.984 [2024-11-17 11:03:05.467514] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:40.984 [2024-11-17 11:03:05.467530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:40.984 #44 NEW cov: 12469 ft: 15353 corp: 27/384b lim: 25 exec/s: 44 rss: 73Mb L: 25/25 MS: 1 InsertRepeatedBytes- 00:08:40.984 [2024-11-17 11:03:05.527013] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:40.984 [2024-11-17 11:03:05.527044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.984 [2024-11-17 11:03:05.527097] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:40.984 [2024-11-17 11:03:05.527114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.984 #45 NEW cov: 12469 ft: 15364 corp: 28/397b lim: 25 exec/s: 45 rss: 73Mb L: 13/25 MS: 1 ChangeBit- 00:08:40.984 [2024-11-17 11:03:05.587046] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:40.984 [2024-11-17 11:03:05.587089] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.984 #46 NEW cov: 12469 ft: 15380 corp: 29/404b lim: 25 exec/s: 46 rss: 73Mb L: 7/25 MS: 1 InsertByte- 00:08:41.245 [2024-11-17 11:03:05.647601] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:41.245 [2024-11-17 11:03:05.647629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.245 [2024-11-17 11:03:05.647671] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:41.245 [2024-11-17 11:03:05.647687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.245 [2024-11-17 11:03:05.647741] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:41.245 [2024-11-17 11:03:05.647757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:41.245 [2024-11-17 11:03:05.647810] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:41.245 [2024-11-17 11:03:05.647829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:41.245 #47 NEW cov: 12469 ft: 15387 corp: 30/425b lim: 25 exec/s: 47 rss: 73Mb L: 21/25 MS: 1 ChangeBit- 00:08:41.245 [2024-11-17 11:03:05.707576] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:41.245 [2024-11-17 11:03:05.707604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.245 [2024-11-17 11:03:05.707656] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:41.245 [2024-11-17 11:03:05.707673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.245 #48 NEW cov: 12469 ft: 15398 corp: 31/438b lim: 25 exec/s: 48 rss: 74Mb L: 13/25 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000\000"- 00:08:41.245 [2024-11-17 11:03:05.767676] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:41.245 [2024-11-17 11:03:05.767703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.245 [2024-11-17 11:03:05.767747] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:41.245 [2024-11-17 11:03:05.767761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.245 #49 NEW cov: 12469 ft: 15402 corp: 32/451b lim: 25 exec/s: 49 rss: 74Mb L: 13/25 MS: 1 ChangeBit- 00:08:41.245 [2024-11-17 11:03:05.807800] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:41.245 [2024-11-17 11:03:05.807826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.245 [2024-11-17 11:03:05.807862] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:41.245 [2024-11-17 11:03:05.807878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.245 #50 NEW cov: 12469 ft: 15425 corp: 33/463b lim: 25 exec/s: 50 rss: 74Mb L: 12/25 MS: 1 ShuffleBytes- 00:08:41.245 [2024-11-17 11:03:05.847795] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:41.245 [2024-11-17 11:03:05.847821] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.245 #51 NEW cov: 12469 ft: 15532 corp: 34/469b lim: 25 exec/s: 51 rss: 74Mb L: 6/25 MS: 1 ChangeBinInt- 00:08:41.245 [2024-11-17 11:03:05.888021] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:41.245 [2024-11-17 11:03:05.888053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.245 [2024-11-17 11:03:05.888102] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:41.245 [2024-11-17 11:03:05.888118] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.506 #52 NEW cov: 12469 ft: 15606 corp: 35/482b lim: 25 exec/s: 52 rss: 74Mb L: 13/25 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000\000"- 00:08:41.506 [2024-11-17 11:03:05.928319] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:41.506 [2024-11-17 11:03:05.928346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.506 [2024-11-17 11:03:05.928391] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:41.506 [2024-11-17 11:03:05.928407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.506 [2024-11-17 11:03:05.928464] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:41.506 [2024-11-17 11:03:05.928481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:41.507 [2024-11-17 11:03:05.928535] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:41.507 [2024-11-17 11:03:05.928552] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:41.507 #53 NEW cov: 12469 ft: 15615 corp: 36/504b lim: 25 exec/s: 26 rss: 74Mb L: 22/25 MS: 1 InsertByte- 00:08:41.507 #53 DONE cov: 12469 ft: 15615 corp: 36/504b lim: 25 exec/s: 26 rss: 74Mb 00:08:41.507 ###### Recommended dictionary. ###### 00:08:41.507 "\006\000\000\000" # Uses: 0 00:08:41.507 "\000\000\000\000\000\000\000\000" # Uses: 3 00:08:41.507 ###### End of recommended dictionary. ###### 00:08:41.507 Done 53 runs in 2 second(s) 00:08:41.507 11:03:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_23.conf /var/tmp/suppress_nvmf_fuzz 00:08:41.507 11:03:06 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:41.507 11:03:06 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:41.507 11:03:06 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 24 1 0x1 00:08:41.507 11:03:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=24 00:08:41.507 11:03:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:41.507 11:03:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:41.507 11:03:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:41.507 11:03:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_24.conf 00:08:41.507 11:03:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:41.507 11:03:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:41.507 11:03:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 24 00:08:41.507 11:03:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4424 00:08:41.507 11:03:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:41.507 11:03:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4424' 00:08:41.507 11:03:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4424"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:41.507 11:03:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:41.507 11:03:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:41.507 11:03:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4424' -c /tmp/fuzz_json_24.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 -Z 24 00:08:41.507 [2024-11-17 11:03:06.114139] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:08:41.507 [2024-11-17 11:03:06.114207] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid159087 ] 00:08:41.768 [2024-11-17 11:03:06.321751] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:41.768 [2024-11-17 11:03:06.334716] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:41.768 [2024-11-17 11:03:06.387081] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:41.768 [2024-11-17 11:03:06.403420] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4424 *** 00:08:41.768 INFO: Running with entropic power schedule (0xFF, 100). 00:08:41.768 INFO: Seed: 2214760685 00:08:42.028 INFO: Loaded 1 modules (387604 inline 8-bit counters): 387604 [0x2a8d80c, 0x2aec220), 00:08:42.028 INFO: Loaded 1 PC tables (387604 PCs): 387604 [0x2aec220,0x30d6360), 00:08:42.028 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:42.028 INFO: A corpus is not provided, starting from an empty corpus 00:08:42.028 #2 INITED exec/s: 0 rss: 65Mb 00:08:42.028 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:42.028 This may also happen if the target rejected all inputs we tried so far 00:08:42.028 [2024-11-17 11:03:06.462566] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744072182824959 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:42.028 [2024-11-17 11:03:06.462597] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.289 NEW_FUNC[1/717]: 0x47e2e8 in fuzz_nvm_compare_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:685 00:08:42.289 NEW_FUNC[2/717]: 0x48ef68 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:42.289 #10 NEW cov: 12310 ft: 12315 corp: 2/22b lim: 100 exec/s: 0 rss: 72Mb L: 21/21 MS: 3 InsertByte-ChangeByte-InsertRepeatedBytes- 00:08:42.289 [2024-11-17 11:03:06.793528] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744072182824959 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:42.289 [2024-11-17 11:03:06.793590] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.289 #21 NEW cov: 12427 ft: 13115 corp: 3/44b lim: 100 exec/s: 0 rss: 72Mb L: 22/22 MS: 1 InsertByte- 00:08:42.289 [2024-11-17 11:03:06.863459] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744072182824959 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:42.289 [2024-11-17 11:03:06.863488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.289 #22 NEW cov: 12433 ft: 13380 corp: 4/80b lim: 100 exec/s: 0 rss: 72Mb L: 36/36 MS: 1 CrossOver- 00:08:42.289 [2024-11-17 11:03:06.903570] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446742976966164479 len:256 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:42.289 [2024-11-17 11:03:06.903600] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.551 #28 NEW cov: 12518 ft: 13638 corp: 5/116b lim: 100 exec/s: 0 rss: 72Mb L: 36/36 MS: 1 CMP- DE: "\000\000\000\000"- 00:08:42.551 [2024-11-17 11:03:06.963730] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744072182824959 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:42.551 [2024-11-17 11:03:06.963761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.551 #29 NEW cov: 12518 ft: 13711 corp: 6/152b lim: 100 exec/s: 0 rss: 72Mb L: 36/36 MS: 1 ChangeBit- 00:08:42.551 [2024-11-17 11:03:07.003841] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446742976966164479 len:256 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:42.551 [2024-11-17 11:03:07.003870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.551 #30 NEW cov: 12518 ft: 13754 corp: 7/188b lim: 100 exec/s: 0 rss: 73Mb L: 36/36 MS: 1 ShuffleBytes- 00:08:42.551 [2024-11-17 11:03:07.064006] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446742976966164479 len:256 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:42.551 [2024-11-17 11:03:07.064035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.551 #31 NEW cov: 12518 ft: 13801 corp: 8/224b lim: 100 exec/s: 0 rss: 73Mb L: 36/36 MS: 1 ShuffleBytes- 00:08:42.551 [2024-11-17 11:03:07.104075] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446742976966164479 len:256 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:42.551 [2024-11-17 11:03:07.104102] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.551 #32 NEW cov: 12518 ft: 13909 corp: 9/261b lim: 100 exec/s: 0 rss: 73Mb L: 37/37 MS: 1 InsertByte- 00:08:42.551 [2024-11-17 11:03:07.164569] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:7523377975159973992 len:26729 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:42.551 [2024-11-17 11:03:07.164596] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.551 [2024-11-17 11:03:07.164641] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:7523377975159973992 len:26729 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:42.551 [2024-11-17 11:03:07.164656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.551 [2024-11-17 11:03:07.164710] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:7523377975159973992 len:26729 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:42.551 [2024-11-17 11:03:07.164727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:42.551 #37 NEW cov: 12518 ft: 14795 corp: 10/328b lim: 100 exec/s: 0 rss: 73Mb L: 67/67 MS: 5 ChangeBit-CopyPart-CrossOver-EraseBytes-InsertRepeatedBytes- 00:08:42.551 [2024-11-17 11:03:07.204847] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:7523377975159973992 len:26729 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:42.551 [2024-11-17 11:03:07.204875] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.551 [2024-11-17 11:03:07.204925] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:7523377975159973992 len:26729 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:42.551 [2024-11-17 11:03:07.204942] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.551 [2024-11-17 11:03:07.204997] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:7523377975159973992 len:26729 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:42.551 [2024-11-17 11:03:07.205013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:42.551 [2024-11-17 11:03:07.205071] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:7523377975159973992 len:26729 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:42.551 [2024-11-17 11:03:07.205087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:42.813 #38 NEW cov: 12518 ft: 15184 corp: 11/416b lim: 100 exec/s: 0 rss: 73Mb L: 88/88 MS: 1 CopyPart- 00:08:42.813 [2024-11-17 11:03:07.264980] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744072182824959 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:42.813 [2024-11-17 11:03:07.265008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.813 [2024-11-17 11:03:07.265059] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:9331882298234068234 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:42.813 [2024-11-17 11:03:07.265073] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.813 [2024-11-17 11:03:07.265127] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:42.813 [2024-11-17 11:03:07.265145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:42.813 [2024-11-17 11:03:07.265200] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:9331882296111890817 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:42.813 [2024-11-17 11:03:07.265216] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:42.813 #39 NEW cov: 12518 ft: 15204 corp: 12/497b lim: 100 exec/s: 0 rss: 73Mb L: 81/88 MS: 1 InsertRepeatedBytes- 00:08:42.813 [2024-11-17 11:03:07.324719] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446742976966164479 len:256 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:42.813 [2024-11-17 11:03:07.324748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.813 NEW_FUNC[1/1]: 0x1c4e498 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:42.813 #40 NEW cov: 12541 ft: 15289 corp: 13/534b lim: 100 exec/s: 0 rss: 73Mb L: 37/88 MS: 1 CMP- DE: "\377\377\377\221"- 00:08:42.813 [2024-11-17 11:03:07.384888] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744072182824959 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:42.813 [2024-11-17 11:03:07.384917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.813 #41 NEW cov: 12541 ft: 15305 corp: 14/570b lim: 100 exec/s: 0 rss: 73Mb L: 36/88 MS: 1 ChangeByte- 00:08:42.813 [2024-11-17 11:03:07.425444] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744072182824959 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:42.813 [2024-11-17 11:03:07.425472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.813 [2024-11-17 11:03:07.425519] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:757028606160404317 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:42.813 [2024-11-17 11:03:07.425536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.813 [2024-11-17 11:03:07.425588] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:42.813 [2024-11-17 11:03:07.425605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:42.813 [2024-11-17 11:03:07.425657] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:9331882296111890817 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:42.813 [2024-11-17 11:03:07.425672] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:42.813 #42 NEW cov: 12541 ft: 15321 corp: 15/652b lim: 100 exec/s: 42 rss: 73Mb L: 82/88 MS: 1 InsertByte- 00:08:43.074 [2024-11-17 11:03:07.485194] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744072182824959 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.074 [2024-11-17 11:03:07.485224] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.075 #43 NEW cov: 12541 ft: 15339 corp: 16/674b lim: 100 exec/s: 43 rss: 73Mb L: 22/88 MS: 1 ChangeByte- 00:08:43.075 [2024-11-17 11:03:07.545335] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744070152781823 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.075 [2024-11-17 11:03:07.545364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.075 #44 NEW cov: 12541 ft: 15366 corp: 17/695b lim: 100 exec/s: 44 rss: 73Mb L: 21/88 MS: 1 ChangeByte- 00:08:43.075 [2024-11-17 11:03:07.585439] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446742976966164479 len:256 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.075 [2024-11-17 11:03:07.585471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.075 #45 NEW cov: 12541 ft: 15443 corp: 18/732b lim: 100 exec/s: 45 rss: 73Mb L: 37/88 MS: 1 ShuffleBytes- 00:08:43.075 [2024-11-17 11:03:07.625996] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:7523377975159973992 len:26729 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.075 [2024-11-17 11:03:07.626024] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.075 [2024-11-17 11:03:07.626079] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18404074677430386687 len:26729 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.075 [2024-11-17 11:03:07.626095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.075 [2024-11-17 11:03:07.626150] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:7523377975159973992 len:26729 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.075 [2024-11-17 11:03:07.626166] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:43.075 [2024-11-17 11:03:07.626219] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:7523377975159973992 len:26729 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.075 [2024-11-17 11:03:07.626233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:43.075 #46 NEW cov: 12541 ft: 15488 corp: 19/826b lim: 100 exec/s: 46 rss: 73Mb L: 94/94 MS: 1 InsertRepeatedBytes- 00:08:43.075 [2024-11-17 11:03:07.686226] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744072182824959 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.075 [2024-11-17 11:03:07.686254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.075 [2024-11-17 11:03:07.686301] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:12514849900987264429 len:44462 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.075 [2024-11-17 11:03:07.686319] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.075 [2024-11-17 11:03:07.686373] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:12514849900987264429 len:44462 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.075 [2024-11-17 11:03:07.686390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:43.075 [2024-11-17 11:03:07.686444] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:12514849900987264429 len:44462 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.075 [2024-11-17 11:03:07.686461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:43.075 #47 NEW cov: 12541 ft: 15497 corp: 20/907b lim: 100 exec/s: 47 rss: 74Mb L: 81/94 MS: 1 InsertRepeatedBytes- 00:08:43.075 [2024-11-17 11:03:07.725858] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744072182790911 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.075 [2024-11-17 11:03:07.725887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.336 #48 NEW cov: 12541 ft: 15507 corp: 21/945b lim: 100 exec/s: 48 rss: 74Mb L: 38/94 MS: 1 InsertByte- 00:08:43.336 [2024-11-17 11:03:07.786027] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744072182824959 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.336 [2024-11-17 11:03:07.786060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.336 #49 NEW cov: 12541 ft: 15536 corp: 22/967b lim: 100 exec/s: 49 rss: 74Mb L: 22/94 MS: 1 PersAutoDict- DE: "\377\377\377\221"- 00:08:43.336 [2024-11-17 11:03:07.846624] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446742976966164479 len:256 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.336 [2024-11-17 11:03:07.846652] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.336 [2024-11-17 11:03:07.846699] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:792633534417165823 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.336 [2024-11-17 11:03:07.846714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.336 [2024-11-17 11:03:07.846785] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:12514849900987264429 len:44462 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.336 [2024-11-17 11:03:07.846802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:43.336 [2024-11-17 11:03:07.846857] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:12514849900987264429 len:44462 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.336 [2024-11-17 11:03:07.846873] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:43.336 #55 NEW cov: 12541 ft: 15539 corp: 23/1051b lim: 100 exec/s: 55 rss: 74Mb L: 84/94 MS: 1 CrossOver- 00:08:43.336 [2024-11-17 11:03:07.906347] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744072182824959 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.336 [2024-11-17 11:03:07.906376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.336 #56 NEW cov: 12541 ft: 15553 corp: 24/1087b lim: 100 exec/s: 56 rss: 74Mb L: 36/94 MS: 1 ChangeBinInt- 00:08:43.336 [2024-11-17 11:03:07.946452] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446742976966164479 len:256 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.336 [2024-11-17 11:03:07.946481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.336 #57 NEW cov: 12541 ft: 15561 corp: 25/1124b lim: 100 exec/s: 57 rss: 74Mb L: 37/94 MS: 1 InsertByte- 00:08:43.336 [2024-11-17 11:03:07.986886] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744072182824959 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.336 [2024-11-17 11:03:07.986915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.336 [2024-11-17 11:03:07.986952] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4702111234474983745 len:16706 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.336 [2024-11-17 11:03:07.986968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.336 [2024-11-17 11:03:07.987023] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:4702111234474983745 len:65374 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.336 [2024-11-17 11:03:07.987039] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:43.597 #58 NEW cov: 12541 ft: 15604 corp: 26/1189b lim: 100 exec/s: 58 rss: 74Mb L: 65/94 MS: 1 InsertRepeatedBytes- 00:08:43.597 [2024-11-17 11:03:08.026717] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744070549471231 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.597 [2024-11-17 11:03:08.026744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.597 [2024-11-17 11:03:08.066788] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744070549471231 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.597 [2024-11-17 11:03:08.066816] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.597 #60 NEW cov: 12541 ft: 15612 corp: 27/1227b lim: 100 exec/s: 60 rss: 74Mb L: 38/94 MS: 2 InsertByte-ChangeByte- 00:08:43.597 [2024-11-17 11:03:08.107236] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744072182824959 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.597 [2024-11-17 11:03:08.107264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.597 [2024-11-17 11:03:08.107306] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4702111234474983745 len:16706 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.597 [2024-11-17 11:03:08.107322] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.597 [2024-11-17 11:03:08.107376] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:4702111234474983745 len:65374 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.597 [2024-11-17 11:03:08.107408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:43.597 #61 NEW cov: 12541 ft: 15625 corp: 28/1292b lim: 100 exec/s: 61 rss: 74Mb L: 65/94 MS: 1 ChangeByte- 00:08:43.598 [2024-11-17 11:03:08.167167] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446742976966164479 len:256 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.598 [2024-11-17 11:03:08.167195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.598 #62 NEW cov: 12541 ft: 15638 corp: 29/1329b lim: 100 exec/s: 62 rss: 74Mb L: 37/94 MS: 1 ChangeBit- 00:08:43.598 [2024-11-17 11:03:08.207247] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744072182824959 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.598 [2024-11-17 11:03:08.207275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.598 #63 NEW cov: 12541 ft: 15649 corp: 30/1365b lim: 100 exec/s: 63 rss: 74Mb L: 36/94 MS: 1 ShuffleBytes- 00:08:43.858 [2024-11-17 11:03:08.267551] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744072182824959 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.859 [2024-11-17 11:03:08.267580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.859 [2024-11-17 11:03:08.267633] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.859 [2024-11-17 11:03:08.267651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.859 #64 NEW cov: 12541 ft: 15951 corp: 31/1413b lim: 100 exec/s: 64 rss: 74Mb L: 48/94 MS: 1 CopyPart- 00:08:43.859 [2024-11-17 11:03:08.327862] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:7523377975159973992 len:26473 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.859 [2024-11-17 11:03:08.327889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.859 [2024-11-17 11:03:08.327925] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:7523377975159973992 len:26729 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.859 [2024-11-17 11:03:08.327941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.859 [2024-11-17 11:03:08.327999] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:7523377975159973992 len:26729 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.859 [2024-11-17 11:03:08.328015] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:43.859 #65 NEW cov: 12541 ft: 15968 corp: 32/1480b lim: 100 exec/s: 65 rss: 74Mb L: 67/94 MS: 1 ChangeBinInt- 00:08:43.859 [2024-11-17 11:03:08.368134] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744072182824959 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.859 [2024-11-17 11:03:08.368161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.859 [2024-11-17 11:03:08.368208] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:12514849900987264429 len:44462 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.859 [2024-11-17 11:03:08.368224] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.859 [2024-11-17 11:03:08.368279] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:12514940413128060333 len:44462 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.859 [2024-11-17 11:03:08.368295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:43.859 [2024-11-17 11:03:08.368352] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:12514849900987264429 len:44462 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.859 [2024-11-17 11:03:08.368368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:43.859 #66 NEW cov: 12541 ft: 15971 corp: 33/1561b lim: 100 exec/s: 66 rss: 74Mb L: 81/94 MS: 1 PersAutoDict- DE: "\377\377\377\221"- 00:08:43.859 [2024-11-17 11:03:08.427842] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744070152781823 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.859 [2024-11-17 11:03:08.427870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.859 #67 NEW cov: 12541 ft: 15978 corp: 34/1582b lim: 100 exec/s: 33 rss: 74Mb L: 21/94 MS: 1 CopyPart- 00:08:43.859 #67 DONE cov: 12541 ft: 15978 corp: 34/1582b lim: 100 exec/s: 33 rss: 74Mb 00:08:43.859 ###### Recommended dictionary. ###### 00:08:43.859 "\000\000\000\000" # Uses: 0 00:08:43.859 "\377\377\377\221" # Uses: 2 00:08:43.859 ###### End of recommended dictionary. ###### 00:08:43.859 Done 67 runs in 2 second(s) 00:08:44.120 11:03:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_24.conf /var/tmp/suppress_nvmf_fuzz 00:08:44.120 11:03:08 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:44.120 11:03:08 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:44.120 11:03:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@79 -- # trap - SIGINT SIGTERM EXIT 00:08:44.120 00:08:44.120 real 1m2.951s 00:08:44.120 user 1m39.341s 00:08:44.120 sys 0m7.495s 00:08:44.120 11:03:08 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1130 -- # xtrace_disable 00:08:44.120 11:03:08 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@10 -- # set +x 00:08:44.120 ************************************ 00:08:44.120 END TEST nvmf_llvm_fuzz 00:08:44.120 ************************************ 00:08:44.120 11:03:08 llvm_fuzz -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:08:44.120 11:03:08 llvm_fuzz -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:08:44.120 11:03:08 llvm_fuzz -- fuzz/llvm.sh@20 -- # run_test vfio_llvm_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/run.sh 00:08:44.120 11:03:08 llvm_fuzz -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:08:44.120 11:03:08 llvm_fuzz -- common/autotest_common.sh@1111 -- # xtrace_disable 00:08:44.120 11:03:08 llvm_fuzz -- common/autotest_common.sh@10 -- # set +x 00:08:44.120 ************************************ 00:08:44.120 START TEST vfio_llvm_fuzz 00:08:44.120 ************************************ 00:08:44.120 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/run.sh 00:08:44.120 * Looking for test storage... 00:08:44.120 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:44.120 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:08:44.120 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1693 -- # lcov --version 00:08:44.120 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:08:44.385 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:08:44.385 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:08:44.385 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@333 -- # local ver1 ver1_l 00:08:44.385 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@334 -- # local ver2 ver2_l 00:08:44.385 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@336 -- # IFS=.-: 00:08:44.385 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@336 -- # read -ra ver1 00:08:44.385 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@337 -- # IFS=.-: 00:08:44.385 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@337 -- # read -ra ver2 00:08:44.385 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@338 -- # local 'op=<' 00:08:44.385 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@340 -- # ver1_l=2 00:08:44.385 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@341 -- # ver2_l=1 00:08:44.385 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:08:44.385 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@344 -- # case "$op" in 00:08:44.385 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@345 -- # : 1 00:08:44.385 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@364 -- # (( v = 0 )) 00:08:44.385 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:08:44.385 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@365 -- # decimal 1 00:08:44.385 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@353 -- # local d=1 00:08:44.385 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:08:44.385 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@355 -- # echo 1 00:08:44.385 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@365 -- # ver1[v]=1 00:08:44.385 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@366 -- # decimal 2 00:08:44.385 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@353 -- # local d=2 00:08:44.385 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:08:44.385 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@355 -- # echo 2 00:08:44.385 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@366 -- # ver2[v]=2 00:08:44.385 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:08:44.385 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:08:44.385 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@368 -- # return 0 00:08:44.385 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:08:44.385 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:08:44.385 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:44.385 --rc genhtml_branch_coverage=1 00:08:44.385 --rc genhtml_function_coverage=1 00:08:44.385 --rc genhtml_legend=1 00:08:44.385 --rc geninfo_all_blocks=1 00:08:44.385 --rc geninfo_unexecuted_blocks=1 00:08:44.385 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:44.385 ' 00:08:44.385 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:08:44.385 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:44.385 --rc genhtml_branch_coverage=1 00:08:44.385 --rc genhtml_function_coverage=1 00:08:44.385 --rc genhtml_legend=1 00:08:44.385 --rc geninfo_all_blocks=1 00:08:44.385 --rc geninfo_unexecuted_blocks=1 00:08:44.385 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:44.385 ' 00:08:44.385 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:08:44.385 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:44.385 --rc genhtml_branch_coverage=1 00:08:44.385 --rc genhtml_function_coverage=1 00:08:44.385 --rc genhtml_legend=1 00:08:44.385 --rc geninfo_all_blocks=1 00:08:44.385 --rc geninfo_unexecuted_blocks=1 00:08:44.385 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:44.386 ' 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:08:44.386 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:44.386 --rc genhtml_branch_coverage=1 00:08:44.386 --rc genhtml_function_coverage=1 00:08:44.386 --rc genhtml_legend=1 00:08:44.386 --rc geninfo_all_blocks=1 00:08:44.386 --rc geninfo_unexecuted_blocks=1 00:08:44.386 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:44.386 ' 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@64 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/common.sh 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- setup/common.sh@6 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@34 -- # set -e 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@36 -- # shopt -s extglob 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@37 -- # shopt -s inherit_errexit 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@39 -- # '[' -z /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output ']' 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@44 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh ]] 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@45 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=n 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@17 -- # CONFIG_MAX_NUMA_NODES=1 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@18 -- # CONFIG_PGO_CAPTURE=n 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@19 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@20 -- # CONFIG_ENV=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@21 -- # CONFIG_LTO=n 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@22 -- # CONFIG_ISCSI_INITIATOR=y 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@23 -- # CONFIG_CET=n 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@24 -- # CONFIG_VBDEV_COMPRESS_MLX5=n 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@25 -- # CONFIG_OCF_PATH= 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@26 -- # CONFIG_RDMA_SET_TOS=y 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@27 -- # CONFIG_AIO_FSDEV=y 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@28 -- # CONFIG_HAVE_ARC4RANDOM=y 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@29 -- # CONFIG_HAVE_LIBARCHIVE=n 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@30 -- # CONFIG_UBLK=y 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@31 -- # CONFIG_ISAL_CRYPTO=y 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@32 -- # CONFIG_OPENSSL_PATH= 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@33 -- # CONFIG_OCF=n 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@34 -- # CONFIG_FUSE=n 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@35 -- # CONFIG_VTUNE_DIR= 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@36 -- # CONFIG_FUZZER_LIB=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@37 -- # CONFIG_FUZZER=y 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@38 -- # CONFIG_FSDEV=y 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@39 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@40 -- # CONFIG_CRYPTO=n 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@41 -- # CONFIG_PGO_USE=n 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@42 -- # CONFIG_VHOST=y 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@43 -- # CONFIG_DAOS=n 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@44 -- # CONFIG_DPDK_INC_DIR=//var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@45 -- # CONFIG_DAOS_DIR= 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@46 -- # CONFIG_UNIT_TESTS=n 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@47 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@48 -- # CONFIG_VIRTIO=y 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@49 -- # CONFIG_DPDK_UADK=n 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@50 -- # CONFIG_COVERAGE=y 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@51 -- # CONFIG_RDMA=y 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@52 -- # CONFIG_HAVE_STRUCT_STAT_ST_ATIM=y 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@53 -- # CONFIG_HAVE_LZ4=n 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@54 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@55 -- # CONFIG_URING_PATH= 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@56 -- # CONFIG_XNVME=n 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@57 -- # CONFIG_VFIO_USER=y 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@58 -- # CONFIG_ARCH=native 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@59 -- # CONFIG_HAVE_EVP_MAC=y 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@60 -- # CONFIG_URING_ZNS=n 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@61 -- # CONFIG_WERROR=y 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@62 -- # CONFIG_HAVE_LIBBSD=n 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@63 -- # CONFIG_UBSAN=y 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@64 -- # CONFIG_HAVE_STRUCT_STAT_ST_ATIMESPEC=n 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@65 -- # CONFIG_IPSEC_MB_DIR= 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@66 -- # CONFIG_GOLANG=n 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@67 -- # CONFIG_ISAL=y 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@68 -- # CONFIG_IDXD_KERNEL=y 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@69 -- # CONFIG_DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@70 -- # CONFIG_RDMA_PROV=verbs 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@71 -- # CONFIG_APPS=y 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@72 -- # CONFIG_SHARED=n 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@73 -- # CONFIG_HAVE_KEYUTILS=y 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@74 -- # CONFIG_FC_PATH= 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@75 -- # CONFIG_DPDK_PKG_CONFIG=n 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@76 -- # CONFIG_FC=n 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@77 -- # CONFIG_AVAHI=n 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@78 -- # CONFIG_FIO_PLUGIN=y 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@79 -- # CONFIG_RAID5F=n 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@80 -- # CONFIG_EXAMPLES=y 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@81 -- # CONFIG_TESTS=y 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@82 -- # CONFIG_CRYPTO_MLX5=n 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@83 -- # CONFIG_MAX_LCORES=128 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@84 -- # CONFIG_IPSEC_MB=n 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@85 -- # CONFIG_PGO_DIR= 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@86 -- # CONFIG_DEBUG=y 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@87 -- # CONFIG_DPDK_COMPRESSDEV=n 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@88 -- # CONFIG_CROSS_PREFIX= 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@89 -- # CONFIG_COPY_FILE_RANGE=y 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@90 -- # CONFIG_URING=n 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:08:44.386 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:08:44.387 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:44.387 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:08:44.387 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:44.387 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:08:44.387 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:08:44.387 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:08:44.387 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:08:44.387 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:08:44.387 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:08:44.387 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/config.h ]] 00:08:44.387 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:08:44.387 #define SPDK_CONFIG_H 00:08:44.387 #define SPDK_CONFIG_AIO_FSDEV 1 00:08:44.387 #define SPDK_CONFIG_APPS 1 00:08:44.387 #define SPDK_CONFIG_ARCH native 00:08:44.387 #undef SPDK_CONFIG_ASAN 00:08:44.387 #undef SPDK_CONFIG_AVAHI 00:08:44.387 #undef SPDK_CONFIG_CET 00:08:44.387 #define SPDK_CONFIG_COPY_FILE_RANGE 1 00:08:44.387 #define SPDK_CONFIG_COVERAGE 1 00:08:44.387 #define SPDK_CONFIG_CROSS_PREFIX 00:08:44.387 #undef SPDK_CONFIG_CRYPTO 00:08:44.387 #undef SPDK_CONFIG_CRYPTO_MLX5 00:08:44.387 #undef SPDK_CONFIG_CUSTOMOCF 00:08:44.387 #undef SPDK_CONFIG_DAOS 00:08:44.387 #define SPDK_CONFIG_DAOS_DIR 00:08:44.387 #define SPDK_CONFIG_DEBUG 1 00:08:44.387 #undef SPDK_CONFIG_DPDK_COMPRESSDEV 00:08:44.387 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:08:44.387 #define SPDK_CONFIG_DPDK_INC_DIR //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:08:44.387 #define SPDK_CONFIG_DPDK_LIB_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:44.387 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:08:44.387 #undef SPDK_CONFIG_DPDK_UADK 00:08:44.387 #define SPDK_CONFIG_ENV /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:08:44.387 #define SPDK_CONFIG_EXAMPLES 1 00:08:44.387 #undef SPDK_CONFIG_FC 00:08:44.387 #define SPDK_CONFIG_FC_PATH 00:08:44.387 #define SPDK_CONFIG_FIO_PLUGIN 1 00:08:44.387 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:08:44.387 #define SPDK_CONFIG_FSDEV 1 00:08:44.387 #undef SPDK_CONFIG_FUSE 00:08:44.387 #define SPDK_CONFIG_FUZZER 1 00:08:44.387 #define SPDK_CONFIG_FUZZER_LIB /usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:08:44.387 #undef SPDK_CONFIG_GOLANG 00:08:44.387 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:08:44.387 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:08:44.387 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:08:44.387 #define SPDK_CONFIG_HAVE_KEYUTILS 1 00:08:44.387 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:08:44.387 #undef SPDK_CONFIG_HAVE_LIBBSD 00:08:44.387 #undef SPDK_CONFIG_HAVE_LZ4 00:08:44.387 #define SPDK_CONFIG_HAVE_STRUCT_STAT_ST_ATIM 1 00:08:44.387 #undef SPDK_CONFIG_HAVE_STRUCT_STAT_ST_ATIMESPEC 00:08:44.387 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:08:44.387 #define SPDK_CONFIG_IDXD 1 00:08:44.387 #define SPDK_CONFIG_IDXD_KERNEL 1 00:08:44.387 #undef SPDK_CONFIG_IPSEC_MB 00:08:44.387 #define SPDK_CONFIG_IPSEC_MB_DIR 00:08:44.387 #define SPDK_CONFIG_ISAL 1 00:08:44.387 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:08:44.387 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:08:44.387 #define SPDK_CONFIG_LIBDIR 00:08:44.387 #undef SPDK_CONFIG_LTO 00:08:44.387 #define SPDK_CONFIG_MAX_LCORES 128 00:08:44.387 #define SPDK_CONFIG_MAX_NUMA_NODES 1 00:08:44.387 #define SPDK_CONFIG_NVME_CUSE 1 00:08:44.387 #undef SPDK_CONFIG_OCF 00:08:44.387 #define SPDK_CONFIG_OCF_PATH 00:08:44.387 #define SPDK_CONFIG_OPENSSL_PATH 00:08:44.387 #undef SPDK_CONFIG_PGO_CAPTURE 00:08:44.387 #define SPDK_CONFIG_PGO_DIR 00:08:44.387 #undef SPDK_CONFIG_PGO_USE 00:08:44.387 #define SPDK_CONFIG_PREFIX /usr/local 00:08:44.387 #undef SPDK_CONFIG_RAID5F 00:08:44.387 #undef SPDK_CONFIG_RBD 00:08:44.387 #define SPDK_CONFIG_RDMA 1 00:08:44.387 #define SPDK_CONFIG_RDMA_PROV verbs 00:08:44.387 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:08:44.387 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:08:44.387 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:08:44.387 #undef SPDK_CONFIG_SHARED 00:08:44.387 #undef SPDK_CONFIG_SMA 00:08:44.387 #define SPDK_CONFIG_TESTS 1 00:08:44.387 #undef SPDK_CONFIG_TSAN 00:08:44.387 #define SPDK_CONFIG_UBLK 1 00:08:44.387 #define SPDK_CONFIG_UBSAN 1 00:08:44.387 #undef SPDK_CONFIG_UNIT_TESTS 00:08:44.387 #undef SPDK_CONFIG_URING 00:08:44.387 #define SPDK_CONFIG_URING_PATH 00:08:44.387 #undef SPDK_CONFIG_URING_ZNS 00:08:44.387 #undef SPDK_CONFIG_USDT 00:08:44.387 #undef SPDK_CONFIG_VBDEV_COMPRESS 00:08:44.387 #undef SPDK_CONFIG_VBDEV_COMPRESS_MLX5 00:08:44.387 #define SPDK_CONFIG_VFIO_USER 1 00:08:44.387 #define SPDK_CONFIG_VFIO_USER_DIR 00:08:44.387 #define SPDK_CONFIG_VHOST 1 00:08:44.387 #define SPDK_CONFIG_VIRTIO 1 00:08:44.387 #undef SPDK_CONFIG_VTUNE 00:08:44.387 #define SPDK_CONFIG_VTUNE_DIR 00:08:44.387 #define SPDK_CONFIG_WERROR 1 00:08:44.387 #define SPDK_CONFIG_WPDK_DIR 00:08:44.387 #undef SPDK_CONFIG_XNVME 00:08:44.387 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:08:44.387 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:08:44.387 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:08:44.387 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@15 -- # shopt -s extglob 00:08:44.387 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:08:44.387 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:08:44.387 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:08:44.387 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:44.387 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:44.387 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:44.387 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- paths/export.sh@5 -- # export PATH 00:08:44.387 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:44.387 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@56 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:08:44.387 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- pm/common@6 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:08:44.387 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- pm/common@6 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:08:44.387 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:08:44.387 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- pm/common@7 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/../../../ 00:08:44.387 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:08:44.387 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- pm/common@64 -- # TEST_TAG=N/A 00:08:44.387 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- pm/common@65 -- # TEST_TAG_FILE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.run_test_name 00:08:44.387 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- pm/common@67 -- # PM_OUTPUTDIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:08:44.387 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- pm/common@68 -- # uname -s 00:08:44.387 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- pm/common@68 -- # PM_OS=Linux 00:08:44.387 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:08:44.387 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:08:44.387 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:08:44.387 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:08:44.387 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:08:44.387 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:08:44.387 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- pm/common@76 -- # SUDO[0]= 00:08:44.387 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- pm/common@76 -- # SUDO[1]='sudo -E' 00:08:44.387 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:08:44.387 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:08:44.387 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- pm/common@81 -- # [[ Linux == Linux ]] 00:08:44.387 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- pm/common@81 -- # [[ ............................... != QEMU ]] 00:08:44.387 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- pm/common@81 -- # [[ ! -e /.dockerenv ]] 00:08:44.387 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- pm/common@84 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:08:44.387 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- pm/common@85 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:08:44.387 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- pm/common@88 -- # [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power ]] 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@58 -- # : 1 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@59 -- # export RUN_NIGHTLY 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@62 -- # : 0 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@63 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@64 -- # : 0 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@65 -- # export SPDK_RUN_VALGRIND 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@66 -- # : 1 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@67 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@68 -- # : 0 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@69 -- # export SPDK_TEST_UNITTEST 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@70 -- # : 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@71 -- # export SPDK_TEST_AUTOBUILD 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@72 -- # : 0 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@73 -- # export SPDK_TEST_RELEASE_BUILD 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@74 -- # : 0 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@75 -- # export SPDK_TEST_ISAL 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@76 -- # : 0 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@77 -- # export SPDK_TEST_ISCSI 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@78 -- # : 0 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@79 -- # export SPDK_TEST_ISCSI_INITIATOR 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@80 -- # : 0 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@82 -- # : 0 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_PMR 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@84 -- # : 0 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_BP 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@86 -- # : 0 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVME_CLI 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@88 -- # : 0 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@89 -- # export SPDK_TEST_NVME_CUSE 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@90 -- # : 0 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@91 -- # export SPDK_TEST_NVME_FDP 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@92 -- # : 0 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@93 -- # export SPDK_TEST_NVMF 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@94 -- # : 0 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@95 -- # export SPDK_TEST_VFIOUSER 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@96 -- # : 0 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@97 -- # export SPDK_TEST_VFIOUSER_QEMU 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@98 -- # : 1 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@99 -- # export SPDK_TEST_FUZZER 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@100 -- # : 1 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@101 -- # export SPDK_TEST_FUZZER_SHORT 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@102 -- # : rdma 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@103 -- # export SPDK_TEST_NVMF_TRANSPORT 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@104 -- # : 0 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@105 -- # export SPDK_TEST_RBD 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@106 -- # : 0 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@107 -- # export SPDK_TEST_VHOST 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@108 -- # : 0 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@109 -- # export SPDK_TEST_BLOCKDEV 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@110 -- # : 0 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@111 -- # export SPDK_TEST_RAID 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@112 -- # : 0 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@113 -- # export SPDK_TEST_IOAT 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@114 -- # : 0 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@115 -- # export SPDK_TEST_BLOBFS 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@116 -- # : 0 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@117 -- # export SPDK_TEST_VHOST_INIT 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@118 -- # : 0 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@119 -- # export SPDK_TEST_LVOL 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@120 -- # : 0 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@121 -- # export SPDK_TEST_VBDEV_COMPRESS 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@122 -- # : 0 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@123 -- # export SPDK_RUN_ASAN 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@124 -- # : 1 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@125 -- # export SPDK_RUN_UBSAN 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@126 -- # : /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@127 -- # export SPDK_RUN_EXTERNAL_DPDK 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@128 -- # : 0 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@129 -- # export SPDK_RUN_NON_ROOT 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@130 -- # : 0 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@131 -- # export SPDK_TEST_CRYPTO 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@132 -- # : 0 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@133 -- # export SPDK_TEST_FTL 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@134 -- # : 0 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@135 -- # export SPDK_TEST_OCF 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@136 -- # : 0 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@137 -- # export SPDK_TEST_VMD 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@138 -- # : 0 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@139 -- # export SPDK_TEST_OPAL 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@140 -- # : v22.11.4 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@141 -- # export SPDK_TEST_NATIVE_DPDK 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@142 -- # : true 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@143 -- # export SPDK_AUTOTEST_X 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@144 -- # : 0 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@145 -- # export SPDK_TEST_URING 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@146 -- # : 0 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@147 -- # export SPDK_TEST_USDT 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@148 -- # : 0 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@149 -- # export SPDK_TEST_USE_IGB_UIO 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@150 -- # : 0 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@151 -- # export SPDK_TEST_SCHEDULER 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@152 -- # : 0 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@153 -- # export SPDK_TEST_SCANBUILD 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@154 -- # : 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@155 -- # export SPDK_TEST_NVMF_NICS 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@156 -- # : 0 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@157 -- # export SPDK_TEST_SMA 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@158 -- # : 0 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@159 -- # export SPDK_TEST_DAOS 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@160 -- # : 0 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@161 -- # export SPDK_TEST_XNVME 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@162 -- # : 0 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@163 -- # export SPDK_TEST_ACCEL 00:08:44.388 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@164 -- # : 0 00:08:44.389 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@165 -- # export SPDK_TEST_ACCEL_DSA 00:08:44.389 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@166 -- # : 0 00:08:44.389 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@167 -- # export SPDK_TEST_ACCEL_IAA 00:08:44.389 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@169 -- # : 00:08:44.389 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@170 -- # export SPDK_TEST_FUZZER_TARGET 00:08:44.389 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@171 -- # : 0 00:08:44.389 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@172 -- # export SPDK_TEST_NVMF_MDNS 00:08:44.389 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@173 -- # : 0 00:08:44.389 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@174 -- # export SPDK_JSONRPC_GO_CLIENT 00:08:44.389 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@175 -- # : 1 00:08:44.389 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@176 -- # export SPDK_TEST_SETUP 00:08:44.389 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@177 -- # : 0 00:08:44.389 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@178 -- # export SPDK_TEST_NVME_INTERRUPT 00:08:44.389 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@181 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:08:44.389 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@181 -- # SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:08:44.389 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@182 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:44.389 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@182 -- # DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:44.389 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@183 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:44.389 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@183 -- # VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:44.389 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@184 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:44.389 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@184 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:44.389 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@187 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:08:44.389 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@187 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:08:44.389 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@191 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:08:44.389 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@191 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:08:44.389 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@195 -- # export PYTHONDONTWRITEBYTECODE=1 00:08:44.389 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@195 -- # PYTHONDONTWRITEBYTECODE=1 00:08:44.389 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@199 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:08:44.389 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@199 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:08:44.389 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@200 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:08:44.389 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@200 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:08:44.389 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@204 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:08:44.389 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@205 -- # rm -rf /var/tmp/asan_suppression_file 00:08:44.389 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@206 -- # cat 00:08:44.389 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@242 -- # echo leak:libfuse3.so 00:08:44.389 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@244 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:08:44.389 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@244 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:08:44.389 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@246 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:08:44.389 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@246 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:08:44.389 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@248 -- # '[' -z /var/spdk/dependencies ']' 00:08:44.389 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@251 -- # export DEPENDENCY_DIR 00:08:44.389 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@255 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:44.389 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@255 -- # SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:44.389 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@256 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:44.389 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@256 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:44.389 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@259 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:08:44.389 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@259 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:08:44.389 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@260 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:08:44.389 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@260 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:08:44.389 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@262 -- # export AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:08:44.389 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@262 -- # AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:08:44.389 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@265 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:08:44.389 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@265 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:08:44.389 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@267 -- # _LCOV_MAIN=0 00:08:44.389 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@268 -- # _LCOV_LLVM=1 00:08:44.389 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@269 -- # _LCOV= 00:08:44.389 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@270 -- # [[ '' == *clang* ]] 00:08:44.389 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@270 -- # [[ 1 -eq 1 ]] 00:08:44.389 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@270 -- # _LCOV=1 00:08:44.389 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@272 -- # _lcov_opt[_LCOV_LLVM]='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:08:44.389 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@273 -- # _lcov_opt[_LCOV_MAIN]= 00:08:44.389 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@275 -- # lcov_opt='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:08:44.389 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@278 -- # '[' 0 -eq 0 ']' 00:08:44.389 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@279 -- # export valgrind= 00:08:44.389 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@279 -- # valgrind= 00:08:44.389 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@285 -- # uname -s 00:08:44.389 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@285 -- # '[' Linux = Linux ']' 00:08:44.389 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@286 -- # HUGEMEM=4096 00:08:44.389 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@287 -- # export CLEAR_HUGE=yes 00:08:44.389 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@287 -- # CLEAR_HUGE=yes 00:08:44.389 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@289 -- # MAKE=make 00:08:44.389 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@290 -- # MAKEFLAGS=-j112 00:08:44.389 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@306 -- # export HUGEMEM=4096 00:08:44.389 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@306 -- # HUGEMEM=4096 00:08:44.389 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@308 -- # NO_HUGE=() 00:08:44.389 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@309 -- # TEST_MODE= 00:08:44.389 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@331 -- # [[ -z 159556 ]] 00:08:44.389 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@331 -- # kill -0 159556 00:08:44.389 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1678 -- # set_test_storage 2147483648 00:08:44.389 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@341 -- # [[ -v testdir ]] 00:08:44.390 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@343 -- # local requested_size=2147483648 00:08:44.390 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@344 -- # local mount target_dir 00:08:44.390 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@346 -- # local -A mounts fss sizes avails uses 00:08:44.390 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@347 -- # local source fs size avail mount use 00:08:44.390 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@349 -- # local storage_fallback storage_candidates 00:08:44.390 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@351 -- # mktemp -udt spdk.XXXXXX 00:08:44.390 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@351 -- # storage_fallback=/tmp/spdk.Jow6s7 00:08:44.390 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@356 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:08:44.390 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@358 -- # [[ -n '' ]] 00:08:44.390 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@363 -- # [[ -n '' ]] 00:08:44.390 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@368 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio /tmp/spdk.Jow6s7/tests/vfio /tmp/spdk.Jow6s7 00:08:44.390 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@371 -- # requested_size=2214592512 00:08:44.390 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:08:44.390 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@340 -- # df -T 00:08:44.390 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@340 -- # grep -v Filesystem 00:08:44.390 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=spdk_devtmpfs 00:08:44.390 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=devtmpfs 00:08:44.390 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=67108864 00:08:44.390 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=67108864 00:08:44.390 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=0 00:08:44.390 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:08:44.390 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=/dev/pmem0 00:08:44.390 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=ext2 00:08:44.390 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=4096 00:08:44.390 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=5284429824 00:08:44.390 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=5284425728 00:08:44.390 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:08:44.390 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=spdk_root 00:08:44.390 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=overlay 00:08:44.390 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=53078478848 00:08:44.390 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=61730553856 00:08:44.390 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=8652075008 00:08:44.390 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:08:44.390 11:03:08 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=tmpfs 00:08:44.390 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=tmpfs 00:08:44.390 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=30861848576 00:08:44.390 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=30865276928 00:08:44.390 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=3428352 00:08:44.390 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:08:44.390 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=tmpfs 00:08:44.390 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=tmpfs 00:08:44.390 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=12340117504 00:08:44.390 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=12346114048 00:08:44.390 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=5996544 00:08:44.390 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:08:44.390 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=tmpfs 00:08:44.390 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=tmpfs 00:08:44.390 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=30865100800 00:08:44.390 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=30865276928 00:08:44.390 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=176128 00:08:44.390 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:08:44.390 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=tmpfs 00:08:44.390 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=tmpfs 00:08:44.390 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=6173040640 00:08:44.390 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=6173052928 00:08:44.390 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=12288 00:08:44.390 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:08:44.390 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@379 -- # printf '* Looking for test storage...\n' 00:08:44.390 * Looking for test storage... 00:08:44.390 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@381 -- # local target_space new_size 00:08:44.390 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@382 -- # for target_dir in "${storage_candidates[@]}" 00:08:44.390 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@385 -- # df /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:44.390 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@385 -- # awk '$1 !~ /Filesystem/{print $6}' 00:08:44.390 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@385 -- # mount=/ 00:08:44.390 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@387 -- # target_space=53078478848 00:08:44.390 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@388 -- # (( target_space == 0 || target_space < requested_size )) 00:08:44.390 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@391 -- # (( target_space >= requested_size )) 00:08:44.390 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@393 -- # [[ overlay == tmpfs ]] 00:08:44.390 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@393 -- # [[ overlay == ramfs ]] 00:08:44.390 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@393 -- # [[ / == / ]] 00:08:44.390 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@394 -- # new_size=10866667520 00:08:44.390 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@395 -- # (( new_size * 100 / sizes[/] > 95 )) 00:08:44.390 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@400 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:44.390 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@400 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:44.390 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@401 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:44.390 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:44.390 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@402 -- # return 0 00:08:44.390 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1680 -- # set -o errtrace 00:08:44.390 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1681 -- # shopt -s extdebug 00:08:44.390 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1682 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:08:44.390 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1684 -- # PS4=' \t ${test_domain:-} -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:08:44.390 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1685 -- # true 00:08:44.390 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1687 -- # xtrace_fd 00:08:44.390 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@25 -- # [[ -n 14 ]] 00:08:44.390 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/14 ]] 00:08:44.391 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@27 -- # exec 00:08:44.391 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@29 -- # exec 00:08:44.391 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@31 -- # xtrace_restore 00:08:44.391 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:08:44.391 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:08:44.391 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@18 -- # set -x 00:08:44.391 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:08:44.391 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1693 -- # lcov --version 00:08:44.391 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:08:44.652 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:08:44.652 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:08:44.652 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@333 -- # local ver1 ver1_l 00:08:44.652 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@334 -- # local ver2 ver2_l 00:08:44.652 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@336 -- # IFS=.-: 00:08:44.652 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@336 -- # read -ra ver1 00:08:44.652 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@337 -- # IFS=.-: 00:08:44.652 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@337 -- # read -ra ver2 00:08:44.652 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@338 -- # local 'op=<' 00:08:44.652 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@340 -- # ver1_l=2 00:08:44.652 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@341 -- # ver2_l=1 00:08:44.652 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:08:44.652 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@344 -- # case "$op" in 00:08:44.652 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@345 -- # : 1 00:08:44.652 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@364 -- # (( v = 0 )) 00:08:44.652 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:08:44.652 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@365 -- # decimal 1 00:08:44.652 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@353 -- # local d=1 00:08:44.652 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:08:44.652 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@355 -- # echo 1 00:08:44.652 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@365 -- # ver1[v]=1 00:08:44.652 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@366 -- # decimal 2 00:08:44.652 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@353 -- # local d=2 00:08:44.652 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:08:44.652 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@355 -- # echo 2 00:08:44.652 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@366 -- # ver2[v]=2 00:08:44.652 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:08:44.652 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:08:44.652 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@368 -- # return 0 00:08:44.652 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:08:44.652 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:08:44.652 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:44.652 --rc genhtml_branch_coverage=1 00:08:44.652 --rc genhtml_function_coverage=1 00:08:44.652 --rc genhtml_legend=1 00:08:44.652 --rc geninfo_all_blocks=1 00:08:44.652 --rc geninfo_unexecuted_blocks=1 00:08:44.652 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:44.652 ' 00:08:44.652 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:08:44.652 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:44.652 --rc genhtml_branch_coverage=1 00:08:44.652 --rc genhtml_function_coverage=1 00:08:44.652 --rc genhtml_legend=1 00:08:44.652 --rc geninfo_all_blocks=1 00:08:44.652 --rc geninfo_unexecuted_blocks=1 00:08:44.652 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:44.652 ' 00:08:44.652 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:08:44.652 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:44.652 --rc genhtml_branch_coverage=1 00:08:44.652 --rc genhtml_function_coverage=1 00:08:44.652 --rc genhtml_legend=1 00:08:44.652 --rc geninfo_all_blocks=1 00:08:44.652 --rc geninfo_unexecuted_blocks=1 00:08:44.652 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:44.652 ' 00:08:44.652 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:08:44.652 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:44.652 --rc genhtml_branch_coverage=1 00:08:44.652 --rc genhtml_function_coverage=1 00:08:44.652 --rc genhtml_legend=1 00:08:44.652 --rc geninfo_all_blocks=1 00:08:44.652 --rc geninfo_unexecuted_blocks=1 00:08:44.652 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:44.652 ' 00:08:44.652 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@65 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/../common.sh 00:08:44.652 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@8 -- # pids=() 00:08:44.652 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@67 -- # fuzzfile=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c 00:08:44.652 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@68 -- # grep -c '\.fn =' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c 00:08:44.652 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@68 -- # fuzz_num=7 00:08:44.652 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@69 -- # (( fuzz_num != 0 )) 00:08:44.652 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@71 -- # trap 'cleanup /tmp/vfio-user-* /var/tmp/suppress_vfio_fuzz; exit 1' SIGINT SIGTERM EXIT 00:08:44.652 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@74 -- # mem_size=0 00:08:44.652 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@75 -- # [[ 1 -eq 1 ]] 00:08:44.652 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@76 -- # start_llvm_fuzz_short 7 1 00:08:44.652 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@69 -- # local fuzz_num=7 00:08:44.652 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@70 -- # local time=1 00:08:44.652 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i = 0 )) 00:08:44.652 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:44.652 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 0 1 0x1 00:08:44.652 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@22 -- # local fuzzer_type=0 00:08:44.652 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@23 -- # local timen=1 00:08:44.652 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@24 -- # local core=0x1 00:08:44.652 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:44.652 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-0 00:08:44.652 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-0/domain/1 00:08:44.652 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-0/domain/2 00:08:44.652 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-0/fuzz_vfio_json.conf 00:08:44.652 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:08:44.652 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:08:44.652 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-0 /tmp/vfio-user-0/domain/1 /tmp/vfio-user-0/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:44.652 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-0/domain/1%; 00:08:44.652 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-0/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:44.653 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:44.653 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:08:44.653 11:03:09 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-0/domain/1 -c /tmp/vfio-user-0/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 -Y /tmp/vfio-user-0/domain/2 -r /tmp/vfio-user-0/spdk0.sock -Z 0 00:08:44.653 [2024-11-17 11:03:09.174298] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:08:44.653 [2024-11-17 11:03:09.174376] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid159699 ] 00:08:44.653 [2024-11-17 11:03:09.269530] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:44.653 [2024-11-17 11:03:09.295727] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:44.913 INFO: Running with entropic power schedule (0xFF, 100). 00:08:44.913 INFO: Seed: 989805908 00:08:44.913 INFO: Loaded 1 modules (384840 inline 8-bit counters): 384840 [0x2a4e00c, 0x2aabf54), 00:08:44.913 INFO: Loaded 1 PC tables (384840 PCs): 384840 [0x2aabf58,0x308b3d8), 00:08:44.913 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:44.913 INFO: A corpus is not provided, starting from an empty corpus 00:08:44.913 #2 INITED exec/s: 0 rss: 65Mb 00:08:44.913 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:44.913 This may also happen if the target rejected all inputs we tried so far 00:08:44.913 [2024-11-17 11:03:09.542350] vfio_user.c:2840:enable_ctrlr: *NOTICE*: /tmp/vfio-user-0/domain/2: enabling controller 00:08:45.433 NEW_FUNC[1/671]: 0x4521a8 in fuzz_vfio_user_region_rw /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:84 00:08:45.433 NEW_FUNC[2/671]: 0x457cb8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:45.433 #15 NEW cov: 11154 ft: 11119 corp: 2/7b lim: 6 exec/s: 0 rss: 71Mb L: 6/6 MS: 3 CMP-CopyPart-CrossOver- DE: "\377\377\377\036"- 00:08:45.433 NEW_FUNC[1/1]: 0x47c628 in bdev_malloc_check_iov_len /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/module/bdev/malloc/bdev_malloc.c:258 00:08:45.433 #21 NEW cov: 11177 ft: 14315 corp: 3/13b lim: 6 exec/s: 0 rss: 72Mb L: 6/6 MS: 1 ShuffleBytes- 00:08:45.693 #27 NEW cov: 11177 ft: 14586 corp: 4/19b lim: 6 exec/s: 0 rss: 73Mb L: 6/6 MS: 1 ChangeByte- 00:08:45.954 NEW_FUNC[1/1]: 0x1c1a8e8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:45.954 #28 NEW cov: 11197 ft: 15900 corp: 5/25b lim: 6 exec/s: 0 rss: 73Mb L: 6/6 MS: 1 CopyPart- 00:08:45.954 #31 NEW cov: 11197 ft: 17023 corp: 6/31b lim: 6 exec/s: 31 rss: 73Mb L: 6/6 MS: 3 EraseBytes-CopyPart-InsertByte- 00:08:46.214 #32 NEW cov: 11197 ft: 17316 corp: 7/37b lim: 6 exec/s: 32 rss: 73Mb L: 6/6 MS: 1 CopyPart- 00:08:46.474 #38 NEW cov: 11197 ft: 17624 corp: 8/43b lim: 6 exec/s: 38 rss: 73Mb L: 6/6 MS: 1 CopyPart- 00:08:46.735 #39 NEW cov: 11197 ft: 18040 corp: 9/49b lim: 6 exec/s: 39 rss: 73Mb L: 6/6 MS: 1 CopyPart- 00:08:46.735 #40 NEW cov: 11204 ft: 18362 corp: 10/55b lim: 6 exec/s: 40 rss: 73Mb L: 6/6 MS: 1 CopyPart- 00:08:46.995 #41 NEW cov: 11204 ft: 18483 corp: 11/61b lim: 6 exec/s: 20 rss: 73Mb L: 6/6 MS: 1 ShuffleBytes- 00:08:46.995 #41 DONE cov: 11204 ft: 18483 corp: 11/61b lim: 6 exec/s: 20 rss: 73Mb 00:08:46.995 ###### Recommended dictionary. ###### 00:08:46.995 "\377\377\377\036" # Uses: 0 00:08:46.995 ###### End of recommended dictionary. ###### 00:08:46.995 Done 41 runs in 2 second(s) 00:08:46.995 [2024-11-17 11:03:11.601230] vfio_user.c:2802:disable_ctrlr: *NOTICE*: /tmp/vfio-user-0/domain/2: disabling controller 00:08:47.256 11:03:11 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-0 /var/tmp/suppress_vfio_fuzz 00:08:47.256 11:03:11 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:47.256 11:03:11 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:47.256 11:03:11 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 1 1 0x1 00:08:47.256 11:03:11 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@22 -- # local fuzzer_type=1 00:08:47.256 11:03:11 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@23 -- # local timen=1 00:08:47.256 11:03:11 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@24 -- # local core=0x1 00:08:47.256 11:03:11 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:08:47.256 11:03:11 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-1 00:08:47.256 11:03:11 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-1/domain/1 00:08:47.256 11:03:11 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-1/domain/2 00:08:47.256 11:03:11 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-1/fuzz_vfio_json.conf 00:08:47.256 11:03:11 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:08:47.256 11:03:11 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:08:47.256 11:03:11 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-1 /tmp/vfio-user-1/domain/1 /tmp/vfio-user-1/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:08:47.256 11:03:11 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-1/domain/1%; 00:08:47.256 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-1/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:47.256 11:03:11 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:47.256 11:03:11 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:08:47.256 11:03:11 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-1/domain/1 -c /tmp/vfio-user-1/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 -Y /tmp/vfio-user-1/domain/2 -r /tmp/vfio-user-1/spdk1.sock -Z 1 00:08:47.256 [2024-11-17 11:03:11.861141] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:08:47.256 [2024-11-17 11:03:11.861226] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid160152 ] 00:08:47.517 [2024-11-17 11:03:11.956451] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:47.517 [2024-11-17 11:03:11.978659] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:47.517 INFO: Running with entropic power schedule (0xFF, 100). 00:08:47.517 INFO: Seed: 3665803666 00:08:47.777 INFO: Loaded 1 modules (384840 inline 8-bit counters): 384840 [0x2a4e00c, 0x2aabf54), 00:08:47.777 INFO: Loaded 1 PC tables (384840 PCs): 384840 [0x2aabf58,0x308b3d8), 00:08:47.777 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:08:47.777 INFO: A corpus is not provided, starting from an empty corpus 00:08:47.777 #2 INITED exec/s: 0 rss: 66Mb 00:08:47.777 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:47.777 This may also happen if the target rejected all inputs we tried so far 00:08:47.777 [2024-11-17 11:03:12.221057] vfio_user.c:2840:enable_ctrlr: *NOTICE*: /tmp/vfio-user-1/domain/2: enabling controller 00:08:47.777 [2024-11-17 11:03:12.286088] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:47.777 [2024-11-17 11:03:12.286124] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:47.777 [2024-11-17 11:03:12.286143] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:48.038 NEW_FUNC[1/673]: 0x452748 in fuzz_vfio_user_version /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:71 00:08:48.038 NEW_FUNC[2/673]: 0x457cb8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:48.038 #7 NEW cov: 11162 ft: 10734 corp: 2/5b lim: 4 exec/s: 0 rss: 71Mb L: 4/4 MS: 5 CrossOver-ChangeBit-CopyPart-ChangeBinInt-InsertByte- 00:08:48.299 [2024-11-17 11:03:12.757403] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:48.299 [2024-11-17 11:03:12.757437] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:48.299 [2024-11-17 11:03:12.757455] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:48.299 NEW_FUNC[1/1]: 0x1f61ce8 in spdk_get_thread /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:1282 00:08:48.299 #8 NEW cov: 11177 ft: 13943 corp: 3/9b lim: 4 exec/s: 0 rss: 72Mb L: 4/4 MS: 1 ShuffleBytes- 00:08:48.299 [2024-11-17 11:03:12.945001] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:48.299 [2024-11-17 11:03:12.945024] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:48.299 [2024-11-17 11:03:12.945043] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:48.567 NEW_FUNC[1/1]: 0x1c1a8e8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:48.567 #13 NEW cov: 11194 ft: 15265 corp: 4/13b lim: 4 exec/s: 0 rss: 73Mb L: 4/4 MS: 5 ChangeByte-CopyPart-ChangeBit-InsertByte-CrossOver- 00:08:48.567 [2024-11-17 11:03:13.132926] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:48.567 [2024-11-17 11:03:13.132950] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:48.567 [2024-11-17 11:03:13.132967] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:48.827 #14 NEW cov: 11195 ft: 16314 corp: 5/17b lim: 4 exec/s: 14 rss: 73Mb L: 4/4 MS: 1 ChangeBit- 00:08:48.827 [2024-11-17 11:03:13.306319] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:48.827 [2024-11-17 11:03:13.306343] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:48.827 [2024-11-17 11:03:13.306360] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:48.828 #15 NEW cov: 11195 ft: 16457 corp: 6/21b lim: 4 exec/s: 15 rss: 73Mb L: 4/4 MS: 1 ChangeByte- 00:08:49.088 [2024-11-17 11:03:13.486941] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:49.088 [2024-11-17 11:03:13.486964] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:49.088 [2024-11-17 11:03:13.486981] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:49.088 #17 NEW cov: 11197 ft: 16710 corp: 7/25b lim: 4 exec/s: 17 rss: 73Mb L: 4/4 MS: 2 CopyPart-CrossOver- 00:08:49.088 [2024-11-17 11:03:13.662323] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:49.088 [2024-11-17 11:03:13.662347] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:49.088 [2024-11-17 11:03:13.662365] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:49.349 #22 NEW cov: 11197 ft: 16973 corp: 8/29b lim: 4 exec/s: 22 rss: 73Mb L: 4/4 MS: 5 ChangeByte-CopyPart-ChangeBit-ChangeBit-CrossOver- 00:08:49.349 [2024-11-17 11:03:13.839838] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:49.349 [2024-11-17 11:03:13.839863] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:49.349 [2024-11-17 11:03:13.839880] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:49.349 #28 NEW cov: 11197 ft: 17081 corp: 9/33b lim: 4 exec/s: 28 rss: 73Mb L: 4/4 MS: 1 ShuffleBytes- 00:08:49.609 [2024-11-17 11:03:14.014109] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:49.609 [2024-11-17 11:03:14.014133] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:49.609 [2024-11-17 11:03:14.014150] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:49.609 #29 NEW cov: 11204 ft: 17366 corp: 10/37b lim: 4 exec/s: 29 rss: 73Mb L: 4/4 MS: 1 ChangeBinInt- 00:08:49.609 [2024-11-17 11:03:14.211484] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:49.609 [2024-11-17 11:03:14.211507] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:49.609 [2024-11-17 11:03:14.211525] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:49.869 #37 NEW cov: 11204 ft: 18127 corp: 11/41b lim: 4 exec/s: 18 rss: 73Mb L: 4/4 MS: 3 EraseBytes-CopyPart-InsertByte- 00:08:49.869 #37 DONE cov: 11204 ft: 18127 corp: 11/41b lim: 4 exec/s: 18 rss: 73Mb 00:08:49.869 Done 37 runs in 2 second(s) 00:08:49.869 [2024-11-17 11:03:14.336223] vfio_user.c:2802:disable_ctrlr: *NOTICE*: /tmp/vfio-user-1/domain/2: disabling controller 00:08:50.130 11:03:14 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-1 /var/tmp/suppress_vfio_fuzz 00:08:50.130 11:03:14 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:50.130 11:03:14 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:50.130 11:03:14 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 2 1 0x1 00:08:50.130 11:03:14 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@22 -- # local fuzzer_type=2 00:08:50.130 11:03:14 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@23 -- # local timen=1 00:08:50.130 11:03:14 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@24 -- # local core=0x1 00:08:50.130 11:03:14 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:08:50.130 11:03:14 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-2 00:08:50.130 11:03:14 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-2/domain/1 00:08:50.130 11:03:14 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-2/domain/2 00:08:50.130 11:03:14 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-2/fuzz_vfio_json.conf 00:08:50.130 11:03:14 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:08:50.130 11:03:14 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:08:50.130 11:03:14 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-2 /tmp/vfio-user-2/domain/1 /tmp/vfio-user-2/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:08:50.130 11:03:14 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-2/domain/1%; 00:08:50.130 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-2/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:50.130 11:03:14 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:50.130 11:03:14 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:08:50.130 11:03:14 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-2/domain/1 -c /tmp/vfio-user-2/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 -Y /tmp/vfio-user-2/domain/2 -r /tmp/vfio-user-2/spdk2.sock -Z 2 00:08:50.130 [2024-11-17 11:03:14.594943] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:08:50.130 [2024-11-17 11:03:14.595010] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid160688 ] 00:08:50.130 [2024-11-17 11:03:14.690759] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:50.130 [2024-11-17 11:03:14.713767] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:50.391 INFO: Running with entropic power schedule (0xFF, 100). 00:08:50.391 INFO: Seed: 2103825180 00:08:50.391 INFO: Loaded 1 modules (384840 inline 8-bit counters): 384840 [0x2a4e00c, 0x2aabf54), 00:08:50.391 INFO: Loaded 1 PC tables (384840 PCs): 384840 [0x2aabf58,0x308b3d8), 00:08:50.391 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:08:50.391 INFO: A corpus is not provided, starting from an empty corpus 00:08:50.391 #2 INITED exec/s: 0 rss: 65Mb 00:08:50.391 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:50.391 This may also happen if the target rejected all inputs we tried so far 00:08:50.391 [2024-11-17 11:03:14.947779] vfio_user.c:2840:enable_ctrlr: *NOTICE*: /tmp/vfio-user-2/domain/2: enabling controller 00:08:50.391 [2024-11-17 11:03:15.015303] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:50.912 NEW_FUNC[1/673]: 0x453138 in fuzz_vfio_user_get_region_info /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:103 00:08:50.912 NEW_FUNC[2/673]: 0x457cb8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:50.912 #59 NEW cov: 11149 ft: 11116 corp: 2/9b lim: 8 exec/s: 0 rss: 70Mb L: 8/8 MS: 2 CopyPart-InsertRepeatedBytes- 00:08:50.912 [2024-11-17 11:03:15.501187] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:51.172 #70 NEW cov: 11163 ft: 14501 corp: 3/17b lim: 8 exec/s: 0 rss: 72Mb L: 8/8 MS: 1 ShuffleBytes- 00:08:51.172 [2024-11-17 11:03:15.686281] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:51.172 NEW_FUNC[1/1]: 0x1c1a8e8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:51.172 #71 NEW cov: 11180 ft: 15292 corp: 4/25b lim: 8 exec/s: 0 rss: 73Mb L: 8/8 MS: 1 ChangeBinInt- 00:08:51.433 [2024-11-17 11:03:15.868523] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:51.433 #77 NEW cov: 11180 ft: 15580 corp: 5/33b lim: 8 exec/s: 77 rss: 73Mb L: 8/8 MS: 1 ShuffleBytes- 00:08:51.433 [2024-11-17 11:03:16.051932] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:51.693 #78 NEW cov: 11180 ft: 15818 corp: 6/41b lim: 8 exec/s: 78 rss: 73Mb L: 8/8 MS: 1 CopyPart- 00:08:51.693 [2024-11-17 11:03:16.231478] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:51.693 #79 NEW cov: 11180 ft: 15969 corp: 7/49b lim: 8 exec/s: 79 rss: 73Mb L: 8/8 MS: 1 ChangeBit- 00:08:51.954 [2024-11-17 11:03:16.414097] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:51.954 #80 NEW cov: 11180 ft: 16335 corp: 8/57b lim: 8 exec/s: 80 rss: 73Mb L: 8/8 MS: 1 ChangeBinInt- 00:08:51.954 [2024-11-17 11:03:16.598892] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:52.214 #81 NEW cov: 11180 ft: 16803 corp: 9/65b lim: 8 exec/s: 81 rss: 73Mb L: 8/8 MS: 1 ChangeBit- 00:08:52.214 [2024-11-17 11:03:16.779053] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:52.475 #87 NEW cov: 11187 ft: 17201 corp: 10/73b lim: 8 exec/s: 87 rss: 73Mb L: 8/8 MS: 1 ShuffleBytes- 00:08:52.475 [2024-11-17 11:03:16.964030] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:52.475 #89 NEW cov: 11187 ft: 17434 corp: 11/81b lim: 8 exec/s: 44 rss: 73Mb L: 8/8 MS: 2 EraseBytes-InsertByte- 00:08:52.475 #89 DONE cov: 11187 ft: 17434 corp: 11/81b lim: 8 exec/s: 44 rss: 73Mb 00:08:52.475 Done 89 runs in 2 second(s) 00:08:52.475 [2024-11-17 11:03:17.084241] vfio_user.c:2802:disable_ctrlr: *NOTICE*: /tmp/vfio-user-2/domain/2: disabling controller 00:08:52.734 11:03:17 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-2 /var/tmp/suppress_vfio_fuzz 00:08:52.734 11:03:17 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:52.734 11:03:17 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:52.734 11:03:17 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 3 1 0x1 00:08:52.734 11:03:17 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@22 -- # local fuzzer_type=3 00:08:52.734 11:03:17 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@23 -- # local timen=1 00:08:52.734 11:03:17 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@24 -- # local core=0x1 00:08:52.734 11:03:17 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:08:52.734 11:03:17 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-3 00:08:52.734 11:03:17 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-3/domain/1 00:08:52.734 11:03:17 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-3/domain/2 00:08:52.734 11:03:17 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-3/fuzz_vfio_json.conf 00:08:52.734 11:03:17 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:08:52.734 11:03:17 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:08:52.734 11:03:17 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-3 /tmp/vfio-user-3/domain/1 /tmp/vfio-user-3/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:08:52.734 11:03:17 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-3/domain/1%; 00:08:52.734 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-3/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:52.734 11:03:17 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:52.734 11:03:17 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:08:52.734 11:03:17 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-3/domain/1 -c /tmp/vfio-user-3/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 -Y /tmp/vfio-user-3/domain/2 -r /tmp/vfio-user-3/spdk3.sock -Z 3 00:08:52.734 [2024-11-17 11:03:17.345731] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:08:52.734 [2024-11-17 11:03:17.345826] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid161233 ] 00:08:52.995 [2024-11-17 11:03:17.440435] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:52.995 [2024-11-17 11:03:17.462177] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:52.995 INFO: Running with entropic power schedule (0xFF, 100). 00:08:52.995 INFO: Seed: 554854426 00:08:53.255 INFO: Loaded 1 modules (384840 inline 8-bit counters): 384840 [0x2a4e00c, 0x2aabf54), 00:08:53.255 INFO: Loaded 1 PC tables (384840 PCs): 384840 [0x2aabf58,0x308b3d8), 00:08:53.255 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:08:53.255 INFO: A corpus is not provided, starting from an empty corpus 00:08:53.255 #2 INITED exec/s: 0 rss: 65Mb 00:08:53.255 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:53.255 This may also happen if the target rejected all inputs we tried so far 00:08:53.255 [2024-11-17 11:03:17.693663] vfio_user.c:2840:enable_ctrlr: *NOTICE*: /tmp/vfio-user-3/domain/2: enabling controller 00:08:53.516 NEW_FUNC[1/673]: 0x453828 in fuzz_vfio_user_dma_map /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:124 00:08:53.516 NEW_FUNC[2/673]: 0x457cb8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:53.516 #44 NEW cov: 11156 ft: 11117 corp: 2/33b lim: 32 exec/s: 0 rss: 71Mb L: 32/32 MS: 2 InsertRepeatedBytes-InsertRepeatedBytes- 00:08:53.776 #50 NEW cov: 11171 ft: 14610 corp: 3/65b lim: 32 exec/s: 0 rss: 72Mb L: 32/32 MS: 1 CopyPart- 00:08:54.036 NEW_FUNC[1/1]: 0x1c1a8e8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:54.036 #56 NEW cov: 11188 ft: 15789 corp: 4/97b lim: 32 exec/s: 0 rss: 73Mb L: 32/32 MS: 1 CopyPart- 00:08:54.036 [2024-11-17 11:03:18.600493] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: DMA region size 9187201950435737471 > max 8796093022208 00:08:54.036 [2024-11-17 11:03:18.600532] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0x7f7f7f7f7f7f7fac, 0xfefefefefefeff2b) offset=0x7f7f7f797979797f flags=0x3: No space left on device 00:08:54.036 [2024-11-17 11:03:18.600544] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: No space left on device 00:08:54.036 [2024-11-17 11:03:18.600560] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:54.296 NEW_FUNC[1/1]: 0x159a848 in vfio_user_log /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/vfio_user.c:3098 00:08:54.296 #62 NEW cov: 11199 ft: 16212 corp: 5/129b lim: 32 exec/s: 62 rss: 73Mb L: 32/32 MS: 1 CMP- DE: "\000\000\000\000\004x\334\254"- 00:08:54.297 [2024-11-17 11:03:18.793416] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: DMA region size 9187201950435737471 > max 8796093022208 00:08:54.297 [2024-11-17 11:03:18.793443] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0x7f7f7f7f7f7f7fac, 0xfefefefefefeff2b) offset=0x7f7f7f79792e797f flags=0x3: No space left on device 00:08:54.297 [2024-11-17 11:03:18.793455] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: No space left on device 00:08:54.297 [2024-11-17 11:03:18.793471] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:54.297 #63 NEW cov: 11199 ft: 16862 corp: 6/161b lim: 32 exec/s: 63 rss: 73Mb L: 32/32 MS: 1 ChangeByte- 00:08:54.557 #64 NEW cov: 11199 ft: 17039 corp: 7/193b lim: 32 exec/s: 64 rss: 74Mb L: 32/32 MS: 1 ChangeByte- 00:08:54.818 #65 NEW cov: 11199 ft: 17450 corp: 8/225b lim: 32 exec/s: 65 rss: 74Mb L: 32/32 MS: 1 ShuffleBytes- 00:08:54.818 [2024-11-17 11:03:19.312019] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: DMA region size 9187097496831098751 > max 8796093022208 00:08:54.818 [2024-11-17 11:03:19.312056] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0x7f7f7f7f7f7f7fac, 0xfefe9ffefefeff2b) offset=0x7f7f7f797979797f flags=0x3: No space left on device 00:08:54.818 [2024-11-17 11:03:19.312068] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: No space left on device 00:08:54.818 [2024-11-17 11:03:19.312085] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:54.818 #66 NEW cov: 11199 ft: 17818 corp: 9/257b lim: 32 exec/s: 66 rss: 74Mb L: 32/32 MS: 1 ChangeBinInt- 00:08:55.078 #67 NEW cov: 11206 ft: 17828 corp: 10/289b lim: 32 exec/s: 67 rss: 74Mb L: 32/32 MS: 1 CrossOver- 00:08:55.078 [2024-11-17 11:03:19.653892] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: DMA region size 9187201950435737471 > max 8796093022208 00:08:55.078 [2024-11-17 11:03:19.653917] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0x7f7f05007f7f7fac, 0xfefe847ffefeff2b) offset=0x7f7f7f797979797f flags=0x3: No space left on device 00:08:55.078 [2024-11-17 11:03:19.653930] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: No space left on device 00:08:55.078 [2024-11-17 11:03:19.653946] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:55.338 #68 NEW cov: 11206 ft: 18395 corp: 11/321b lim: 32 exec/s: 34 rss: 74Mb L: 32/32 MS: 1 CMP- DE: "\000\005"- 00:08:55.338 #68 DONE cov: 11206 ft: 18395 corp: 11/321b lim: 32 exec/s: 34 rss: 74Mb 00:08:55.338 ###### Recommended dictionary. ###### 00:08:55.338 "\000\000\000\000\004x\334\254" # Uses: 0 00:08:55.338 "\000\005" # Uses: 0 00:08:55.338 ###### End of recommended dictionary. ###### 00:08:55.338 Done 68 runs in 2 second(s) 00:08:55.338 [2024-11-17 11:03:19.853239] vfio_user.c:2802:disable_ctrlr: *NOTICE*: /tmp/vfio-user-3/domain/2: disabling controller 00:08:55.599 11:03:20 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-3 /var/tmp/suppress_vfio_fuzz 00:08:55.599 11:03:20 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:55.599 11:03:20 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:55.599 11:03:20 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 4 1 0x1 00:08:55.599 11:03:20 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@22 -- # local fuzzer_type=4 00:08:55.599 11:03:20 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@23 -- # local timen=1 00:08:55.599 11:03:20 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@24 -- # local core=0x1 00:08:55.599 11:03:20 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:08:55.599 11:03:20 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-4 00:08:55.599 11:03:20 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-4/domain/1 00:08:55.599 11:03:20 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-4/domain/2 00:08:55.599 11:03:20 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-4/fuzz_vfio_json.conf 00:08:55.599 11:03:20 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:08:55.599 11:03:20 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:08:55.599 11:03:20 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-4 /tmp/vfio-user-4/domain/1 /tmp/vfio-user-4/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:08:55.599 11:03:20 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-4/domain/1%; 00:08:55.599 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-4/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:55.599 11:03:20 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:55.599 11:03:20 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:08:55.599 11:03:20 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-4/domain/1 -c /tmp/vfio-user-4/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 -Y /tmp/vfio-user-4/domain/2 -r /tmp/vfio-user-4/spdk4.sock -Z 4 00:08:55.599 [2024-11-17 11:03:20.117032] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:08:55.599 [2024-11-17 11:03:20.117139] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid161647 ] 00:08:55.599 [2024-11-17 11:03:20.213352] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:55.599 [2024-11-17 11:03:20.236771] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:55.859 INFO: Running with entropic power schedule (0xFF, 100). 00:08:55.859 INFO: Seed: 3333863434 00:08:55.859 INFO: Loaded 1 modules (384840 inline 8-bit counters): 384840 [0x2a4e00c, 0x2aabf54), 00:08:55.859 INFO: Loaded 1 PC tables (384840 PCs): 384840 [0x2aabf58,0x308b3d8), 00:08:55.859 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:08:55.860 INFO: A corpus is not provided, starting from an empty corpus 00:08:55.860 #2 INITED exec/s: 0 rss: 65Mb 00:08:55.860 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:55.860 This may also happen if the target rejected all inputs we tried so far 00:08:55.860 [2024-11-17 11:03:20.481245] vfio_user.c:2840:enable_ctrlr: *NOTICE*: /tmp/vfio-user-4/domain/2: enabling controller 00:08:56.380 NEW_FUNC[1/673]: 0x4540a8 in fuzz_vfio_user_dma_unmap /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:144 00:08:56.380 NEW_FUNC[2/673]: 0x457cb8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:56.380 #11 NEW cov: 11156 ft: 11118 corp: 2/33b lim: 32 exec/s: 0 rss: 70Mb L: 32/32 MS: 4 InsertRepeatedBytes-ShuffleBytes-CopyPart-InsertByte- 00:08:56.641 #17 NEW cov: 11173 ft: 14767 corp: 3/65b lim: 32 exec/s: 0 rss: 71Mb L: 32/32 MS: 1 CopyPart- 00:08:56.901 NEW_FUNC[1/1]: 0x1c1a8e8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:56.901 #23 NEW cov: 11190 ft: 16366 corp: 4/97b lim: 32 exec/s: 0 rss: 72Mb L: 32/32 MS: 1 ChangeBinInt- 00:08:56.901 #31 NEW cov: 11190 ft: 17013 corp: 5/129b lim: 32 exec/s: 31 rss: 72Mb L: 32/32 MS: 3 EraseBytes-InsertByte-InsertRepeatedBytes- 00:08:57.161 #32 NEW cov: 11190 ft: 17315 corp: 6/161b lim: 32 exec/s: 32 rss: 72Mb L: 32/32 MS: 1 ChangeByte- 00:08:57.422 #43 NEW cov: 11190 ft: 17513 corp: 7/193b lim: 32 exec/s: 43 rss: 72Mb L: 32/32 MS: 1 ChangeBinInt- 00:08:57.682 #44 NEW cov: 11190 ft: 17729 corp: 8/225b lim: 32 exec/s: 44 rss: 73Mb L: 32/32 MS: 1 ChangeBit- 00:08:57.682 #50 NEW cov: 11197 ft: 17855 corp: 9/257b lim: 32 exec/s: 50 rss: 73Mb L: 32/32 MS: 1 ShuffleBytes- 00:08:57.942 #51 NEW cov: 11197 ft: 17927 corp: 10/289b lim: 32 exec/s: 25 rss: 73Mb L: 32/32 MS: 1 CMP- DE: "S#\000\000\000\000\000\000"- 00:08:57.942 #51 DONE cov: 11197 ft: 17927 corp: 10/289b lim: 32 exec/s: 25 rss: 73Mb 00:08:57.942 ###### Recommended dictionary. ###### 00:08:57.942 "S#\000\000\000\000\000\000" # Uses: 0 00:08:57.942 ###### End of recommended dictionary. ###### 00:08:57.942 Done 51 runs in 2 second(s) 00:08:57.942 [2024-11-17 11:03:22.485226] vfio_user.c:2802:disable_ctrlr: *NOTICE*: /tmp/vfio-user-4/domain/2: disabling controller 00:08:58.204 11:03:22 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-4 /var/tmp/suppress_vfio_fuzz 00:08:58.204 11:03:22 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:58.204 11:03:22 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:58.204 11:03:22 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 5 1 0x1 00:08:58.204 11:03:22 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@22 -- # local fuzzer_type=5 00:08:58.204 11:03:22 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@23 -- # local timen=1 00:08:58.204 11:03:22 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@24 -- # local core=0x1 00:08:58.204 11:03:22 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:08:58.204 11:03:22 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-5 00:08:58.204 11:03:22 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-5/domain/1 00:08:58.204 11:03:22 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-5/domain/2 00:08:58.204 11:03:22 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-5/fuzz_vfio_json.conf 00:08:58.204 11:03:22 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:08:58.204 11:03:22 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:08:58.204 11:03:22 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-5 /tmp/vfio-user-5/domain/1 /tmp/vfio-user-5/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:08:58.204 11:03:22 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-5/domain/1%; 00:08:58.204 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-5/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:58.204 11:03:22 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:58.204 11:03:22 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:08:58.204 11:03:22 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-5/domain/1 -c /tmp/vfio-user-5/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 -Y /tmp/vfio-user-5/domain/2 -r /tmp/vfio-user-5/spdk5.sock -Z 5 00:08:58.204 [2024-11-17 11:03:22.748746] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:08:58.204 [2024-11-17 11:03:22.748819] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid162060 ] 00:08:58.204 [2024-11-17 11:03:22.821529] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:58.204 [2024-11-17 11:03:22.843458] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:58.464 INFO: Running with entropic power schedule (0xFF, 100). 00:08:58.464 INFO: Seed: 1641903075 00:08:58.464 INFO: Loaded 1 modules (384840 inline 8-bit counters): 384840 [0x2a4e00c, 0x2aabf54), 00:08:58.464 INFO: Loaded 1 PC tables (384840 PCs): 384840 [0x2aabf58,0x308b3d8), 00:08:58.464 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:08:58.464 INFO: A corpus is not provided, starting from an empty corpus 00:08:58.464 #2 INITED exec/s: 0 rss: 67Mb 00:08:58.464 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:58.464 This may also happen if the target rejected all inputs we tried so far 00:08:58.464 [2024-11-17 11:03:23.079263] vfio_user.c:2840:enable_ctrlr: *NOTICE*: /tmp/vfio-user-5/domain/2: enabling controller 00:08:58.725 [2024-11-17 11:03:23.152722] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:58.725 [2024-11-17 11:03:23.152763] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:58.987 NEW_FUNC[1/674]: 0x454aa8 in fuzz_vfio_user_irq_set /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:171 00:08:58.987 NEW_FUNC[2/674]: 0x457cb8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:58.987 #64 NEW cov: 11165 ft: 11122 corp: 2/14b lim: 13 exec/s: 0 rss: 73Mb L: 13/13 MS: 2 InsertByte-InsertRepeatedBytes- 00:08:58.987 [2024-11-17 11:03:23.638246] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:58.987 [2024-11-17 11:03:23.638288] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:59.247 #70 NEW cov: 11179 ft: 13622 corp: 3/27b lim: 13 exec/s: 0 rss: 74Mb L: 13/13 MS: 1 CrossOver- 00:08:59.247 [2024-11-17 11:03:23.835048] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:59.247 [2024-11-17 11:03:23.835085] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:59.508 NEW_FUNC[1/1]: 0x1c1a8e8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:59.508 #81 NEW cov: 11196 ft: 14627 corp: 4/40b lim: 13 exec/s: 0 rss: 75Mb L: 13/13 MS: 1 CopyPart- 00:08:59.508 [2024-11-17 11:03:24.037134] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:59.508 [2024-11-17 11:03:24.037168] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:59.508 #82 NEW cov: 11199 ft: 15230 corp: 5/53b lim: 13 exec/s: 82 rss: 75Mb L: 13/13 MS: 1 CMP- DE: "\377\377\377\377\003G\033\006"- 00:08:59.769 [2024-11-17 11:03:24.232346] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:59.769 [2024-11-17 11:03:24.232377] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:59.769 #83 NEW cov: 11199 ft: 15318 corp: 6/66b lim: 13 exec/s: 83 rss: 77Mb L: 13/13 MS: 1 ChangeBit- 00:08:59.769 [2024-11-17 11:03:24.419847] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:59.769 [2024-11-17 11:03:24.419876] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:00.030 #89 NEW cov: 11199 ft: 16800 corp: 7/79b lim: 13 exec/s: 89 rss: 77Mb L: 13/13 MS: 1 ChangeBinInt- 00:09:00.030 [2024-11-17 11:03:24.615486] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:00.030 [2024-11-17 11:03:24.615517] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:00.290 #95 NEW cov: 11199 ft: 16883 corp: 8/92b lim: 13 exec/s: 95 rss: 77Mb L: 13/13 MS: 1 ChangeBinInt- 00:09:00.290 [2024-11-17 11:03:24.794218] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:00.290 [2024-11-17 11:03:24.794249] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:00.290 #96 NEW cov: 11206 ft: 17276 corp: 9/105b lim: 13 exec/s: 96 rss: 77Mb L: 13/13 MS: 1 PersAutoDict- DE: "\377\377\377\377\003G\033\006"- 00:09:00.551 [2024-11-17 11:03:24.977802] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:00.551 [2024-11-17 11:03:24.977834] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:00.551 #97 NEW cov: 11206 ft: 17674 corp: 10/118b lim: 13 exec/s: 48 rss: 77Mb L: 13/13 MS: 1 ChangeByte- 00:09:00.551 #97 DONE cov: 11206 ft: 17674 corp: 10/118b lim: 13 exec/s: 48 rss: 77Mb 00:09:00.551 ###### Recommended dictionary. ###### 00:09:00.551 "\377\377\377\377\003G\033\006" # Uses: 2 00:09:00.551 ###### End of recommended dictionary. ###### 00:09:00.551 Done 97 runs in 2 second(s) 00:09:00.551 [2024-11-17 11:03:25.106250] vfio_user.c:2802:disable_ctrlr: *NOTICE*: /tmp/vfio-user-5/domain/2: disabling controller 00:09:00.812 11:03:25 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-5 /var/tmp/suppress_vfio_fuzz 00:09:00.812 11:03:25 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:09:00.812 11:03:25 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:00.812 11:03:25 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 6 1 0x1 00:09:00.812 11:03:25 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@22 -- # local fuzzer_type=6 00:09:00.812 11:03:25 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@23 -- # local timen=1 00:09:00.812 11:03:25 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@24 -- # local core=0x1 00:09:00.812 11:03:25 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:09:00.812 11:03:25 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-6 00:09:00.812 11:03:25 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-6/domain/1 00:09:00.812 11:03:25 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-6/domain/2 00:09:00.812 11:03:25 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-6/fuzz_vfio_json.conf 00:09:00.812 11:03:25 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:09:00.812 11:03:25 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:09:00.812 11:03:25 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-6 /tmp/vfio-user-6/domain/1 /tmp/vfio-user-6/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:09:00.812 11:03:25 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-6/domain/1%; 00:09:00.812 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-6/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:09:00.812 11:03:25 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:09:00.812 11:03:25 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:09:00.812 11:03:25 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-6/domain/1 -c /tmp/vfio-user-6/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 -Y /tmp/vfio-user-6/domain/2 -r /tmp/vfio-user-6/spdk6.sock -Z 6 00:09:00.812 [2024-11-17 11:03:25.365410] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:09:00.812 [2024-11-17 11:03:25.365479] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid162591 ] 00:09:00.812 [2024-11-17 11:03:25.457123] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:01.073 [2024-11-17 11:03:25.479677] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:09:01.073 INFO: Running with entropic power schedule (0xFF, 100). 00:09:01.073 INFO: Seed: 4279908272 00:09:01.073 INFO: Loaded 1 modules (384840 inline 8-bit counters): 384840 [0x2a4e00c, 0x2aabf54), 00:09:01.073 INFO: Loaded 1 PC tables (384840 PCs): 384840 [0x2aabf58,0x308b3d8), 00:09:01.073 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:09:01.073 INFO: A corpus is not provided, starting from an empty corpus 00:09:01.073 #2 INITED exec/s: 0 rss: 65Mb 00:09:01.073 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:01.073 This may also happen if the target rejected all inputs we tried so far 00:09:01.073 [2024-11-17 11:03:25.721958] vfio_user.c:2840:enable_ctrlr: *NOTICE*: /tmp/vfio-user-6/domain/2: enabling controller 00:09:01.333 [2024-11-17 11:03:25.785075] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:01.333 [2024-11-17 11:03:25.785109] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:01.594 NEW_FUNC[1/672]: 0x455798 in fuzz_vfio_user_set_msix /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:190 00:09:01.594 NEW_FUNC[2/672]: 0x457cb8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:09:01.594 #17 NEW cov: 11148 ft: 10718 corp: 2/10b lim: 9 exec/s: 0 rss: 71Mb L: 9/9 MS: 5 ChangeByte-InsertRepeatedBytes-ChangeBinInt-ShuffleBytes-InsertByte- 00:09:01.855 [2024-11-17 11:03:26.273804] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:01.855 [2024-11-17 11:03:26.273845] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:01.855 NEW_FUNC[1/2]: 0x1f34ca8 in accel_comp_poll /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/accel/accel_sw.c:683 00:09:01.855 NEW_FUNC[2/2]: 0x1f61ce8 in spdk_get_thread /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:1282 00:09:01.855 #27 NEW cov: 11170 ft: 13877 corp: 3/19b lim: 9 exec/s: 0 rss: 72Mb L: 9/9 MS: 5 ChangeBit-ShuffleBytes-ChangeByte-CopyPart-InsertRepeatedBytes- 00:09:01.855 [2024-11-17 11:03:26.476152] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:01.855 [2024-11-17 11:03:26.476183] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:02.115 NEW_FUNC[1/1]: 0x1c1a8e8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:09:02.115 #28 NEW cov: 11187 ft: 15363 corp: 4/28b lim: 9 exec/s: 0 rss: 73Mb L: 9/9 MS: 1 ChangeBinInt- 00:09:02.115 [2024-11-17 11:03:26.649508] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:02.115 [2024-11-17 11:03:26.649540] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:02.115 #29 NEW cov: 11187 ft: 16457 corp: 5/37b lim: 9 exec/s: 29 rss: 73Mb L: 9/9 MS: 1 CopyPart- 00:09:02.376 [2024-11-17 11:03:26.827917] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:02.376 [2024-11-17 11:03:26.827948] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:02.376 #30 NEW cov: 11187 ft: 17005 corp: 6/46b lim: 9 exec/s: 30 rss: 73Mb L: 9/9 MS: 1 CrossOver- 00:09:02.376 [2024-11-17 11:03:27.010749] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:02.376 [2024-11-17 11:03:27.010779] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:02.636 #31 NEW cov: 11187 ft: 17238 corp: 7/55b lim: 9 exec/s: 31 rss: 73Mb L: 9/9 MS: 1 CrossOver- 00:09:02.636 [2024-11-17 11:03:27.188375] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:02.636 [2024-11-17 11:03:27.188407] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:02.896 #32 NEW cov: 11187 ft: 17436 corp: 8/64b lim: 9 exec/s: 32 rss: 73Mb L: 9/9 MS: 1 ChangeBit- 00:09:02.896 [2024-11-17 11:03:27.362379] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:02.897 [2024-11-17 11:03:27.362410] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:02.897 #33 NEW cov: 11187 ft: 17596 corp: 9/73b lim: 9 exec/s: 33 rss: 73Mb L: 9/9 MS: 1 ChangeBit- 00:09:02.897 [2024-11-17 11:03:27.532766] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:02.897 [2024-11-17 11:03:27.532796] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:03.157 #34 NEW cov: 11194 ft: 17650 corp: 10/82b lim: 9 exec/s: 34 rss: 74Mb L: 9/9 MS: 1 ChangeByte- 00:09:03.157 [2024-11-17 11:03:27.723344] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:03.157 [2024-11-17 11:03:27.723375] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:03.417 #40 NEW cov: 11194 ft: 17837 corp: 11/91b lim: 9 exec/s: 20 rss: 74Mb L: 9/9 MS: 1 CMP- DE: "n\000\000\000\000\000\000\000"- 00:09:03.417 #40 DONE cov: 11194 ft: 17837 corp: 11/91b lim: 9 exec/s: 20 rss: 74Mb 00:09:03.417 ###### Recommended dictionary. ###### 00:09:03.417 "n\000\000\000\000\000\000\000" # Uses: 0 00:09:03.417 ###### End of recommended dictionary. ###### 00:09:03.417 Done 40 runs in 2 second(s) 00:09:03.417 [2024-11-17 11:03:27.851235] vfio_user.c:2802:disable_ctrlr: *NOTICE*: /tmp/vfio-user-6/domain/2: disabling controller 00:09:03.417 11:03:28 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-6 /var/tmp/suppress_vfio_fuzz 00:09:03.417 11:03:28 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:09:03.417 11:03:28 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:03.418 11:03:28 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@84 -- # trap - SIGINT SIGTERM EXIT 00:09:03.418 00:09:03.418 real 0m19.415s 00:09:03.418 user 0m27.479s 00:09:03.418 sys 0m1.927s 00:09:03.418 11:03:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1130 -- # xtrace_disable 00:09:03.418 11:03:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@10 -- # set +x 00:09:03.418 ************************************ 00:09:03.418 END TEST vfio_llvm_fuzz 00:09:03.418 ************************************ 00:09:03.678 00:09:03.678 real 1m22.743s 00:09:03.678 user 2m6.994s 00:09:03.678 sys 0m9.656s 00:09:03.678 11:03:28 llvm_fuzz -- common/autotest_common.sh@1130 -- # xtrace_disable 00:09:03.678 11:03:28 llvm_fuzz -- common/autotest_common.sh@10 -- # set +x 00:09:03.678 ************************************ 00:09:03.678 END TEST llvm_fuzz 00:09:03.678 ************************************ 00:09:03.678 11:03:28 -- spdk/autotest.sh@378 -- # [[ '' -eq 1 ]] 00:09:03.678 11:03:28 -- spdk/autotest.sh@385 -- # trap - SIGINT SIGTERM EXIT 00:09:03.678 11:03:28 -- spdk/autotest.sh@387 -- # timing_enter post_cleanup 00:09:03.678 11:03:28 -- common/autotest_common.sh@726 -- # xtrace_disable 00:09:03.678 11:03:28 -- common/autotest_common.sh@10 -- # set +x 00:09:03.678 11:03:28 -- spdk/autotest.sh@388 -- # autotest_cleanup 00:09:03.678 11:03:28 -- common/autotest_common.sh@1396 -- # local autotest_es=0 00:09:03.678 11:03:28 -- common/autotest_common.sh@1397 -- # xtrace_disable 00:09:03.678 11:03:28 -- common/autotest_common.sh@10 -- # set +x 00:09:10.267 INFO: APP EXITING 00:09:10.267 INFO: killing all VMs 00:09:10.267 INFO: killing vhost app 00:09:10.267 INFO: EXIT DONE 00:09:13.569 Waiting for block devices as requested 00:09:13.569 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:09:13.569 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:09:13.830 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:09:13.830 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:09:13.830 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:09:14.091 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:09:14.091 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:09:14.091 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:09:14.351 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:09:14.351 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:09:14.351 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:09:14.612 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:09:14.612 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:09:14.612 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:09:14.873 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:09:14.873 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:09:14.873 0000:d8:00.0 (8086 0a54): vfio-pci -> nvme 00:09:19.083 Cleaning 00:09:19.083 Removing: /dev/shm/spdk_tgt_trace.pid134890 00:09:19.083 Removing: /var/run/dpdk/spdk_pid132423 00:09:19.083 Removing: /var/run/dpdk/spdk_pid133597 00:09:19.083 Removing: /var/run/dpdk/spdk_pid134890 00:09:19.083 Removing: /var/run/dpdk/spdk_pid135370 00:09:19.083 Removing: /var/run/dpdk/spdk_pid136457 00:09:19.083 Removing: /var/run/dpdk/spdk_pid136495 00:09:19.083 Removing: /var/run/dpdk/spdk_pid137586 00:09:19.083 Removing: /var/run/dpdk/spdk_pid137592 00:09:19.083 Removing: /var/run/dpdk/spdk_pid138033 00:09:19.083 Removing: /var/run/dpdk/spdk_pid138356 00:09:19.083 Removing: /var/run/dpdk/spdk_pid138685 00:09:19.083 Removing: /var/run/dpdk/spdk_pid139021 00:09:19.083 Removing: /var/run/dpdk/spdk_pid139194 00:09:19.083 Removing: /var/run/dpdk/spdk_pid139387 00:09:19.083 Removing: /var/run/dpdk/spdk_pid139667 00:09:19.083 Removing: /var/run/dpdk/spdk_pid139989 00:09:19.083 Removing: /var/run/dpdk/spdk_pid140602 00:09:19.083 Removing: /var/run/dpdk/spdk_pid143735 00:09:19.083 Removing: /var/run/dpdk/spdk_pid144038 00:09:19.083 Removing: /var/run/dpdk/spdk_pid144329 00:09:19.083 Removing: /var/run/dpdk/spdk_pid144333 00:09:19.083 Removing: /var/run/dpdk/spdk_pid144893 00:09:19.083 Removing: /var/run/dpdk/spdk_pid144901 00:09:19.083 Removing: /var/run/dpdk/spdk_pid145461 00:09:19.083 Removing: /var/run/dpdk/spdk_pid145468 00:09:19.083 Removing: /var/run/dpdk/spdk_pid145772 00:09:19.083 Removing: /var/run/dpdk/spdk_pid145809 00:09:19.083 Removing: /var/run/dpdk/spdk_pid146068 00:09:19.083 Removing: /var/run/dpdk/spdk_pid146088 00:09:19.083 Removing: /var/run/dpdk/spdk_pid146706 00:09:19.083 Removing: /var/run/dpdk/spdk_pid146990 00:09:19.083 Removing: /var/run/dpdk/spdk_pid147167 00:09:19.083 Removing: /var/run/dpdk/spdk_pid147355 00:09:19.083 Removing: /var/run/dpdk/spdk_pid148112 00:09:19.083 Removing: /var/run/dpdk/spdk_pid148399 00:09:19.083 Removing: /var/run/dpdk/spdk_pid148928 00:09:19.083 Removing: /var/run/dpdk/spdk_pid149339 00:09:19.083 Removing: /var/run/dpdk/spdk_pid149749 00:09:19.083 Removing: /var/run/dpdk/spdk_pid150282 00:09:19.083 Removing: /var/run/dpdk/spdk_pid150579 00:09:19.083 Removing: /var/run/dpdk/spdk_pid151107 00:09:19.083 Removing: /var/run/dpdk/spdk_pid151562 00:09:19.083 Removing: /var/run/dpdk/spdk_pid151918 00:09:19.083 Removing: /var/run/dpdk/spdk_pid152453 00:09:19.083 Removing: /var/run/dpdk/spdk_pid152807 00:09:19.083 Removing: /var/run/dpdk/spdk_pid153277 00:09:19.083 Removing: /var/run/dpdk/spdk_pid153804 00:09:19.083 Removing: /var/run/dpdk/spdk_pid154097 00:09:19.083 Removing: /var/run/dpdk/spdk_pid154627 00:09:19.083 Removing: /var/run/dpdk/spdk_pid155053 00:09:19.083 Removing: /var/run/dpdk/spdk_pid155451 00:09:19.083 Removing: /var/run/dpdk/spdk_pid155980 00:09:19.083 Removing: /var/run/dpdk/spdk_pid156318 00:09:19.083 Removing: /var/run/dpdk/spdk_pid156800 00:09:19.083 Removing: /var/run/dpdk/spdk_pid157291 00:09:19.083 Removing: /var/run/dpdk/spdk_pid157696 00:09:19.083 Removing: /var/run/dpdk/spdk_pid158274 00:09:19.083 Removing: /var/run/dpdk/spdk_pid159087 00:09:19.083 Removing: /var/run/dpdk/spdk_pid159699 00:09:19.083 Removing: /var/run/dpdk/spdk_pid160152 00:09:19.083 Removing: /var/run/dpdk/spdk_pid160688 00:09:19.083 Removing: /var/run/dpdk/spdk_pid161233 00:09:19.083 Removing: /var/run/dpdk/spdk_pid161647 00:09:19.083 Removing: /var/run/dpdk/spdk_pid162060 00:09:19.083 Removing: /var/run/dpdk/spdk_pid162591 00:09:19.083 Clean 00:09:19.083 11:03:43 -- common/autotest_common.sh@1453 -- # return 0 00:09:19.083 11:03:43 -- spdk/autotest.sh@389 -- # timing_exit post_cleanup 00:09:19.083 11:03:43 -- common/autotest_common.sh@732 -- # xtrace_disable 00:09:19.083 11:03:43 -- common/autotest_common.sh@10 -- # set +x 00:09:19.083 11:03:43 -- spdk/autotest.sh@391 -- # timing_exit autotest 00:09:19.083 11:03:43 -- common/autotest_common.sh@732 -- # xtrace_disable 00:09:19.083 11:03:43 -- common/autotest_common.sh@10 -- # set +x 00:09:19.083 11:03:43 -- spdk/autotest.sh@392 -- # chmod a+r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt 00:09:19.083 11:03:43 -- spdk/autotest.sh@394 -- # [[ -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/udev.log ]] 00:09:19.083 11:03:43 -- spdk/autotest.sh@394 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/udev.log 00:09:19.083 11:03:43 -- spdk/autotest.sh@396 -- # [[ y == y ]] 00:09:19.083 11:03:43 -- spdk/autotest.sh@398 -- # hostname 00:09:19.083 11:03:43 -- spdk/autotest.sh@398 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -c --no-external -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk -t spdk-wfp-20 -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_test.info 00:09:19.083 geninfo: WARNING: invalid characters removed from testname! 00:09:24.374 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/mdns_server.gcda 00:09:28.581 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_stubs.gcda 00:09:32.786 11:03:56 -- spdk/autotest.sh@399 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -a /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_base.info -a /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_test.info -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:40.920 11:04:04 -- spdk/autotest.sh@400 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/dpdk/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:45.116 11:04:09 -- spdk/autotest.sh@404 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info --ignore-errors unused,unused '/usr/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:50.394 11:04:14 -- spdk/autotest.sh@405 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/examples/vmd/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:55.675 11:04:19 -- spdk/autotest.sh@406 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_lspci/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:10:00.954 11:04:25 -- spdk/autotest.sh@407 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_top/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:10:06.233 11:04:30 -- spdk/autotest.sh@408 -- # rm -f cov_base.info cov_test.info OLD_STDOUT OLD_STDERR 00:10:06.233 11:04:30 -- spdk/autorun.sh@1 -- $ timing_finish 00:10:06.233 11:04:30 -- common/autotest_common.sh@738 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt ]] 00:10:06.233 11:04:30 -- common/autotest_common.sh@740 -- $ flamegraph=/usr/local/FlameGraph/flamegraph.pl 00:10:06.233 11:04:30 -- common/autotest_common.sh@741 -- $ [[ -x /usr/local/FlameGraph/flamegraph.pl ]] 00:10:06.233 11:04:30 -- common/autotest_common.sh@744 -- $ /usr/local/FlameGraph/flamegraph.pl --title 'Build Timing' --nametype Step: --countname seconds /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt 00:10:06.233 + [[ -n 5640 ]] 00:10:06.233 + sudo kill 5640 00:10:06.244 [Pipeline] } 00:10:06.258 [Pipeline] // stage 00:10:06.263 [Pipeline] } 00:10:06.279 [Pipeline] // timeout 00:10:06.285 [Pipeline] } 00:10:06.299 [Pipeline] // catchError 00:10:06.304 [Pipeline] } 00:10:06.319 [Pipeline] // wrap 00:10:06.325 [Pipeline] } 00:10:06.338 [Pipeline] // catchError 00:10:06.347 [Pipeline] stage 00:10:06.349 [Pipeline] { (Epilogue) 00:10:06.362 [Pipeline] catchError 00:10:06.364 [Pipeline] { 00:10:06.376 [Pipeline] echo 00:10:06.378 Cleanup processes 00:10:06.384 [Pipeline] sh 00:10:06.677 + sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:10:06.677 171054 sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:10:06.692 [Pipeline] sh 00:10:06.980 ++ sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:10:06.980 ++ grep -v 'sudo pgrep' 00:10:06.980 ++ awk '{print $1}' 00:10:06.980 + sudo kill -9 00:10:06.980 + true 00:10:06.993 [Pipeline] sh 00:10:07.280 + jbp/jenkins/jjb-config/jobs/scripts/compress_artifacts.sh 00:10:07.280 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,718 MiB 00:10:07.280 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,718 MiB 00:10:08.660 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,718 MiB 00:10:18.662 [Pipeline] sh 00:10:18.951 + jbp/jenkins/jjb-config/jobs/scripts/check_artifacts_size.sh 00:10:18.951 Artifacts sizes are good 00:10:18.967 [Pipeline] archiveArtifacts 00:10:18.975 Archiving artifacts 00:10:19.387 [Pipeline] sh 00:10:19.677 + sudo chown -R sys_sgci: /var/jenkins/workspace/short-fuzz-phy-autotest 00:10:19.692 [Pipeline] cleanWs 00:10:19.702 [WS-CLEANUP] Deleting project workspace... 00:10:19.702 [WS-CLEANUP] Deferred wipeout is used... 00:10:19.709 [WS-CLEANUP] done 00:10:19.710 [Pipeline] } 00:10:19.727 [Pipeline] // catchError 00:10:19.737 [Pipeline] sh 00:10:20.026 + logger -p user.info -t JENKINS-CI 00:10:20.036 [Pipeline] } 00:10:20.050 [Pipeline] // stage 00:10:20.055 [Pipeline] } 00:10:20.069 [Pipeline] // node 00:10:20.073 [Pipeline] End of Pipeline 00:10:20.115 Finished: SUCCESS