00:00:00.002 Started by upstream project "autotest-spdk-v24.01-LTS-vs-dpdk-v22.11" build number 979 00:00:00.002 originally caused by: 00:00:00.002 Started by upstream project "nightly-trigger" build number 3646 00:00:00.002 originally caused by: 00:00:00.002 Started by timer 00:00:00.002 Started by timer 00:00:00.053 Checking out git https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool into /var/jenkins_home/workspace/short-fuzz-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4 to read jbp/jenkins/jjb-config/jobs/autotest-downstream/autotest-phy.groovy 00:00:00.054 The recommended git tool is: git 00:00:00.055 using credential 00000000-0000-0000-0000-000000000002 00:00:00.057 > git rev-parse --resolve-git-dir /var/jenkins_home/workspace/short-fuzz-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4/jbp/.git # timeout=10 00:00:00.070 Fetching changes from the remote Git repository 00:00:00.084 > git config remote.origin.url https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool # timeout=10 00:00:00.104 Using shallow fetch with depth 1 00:00:00.104 Fetching upstream changes from https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool 00:00:00.104 > git --version # timeout=10 00:00:00.131 > git --version # 'git version 2.39.2' 00:00:00.131 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:00.163 Setting http proxy: proxy-dmz.intel.com:911 00:00:00.163 > git fetch --tags --force --progress --depth=1 -- https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool refs/heads/master # timeout=5 00:00:05.513 > git rev-parse origin/FETCH_HEAD^{commit} # timeout=10 00:00:05.523 > git rev-parse FETCH_HEAD^{commit} # timeout=10 00:00:05.533 Checking out Revision db4637e8b949f278f369ec13f70585206ccd9507 (FETCH_HEAD) 00:00:05.533 > git config core.sparsecheckout # timeout=10 00:00:05.543 > git read-tree -mu HEAD # timeout=10 00:00:05.558 > git checkout -f db4637e8b949f278f369ec13f70585206ccd9507 # timeout=5 00:00:05.580 Commit message: "jenkins/jjb-config: Add missing SPDK_TEST_NVME_INTERRUPT flag" 00:00:05.580 > git rev-list --no-walk db4637e8b949f278f369ec13f70585206ccd9507 # timeout=10 00:00:05.659 [Pipeline] Start of Pipeline 00:00:05.674 [Pipeline] library 00:00:05.676 Loading library shm_lib@master 00:00:05.677 Library shm_lib@master is cached. Copying from home. 00:00:05.695 [Pipeline] node 00:00:05.712 Running on WFP20 in /var/jenkins/workspace/short-fuzz-phy-autotest 00:00:05.714 [Pipeline] { 00:00:05.728 [Pipeline] catchError 00:00:05.730 [Pipeline] { 00:00:05.747 [Pipeline] wrap 00:00:05.756 [Pipeline] { 00:00:05.763 [Pipeline] stage 00:00:05.765 [Pipeline] { (Prologue) 00:00:05.965 [Pipeline] sh 00:00:06.251 + logger -p user.info -t JENKINS-CI 00:00:06.270 [Pipeline] echo 00:00:06.271 Node: WFP20 00:00:06.278 [Pipeline] sh 00:00:06.578 [Pipeline] setCustomBuildProperty 00:00:06.587 [Pipeline] echo 00:00:06.589 Cleanup processes 00:00:06.592 [Pipeline] sh 00:00:06.875 + sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:06.875 502829 sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:06.887 [Pipeline] sh 00:00:07.172 ++ sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:07.172 ++ grep -v 'sudo pgrep' 00:00:07.172 ++ awk '{print $1}' 00:00:07.172 + sudo kill -9 00:00:07.172 + true 00:00:07.187 [Pipeline] cleanWs 00:00:07.198 [WS-CLEANUP] Deleting project workspace... 00:00:07.198 [WS-CLEANUP] Deferred wipeout is used... 00:00:07.204 [WS-CLEANUP] done 00:00:07.208 [Pipeline] setCustomBuildProperty 00:00:07.223 [Pipeline] sh 00:00:07.505 + sudo git config --global --replace-all safe.directory '*' 00:00:07.635 [Pipeline] httpRequest 00:00:07.995 [Pipeline] echo 00:00:07.997 Sorcerer 10.211.164.20 is alive 00:00:08.008 [Pipeline] retry 00:00:08.010 [Pipeline] { 00:00:08.025 [Pipeline] httpRequest 00:00:08.030 HttpMethod: GET 00:00:08.030 URL: http://10.211.164.20/packages/jbp_db4637e8b949f278f369ec13f70585206ccd9507.tar.gz 00:00:08.031 Sending request to url: http://10.211.164.20/packages/jbp_db4637e8b949f278f369ec13f70585206ccd9507.tar.gz 00:00:08.042 Response Code: HTTP/1.1 200 OK 00:00:08.042 Success: Status code 200 is in the accepted range: 200,404 00:00:08.042 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/jbp_db4637e8b949f278f369ec13f70585206ccd9507.tar.gz 00:00:10.386 [Pipeline] } 00:00:10.399 [Pipeline] // retry 00:00:10.404 [Pipeline] sh 00:00:10.686 + tar --no-same-owner -xf jbp_db4637e8b949f278f369ec13f70585206ccd9507.tar.gz 00:00:10.704 [Pipeline] httpRequest 00:00:11.090 [Pipeline] echo 00:00:11.092 Sorcerer 10.211.164.20 is alive 00:00:11.103 [Pipeline] retry 00:00:11.105 [Pipeline] { 00:00:11.121 [Pipeline] httpRequest 00:00:11.125 HttpMethod: GET 00:00:11.126 URL: http://10.211.164.20/packages/spdk_c13c99a5eba3bff912124706e0ae1d70defef44d.tar.gz 00:00:11.126 Sending request to url: http://10.211.164.20/packages/spdk_c13c99a5eba3bff912124706e0ae1d70defef44d.tar.gz 00:00:11.142 Response Code: HTTP/1.1 200 OK 00:00:11.142 Success: Status code 200 is in the accepted range: 200,404 00:00:11.142 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk_c13c99a5eba3bff912124706e0ae1d70defef44d.tar.gz 00:01:07.026 [Pipeline] } 00:01:07.041 [Pipeline] // retry 00:01:07.048 [Pipeline] sh 00:01:07.331 + tar --no-same-owner -xf spdk_c13c99a5eba3bff912124706e0ae1d70defef44d.tar.gz 00:01:09.881 [Pipeline] sh 00:01:10.165 + git -C spdk log --oneline -n5 00:01:10.165 c13c99a5e test: Various fixes for Fedora40 00:01:10.165 726a04d70 test/nvmf: adjust timeout for bigger nvmes 00:01:10.165 61c96acfb dpdk: Point dpdk submodule at a latest fix from spdk-23.11 00:01:10.165 7db6dcdb8 nvme/fio_plugin: update the way ruhs descriptors are fetched 00:01:10.166 ff6f5c41e nvme/fio_plugin: trim add support for multiple ranges 00:01:10.182 [Pipeline] withCredentials 00:01:10.192 > git --version # timeout=10 00:01:10.202 > git --version # 'git version 2.39.2' 00:01:10.219 Masking supported pattern matches of $GIT_PASSWORD or $GIT_ASKPASS 00:01:10.221 [Pipeline] { 00:01:10.229 [Pipeline] retry 00:01:10.230 [Pipeline] { 00:01:10.244 [Pipeline] sh 00:01:10.528 + git ls-remote http://dpdk.org/git/dpdk-stable v22.11.4 00:01:11.920 [Pipeline] } 00:01:11.938 [Pipeline] // retry 00:01:11.944 [Pipeline] } 00:01:11.962 [Pipeline] // withCredentials 00:01:11.972 [Pipeline] httpRequest 00:01:12.340 [Pipeline] echo 00:01:12.342 Sorcerer 10.211.164.20 is alive 00:01:12.353 [Pipeline] retry 00:01:12.355 [Pipeline] { 00:01:12.370 [Pipeline] httpRequest 00:01:12.374 HttpMethod: GET 00:01:12.375 URL: http://10.211.164.20/packages/dpdk_fee0f13c213d0584f0c42a51d0e0625d99a0b2f1.tar.gz 00:01:12.376 Sending request to url: http://10.211.164.20/packages/dpdk_fee0f13c213d0584f0c42a51d0e0625d99a0b2f1.tar.gz 00:01:12.383 Response Code: HTTP/1.1 200 OK 00:01:12.384 Success: Status code 200 is in the accepted range: 200,404 00:01:12.384 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk_fee0f13c213d0584f0c42a51d0e0625d99a0b2f1.tar.gz 00:01:28.999 [Pipeline] } 00:01:29.015 [Pipeline] // retry 00:01:29.022 [Pipeline] sh 00:01:29.307 + tar --no-same-owner -xf dpdk_fee0f13c213d0584f0c42a51d0e0625d99a0b2f1.tar.gz 00:01:30.700 [Pipeline] sh 00:01:30.986 + git -C dpdk log --oneline -n5 00:01:30.986 caf0f5d395 version: 22.11.4 00:01:30.986 7d6f1cc05f Revert "net/iavf: fix abnormal disable HW interrupt" 00:01:30.986 dc9c799c7d vhost: fix missing spinlock unlock 00:01:30.986 4307659a90 net/mlx5: fix LACP redirection in Rx domain 00:01:30.986 6ef77f2a5e net/gve: fix RX buffer size alignment 00:01:30.997 [Pipeline] } 00:01:31.018 [Pipeline] // stage 00:01:31.027 [Pipeline] stage 00:01:31.029 [Pipeline] { (Prepare) 00:01:31.049 [Pipeline] writeFile 00:01:31.064 [Pipeline] sh 00:01:31.349 + logger -p user.info -t JENKINS-CI 00:01:31.361 [Pipeline] sh 00:01:31.662 + logger -p user.info -t JENKINS-CI 00:01:31.672 [Pipeline] sh 00:01:31.953 + cat autorun-spdk.conf 00:01:31.953 SPDK_RUN_FUNCTIONAL_TEST=1 00:01:31.953 SPDK_RUN_UBSAN=1 00:01:31.953 SPDK_TEST_FUZZER=1 00:01:31.953 SPDK_TEST_FUZZER_SHORT=1 00:01:31.953 SPDK_TEST_NATIVE_DPDK=v22.11.4 00:01:31.953 SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:31.961 RUN_NIGHTLY=1 00:01:31.966 [Pipeline] readFile 00:01:31.990 [Pipeline] withEnv 00:01:31.992 [Pipeline] { 00:01:32.005 [Pipeline] sh 00:01:32.292 + set -ex 00:01:32.292 + [[ -f /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf ]] 00:01:32.292 + source /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:01:32.292 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:01:32.292 ++ SPDK_RUN_UBSAN=1 00:01:32.292 ++ SPDK_TEST_FUZZER=1 00:01:32.292 ++ SPDK_TEST_FUZZER_SHORT=1 00:01:32.292 ++ SPDK_TEST_NATIVE_DPDK=v22.11.4 00:01:32.292 ++ SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:32.292 ++ RUN_NIGHTLY=1 00:01:32.292 + case $SPDK_TEST_NVMF_NICS in 00:01:32.292 + DRIVERS= 00:01:32.292 + [[ -n '' ]] 00:01:32.292 + exit 0 00:01:32.301 [Pipeline] } 00:01:32.315 [Pipeline] // withEnv 00:01:32.320 [Pipeline] } 00:01:32.333 [Pipeline] // stage 00:01:32.342 [Pipeline] catchError 00:01:32.344 [Pipeline] { 00:01:32.357 [Pipeline] timeout 00:01:32.357 Timeout set to expire in 30 min 00:01:32.358 [Pipeline] { 00:01:32.372 [Pipeline] stage 00:01:32.374 [Pipeline] { (Tests) 00:01:32.387 [Pipeline] sh 00:01:32.673 + jbp/jenkins/jjb-config/jobs/scripts/autoruner.sh /var/jenkins/workspace/short-fuzz-phy-autotest 00:01:32.673 ++ readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest 00:01:32.673 + DIR_ROOT=/var/jenkins/workspace/short-fuzz-phy-autotest 00:01:32.673 + [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest ]] 00:01:32.673 + DIR_SPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:32.673 + DIR_OUTPUT=/var/jenkins/workspace/short-fuzz-phy-autotest/output 00:01:32.673 + [[ -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk ]] 00:01:32.673 + [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/output ]] 00:01:32.673 + mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/output 00:01:32.673 + [[ -d /var/jenkins/workspace/short-fuzz-phy-autotest/output ]] 00:01:32.673 + [[ short-fuzz-phy-autotest == pkgdep-* ]] 00:01:32.673 + cd /var/jenkins/workspace/short-fuzz-phy-autotest 00:01:32.673 + source /etc/os-release 00:01:32.673 ++ NAME='Fedora Linux' 00:01:32.673 ++ VERSION='39 (Cloud Edition)' 00:01:32.673 ++ ID=fedora 00:01:32.673 ++ VERSION_ID=39 00:01:32.673 ++ VERSION_CODENAME= 00:01:32.674 ++ PLATFORM_ID=platform:f39 00:01:32.674 ++ PRETTY_NAME='Fedora Linux 39 (Cloud Edition)' 00:01:32.674 ++ ANSI_COLOR='0;38;2;60;110;180' 00:01:32.674 ++ LOGO=fedora-logo-icon 00:01:32.674 ++ CPE_NAME=cpe:/o:fedoraproject:fedora:39 00:01:32.674 ++ HOME_URL=https://fedoraproject.org/ 00:01:32.674 ++ DOCUMENTATION_URL=https://docs.fedoraproject.org/en-US/fedora/f39/system-administrators-guide/ 00:01:32.674 ++ SUPPORT_URL=https://ask.fedoraproject.org/ 00:01:32.674 ++ BUG_REPORT_URL=https://bugzilla.redhat.com/ 00:01:32.674 ++ REDHAT_BUGZILLA_PRODUCT=Fedora 00:01:32.674 ++ REDHAT_BUGZILLA_PRODUCT_VERSION=39 00:01:32.674 ++ REDHAT_SUPPORT_PRODUCT=Fedora 00:01:32.674 ++ REDHAT_SUPPORT_PRODUCT_VERSION=39 00:01:32.674 ++ SUPPORT_END=2024-11-12 00:01:32.674 ++ VARIANT='Cloud Edition' 00:01:32.674 ++ VARIANT_ID=cloud 00:01:32.674 + uname -a 00:01:32.674 Linux spdk-wfp-20 6.8.9-200.fc39.x86_64 #1 SMP PREEMPT_DYNAMIC Wed Jul 24 03:04:40 UTC 2024 x86_64 GNU/Linux 00:01:32.674 + sudo /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:01:35.969 Hugepages 00:01:35.969 node hugesize free / total 00:01:35.969 node0 1048576kB 0 / 0 00:01:35.969 node0 2048kB 0 / 0 00:01:35.969 node1 1048576kB 0 / 0 00:01:35.969 node1 2048kB 0 / 0 00:01:35.969 00:01:35.969 Type BDF Vendor Device NUMA Driver Device Block devices 00:01:35.969 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:01:35.969 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:01:35.969 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:01:35.969 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:01:35.969 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:01:35.969 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:01:35.969 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:01:35.969 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:01:35.969 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:01:35.969 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:01:35.969 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:01:35.969 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:01:35.969 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:01:35.969 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:01:35.969 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:01:35.969 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:01:35.969 NVMe 0000:d8:00.0 8086 0a54 1 nvme nvme0 nvme0n1 00:01:35.969 + rm -f /tmp/spdk-ld-path 00:01:35.969 + source autorun-spdk.conf 00:01:35.969 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:01:35.969 ++ SPDK_RUN_UBSAN=1 00:01:35.969 ++ SPDK_TEST_FUZZER=1 00:01:35.969 ++ SPDK_TEST_FUZZER_SHORT=1 00:01:35.969 ++ SPDK_TEST_NATIVE_DPDK=v22.11.4 00:01:35.969 ++ SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:35.969 ++ RUN_NIGHTLY=1 00:01:35.969 + (( SPDK_TEST_NVME_CMB == 1 || SPDK_TEST_NVME_PMR == 1 )) 00:01:35.969 + [[ -n '' ]] 00:01:35.969 + sudo git config --global --add safe.directory /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:35.970 + for M in /var/spdk/build-*-manifest.txt 00:01:35.970 + [[ -f /var/spdk/build-kernel-manifest.txt ]] 00:01:35.970 + cp /var/spdk/build-kernel-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:01:35.970 + for M in /var/spdk/build-*-manifest.txt 00:01:35.970 + [[ -f /var/spdk/build-pkg-manifest.txt ]] 00:01:35.970 + cp /var/spdk/build-pkg-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:01:35.970 + for M in /var/spdk/build-*-manifest.txt 00:01:35.970 + [[ -f /var/spdk/build-repo-manifest.txt ]] 00:01:35.970 + cp /var/spdk/build-repo-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:01:35.970 ++ uname 00:01:35.970 + [[ Linux == \L\i\n\u\x ]] 00:01:35.970 + sudo dmesg -T 00:01:35.970 + sudo dmesg --clear 00:01:35.970 + dmesg_pid=504304 00:01:35.970 + [[ Fedora Linux == FreeBSD ]] 00:01:35.970 + export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:01:35.970 + UNBIND_ENTIRE_IOMMU_GROUP=yes 00:01:35.970 + [[ -e /var/spdk/dependencies/vhost/spdk_test_image.qcow2 ]] 00:01:35.970 + [[ -x /usr/src/fio-static/fio ]] 00:01:35.970 + export FIO_BIN=/usr/src/fio-static/fio 00:01:35.970 + FIO_BIN=/usr/src/fio-static/fio 00:01:35.970 + sudo dmesg -Tw 00:01:35.970 + [[ '' == \/\v\a\r\/\j\e\n\k\i\n\s\/\w\o\r\k\s\p\a\c\e\/\s\h\o\r\t\-\f\u\z\z\-\p\h\y\-\a\u\t\o\t\e\s\t\/\q\e\m\u\_\v\f\i\o\/* ]] 00:01:35.970 + [[ ! -v VFIO_QEMU_BIN ]] 00:01:35.970 + [[ -e /usr/local/qemu/vfio-user-latest ]] 00:01:35.970 + export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:01:35.970 + VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:01:35.970 + [[ -e /usr/local/qemu/vanilla-latest ]] 00:01:35.970 + export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:01:35.970 + QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:01:35.970 + spdk/autorun.sh /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:01:35.970 Test configuration: 00:01:35.970 SPDK_RUN_FUNCTIONAL_TEST=1 00:01:35.970 SPDK_RUN_UBSAN=1 00:01:35.970 SPDK_TEST_FUZZER=1 00:01:35.970 SPDK_TEST_FUZZER_SHORT=1 00:01:35.970 SPDK_TEST_NATIVE_DPDK=v22.11.4 00:01:35.970 SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:35.970 RUN_NIGHTLY=1 17:45:28 -- common/autotest_common.sh@1689 -- $ [[ n == y ]] 00:01:35.970 17:45:28 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:01:35.970 17:45:28 -- scripts/common.sh@433 -- $ [[ -e /bin/wpdk_common.sh ]] 00:01:35.970 17:45:28 -- scripts/common.sh@441 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:01:35.970 17:45:28 -- scripts/common.sh@442 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:01:35.970 17:45:28 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:35.970 17:45:28 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:35.970 17:45:28 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:35.970 17:45:28 -- paths/export.sh@5 -- $ export PATH 00:01:35.970 17:45:28 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:35.970 17:45:28 -- common/autobuild_common.sh@439 -- $ out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:01:35.970 17:45:28 -- common/autobuild_common.sh@440 -- $ date +%s 00:01:35.970 17:45:28 -- common/autobuild_common.sh@440 -- $ mktemp -dt spdk_1732034728.XXXXXX 00:01:35.970 17:45:28 -- common/autobuild_common.sh@440 -- $ SPDK_WORKSPACE=/tmp/spdk_1732034728.9hiwiY 00:01:35.970 17:45:28 -- common/autobuild_common.sh@442 -- $ [[ -n '' ]] 00:01:35.970 17:45:28 -- common/autobuild_common.sh@446 -- $ '[' -n v22.11.4 ']' 00:01:35.970 17:45:28 -- common/autobuild_common.sh@447 -- $ dirname /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:35.970 17:45:28 -- common/autobuild_common.sh@447 -- $ scanbuild_exclude=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk' 00:01:35.970 17:45:28 -- common/autobuild_common.sh@453 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp' 00:01:35.970 17:45:28 -- common/autobuild_common.sh@455 -- $ scanbuild='scan-build -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:01:35.970 17:45:28 -- common/autobuild_common.sh@456 -- $ get_config_params 00:01:35.970 17:45:28 -- common/autotest_common.sh@397 -- $ xtrace_disable 00:01:35.970 17:45:28 -- common/autotest_common.sh@10 -- $ set +x 00:01:35.970 17:45:28 -- common/autobuild_common.sh@456 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user' 00:01:35.970 17:45:28 -- spdk/autobuild.sh@11 -- $ SPDK_TEST_AUTOBUILD= 00:01:35.970 17:45:28 -- spdk/autobuild.sh@12 -- $ umask 022 00:01:35.970 17:45:28 -- spdk/autobuild.sh@13 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:35.970 17:45:28 -- spdk/autobuild.sh@16 -- $ date -u 00:01:35.970 Tue Nov 19 04:45:28 PM UTC 2024 00:01:35.970 17:45:28 -- spdk/autobuild.sh@17 -- $ git describe --tags 00:01:35.970 LTS-67-gc13c99a5e 00:01:35.970 17:45:28 -- spdk/autobuild.sh@19 -- $ '[' 0 -eq 1 ']' 00:01:35.970 17:45:28 -- spdk/autobuild.sh@23 -- $ '[' 1 -eq 1 ']' 00:01:35.970 17:45:28 -- spdk/autobuild.sh@24 -- $ run_test ubsan echo 'using ubsan' 00:01:35.970 17:45:28 -- common/autotest_common.sh@1087 -- $ '[' 3 -le 1 ']' 00:01:35.970 17:45:28 -- common/autotest_common.sh@1093 -- $ xtrace_disable 00:01:35.970 17:45:28 -- common/autotest_common.sh@10 -- $ set +x 00:01:35.970 ************************************ 00:01:35.970 START TEST ubsan 00:01:35.970 ************************************ 00:01:35.970 17:45:28 -- common/autotest_common.sh@1114 -- $ echo 'using ubsan' 00:01:35.970 using ubsan 00:01:35.970 00:01:35.970 real 0m0.001s 00:01:35.970 user 0m0.000s 00:01:35.970 sys 0m0.000s 00:01:35.970 17:45:28 -- common/autotest_common.sh@1115 -- $ xtrace_disable 00:01:35.970 17:45:28 -- common/autotest_common.sh@10 -- $ set +x 00:01:35.970 ************************************ 00:01:35.970 END TEST ubsan 00:01:35.970 ************************************ 00:01:35.970 17:45:28 -- spdk/autobuild.sh@27 -- $ '[' -n v22.11.4 ']' 00:01:35.970 17:45:28 -- spdk/autobuild.sh@28 -- $ build_native_dpdk 00:01:35.970 17:45:28 -- common/autobuild_common.sh@432 -- $ run_test build_native_dpdk _build_native_dpdk 00:01:35.970 17:45:28 -- common/autotest_common.sh@1087 -- $ '[' 2 -le 1 ']' 00:01:35.970 17:45:28 -- common/autotest_common.sh@1093 -- $ xtrace_disable 00:01:35.970 17:45:28 -- common/autotest_common.sh@10 -- $ set +x 00:01:35.970 ************************************ 00:01:35.970 START TEST build_native_dpdk 00:01:35.970 ************************************ 00:01:35.970 17:45:28 -- common/autotest_common.sh@1114 -- $ _build_native_dpdk 00:01:35.970 17:45:28 -- common/autobuild_common.sh@48 -- $ local external_dpdk_dir 00:01:35.970 17:45:28 -- common/autobuild_common.sh@49 -- $ local external_dpdk_base_dir 00:01:35.970 17:45:28 -- common/autobuild_common.sh@50 -- $ local compiler_version 00:01:35.970 17:45:28 -- common/autobuild_common.sh@51 -- $ local compiler 00:01:35.970 17:45:28 -- common/autobuild_common.sh@52 -- $ local dpdk_kmods 00:01:35.970 17:45:28 -- common/autobuild_common.sh@53 -- $ local repo=dpdk 00:01:35.970 17:45:28 -- common/autobuild_common.sh@55 -- $ compiler=gcc 00:01:35.970 17:45:28 -- common/autobuild_common.sh@61 -- $ export CC=gcc 00:01:35.970 17:45:28 -- common/autobuild_common.sh@61 -- $ CC=gcc 00:01:35.970 17:45:28 -- common/autobuild_common.sh@63 -- $ [[ gcc != *clang* ]] 00:01:35.970 17:45:28 -- common/autobuild_common.sh@63 -- $ [[ gcc != *gcc* ]] 00:01:35.970 17:45:28 -- common/autobuild_common.sh@68 -- $ gcc -dumpversion 00:01:35.970 17:45:28 -- common/autobuild_common.sh@68 -- $ compiler_version=13 00:01:35.970 17:45:28 -- common/autobuild_common.sh@69 -- $ compiler_version=13 00:01:35.970 17:45:28 -- common/autobuild_common.sh@70 -- $ external_dpdk_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:35.970 17:45:28 -- common/autobuild_common.sh@71 -- $ dirname /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:35.970 17:45:28 -- common/autobuild_common.sh@71 -- $ external_dpdk_base_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk 00:01:35.970 17:45:28 -- common/autobuild_common.sh@73 -- $ [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk ]] 00:01:35.970 17:45:28 -- common/autobuild_common.sh@82 -- $ orgdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:35.970 17:45:28 -- common/autobuild_common.sh@83 -- $ git -C /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk log --oneline -n 5 00:01:35.970 caf0f5d395 version: 22.11.4 00:01:35.970 7d6f1cc05f Revert "net/iavf: fix abnormal disable HW interrupt" 00:01:35.970 dc9c799c7d vhost: fix missing spinlock unlock 00:01:35.970 4307659a90 net/mlx5: fix LACP redirection in Rx domain 00:01:35.970 6ef77f2a5e net/gve: fix RX buffer size alignment 00:01:35.970 17:45:28 -- common/autobuild_common.sh@85 -- $ dpdk_cflags='-fPIC -g -fcommon' 00:01:35.970 17:45:28 -- common/autobuild_common.sh@86 -- $ dpdk_ldflags= 00:01:35.970 17:45:28 -- common/autobuild_common.sh@87 -- $ dpdk_ver=22.11.4 00:01:35.970 17:45:28 -- common/autobuild_common.sh@89 -- $ [[ gcc == *gcc* ]] 00:01:35.970 17:45:28 -- common/autobuild_common.sh@89 -- $ [[ 13 -ge 5 ]] 00:01:35.970 17:45:28 -- common/autobuild_common.sh@90 -- $ dpdk_cflags+=' -Werror' 00:01:35.970 17:45:28 -- common/autobuild_common.sh@93 -- $ [[ gcc == *gcc* ]] 00:01:35.970 17:45:28 -- common/autobuild_common.sh@93 -- $ [[ 13 -ge 10 ]] 00:01:35.970 17:45:28 -- common/autobuild_common.sh@94 -- $ dpdk_cflags+=' -Wno-stringop-overflow' 00:01:35.970 17:45:28 -- common/autobuild_common.sh@100 -- $ DPDK_DRIVERS=("bus" "bus/pci" "bus/vdev" "mempool/ring" "net/i40e" "net/i40e/base") 00:01:35.970 17:45:28 -- common/autobuild_common.sh@102 -- $ local mlx5_libs_added=n 00:01:35.970 17:45:28 -- common/autobuild_common.sh@103 -- $ [[ 0 -eq 1 ]] 00:01:35.970 17:45:28 -- common/autobuild_common.sh@103 -- $ [[ 0 -eq 1 ]] 00:01:35.971 17:45:28 -- common/autobuild_common.sh@139 -- $ [[ 0 -eq 1 ]] 00:01:35.971 17:45:28 -- common/autobuild_common.sh@167 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk 00:01:35.971 17:45:28 -- common/autobuild_common.sh@168 -- $ uname -s 00:01:35.971 17:45:28 -- common/autobuild_common.sh@168 -- $ '[' Linux = Linux ']' 00:01:35.971 17:45:28 -- common/autobuild_common.sh@169 -- $ lt 22.11.4 21.11.0 00:01:35.971 17:45:28 -- scripts/common.sh@372 -- $ cmp_versions 22.11.4 '<' 21.11.0 00:01:35.971 17:45:28 -- scripts/common.sh@332 -- $ local ver1 ver1_l 00:01:35.971 17:45:28 -- scripts/common.sh@333 -- $ local ver2 ver2_l 00:01:35.971 17:45:28 -- scripts/common.sh@335 -- $ IFS=.-: 00:01:35.971 17:45:28 -- scripts/common.sh@335 -- $ read -ra ver1 00:01:35.971 17:45:28 -- scripts/common.sh@336 -- $ IFS=.-: 00:01:35.971 17:45:28 -- scripts/common.sh@336 -- $ read -ra ver2 00:01:35.971 17:45:28 -- scripts/common.sh@337 -- $ local 'op=<' 00:01:35.971 17:45:28 -- scripts/common.sh@339 -- $ ver1_l=3 00:01:35.971 17:45:28 -- scripts/common.sh@340 -- $ ver2_l=3 00:01:35.971 17:45:28 -- scripts/common.sh@342 -- $ local lt=0 gt=0 eq=0 v 00:01:35.971 17:45:28 -- scripts/common.sh@343 -- $ case "$op" in 00:01:35.971 17:45:28 -- scripts/common.sh@344 -- $ : 1 00:01:35.971 17:45:28 -- scripts/common.sh@363 -- $ (( v = 0 )) 00:01:35.971 17:45:28 -- scripts/common.sh@363 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:01:36.230 17:45:28 -- scripts/common.sh@364 -- $ decimal 22 00:01:36.230 17:45:28 -- scripts/common.sh@352 -- $ local d=22 00:01:36.230 17:45:28 -- scripts/common.sh@353 -- $ [[ 22 =~ ^[0-9]+$ ]] 00:01:36.230 17:45:28 -- scripts/common.sh@354 -- $ echo 22 00:01:36.230 17:45:28 -- scripts/common.sh@364 -- $ ver1[v]=22 00:01:36.230 17:45:28 -- scripts/common.sh@365 -- $ decimal 21 00:01:36.230 17:45:28 -- scripts/common.sh@352 -- $ local d=21 00:01:36.230 17:45:28 -- scripts/common.sh@353 -- $ [[ 21 =~ ^[0-9]+$ ]] 00:01:36.230 17:45:28 -- scripts/common.sh@354 -- $ echo 21 00:01:36.230 17:45:28 -- scripts/common.sh@365 -- $ ver2[v]=21 00:01:36.230 17:45:28 -- scripts/common.sh@366 -- $ (( ver1[v] > ver2[v] )) 00:01:36.230 17:45:28 -- scripts/common.sh@366 -- $ return 1 00:01:36.230 17:45:28 -- common/autobuild_common.sh@173 -- $ patch -p1 00:01:36.230 patching file config/rte_config.h 00:01:36.230 Hunk #1 succeeded at 60 (offset 1 line). 00:01:36.230 17:45:28 -- common/autobuild_common.sh@176 -- $ lt 22.11.4 24.07.0 00:01:36.230 17:45:28 -- scripts/common.sh@372 -- $ cmp_versions 22.11.4 '<' 24.07.0 00:01:36.230 17:45:28 -- scripts/common.sh@332 -- $ local ver1 ver1_l 00:01:36.230 17:45:28 -- scripts/common.sh@333 -- $ local ver2 ver2_l 00:01:36.230 17:45:28 -- scripts/common.sh@335 -- $ IFS=.-: 00:01:36.230 17:45:28 -- scripts/common.sh@335 -- $ read -ra ver1 00:01:36.230 17:45:28 -- scripts/common.sh@336 -- $ IFS=.-: 00:01:36.230 17:45:28 -- scripts/common.sh@336 -- $ read -ra ver2 00:01:36.230 17:45:28 -- scripts/common.sh@337 -- $ local 'op=<' 00:01:36.230 17:45:28 -- scripts/common.sh@339 -- $ ver1_l=3 00:01:36.230 17:45:28 -- scripts/common.sh@340 -- $ ver2_l=3 00:01:36.230 17:45:28 -- scripts/common.sh@342 -- $ local lt=0 gt=0 eq=0 v 00:01:36.230 17:45:28 -- scripts/common.sh@343 -- $ case "$op" in 00:01:36.230 17:45:28 -- scripts/common.sh@344 -- $ : 1 00:01:36.230 17:45:28 -- scripts/common.sh@363 -- $ (( v = 0 )) 00:01:36.230 17:45:28 -- scripts/common.sh@363 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:01:36.230 17:45:28 -- scripts/common.sh@364 -- $ decimal 22 00:01:36.230 17:45:28 -- scripts/common.sh@352 -- $ local d=22 00:01:36.230 17:45:28 -- scripts/common.sh@353 -- $ [[ 22 =~ ^[0-9]+$ ]] 00:01:36.230 17:45:28 -- scripts/common.sh@354 -- $ echo 22 00:01:36.230 17:45:28 -- scripts/common.sh@364 -- $ ver1[v]=22 00:01:36.230 17:45:28 -- scripts/common.sh@365 -- $ decimal 24 00:01:36.230 17:45:28 -- scripts/common.sh@352 -- $ local d=24 00:01:36.230 17:45:28 -- scripts/common.sh@353 -- $ [[ 24 =~ ^[0-9]+$ ]] 00:01:36.230 17:45:28 -- scripts/common.sh@354 -- $ echo 24 00:01:36.231 17:45:28 -- scripts/common.sh@365 -- $ ver2[v]=24 00:01:36.231 17:45:28 -- scripts/common.sh@366 -- $ (( ver1[v] > ver2[v] )) 00:01:36.231 17:45:28 -- scripts/common.sh@367 -- $ (( ver1[v] < ver2[v] )) 00:01:36.231 17:45:28 -- scripts/common.sh@367 -- $ return 0 00:01:36.231 17:45:28 -- common/autobuild_common.sh@177 -- $ patch -p1 00:01:36.231 patching file lib/pcapng/rte_pcapng.c 00:01:36.231 Hunk #1 succeeded at 110 (offset -18 lines). 00:01:36.231 17:45:28 -- common/autobuild_common.sh@180 -- $ dpdk_kmods=false 00:01:36.231 17:45:28 -- common/autobuild_common.sh@181 -- $ uname -s 00:01:36.231 17:45:28 -- common/autobuild_common.sh@181 -- $ '[' Linux = FreeBSD ']' 00:01:36.231 17:45:28 -- common/autobuild_common.sh@185 -- $ printf %s, bus bus/pci bus/vdev mempool/ring net/i40e net/i40e/base 00:01:36.231 17:45:28 -- common/autobuild_common.sh@185 -- $ meson build-tmp --prefix=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --libdir lib -Denable_docs=false -Denable_kmods=false -Dtests=false -Dc_link_args= '-Dc_args=-fPIC -g -fcommon -Werror -Wno-stringop-overflow' -Dmachine=native -Denable_drivers=bus,bus/pci,bus/vdev,mempool/ring,net/i40e,net/i40e/base, 00:01:41.512 The Meson build system 00:01:41.512 Version: 1.5.0 00:01:41.512 Source dir: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk 00:01:41.512 Build dir: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp 00:01:41.512 Build type: native build 00:01:41.512 Program cat found: YES (/usr/bin/cat) 00:01:41.512 Project name: DPDK 00:01:41.512 Project version: 22.11.4 00:01:41.512 C compiler for the host machine: gcc (gcc 13.3.1 "gcc (GCC) 13.3.1 20240522 (Red Hat 13.3.1-1)") 00:01:41.512 C linker for the host machine: gcc ld.bfd 2.40-14 00:01:41.512 Host machine cpu family: x86_64 00:01:41.513 Host machine cpu: x86_64 00:01:41.513 Message: ## Building in Developer Mode ## 00:01:41.513 Program pkg-config found: YES (/usr/bin/pkg-config) 00:01:41.513 Program check-symbols.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/buildtools/check-symbols.sh) 00:01:41.513 Program options-ibverbs-static.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/buildtools/options-ibverbs-static.sh) 00:01:41.513 Program objdump found: YES (/usr/bin/objdump) 00:01:41.513 Program python3 found: YES (/usr/bin/python3) 00:01:41.513 Program cat found: YES (/usr/bin/cat) 00:01:41.513 config/meson.build:83: WARNING: The "machine" option is deprecated. Please use "cpu_instruction_set" instead. 00:01:41.513 Checking for size of "void *" : 8 00:01:41.513 Checking for size of "void *" : 8 (cached) 00:01:41.513 Library m found: YES 00:01:41.513 Library numa found: YES 00:01:41.513 Has header "numaif.h" : YES 00:01:41.513 Library fdt found: NO 00:01:41.513 Library execinfo found: NO 00:01:41.513 Has header "execinfo.h" : YES 00:01:41.513 Found pkg-config: YES (/usr/bin/pkg-config) 1.9.5 00:01:41.513 Run-time dependency libarchive found: NO (tried pkgconfig) 00:01:41.513 Run-time dependency libbsd found: NO (tried pkgconfig) 00:01:41.513 Run-time dependency jansson found: NO (tried pkgconfig) 00:01:41.513 Run-time dependency openssl found: YES 3.1.1 00:01:41.513 Run-time dependency libpcap found: YES 1.10.4 00:01:41.513 Has header "pcap.h" with dependency libpcap: YES 00:01:41.513 Compiler for C supports arguments -Wcast-qual: YES 00:01:41.513 Compiler for C supports arguments -Wdeprecated: YES 00:01:41.513 Compiler for C supports arguments -Wformat: YES 00:01:41.513 Compiler for C supports arguments -Wformat-nonliteral: NO 00:01:41.513 Compiler for C supports arguments -Wformat-security: NO 00:01:41.513 Compiler for C supports arguments -Wmissing-declarations: YES 00:01:41.513 Compiler for C supports arguments -Wmissing-prototypes: YES 00:01:41.513 Compiler for C supports arguments -Wnested-externs: YES 00:01:41.513 Compiler for C supports arguments -Wold-style-definition: YES 00:01:41.513 Compiler for C supports arguments -Wpointer-arith: YES 00:01:41.513 Compiler for C supports arguments -Wsign-compare: YES 00:01:41.513 Compiler for C supports arguments -Wstrict-prototypes: YES 00:01:41.513 Compiler for C supports arguments -Wundef: YES 00:01:41.513 Compiler for C supports arguments -Wwrite-strings: YES 00:01:41.513 Compiler for C supports arguments -Wno-address-of-packed-member: YES 00:01:41.513 Compiler for C supports arguments -Wno-packed-not-aligned: YES 00:01:41.513 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:01:41.513 Compiler for C supports arguments -Wno-zero-length-bounds: YES 00:01:41.513 Compiler for C supports arguments -mavx512f: YES 00:01:41.513 Checking if "AVX512 checking" compiles: YES 00:01:41.513 Fetching value of define "__SSE4_2__" : 1 00:01:41.513 Fetching value of define "__AES__" : 1 00:01:41.513 Fetching value of define "__AVX__" : 1 00:01:41.513 Fetching value of define "__AVX2__" : 1 00:01:41.513 Fetching value of define "__AVX512BW__" : 1 00:01:41.513 Fetching value of define "__AVX512CD__" : 1 00:01:41.513 Fetching value of define "__AVX512DQ__" : 1 00:01:41.513 Fetching value of define "__AVX512F__" : 1 00:01:41.513 Fetching value of define "__AVX512VL__" : 1 00:01:41.513 Fetching value of define "__PCLMUL__" : 1 00:01:41.513 Fetching value of define "__RDRND__" : 1 00:01:41.513 Fetching value of define "__RDSEED__" : 1 00:01:41.513 Fetching value of define "__VPCLMULQDQ__" : (undefined) 00:01:41.513 Compiler for C supports arguments -Wno-format-truncation: YES 00:01:41.513 Message: lib/kvargs: Defining dependency "kvargs" 00:01:41.513 Message: lib/telemetry: Defining dependency "telemetry" 00:01:41.513 Checking for function "getentropy" : YES 00:01:41.513 Message: lib/eal: Defining dependency "eal" 00:01:41.513 Message: lib/ring: Defining dependency "ring" 00:01:41.513 Message: lib/rcu: Defining dependency "rcu" 00:01:41.513 Message: lib/mempool: Defining dependency "mempool" 00:01:41.513 Message: lib/mbuf: Defining dependency "mbuf" 00:01:41.513 Fetching value of define "__PCLMUL__" : 1 (cached) 00:01:41.513 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:41.513 Fetching value of define "__AVX512BW__" : 1 (cached) 00:01:41.513 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:01:41.513 Fetching value of define "__AVX512VL__" : 1 (cached) 00:01:41.513 Fetching value of define "__VPCLMULQDQ__" : (undefined) (cached) 00:01:41.513 Compiler for C supports arguments -mpclmul: YES 00:01:41.513 Compiler for C supports arguments -maes: YES 00:01:41.513 Compiler for C supports arguments -mavx512f: YES (cached) 00:01:41.513 Compiler for C supports arguments -mavx512bw: YES 00:01:41.513 Compiler for C supports arguments -mavx512dq: YES 00:01:41.513 Compiler for C supports arguments -mavx512vl: YES 00:01:41.513 Compiler for C supports arguments -mvpclmulqdq: YES 00:01:41.513 Compiler for C supports arguments -mavx2: YES 00:01:41.513 Compiler for C supports arguments -mavx: YES 00:01:41.513 Message: lib/net: Defining dependency "net" 00:01:41.513 Message: lib/meter: Defining dependency "meter" 00:01:41.513 Message: lib/ethdev: Defining dependency "ethdev" 00:01:41.513 Message: lib/pci: Defining dependency "pci" 00:01:41.513 Message: lib/cmdline: Defining dependency "cmdline" 00:01:41.513 Message: lib/metrics: Defining dependency "metrics" 00:01:41.513 Message: lib/hash: Defining dependency "hash" 00:01:41.513 Message: lib/timer: Defining dependency "timer" 00:01:41.513 Fetching value of define "__AVX2__" : 1 (cached) 00:01:41.513 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:41.513 Fetching value of define "__AVX512VL__" : 1 (cached) 00:01:41.513 Fetching value of define "__AVX512CD__" : 1 (cached) 00:01:41.513 Fetching value of define "__AVX512BW__" : 1 (cached) 00:01:41.513 Message: lib/acl: Defining dependency "acl" 00:01:41.513 Message: lib/bbdev: Defining dependency "bbdev" 00:01:41.513 Message: lib/bitratestats: Defining dependency "bitratestats" 00:01:41.513 Run-time dependency libelf found: YES 0.191 00:01:41.513 Message: lib/bpf: Defining dependency "bpf" 00:01:41.513 Message: lib/cfgfile: Defining dependency "cfgfile" 00:01:41.513 Message: lib/compressdev: Defining dependency "compressdev" 00:01:41.513 Message: lib/cryptodev: Defining dependency "cryptodev" 00:01:41.513 Message: lib/distributor: Defining dependency "distributor" 00:01:41.513 Message: lib/efd: Defining dependency "efd" 00:01:41.513 Message: lib/eventdev: Defining dependency "eventdev" 00:01:41.513 Message: lib/gpudev: Defining dependency "gpudev" 00:01:41.513 Message: lib/gro: Defining dependency "gro" 00:01:41.513 Message: lib/gso: Defining dependency "gso" 00:01:41.513 Message: lib/ip_frag: Defining dependency "ip_frag" 00:01:41.513 Message: lib/jobstats: Defining dependency "jobstats" 00:01:41.513 Message: lib/latencystats: Defining dependency "latencystats" 00:01:41.513 Message: lib/lpm: Defining dependency "lpm" 00:01:41.513 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:41.513 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:01:41.513 Fetching value of define "__AVX512IFMA__" : (undefined) 00:01:41.513 Compiler for C supports arguments -mavx512f -mavx512dq -mavx512ifma: YES 00:01:41.513 Message: lib/member: Defining dependency "member" 00:01:41.513 Message: lib/pcapng: Defining dependency "pcapng" 00:01:41.513 Compiler for C supports arguments -Wno-cast-qual: YES 00:01:41.513 Message: lib/power: Defining dependency "power" 00:01:41.513 Message: lib/rawdev: Defining dependency "rawdev" 00:01:41.513 Message: lib/regexdev: Defining dependency "regexdev" 00:01:41.513 Message: lib/dmadev: Defining dependency "dmadev" 00:01:41.513 Message: lib/rib: Defining dependency "rib" 00:01:41.513 Message: lib/reorder: Defining dependency "reorder" 00:01:41.513 Message: lib/sched: Defining dependency "sched" 00:01:41.513 Message: lib/security: Defining dependency "security" 00:01:41.513 Message: lib/stack: Defining dependency "stack" 00:01:41.513 Has header "linux/userfaultfd.h" : YES 00:01:41.513 Message: lib/vhost: Defining dependency "vhost" 00:01:41.513 Message: lib/ipsec: Defining dependency "ipsec" 00:01:41.513 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:41.513 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:01:41.513 Fetching value of define "__AVX512BW__" : 1 (cached) 00:01:41.513 Message: lib/fib: Defining dependency "fib" 00:01:41.513 Message: lib/port: Defining dependency "port" 00:01:41.513 Message: lib/pdump: Defining dependency "pdump" 00:01:41.513 Message: lib/table: Defining dependency "table" 00:01:41.513 Message: lib/pipeline: Defining dependency "pipeline" 00:01:41.513 Message: lib/graph: Defining dependency "graph" 00:01:41.513 Message: lib/node: Defining dependency "node" 00:01:41.513 Compiler for C supports arguments -Wno-format-truncation: YES (cached) 00:01:41.513 Message: drivers/bus/pci: Defining dependency "bus_pci" 00:01:41.513 Message: drivers/bus/vdev: Defining dependency "bus_vdev" 00:01:41.513 Message: drivers/mempool/ring: Defining dependency "mempool_ring" 00:01:41.513 Compiler for C supports arguments -Wno-sign-compare: YES 00:01:41.513 Compiler for C supports arguments -Wno-unused-value: YES 00:01:41.513 Compiler for C supports arguments -Wno-format: YES 00:01:41.513 Compiler for C supports arguments -Wno-format-security: YES 00:01:41.513 Compiler for C supports arguments -Wno-format-nonliteral: YES 00:01:41.773 Compiler for C supports arguments -Wno-strict-aliasing: YES 00:01:41.773 Compiler for C supports arguments -Wno-unused-but-set-variable: YES 00:01:41.773 Compiler for C supports arguments -Wno-unused-parameter: YES 00:01:41.773 Fetching value of define "__AVX2__" : 1 (cached) 00:01:41.773 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:41.773 Fetching value of define "__AVX512BW__" : 1 (cached) 00:01:41.773 Compiler for C supports arguments -mavx512f: YES (cached) 00:01:41.773 Compiler for C supports arguments -mavx512bw: YES (cached) 00:01:41.773 Compiler for C supports arguments -march=skylake-avx512: YES 00:01:41.773 Message: drivers/net/i40e: Defining dependency "net_i40e" 00:01:41.773 Program doxygen found: YES (/usr/local/bin/doxygen) 00:01:41.773 Configuring doxy-api.conf using configuration 00:01:41.773 Program sphinx-build found: NO 00:01:41.773 Configuring rte_build_config.h using configuration 00:01:41.773 Message: 00:01:41.773 ================= 00:01:41.773 Applications Enabled 00:01:41.773 ================= 00:01:41.773 00:01:41.773 apps: 00:01:41.773 dumpcap, pdump, proc-info, test-acl, test-bbdev, test-cmdline, test-compress-perf, test-crypto-perf, 00:01:41.773 test-eventdev, test-fib, test-flow-perf, test-gpudev, test-pipeline, test-pmd, test-regex, test-sad, 00:01:41.773 test-security-perf, 00:01:41.773 00:01:41.773 Message: 00:01:41.773 ================= 00:01:41.773 Libraries Enabled 00:01:41.773 ================= 00:01:41.773 00:01:41.773 libs: 00:01:41.773 kvargs, telemetry, eal, ring, rcu, mempool, mbuf, net, 00:01:41.773 meter, ethdev, pci, cmdline, metrics, hash, timer, acl, 00:01:41.773 bbdev, bitratestats, bpf, cfgfile, compressdev, cryptodev, distributor, efd, 00:01:41.773 eventdev, gpudev, gro, gso, ip_frag, jobstats, latencystats, lpm, 00:01:41.773 member, pcapng, power, rawdev, regexdev, dmadev, rib, reorder, 00:01:41.773 sched, security, stack, vhost, ipsec, fib, port, pdump, 00:01:41.773 table, pipeline, graph, node, 00:01:41.773 00:01:41.773 Message: 00:01:41.773 =============== 00:01:41.773 Drivers Enabled 00:01:41.773 =============== 00:01:41.773 00:01:41.773 common: 00:01:41.773 00:01:41.773 bus: 00:01:41.773 pci, vdev, 00:01:41.773 mempool: 00:01:41.773 ring, 00:01:41.773 dma: 00:01:41.773 00:01:41.773 net: 00:01:41.773 i40e, 00:01:41.773 raw: 00:01:41.773 00:01:41.773 crypto: 00:01:41.773 00:01:41.773 compress: 00:01:41.773 00:01:41.773 regex: 00:01:41.773 00:01:41.773 vdpa: 00:01:41.773 00:01:41.773 event: 00:01:41.773 00:01:41.773 baseband: 00:01:41.773 00:01:41.773 gpu: 00:01:41.773 00:01:41.773 00:01:41.773 Message: 00:01:41.773 ================= 00:01:41.773 Content Skipped 00:01:41.773 ================= 00:01:41.773 00:01:41.773 apps: 00:01:41.773 00:01:41.773 libs: 00:01:41.773 kni: explicitly disabled via build config (deprecated lib) 00:01:41.773 flow_classify: explicitly disabled via build config (deprecated lib) 00:01:41.773 00:01:41.773 drivers: 00:01:41.773 common/cpt: not in enabled drivers build config 00:01:41.773 common/dpaax: not in enabled drivers build config 00:01:41.773 common/iavf: not in enabled drivers build config 00:01:41.773 common/idpf: not in enabled drivers build config 00:01:41.773 common/mvep: not in enabled drivers build config 00:01:41.773 common/octeontx: not in enabled drivers build config 00:01:41.773 bus/auxiliary: not in enabled drivers build config 00:01:41.773 bus/dpaa: not in enabled drivers build config 00:01:41.773 bus/fslmc: not in enabled drivers build config 00:01:41.773 bus/ifpga: not in enabled drivers build config 00:01:41.773 bus/vmbus: not in enabled drivers build config 00:01:41.773 common/cnxk: not in enabled drivers build config 00:01:41.773 common/mlx5: not in enabled drivers build config 00:01:41.773 common/qat: not in enabled drivers build config 00:01:41.773 common/sfc_efx: not in enabled drivers build config 00:01:41.774 mempool/bucket: not in enabled drivers build config 00:01:41.774 mempool/cnxk: not in enabled drivers build config 00:01:41.774 mempool/dpaa: not in enabled drivers build config 00:01:41.774 mempool/dpaa2: not in enabled drivers build config 00:01:41.774 mempool/octeontx: not in enabled drivers build config 00:01:41.774 mempool/stack: not in enabled drivers build config 00:01:41.774 dma/cnxk: not in enabled drivers build config 00:01:41.774 dma/dpaa: not in enabled drivers build config 00:01:41.774 dma/dpaa2: not in enabled drivers build config 00:01:41.774 dma/hisilicon: not in enabled drivers build config 00:01:41.774 dma/idxd: not in enabled drivers build config 00:01:41.774 dma/ioat: not in enabled drivers build config 00:01:41.774 dma/skeleton: not in enabled drivers build config 00:01:41.774 net/af_packet: not in enabled drivers build config 00:01:41.774 net/af_xdp: not in enabled drivers build config 00:01:41.774 net/ark: not in enabled drivers build config 00:01:41.774 net/atlantic: not in enabled drivers build config 00:01:41.774 net/avp: not in enabled drivers build config 00:01:41.774 net/axgbe: not in enabled drivers build config 00:01:41.774 net/bnx2x: not in enabled drivers build config 00:01:41.774 net/bnxt: not in enabled drivers build config 00:01:41.774 net/bonding: not in enabled drivers build config 00:01:41.774 net/cnxk: not in enabled drivers build config 00:01:41.774 net/cxgbe: not in enabled drivers build config 00:01:41.774 net/dpaa: not in enabled drivers build config 00:01:41.774 net/dpaa2: not in enabled drivers build config 00:01:41.774 net/e1000: not in enabled drivers build config 00:01:41.774 net/ena: not in enabled drivers build config 00:01:41.774 net/enetc: not in enabled drivers build config 00:01:41.774 net/enetfec: not in enabled drivers build config 00:01:41.774 net/enic: not in enabled drivers build config 00:01:41.774 net/failsafe: not in enabled drivers build config 00:01:41.774 net/fm10k: not in enabled drivers build config 00:01:41.774 net/gve: not in enabled drivers build config 00:01:41.774 net/hinic: not in enabled drivers build config 00:01:41.774 net/hns3: not in enabled drivers build config 00:01:41.774 net/iavf: not in enabled drivers build config 00:01:41.774 net/ice: not in enabled drivers build config 00:01:41.774 net/idpf: not in enabled drivers build config 00:01:41.774 net/igc: not in enabled drivers build config 00:01:41.774 net/ionic: not in enabled drivers build config 00:01:41.774 net/ipn3ke: not in enabled drivers build config 00:01:41.774 net/ixgbe: not in enabled drivers build config 00:01:41.774 net/kni: not in enabled drivers build config 00:01:41.774 net/liquidio: not in enabled drivers build config 00:01:41.774 net/mana: not in enabled drivers build config 00:01:41.774 net/memif: not in enabled drivers build config 00:01:41.774 net/mlx4: not in enabled drivers build config 00:01:41.774 net/mlx5: not in enabled drivers build config 00:01:41.774 net/mvneta: not in enabled drivers build config 00:01:41.774 net/mvpp2: not in enabled drivers build config 00:01:41.774 net/netvsc: not in enabled drivers build config 00:01:41.774 net/nfb: not in enabled drivers build config 00:01:41.774 net/nfp: not in enabled drivers build config 00:01:41.774 net/ngbe: not in enabled drivers build config 00:01:41.774 net/null: not in enabled drivers build config 00:01:41.774 net/octeontx: not in enabled drivers build config 00:01:41.774 net/octeon_ep: not in enabled drivers build config 00:01:41.774 net/pcap: not in enabled drivers build config 00:01:41.774 net/pfe: not in enabled drivers build config 00:01:41.774 net/qede: not in enabled drivers build config 00:01:41.774 net/ring: not in enabled drivers build config 00:01:41.774 net/sfc: not in enabled drivers build config 00:01:41.774 net/softnic: not in enabled drivers build config 00:01:41.774 net/tap: not in enabled drivers build config 00:01:41.774 net/thunderx: not in enabled drivers build config 00:01:41.774 net/txgbe: not in enabled drivers build config 00:01:41.774 net/vdev_netvsc: not in enabled drivers build config 00:01:41.774 net/vhost: not in enabled drivers build config 00:01:41.774 net/virtio: not in enabled drivers build config 00:01:41.774 net/vmxnet3: not in enabled drivers build config 00:01:41.774 raw/cnxk_bphy: not in enabled drivers build config 00:01:41.774 raw/cnxk_gpio: not in enabled drivers build config 00:01:41.774 raw/dpaa2_cmdif: not in enabled drivers build config 00:01:41.774 raw/ifpga: not in enabled drivers build config 00:01:41.774 raw/ntb: not in enabled drivers build config 00:01:41.774 raw/skeleton: not in enabled drivers build config 00:01:41.774 crypto/armv8: not in enabled drivers build config 00:01:41.774 crypto/bcmfs: not in enabled drivers build config 00:01:41.774 crypto/caam_jr: not in enabled drivers build config 00:01:41.774 crypto/ccp: not in enabled drivers build config 00:01:41.774 crypto/cnxk: not in enabled drivers build config 00:01:41.774 crypto/dpaa_sec: not in enabled drivers build config 00:01:41.774 crypto/dpaa2_sec: not in enabled drivers build config 00:01:41.774 crypto/ipsec_mb: not in enabled drivers build config 00:01:41.774 crypto/mlx5: not in enabled drivers build config 00:01:41.774 crypto/mvsam: not in enabled drivers build config 00:01:41.774 crypto/nitrox: not in enabled drivers build config 00:01:41.774 crypto/null: not in enabled drivers build config 00:01:41.774 crypto/octeontx: not in enabled drivers build config 00:01:41.774 crypto/openssl: not in enabled drivers build config 00:01:41.774 crypto/scheduler: not in enabled drivers build config 00:01:41.774 crypto/uadk: not in enabled drivers build config 00:01:41.774 crypto/virtio: not in enabled drivers build config 00:01:41.774 compress/isal: not in enabled drivers build config 00:01:41.774 compress/mlx5: not in enabled drivers build config 00:01:41.774 compress/octeontx: not in enabled drivers build config 00:01:41.774 compress/zlib: not in enabled drivers build config 00:01:41.774 regex/mlx5: not in enabled drivers build config 00:01:41.774 regex/cn9k: not in enabled drivers build config 00:01:41.774 vdpa/ifc: not in enabled drivers build config 00:01:41.774 vdpa/mlx5: not in enabled drivers build config 00:01:41.774 vdpa/sfc: not in enabled drivers build config 00:01:41.774 event/cnxk: not in enabled drivers build config 00:01:41.774 event/dlb2: not in enabled drivers build config 00:01:41.774 event/dpaa: not in enabled drivers build config 00:01:41.774 event/dpaa2: not in enabled drivers build config 00:01:41.774 event/dsw: not in enabled drivers build config 00:01:41.774 event/opdl: not in enabled drivers build config 00:01:41.774 event/skeleton: not in enabled drivers build config 00:01:41.774 event/sw: not in enabled drivers build config 00:01:41.774 event/octeontx: not in enabled drivers build config 00:01:41.774 baseband/acc: not in enabled drivers build config 00:01:41.774 baseband/fpga_5gnr_fec: not in enabled drivers build config 00:01:41.774 baseband/fpga_lte_fec: not in enabled drivers build config 00:01:41.774 baseband/la12xx: not in enabled drivers build config 00:01:41.774 baseband/null: not in enabled drivers build config 00:01:41.774 baseband/turbo_sw: not in enabled drivers build config 00:01:41.774 gpu/cuda: not in enabled drivers build config 00:01:41.774 00:01:41.774 00:01:41.774 Build targets in project: 311 00:01:41.774 00:01:41.774 DPDK 22.11.4 00:01:41.774 00:01:41.774 User defined options 00:01:41.774 libdir : lib 00:01:41.774 prefix : /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:41.774 c_args : -fPIC -g -fcommon -Werror -Wno-stringop-overflow 00:01:41.774 c_link_args : 00:01:41.774 enable_docs : false 00:01:41.774 enable_drivers: bus,bus/pci,bus/vdev,mempool/ring,net/i40e,net/i40e/base, 00:01:41.774 enable_kmods : false 00:01:41.774 machine : native 00:01:41.774 tests : false 00:01:41.774 00:01:41.774 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:01:41.774 WARNING: Running the setup command as `meson [options]` instead of `meson setup [options]` is ambiguous and deprecated. 00:01:42.044 17:45:34 -- common/autobuild_common.sh@189 -- $ ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp -j112 00:01:42.044 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp' 00:01:42.044 [1/740] Generating lib/rte_kvargs_def with a custom command 00:01:42.044 [2/740] Generating lib/rte_telemetry_mingw with a custom command 00:01:42.044 [3/740] Generating lib/rte_kvargs_mingw with a custom command 00:01:42.044 [4/740] Generating lib/rte_telemetry_def with a custom command 00:01:42.044 [5/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hypervisor.c.o 00:01:42.044 [6/740] Generating lib/rte_ring_mingw with a custom command 00:01:42.044 [7/740] Generating lib/rte_ring_def with a custom command 00:01:42.044 [8/740] Generating lib/rte_eal_mingw with a custom command 00:01:42.044 [9/740] Generating lib/rte_rcu_def with a custom command 00:01:42.044 [10/740] Generating lib/rte_mempool_mingw with a custom command 00:01:42.044 [11/740] Generating lib/rte_mbuf_def with a custom command 00:01:42.308 [12/740] Generating lib/rte_mbuf_mingw with a custom command 00:01:42.308 [13/740] Generating lib/rte_eal_def with a custom command 00:01:42.308 [14/740] Generating lib/rte_mempool_def with a custom command 00:01:42.308 [15/740] Generating lib/rte_meter_def with a custom command 00:01:42.308 [16/740] Generating lib/rte_meter_mingw with a custom command 00:01:42.308 [17/740] Generating lib/rte_rcu_mingw with a custom command 00:01:42.308 [18/740] Generating lib/rte_net_def with a custom command 00:01:42.308 [19/740] Generating lib/rte_net_mingw with a custom command 00:01:42.308 [20/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_cpuflags.c.o 00:01:42.308 [21/740] Compiling C object lib/librte_eal.a.p/eal_common_rte_version.c.o 00:01:42.308 [22/740] Compiling C object lib/librte_eal.a.p/eal_x86_rte_spinlock.c.o 00:01:42.308 [23/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_errno.c.o 00:01:42.308 [24/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_class.c.o 00:01:42.308 [25/740] Compiling C object lib/librte_eal.a.p/eal_common_rte_reciprocal.c.o 00:01:42.308 [26/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_log.c.o 00:01:42.308 [27/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hexdump.c.o 00:01:42.308 [28/740] Compiling C object lib/librte_eal.a.p/eal_x86_rte_hypervisor.c.o 00:01:42.308 [29/740] Generating lib/rte_ethdev_def with a custom command 00:01:42.308 [30/740] Generating lib/rte_ethdev_mingw with a custom command 00:01:42.308 [31/740] Generating lib/rte_pci_def with a custom command 00:01:42.308 [32/740] Generating lib/rte_pci_mingw with a custom command 00:01:42.308 [33/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_firmware.c.o 00:01:42.308 [34/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_thread.c.o 00:01:42.308 [35/740] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cpuflags.c.o 00:01:42.308 [36/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_debug.c.o 00:01:42.308 [37/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_uuid.c.o 00:01:42.308 [38/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_cpuflags.c.o 00:01:42.308 [39/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_string_fns.c.o 00:01:42.308 [40/740] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_data.c.o 00:01:42.308 [41/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_timer.c.o 00:01:42.308 [42/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_debug.c.o 00:01:42.308 [43/740] Generating lib/rte_cmdline_mingw with a custom command 00:01:42.308 [44/740] Generating lib/rte_cmdline_def with a custom command 00:01:42.308 [45/740] Generating lib/rte_metrics_mingw with a custom command 00:01:42.308 [46/740] Generating lib/rte_metrics_def with a custom command 00:01:42.308 [47/740] Compiling C object lib/librte_kvargs.a.p/kvargs_rte_kvargs.c.o 00:01:42.308 [48/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_timer.c.o 00:01:42.308 [49/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_thread.c.o 00:01:42.308 [50/740] Linking static target lib/librte_kvargs.a 00:01:42.308 [51/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_config.c.o 00:01:42.308 [52/740] Compiling C object lib/librte_eal.a.p/eal_common_rte_keepalive.c.o 00:01:42.308 [53/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_lcore.c.o 00:01:42.308 [54/740] Generating lib/rte_hash_def with a custom command 00:01:42.308 [55/740] Generating lib/rte_hash_mingw with a custom command 00:01:42.308 [56/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio_mp_sync.c.o 00:01:42.308 [57/740] Generating lib/rte_timer_def with a custom command 00:01:42.308 [58/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_filesystem.c.o 00:01:42.308 [59/740] Generating lib/rte_timer_mingw with a custom command 00:01:42.308 [60/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_vt100.c.o 00:01:42.308 [61/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_timer.c.o 00:01:42.308 [62/740] Compiling C object lib/librte_eal.a.p/eal_x86_rte_power_intrinsics.c.o 00:01:42.308 [63/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_file.c.o 00:01:42.308 [64/740] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_legacy.c.o 00:01:42.308 [65/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_mcfg.c.o 00:01:42.308 [66/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_memory.c.o 00:01:42.308 [67/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_socket.c.o 00:01:42.308 [68/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_tailqs.c.o 00:01:42.308 [69/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_launch.c.o 00:01:42.308 [70/740] Compiling C object lib/librte_eal.a.p/eal_unix_rte_thread.c.o 00:01:42.308 [71/740] Generating lib/rte_acl_def with a custom command 00:01:42.308 [72/740] Generating lib/rte_acl_mingw with a custom command 00:01:42.308 [73/740] Generating lib/rte_bbdev_def with a custom command 00:01:42.308 [74/740] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cycles.c.o 00:01:42.308 [75/740] Generating lib/rte_bbdev_mingw with a custom command 00:01:42.308 [76/740] Generating lib/rte_bitratestats_def with a custom command 00:01:42.308 [77/740] Generating lib/rte_bitratestats_mingw with a custom command 00:01:42.308 [78/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_os_unix.c.o 00:01:42.308 [79/740] Compiling C object lib/librte_meter.a.p/meter_rte_meter.c.o 00:01:42.308 [80/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_bus.c.o 00:01:42.308 [81/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_ctf.c.o 00:01:42.308 [82/740] Compiling C object lib/librte_pci.a.p/pci_rte_pci.c.o 00:01:42.308 [83/740] Linking static target lib/librte_meter.a 00:01:42.572 [84/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_portlist.c.o 00:01:42.572 [85/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_string.c.o 00:01:42.572 [86/740] Generating lib/rte_cfgfile_def with a custom command 00:01:42.572 [87/740] Compiling C object lib/librte_eal.a.p/eal_common_hotplug_mp.c.o 00:01:42.572 [88/740] Generating lib/rte_bpf_def with a custom command 00:01:42.572 [89/740] Generating lib/rte_cfgfile_mingw with a custom command 00:01:42.572 [90/740] Generating lib/rte_bpf_mingw with a custom command 00:01:42.572 [91/740] Linking static target lib/librte_pci.a 00:01:42.572 [92/740] Generating lib/rte_compressdev_def with a custom command 00:01:42.572 [93/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_points.c.o 00:01:42.572 [94/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memalloc.c.o 00:01:42.572 [95/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_interrupts.c.o 00:01:42.572 [96/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_utils.c.o 00:01:42.572 [97/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_devargs.c.o 00:01:42.572 [98/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_num.c.o 00:01:42.572 [99/740] Generating lib/rte_compressdev_mingw with a custom command 00:01:42.572 [100/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_log.c.o 00:01:42.572 [101/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace.c.o 00:01:42.572 [102/740] Compiling C object lib/librte_ring.a.p/ring_rte_ring.c.o 00:01:42.572 [103/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_thread.c.o 00:01:42.572 [104/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline.c.o 00:01:42.572 [105/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_lcore.c.o 00:01:42.572 [106/740] Generating lib/rte_cryptodev_def with a custom command 00:01:42.572 [107/740] Linking static target lib/librte_ring.a 00:01:42.572 [108/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_alarm.c.o 00:01:42.572 [109/740] Generating lib/rte_cryptodev_mingw with a custom command 00:01:42.572 [110/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_cirbuf.c.o 00:01:42.572 [111/740] Generating lib/rte_distributor_def with a custom command 00:01:42.572 [112/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memzone.c.o 00:01:42.572 [113/740] Compiling C object lib/librte_eal.a.p/eal_common_malloc_mp.c.o 00:01:42.572 [114/740] Generating lib/rte_distributor_mingw with a custom command 00:01:42.572 [115/740] Generating lib/rte_efd_def with a custom command 00:01:42.572 [116/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_dev.c.o 00:01:42.572 [117/740] Generating lib/rte_efd_mingw with a custom command 00:01:42.572 [118/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dynmem.c.o 00:01:42.572 [119/740] Compiling C object lib/librte_hash.a.p/hash_rte_fbk_hash.c.o 00:01:42.572 [120/740] Generating lib/rte_eventdev_def with a custom command 00:01:42.572 [121/740] Generating lib/rte_eventdev_mingw with a custom command 00:01:42.572 [122/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dev.c.o 00:01:42.572 [123/740] Compiling C object lib/librte_eal.a.p/eal_common_malloc_elem.c.o 00:01:42.572 [124/740] Generating lib/rte_gpudev_def with a custom command 00:01:42.572 [125/740] Generating lib/rte_gpudev_mingw with a custom command 00:01:42.572 [126/740] Generating lib/rte_gro_def with a custom command 00:01:42.572 [127/740] Generating lib/rte_gro_mingw with a custom command 00:01:42.572 [128/740] Compiling C object lib/librte_metrics.a.p/metrics_rte_metrics.c.o 00:01:42.572 [129/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse.c.o 00:01:42.572 [130/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_hugepage_info.c.o 00:01:42.572 [131/740] Generating lib/rte_gso_def with a custom command 00:01:42.572 [132/740] Generating lib/rte_gso_mingw with a custom command 00:01:42.572 [133/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memory.c.o 00:01:42.831 [134/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_rdline.c.o 00:01:42.831 [135/740] Generating lib/rte_ip_frag_def with a custom command 00:01:42.831 [136/740] Compiling C object lib/librte_eal.a.p/eal_common_rte_service.c.o 00:01:42.831 [137/740] Generating lib/rte_ip_frag_mingw with a custom command 00:01:42.831 [138/740] Generating lib/kvargs.sym_chk with a custom command (wrapped by meson to capture output) 00:01:42.831 [139/740] Generating lib/pci.sym_chk with a custom command (wrapped by meson to capture output) 00:01:42.831 [140/740] Generating lib/rte_jobstats_def with a custom command 00:01:42.831 [141/740] Generating lib/rte_jobstats_mingw with a custom command 00:01:42.831 [142/740] Generating lib/meter.sym_chk with a custom command (wrapped by meson to capture output) 00:01:42.831 [143/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal.c.o 00:01:42.831 [144/740] Linking target lib/librte_kvargs.so.23.0 00:01:42.831 [145/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_proc.c.o 00:01:42.831 [146/740] Compiling C object lib/librte_eal.a.p/eal_common_malloc_heap.c.o 00:01:42.831 [147/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_fbarray.c.o 00:01:42.831 [148/740] Compiling C object lib/librte_cfgfile.a.p/cfgfile_rte_cfgfile.c.o 00:01:42.831 [149/740] Generating lib/rte_latencystats_def with a custom command 00:01:42.831 [150/740] Linking static target lib/librte_cfgfile.a 00:01:42.831 [151/740] Generating lib/rte_latencystats_mingw with a custom command 00:01:42.831 [152/740] Compiling C object lib/librte_eal.a.p/eal_common_rte_random.c.o 00:01:42.831 [153/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_interrupts.c.o 00:01:42.831 [154/740] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops_default.c.o 00:01:42.831 [155/740] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_pool_ops.c.o 00:01:42.831 [156/740] Compiling C object lib/librte_net.a.p/net_net_crc_sse.c.o 00:01:42.831 [157/740] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_ptype.c.o 00:01:42.831 [158/740] Generating lib/rte_lpm_mingw with a custom command 00:01:42.831 [159/740] Generating lib/rte_lpm_def with a custom command 00:01:42.831 [160/740] Compiling C object lib/librte_net.a.p/net_rte_net_crc.c.o 00:01:42.831 [161/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_etheraddr.c.o 00:01:42.831 [162/740] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_profile.c.o 00:01:42.831 [163/740] Generating lib/rte_member_def with a custom command 00:01:42.831 [164/740] Generating lib/rte_member_mingw with a custom command 00:01:42.831 [165/740] Generating lib/rte_pcapng_def with a custom command 00:01:42.831 [166/740] Generating lib/rte_pcapng_mingw with a custom command 00:01:42.831 [167/740] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_cman.c.o 00:01:42.831 [168/740] Compiling C object lib/librte_jobstats.a.p/jobstats_rte_jobstats.c.o 00:01:42.831 [169/740] Compiling C object lib/librte_acl.a.p/acl_tb_mem.c.o 00:01:42.831 [170/740] Generating lib/ring.sym_chk with a custom command (wrapped by meson to capture output) 00:01:42.831 [171/740] Linking static target lib/librte_jobstats.a 00:01:43.093 [172/740] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_trace_points.c.o 00:01:43.093 [173/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memalloc.c.o 00:01:43.093 [174/740] Compiling C object lib/librte_mempool.a.p/mempool_mempool_trace_points.c.o 00:01:43.093 [175/740] Compiling C object lib/librte_net.a.p/net_rte_ether.c.o 00:01:43.093 [176/740] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_telemetry.c.o 00:01:43.093 [177/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_ipaddr.c.o 00:01:43.093 [178/740] Compiling C object lib/librte_timer.a.p/timer_rte_timer.c.o 00:01:43.093 [179/740] Compiling C object lib/net/libnet_crc_avx512_lib.a.p/net_crc_avx512.c.o 00:01:43.093 [180/740] Linking static target lib/librte_timer.a 00:01:43.093 [181/740] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry.c.o 00:01:43.093 [182/740] Generating lib/rte_power_def with a custom command 00:01:43.093 [183/740] Generating lib/rte_power_mingw with a custom command 00:01:43.093 [184/740] Linking static target lib/librte_cmdline.a 00:01:43.093 [185/740] Compiling C object lib/librte_metrics.a.p/metrics_rte_metrics_telemetry.c.o 00:01:43.093 [186/740] Linking static target lib/net/libnet_crc_avx512_lib.a 00:01:43.093 [187/740] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_class_eth.c.o 00:01:43.093 [188/740] Linking static target lib/librte_telemetry.a 00:01:43.093 [189/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf.c.o 00:01:43.093 [190/740] Generating lib/rte_rawdev_def with a custom command 00:01:43.093 [191/740] Linking static target lib/librte_metrics.a 00:01:43.093 [192/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_stub.c.o 00:01:43.093 [193/740] Generating lib/rte_rawdev_mingw with a custom command 00:01:43.093 [194/740] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops.c.o 00:01:43.093 [195/740] Generating symbol file lib/librte_kvargs.so.23.0.p/librte_kvargs.so.23.0.symbols 00:01:43.093 [196/740] Compiling C object lib/librte_power.a.p/power_power_common.c.o 00:01:43.093 [197/740] Compiling C object lib/librte_power.a.p/power_guest_channel.c.o 00:01:43.093 [198/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_load.c.o 00:01:43.093 [199/740] Compiling C object lib/librte_acl.a.p/acl_rte_acl.c.o 00:01:43.093 [200/740] Generating lib/rte_regexdev_def with a custom command 00:01:43.093 [201/740] Generating lib/rte_regexdev_mingw with a custom command 00:01:43.093 [202/740] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8472.c.o 00:01:43.093 [203/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memory.c.o 00:01:43.093 [204/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_dump.c.o 00:01:43.093 [205/740] Compiling C object lib/librte_power.a.p/power_power_kvm_vm.c.o 00:01:43.093 [206/740] Generating lib/rte_dmadev_mingw with a custom command 00:01:43.093 [207/740] Generating lib/rte_dmadev_def with a custom command 00:01:43.093 [208/740] Generating lib/rte_reorder_mingw with a custom command 00:01:43.093 [209/740] Compiling C object lib/librte_net.a.p/net_rte_arp.c.o 00:01:43.093 [210/740] Generating lib/rte_rib_def with a custom command 00:01:43.093 [211/740] Generating lib/rte_reorder_def with a custom command 00:01:43.093 [212/740] Compiling C object lib/librte_net.a.p/net_rte_net.c.o 00:01:43.093 [213/740] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev_pmd.c.o 00:01:43.093 [214/740] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_dyn.c.o 00:01:43.093 [215/740] Generating lib/rte_rib_mingw with a custom command 00:01:43.093 [216/740] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor_match_sse.c.o 00:01:43.093 [217/740] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_private.c.o 00:01:43.093 [218/740] Linking static target lib/librte_net.a 00:01:43.093 [219/740] Compiling C object lib/librte_bitratestats.a.p/bitratestats_rte_bitrate.c.o 00:01:43.093 [220/740] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_ring.c.o 00:01:43.093 [221/740] Compiling C object lib/librte_power.a.p/power_rte_power.c.o 00:01:43.093 [222/740] Generating lib/rte_sched_mingw with a custom command 00:01:43.093 [223/740] Generating lib/rte_security_def with a custom command 00:01:43.093 [224/740] Generating lib/rte_security_mingw with a custom command 00:01:43.093 [225/740] Generating lib/rte_sched_def with a custom command 00:01:43.093 [226/740] Linking static target lib/librte_bitratestats.a 00:01:43.093 [227/740] Generating lib/rte_stack_mingw with a custom command 00:01:43.093 [228/740] Generating lib/rte_stack_def with a custom command 00:01:43.094 [229/740] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_pmd.c.o 00:01:43.094 [230/740] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_driver.c.o 00:01:43.094 [231/740] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_mtr.c.o 00:01:43.094 [232/740] Compiling C object lib/librte_eventdev.a.p/eventdev_eventdev_private.c.o 00:01:43.094 [233/740] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8079.c.o 00:01:43.094 [234/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_load_elf.c.o 00:01:43.094 [235/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio.c.o 00:01:43.094 [236/740] Generating lib/rte_vhost_def with a custom command 00:01:43.094 [237/740] Generating lib/rte_vhost_mingw with a custom command 00:01:43.094 [238/740] Compiling C object lib/librte_sched.a.p/sched_rte_pie.c.o 00:01:43.094 [239/740] Compiling C object lib/librte_gso.a.p/gso_gso_udp4.c.o 00:01:43.094 [240/740] Compiling C object lib/librte_eal.a.p/eal_common_rte_malloc.c.o 00:01:43.358 [241/740] Generating lib/rte_ipsec_mingw with a custom command 00:01:43.358 [242/740] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_common.c.o 00:01:43.358 [243/740] Compiling C object lib/librte_gso.a.p/gso_gso_tcp4.c.o 00:01:43.358 [244/740] Compiling C object lib/librte_stack.a.p/stack_rte_stack_std.c.o 00:01:43.358 [245/740] Generating lib/rte_ipsec_def with a custom command 00:01:43.358 [246/740] Compiling C object lib/librte_gso.a.p/gso_gso_tunnel_udp4.c.o 00:01:43.358 [247/740] Compiling C object lib/librte_stack.a.p/stack_rte_stack_lf.c.o 00:01:43.358 [248/740] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_tm.c.o 00:01:43.358 [249/740] Compiling C object lib/librte_power.a.p/power_rte_power_empty_poll.c.o 00:01:43.358 [250/740] Compiling C object lib/librte_sched.a.p/sched_rte_approx.c.o 00:01:43.358 [251/740] Compiling C object lib/librte_sched.a.p/sched_rte_red.c.o 00:01:43.358 [252/740] Generating lib/rte_fib_def with a custom command 00:01:43.358 [253/740] Compiling C object lib/librte_stack.a.p/stack_rte_stack.c.o 00:01:43.358 [254/740] Compiling C object lib/librte_gso.a.p/gso_rte_gso.c.o 00:01:43.358 [255/740] Generating lib/rte_fib_mingw with a custom command 00:01:43.358 [256/740] Linking static target lib/librte_stack.a 00:01:43.358 [257/740] Compiling C object lib/librte_gso.a.p/gso_gso_tunnel_tcp4.c.o 00:01:43.358 [258/740] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor_single.c.o 00:01:43.358 [259/740] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_comp.c.o 00:01:43.358 [260/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_convert.c.o 00:01:43.358 [261/740] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv4_reassembly.c.o 00:01:43.358 [262/740] Compiling C object lib/librte_eventdev.a.p/eventdev_eventdev_trace_points.c.o 00:01:43.358 [263/740] Compiling C object lib/librte_acl.a.p/acl_acl_gen.c.o 00:01:43.358 [264/740] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8636.c.o 00:01:43.358 [265/740] Compiling C object lib/librte_vhost.a.p/vhost_fd_man.c.o 00:01:43.358 [266/740] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev.c.o 00:01:43.358 [267/740] Generating lib/rte_port_def with a custom command 00:01:43.358 [268/740] Generating lib/rte_port_mingw with a custom command 00:01:43.358 [269/740] Linking static target lib/librte_compressdev.a 00:01:43.358 [270/740] Generating lib/rte_pdump_mingw with a custom command 00:01:43.358 [271/740] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv6_reassembly.c.o 00:01:43.358 [272/740] Generating lib/rte_pdump_def with a custom command 00:01:43.358 [273/740] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_trace_points.c.o 00:01:43.358 [274/740] Generating lib/cfgfile.sym_chk with a custom command (wrapped by meson to capture output) 00:01:43.358 [275/740] Compiling C object lib/librte_acl.a.p/acl_acl_run_scalar.c.o 00:01:43.358 [276/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_exec.c.o 00:01:43.358 [277/740] Compiling C object lib/librte_fib.a.p/fib_rte_fib.c.o 00:01:43.358 [278/740] Generating lib/jobstats.sym_chk with a custom command (wrapped by meson to capture output) 00:01:43.358 [279/740] Compiling C object lib/librte_gro.a.p/gro_gro_tcp4.c.o 00:01:43.358 [280/740] Compiling C object lib/librte_gro.a.p/gro_rte_gro.c.o 00:01:43.358 [281/740] Compiling C object lib/librte_gro.a.p/gro_gro_udp4.c.o 00:01:43.358 [282/740] Generating lib/bitratestats.sym_chk with a custom command (wrapped by meson to capture output) 00:01:43.358 [283/740] Compiling C object lib/librte_rawdev.a.p/rawdev_rte_rawdev.c.o 00:01:43.358 [284/740] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool.c.o 00:01:43.617 [285/740] Generating lib/net.sym_chk with a custom command (wrapped by meson to capture output) 00:01:43.617 [286/740] Linking static target lib/librte_rawdev.a 00:01:43.617 [287/740] Linking static target lib/librte_mempool.a 00:01:43.617 [288/740] Compiling C object lib/librte_rcu.a.p/rcu_rte_rcu_qsbr.c.o 00:01:43.617 [289/740] Compiling C object lib/librte_bbdev.a.p/bbdev_rte_bbdev.c.o 00:01:43.617 [290/740] Generating lib/rte_table_def with a custom command 00:01:43.617 [291/740] Linking static target lib/librte_rcu.a 00:01:43.617 [292/740] Generating lib/rte_table_mingw with a custom command 00:01:43.617 [293/740] Compiling C object lib/librte_gro.a.p/gro_gro_vxlan_udp4.c.o 00:01:43.617 [294/740] Linking static target lib/librte_bbdev.a 00:01:43.617 [295/740] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ip_frag_common.c.o 00:01:43.617 [296/740] Compiling C object lib/librte_gro.a.p/gro_gro_vxlan_tcp4.c.o 00:01:43.617 [297/740] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev.c.o 00:01:43.617 [298/740] Compiling C object lib/librte_gpudev.a.p/gpudev_gpudev.c.o 00:01:43.617 [299/740] Linking static target lib/librte_gro.a 00:01:43.617 [300/740] Linking static target lib/librte_dmadev.a 00:01:43.617 [301/740] Linking static target lib/librte_gpudev.a 00:01:43.617 [302/740] Compiling C object lib/librte_member.a.p/member_rte_member.c.o 00:01:43.617 [303/740] Generating lib/timer.sym_chk with a custom command (wrapped by meson to capture output) 00:01:43.617 [304/740] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv6_fragmentation.c.o 00:01:43.617 [305/740] Compiling C object lib/librte_ip_frag.a.p/ip_frag_ip_frag_internal.c.o 00:01:43.617 [306/740] Generating lib/stack.sym_chk with a custom command (wrapped by meson to capture output) 00:01:43.617 [307/740] Generating lib/telemetry.sym_chk with a custom command (wrapped by meson to capture output) 00:01:43.617 [308/740] Compiling C object lib/librte_hash.a.p/hash_rte_thash.c.o 00:01:43.617 [309/740] Generating lib/metrics.sym_chk with a custom command (wrapped by meson to capture output) 00:01:43.617 [310/740] Compiling C object lib/librte_table.a.p/table_rte_swx_keycmp.c.o 00:01:43.617 [311/740] Compiling C object lib/librte_gso.a.p/gso_gso_common.c.o 00:01:43.617 [312/740] Generating lib/rte_pipeline_mingw with a custom command 00:01:43.617 [313/740] Linking static target lib/librte_gso.a 00:01:43.617 [314/740] Generating lib/rte_pipeline_def with a custom command 00:01:43.617 [315/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_pkt.c.o 00:01:43.617 [316/740] Compiling C object lib/librte_latencystats.a.p/latencystats_rte_latencystats.c.o 00:01:43.617 [317/740] Linking target lib/librte_telemetry.so.23.0 00:01:43.617 [318/740] Linking static target lib/librte_latencystats.a 00:01:43.617 [319/740] Generating lib/rte_graph_def with a custom command 00:01:43.617 [320/740] Generating lib/rte_graph_mingw with a custom command 00:01:43.617 [321/740] Compiling C object lib/librte_power.a.p/power_power_acpi_cpufreq.c.o 00:01:43.617 [322/740] Compiling C object lib/librte_power.a.p/power_rte_power_intel_uncore.c.o 00:01:43.617 [323/740] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor.c.o 00:01:43.617 [324/740] Compiling C object lib/member/libsketch_avx512_tmp.a.p/rte_member_sketch_avx512.c.o 00:01:43.617 [325/740] Linking static target lib/member/libsketch_avx512_tmp.a 00:01:43.617 [326/740] Linking static target lib/librte_distributor.a 00:01:43.884 [327/740] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv4_fragmentation.c.o 00:01:43.884 [328/740] Compiling C object lib/librte_table.a.p/table_rte_swx_table_learner.c.o 00:01:43.884 [329/740] Linking static target lib/librte_ip_frag.a 00:01:43.884 [330/740] Compiling C object lib/librte_power.a.p/power_power_cppc_cpufreq.c.o 00:01:43.884 [331/740] Compiling C object lib/librte_lpm.a.p/lpm_rte_lpm.c.o 00:01:43.884 [332/740] Compiling C object lib/librte_node.a.p/node_null.c.o 00:01:43.884 [333/740] Compiling C object lib/librte_regexdev.a.p/regexdev_rte_regexdev.c.o 00:01:43.884 [334/740] Compiling C object lib/librte_member.a.p/member_rte_member_vbf.c.o 00:01:43.884 [335/740] Compiling C object lib/librte_table.a.p/table_rte_swx_table_em.c.o 00:01:43.884 [336/740] Compiling C object lib/librte_fib.a.p/fib_rte_fib6.c.o 00:01:43.884 [337/740] Linking static target lib/librte_regexdev.a 00:01:43.884 [338/740] Compiling C object lib/librte_power.a.p/power_rte_power_pmd_mgmt.c.o 00:01:43.884 [339/740] Compiling C object lib/librte_ipsec.a.p/ipsec_ses.c.o 00:01:43.884 [340/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_options.c.o 00:01:43.884 [341/740] Generating symbol file lib/librte_telemetry.so.23.0.p/librte_telemetry.so.23.0.symbols 00:01:43.884 [342/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_validate.c.o 00:01:43.884 [343/740] Generating lib/rte_node_def with a custom command 00:01:43.884 [344/740] Generating lib/rte_node_mingw with a custom command 00:01:43.884 [345/740] Compiling C object lib/librte_ipsec.a.p/ipsec_ipsec_telemetry.c.o 00:01:43.884 [346/740] Generating lib/gro.sym_chk with a custom command (wrapped by meson to capture output) 00:01:43.884 [347/740] Linking static target lib/librte_eal.a 00:01:43.884 [348/740] Generating lib/gso.sym_chk with a custom command (wrapped by meson to capture output) 00:01:43.884 [349/740] Compiling C object lib/librte_vhost.a.p/vhost_vdpa.c.o 00:01:43.884 [350/740] Generating drivers/rte_bus_pci_def with a custom command 00:01:43.884 [351/740] Generating drivers/rte_bus_pci_mingw with a custom command 00:01:43.884 [352/740] Compiling C object lib/librte_power.a.p/power_power_pstate_cpufreq.c.o 00:01:43.884 [353/740] Compiling C object lib/librte_rib.a.p/rib_rte_rib.c.o 00:01:43.884 [354/740] Linking static target lib/librte_power.a 00:01:43.884 [355/740] Compiling C object lib/librte_port.a.p/port_rte_port_sched.c.o 00:01:43.884 [356/740] Generating drivers/rte_bus_vdev_def with a custom command 00:01:43.884 [357/740] Generating drivers/rte_bus_vdev_mingw with a custom command 00:01:43.884 [358/740] Generating drivers/rte_mempool_ring_def with a custom command 00:01:43.884 [359/740] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_params.c.o 00:01:43.884 [360/740] Compiling C object lib/librte_vhost.a.p/vhost_iotlb.c.o 00:01:43.884 [361/740] Compiling C object lib/librte_reorder.a.p/reorder_rte_reorder.c.o 00:01:43.884 [362/740] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev_params.c.o 00:01:43.884 [363/740] Generating lib/latencystats.sym_chk with a custom command (wrapped by meson to capture output) 00:01:43.884 [364/740] Compiling C object lib/librte_fib.a.p/fib_dir24_8_avx512.c.o 00:01:43.884 [365/740] Linking static target lib/librte_reorder.a 00:01:43.884 [366/740] Generating lib/rcu.sym_chk with a custom command (wrapped by meson to capture output) 00:01:44.146 [367/740] Compiling C object lib/librte_fib.a.p/fib_trie_avx512.c.o 00:01:44.146 [368/740] Generating drivers/rte_mempool_ring_mingw with a custom command 00:01:44.146 [369/740] Compiling C object lib/librte_table.a.p/table_rte_table_array.c.o 00:01:44.146 [370/740] Compiling C object lib/librte_security.a.p/security_rte_security.c.o 00:01:44.146 [371/740] Linking static target lib/librte_security.a 00:01:44.146 [372/740] Compiling C object lib/librte_pcapng.a.p/pcapng_rte_pcapng.c.o 00:01:44.146 [373/740] Generating lib/distributor.sym_chk with a custom command (wrapped by meson to capture output) 00:01:44.146 [374/740] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common_uio.c.o 00:01:44.146 [375/740] Compiling C object lib/librte_table.a.p/table_rte_table_hash_cuckoo.c.o 00:01:44.146 [376/740] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf.c.o 00:01:44.146 [377/740] Compiling C object lib/librte_table.a.p/table_rte_table_stub.c.o 00:01:44.146 [378/740] Linking static target lib/librte_pcapng.a 00:01:44.146 [379/740] Linking static target lib/librte_mbuf.a 00:01:44.146 [380/740] Compiling C object lib/librte_table.a.p/table_rte_swx_table_wm.c.o 00:01:44.146 [381/740] Compiling C object lib/librte_ipsec.a.p/ipsec_sa.c.o 00:01:44.146 [382/740] Generating lib/rawdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:44.146 [383/740] Compiling C object lib/librte_table.a.p/table_rte_swx_table_selector.c.o 00:01:44.146 [384/740] Compiling C object lib/librte_table.a.p/table_rte_table_lpm.c.o 00:01:44.146 [385/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_jit_x86.c.o 00:01:44.146 [386/740] Linking static target lib/librte_bpf.a 00:01:44.146 [387/740] Compiling C object lib/librte_table.a.p/table_rte_table_lpm_ipv6.c.o 00:01:44.146 [388/740] Compiling C object lib/librte_graph.a.p/graph_graph_debug.c.o 00:01:44.146 [389/740] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_timer_adapter.c.o 00:01:44.146 [390/740] Compiling C object lib/librte_graph.a.p/graph_graph_ops.c.o 00:01:44.146 [391/740] Generating drivers/rte_net_i40e_mingw with a custom command 00:01:44.146 [392/740] Generating drivers/rte_net_i40e_def with a custom command 00:01:44.146 [393/740] Generating lib/ip_frag.sym_chk with a custom command (wrapped by meson to capture output) 00:01:44.146 [394/740] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_uio.c.o 00:01:44.146 [395/740] Compiling C object lib/librte_vhost.a.p/vhost_socket.c.o 00:01:44.146 [396/740] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_port_in_action.c.o 00:01:44.146 [397/740] Compiling C object lib/librte_graph.a.p/graph_graph_populate.c.o 00:01:44.408 [398/740] Compiling C object lib/librte_table.a.p/table_rte_table_acl.c.o 00:01:44.408 [399/740] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev.c.o 00:01:44.408 [400/740] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_eth_tx_adapter.c.o 00:01:44.408 [401/740] Compiling C object lib/librte_node.a.p/node_log.c.o 00:01:44.408 [402/740] Linking static target drivers/libtmp_rte_bus_vdev.a 00:01:44.408 [403/740] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_crypto_adapter.c.o 00:01:44.408 [404/740] Compiling C object lib/librte_graph.a.p/graph_node.c.o 00:01:44.408 [405/740] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_eventdev.c.o 00:01:44.408 [406/740] Compiling C object lib/librte_port.a.p/port_rte_port_frag.c.o 00:01:44.408 [407/740] Compiling C object app/dpdk-test-cmdline.p/test-cmdline_commands.c.o 00:01:44.408 [408/740] Compiling C object lib/librte_port.a.p/port_rte_swx_port_ethdev.c.o 00:01:44.408 [409/740] Compiling C object lib/librte_lpm.a.p/lpm_rte_lpm6.c.o 00:01:44.408 [410/740] Compiling C object lib/librte_node.a.p/node_ethdev_ctrl.c.o 00:01:44.408 [411/740] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common.c.o 00:01:44.408 [412/740] Compiling C object lib/librte_port.a.p/port_rte_port_ras.c.o 00:01:44.408 [413/740] Compiling C object app/dpdk-test-cmdline.p/test-cmdline_cmdline_test.c.o 00:01:44.408 [414/740] Linking static target lib/librte_lpm.a 00:01:44.408 [415/740] Generating lib/dmadev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:44.408 [416/740] Compiling C object lib/librte_graph.a.p/graph_graph.c.o 00:01:44.408 [417/740] Compiling C object lib/librte_rib.a.p/rib_rte_rib6.c.o 00:01:44.408 [418/740] Compiling C object lib/librte_port.a.p/port_rte_swx_port_fd.c.o 00:01:44.408 [419/740] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_flow.c.o 00:01:44.408 [420/740] Generating lib/reorder.sym_chk with a custom command (wrapped by meson to capture output) 00:01:44.408 [421/740] Linking static target lib/librte_rib.a 00:01:44.408 [422/740] Compiling C object lib/librte_graph.a.p/graph_graph_stats.c.o 00:01:44.408 [423/740] Generating lib/cmdline.sym_chk with a custom command (wrapped by meson to capture output) 00:01:44.408 [424/740] Compiling C object lib/librte_member.a.p/member_rte_member_ht.c.o 00:01:44.408 [425/740] Generating lib/bbdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:44.408 [426/740] Compiling C object lib/librte_port.a.p/port_rte_port_fd.c.o 00:01:44.408 [427/740] Compiling C object lib/librte_node.a.p/node_pkt_drop.c.o 00:01:44.408 [428/740] Linking static target lib/librte_graph.a 00:01:44.408 [429/740] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_vfio.c.o 00:01:44.408 [430/740] Compiling C object lib/librte_ipsec.a.p/ipsec_ipsec_sad.c.o 00:01:44.408 [431/740] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_main.c.o 00:01:44.408 [432/740] Compiling C object lib/librte_port.a.p/port_rte_port_ethdev.c.o 00:01:44.408 [433/740] Compiling C object lib/librte_node.a.p/node_ethdev_tx.c.o 00:01:44.408 [434/740] Compiling C object lib/librte_port.a.p/port_rte_swx_port_source_sink.c.o 00:01:44.673 [435/740] Compiling C object lib/librte_port.a.p/port_rte_port_sym_crypto.c.o 00:01:44.673 [436/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_test.c.o 00:01:44.673 [437/740] Compiling C object lib/librte_fib.a.p/fib_trie.c.o 00:01:44.673 [438/740] Compiling C object lib/librte_port.a.p/port_rte_port_eventdev.c.o 00:01:44.673 [439/740] Generating lib/pcapng.sym_chk with a custom command (wrapped by meson to capture output) 00:01:44.673 [440/740] Compiling C object lib/librte_efd.a.p/efd_rte_efd.c.o 00:01:44.673 [441/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_diag.c.o 00:01:44.673 [442/740] Compiling C object lib/librte_node.a.p/node_ethdev_rx.c.o 00:01:44.673 [443/740] Linking static target lib/librte_efd.a 00:01:44.674 [444/740] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci.c.o 00:01:44.674 [445/740] Compiling C object lib/librte_port.a.p/port_rte_port_source_sink.c.o 00:01:44.674 [446/740] Generating drivers/rte_bus_vdev.pmd.c with a custom command 00:01:44.674 [447/740] Linking static target drivers/libtmp_rte_bus_pci.a 00:01:44.674 [448/740] Generating lib/mempool.sym_chk with a custom command (wrapped by meson to capture output) 00:01:44.674 [449/740] Generating lib/bpf.sym_chk with a custom command (wrapped by meson to capture output) 00:01:44.674 [450/740] Compiling C object drivers/librte_bus_vdev.a.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:01:44.674 [451/740] Compiling C object drivers/librte_bus_vdev.so.23.0.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:01:44.674 [452/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_hmc.c.o 00:01:44.674 [453/740] Linking static target drivers/librte_bus_vdev.a 00:01:44.674 [454/740] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key8.c.o 00:01:44.674 [455/740] Generating lib/compressdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:44.674 [456/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_parser.c.o 00:01:44.674 [457/740] Compiling C object lib/librte_port.a.p/port_rte_swx_port_ring.c.o 00:01:44.938 [458/740] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_pipeline.c.o 00:01:44.938 [459/740] Generating lib/security.sym_chk with a custom command (wrapped by meson to capture output) 00:01:44.938 [460/740] Compiling C object lib/librte_fib.a.p/fib_dir24_8.c.o 00:01:44.938 [461/740] Linking static target lib/librte_fib.a 00:01:44.938 [462/740] Compiling C object lib/librte_acl.a.p/acl_acl_bld.c.o 00:01:44.938 [463/740] Generating lib/lpm.sym_chk with a custom command (wrapped by meson to capture output) 00:01:44.938 [464/740] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key16.c.o 00:01:44.938 [465/740] Generating lib/gpudev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:44.938 [466/740] Compiling C object lib/librte_node.a.p/node_pkt_cls.c.o 00:01:44.938 [467/740] Compiling C object lib/librte_pdump.a.p/pdump_rte_pdump.c.o 00:01:44.938 [468/740] Linking static target lib/librte_pdump.a 00:01:44.938 [469/740] Compiling C object lib/librte_table.a.p/table_rte_table_hash_ext.c.o 00:01:44.938 [470/740] Generating lib/regexdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:44.938 [471/740] Compiling C object lib/librte_vhost.a.p/vhost_vhost.c.o 00:01:44.938 [472/740] Generating lib/efd.sym_chk with a custom command (wrapped by meson to capture output) 00:01:44.938 [473/740] Generating lib/mbuf.sym_chk with a custom command (wrapped by meson to capture output) 00:01:45.203 [474/740] Generating drivers/rte_bus_pci.pmd.c with a custom command 00:01:45.203 [475/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_dcb.c.o 00:01:45.203 [476/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_vf_representor.c.o 00:01:45.203 [477/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_common.c.o 00:01:45.203 [478/740] Compiling C object lib/librte_vhost.a.p/vhost_vhost_user.c.o 00:01:45.203 [479/740] Compiling C object drivers/librte_bus_pci.a.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:01:45.203 [480/740] Compiling C object lib/librte_table.a.p/table_rte_table_hash_lru.c.o 00:01:45.203 [481/740] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_options_parse.c.o 00:01:45.203 [482/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_lan_hmc.c.o 00:01:45.203 [483/740] Linking static target drivers/librte_bus_pci.a 00:01:45.203 [484/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_vectors.c.o 00:01:45.203 [485/740] Compiling C object drivers/librte_bus_pci.so.23.0.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:01:45.203 [486/740] Generating lib/rib.sym_chk with a custom command (wrapped by meson to capture output) 00:01:45.203 [487/740] Generating drivers/rte_bus_vdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:45.203 [488/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_main.c.o 00:01:45.203 [489/740] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key32.c.o 00:01:45.203 [490/740] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_main.c.o 00:01:45.203 [491/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_tm.c.o 00:01:45.203 [492/740] Linking static target lib/librte_table.a 00:01:45.203 [493/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_adminq.c.o 00:01:45.203 [494/740] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_flow_gen.c.o 00:01:45.203 [495/740] Compiling C object app/dpdk-test-acl.p/test-acl_main.c.o 00:01:45.203 [496/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_options.c.o 00:01:45.464 [497/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_vector_parsing.c.o 00:01:45.464 [498/740] Compiling C object lib/librte_node.a.p/node_ip4_lookup.c.o 00:01:45.464 [499/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_hash.c.o 00:01:45.464 [500/740] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_items_gen.c.o 00:01:45.464 [501/740] Compiling C object app/dpdk-dumpcap.p/dumpcap_main.c.o 00:01:45.464 [502/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_main.c.o 00:01:45.464 [503/740] Generating lib/fib.sym_chk with a custom command (wrapped by meson to capture output) 00:01:45.464 [504/740] Compiling C object app/dpdk-test-gpudev.p/test-gpudev_main.c.o 00:01:45.464 [505/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_init.c.o 00:01:45.464 [506/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_common.c.o 00:01:45.464 [507/740] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_ctl.c.o 00:01:45.464 [508/740] Generating lib/pdump.sym_chk with a custom command (wrapped by meson to capture output) 00:01:45.464 [509/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_config.c.o 00:01:45.464 [510/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_stub.c.o 00:01:45.464 [511/740] Generating lib/power.sym_chk with a custom command (wrapped by meson to capture output) 00:01:45.464 [512/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_main.c.o 00:01:45.464 [513/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_acl.c.o 00:01:45.464 [514/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_lpm.c.o 00:01:45.464 [515/740] Compiling C object lib/librte_cryptodev.a.p/cryptodev_rte_cryptodev.c.o 00:01:45.464 [516/740] Linking static target lib/librte_cryptodev.a 00:01:45.464 [517/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_lpm_ipv6.c.o 00:01:45.464 [518/740] Compiling C object lib/librte_sched.a.p/sched_rte_sched.c.o 00:01:45.464 [519/740] Linking static target lib/librte_sched.a 00:01:45.464 [520/740] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_throughput.c.o 00:01:45.464 [521/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_options_parsing.c.o 00:01:45.464 [522/740] Compiling C object lib/librte_acl.a.p/acl_acl_run_sse.c.o 00:01:45.464 [523/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_nvm.c.o 00:01:45.464 [524/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_ops.c.o 00:01:45.464 [525/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_atq.c.o 00:01:45.723 [526/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_queue.c.o 00:01:45.723 [527/740] Compiling C object drivers/libtmp_rte_mempool_ring.a.p/mempool_ring_rte_mempool_ring.c.o 00:01:45.723 [528/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_verify.c.o 00:01:45.723 [529/740] Linking static target drivers/libtmp_rte_mempool_ring.a 00:01:45.723 [530/740] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_actions_gen.c.o 00:01:45.723 [531/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_throughput.c.o 00:01:45.723 [532/740] Compiling C object app/dpdk-proc-info.p/proc-info_main.c.o 00:01:45.723 [533/740] Generating drivers/rte_bus_pci.sym_chk with a custom command (wrapped by meson to capture output) 00:01:45.723 [534/740] Compiling C object lib/librte_node.a.p/node_ip4_rewrite.c.o 00:01:45.723 [535/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_pf.c.o 00:01:45.723 [536/740] Linking static target lib/librte_node.a 00:01:45.723 [537/740] Compiling C object app/dpdk-testpmd.p/test-pmd_cmd_flex_item.c.o 00:01:45.723 [538/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_hash.c.o 00:01:45.723 [539/740] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_verify.c.o 00:01:45.723 [540/740] Compiling C object lib/librte_ipsec.a.p/ipsec_esp_outb.c.o 00:01:45.723 [541/740] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev.c.o 00:01:45.724 [542/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_pmd_cyclecount.c.o 00:01:45.724 [543/740] Generating lib/graph.sym_chk with a custom command (wrapped by meson to capture output) 00:01:45.724 [544/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_latency.c.o 00:01:45.724 [545/740] Compiling C object app/dpdk-testpmd.p/test-pmd_5tswap.c.o 00:01:45.724 [546/740] Compiling C object app/dpdk-testpmd.p/test-pmd_bpf_cmd.c.o 00:01:45.724 [547/740] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_common.c.o 00:01:45.724 [548/740] Compiling C object app/dpdk-pdump.p/pdump_main.c.o 00:01:45.724 [549/740] Generating drivers/rte_mempool_ring.pmd.c with a custom command 00:01:45.724 [550/740] Compiling C object lib/librte_ipsec.a.p/ipsec_esp_inb.c.o 00:01:45.724 [551/740] Compiling C object drivers/librte_mempool_ring.so.23.0.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:01:45.724 [552/740] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev.c.o 00:01:45.724 [553/740] Linking static target lib/librte_ipsec.a 00:01:45.724 [554/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_common.c.o 00:01:45.724 [555/740] Compiling C object drivers/librte_mempool_ring.a.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:01:45.724 [556/740] Compiling C object app/dpdk-test-fib.p/test-fib_main.c.o 00:01:45.724 [557/740] Linking static target drivers/librte_mempool_ring.a 00:01:45.724 [558/740] Linking static target lib/librte_ethdev.a 00:01:45.724 [559/740] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_mtr.c.o 00:01:45.984 [560/740] Compiling C object app/dpdk-testpmd.p/test-pmd_iofwd.c.o 00:01:45.984 [561/740] Compiling C object app/dpdk-testpmd.p/test-pmd_macfwd.c.o 00:01:45.984 [562/740] Compiling C object lib/librte_port.a.p/port_rte_port_ring.c.o 00:01:45.984 [563/740] Compiling C object app/dpdk-testpmd.p/test-pmd_rxonly.c.o 00:01:45.984 [564/740] Linking static target lib/librte_port.a 00:01:45.984 [565/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_runtime.c.o 00:01:45.984 [566/740] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev_vector.c.o 00:01:45.984 [567/740] Generating lib/node.sym_chk with a custom command (wrapped by meson to capture output) 00:01:45.984 [568/740] Compiling C object app/dpdk-testpmd.p/test-pmd_shared_rxq_fwd.c.o 00:01:45.984 [569/740] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_cyclecount.c.o 00:01:45.984 [570/740] Compiling C object app/dpdk-testpmd.p/test-pmd_ieee1588fwd.c.o 00:01:45.984 [571/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_flow.c.o 00:01:45.984 [572/740] Compiling C object app/dpdk-testpmd.p/test-pmd_flowgen.c.o 00:01:45.984 [573/740] Compiling C object lib/librte_member.a.p/member_rte_member_sketch.c.o 00:01:45.984 [574/740] Compiling C object app/dpdk-testpmd.p/test-pmd_icmpecho.c.o 00:01:45.984 [575/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_atq.c.o 00:01:45.984 [576/740] Linking static target lib/librte_member.a 00:01:45.984 [577/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_atq.c.o 00:01:45.984 [578/740] Compiling C object app/dpdk-test-security-perf.p/test-security-perf_test_security_perf.c.o 00:01:45.984 [579/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_fdir.c.o 00:01:45.984 [580/740] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_tm.c.o 00:01:45.984 [581/740] Compiling C object app/dpdk-testpmd.p/test-pmd_util.c.o 00:01:46.243 [582/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_queue.c.o 00:01:46.243 [583/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_queue.c.o 00:01:46.243 [584/740] Compiling C object app/dpdk-test-sad.p/test-sad_main.c.o 00:01:46.243 [585/740] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_main.c.o 00:01:46.243 [586/740] Compiling C object app/dpdk-testpmd.p/test-pmd_macswap.c.o 00:01:46.243 [587/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_rte_pmd_i40e.c.o 00:01:46.243 [588/740] Compiling C object lib/librte_acl.a.p/acl_acl_run_avx2.c.o 00:01:46.243 [589/740] Generating lib/sched.sym_chk with a custom command (wrapped by meson to capture output) 00:01:46.243 [590/740] Generating lib/table.sym_chk with a custom command (wrapped by meson to capture output) 00:01:46.243 [591/740] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_eth_rx_adapter.c.o 00:01:46.243 [592/740] Linking static target lib/librte_eventdev.a 00:01:46.243 [593/740] Compiling C object app/dpdk-testpmd.p/.._drivers_net_i40e_i40e_testpmd.c.o 00:01:46.243 [594/740] Generating lib/ipsec.sym_chk with a custom command (wrapped by meson to capture output) 00:01:46.243 [595/740] Compiling C object app/dpdk-testpmd.p/test-pmd_parameters.c.o 00:01:46.243 [596/740] Compiling C object app/dpdk-test-regex.p/test-regex_main.c.o 00:01:46.502 [597/740] Compiling C object drivers/net/i40e/libi40e_avx512_lib.a.p/i40e_rxtx_vec_avx512.c.o 00:01:46.502 [598/740] Linking static target drivers/net/i40e/libi40e_avx512_lib.a 00:01:46.502 [599/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_rxtx_vec_sse.c.o 00:01:46.502 [600/740] Compiling C object lib/librte_hash.a.p/hash_rte_cuckoo_hash.c.o 00:01:46.502 [601/740] Linking static target lib/librte_hash.a 00:01:46.502 [602/740] Compiling C object app/dpdk-testpmd.p/test-pmd_txonly.c.o 00:01:46.502 [603/740] Compiling C object app/dpdk-test-security-perf.p/test_test_cryptodev_security_ipsec.c.o 00:01:46.502 [604/740] Generating lib/member.sym_chk with a custom command (wrapped by meson to capture output) 00:01:46.502 [605/740] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_pipeline_spec.c.o 00:01:46.502 [606/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_rxtx_vec_avx2.c.o 00:01:46.502 [607/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_common.c.o 00:01:46.761 [608/740] Linking static target drivers/net/i40e/base/libi40e_base.a 00:01:46.761 [609/740] Compiling C object app/dpdk-testpmd.p/test-pmd_csumonly.c.o 00:01:46.761 [610/740] Compiling C object lib/librte_acl.a.p/acl_acl_run_avx512.c.o 00:01:46.761 [611/740] Linking static target lib/librte_acl.a 00:01:46.761 [612/740] Generating lib/port.sym_chk with a custom command (wrapped by meson to capture output) 00:01:46.761 [613/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_common.c.o 00:01:47.019 [614/740] Compiling C object app/dpdk-testpmd.p/test-pmd_testpmd.c.o 00:01:47.019 [615/740] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline.c.o 00:01:47.019 [616/740] Compiling C object app/dpdk-testpmd.p/test-pmd_noisy_vnf.c.o 00:01:47.279 [617/740] Generating lib/acl.sym_chk with a custom command (wrapped by meson to capture output) 00:01:47.537 [618/740] Generating lib/hash.sym_chk with a custom command (wrapped by meson to capture output) 00:01:47.537 [619/740] Compiling C object app/dpdk-testpmd.p/test-pmd_config.c.o 00:01:47.796 [620/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_rxtx.c.o 00:01:47.796 [621/740] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_flow.c.o 00:01:48.366 [622/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_ethdev.c.o 00:01:48.366 [623/740] Linking static target drivers/libtmp_rte_net_i40e.a 00:01:48.625 [624/740] Generating drivers/rte_net_i40e.pmd.c with a custom command 00:01:48.882 [625/740] Compiling C object drivers/librte_net_i40e.a.p/meson-generated_.._rte_net_i40e.pmd.c.o 00:01:48.882 [626/740] Compiling C object drivers/librte_net_i40e.so.23.0.p/meson-generated_.._rte_net_i40e.pmd.c.o 00:01:48.882 [627/740] Linking static target drivers/librte_net_i40e.a 00:01:49.140 [628/740] Generating lib/cryptodev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:49.140 [629/740] Compiling C object lib/librte_vhost.a.p/vhost_vhost_crypto.c.o 00:01:49.399 [630/740] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev_perf.c.o 00:01:49.399 [631/740] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_pipeline.c.o 00:01:49.657 [632/740] Generating lib/eventdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:49.916 [633/740] Generating drivers/rte_net_i40e.sym_chk with a custom command (wrapped by meson to capture output) 00:01:55.194 [634/740] Generating lib/ethdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:55.194 [635/740] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net.c.o 00:01:55.194 [636/740] Linking static target lib/librte_vhost.a 00:01:55.764 [637/740] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_table_action.c.o 00:01:55.764 [638/740] Linking static target lib/librte_pipeline.a 00:01:56.334 [639/740] Linking target app/dpdk-dumpcap 00:01:56.334 [640/740] Linking target app/dpdk-test-pipeline 00:01:56.334 [641/740] Linking target app/dpdk-test-compress-perf 00:01:56.334 [642/740] Linking target app/dpdk-proc-info 00:01:56.334 [643/740] Linking target app/dpdk-test-flow-perf 00:01:56.334 [644/740] Linking target app/dpdk-test-cmdline 00:01:56.334 [645/740] Linking target app/dpdk-test-regex 00:01:56.334 [646/740] Linking target app/dpdk-test-gpudev 00:01:56.334 [647/740] Linking target app/dpdk-test-security-perf 00:01:56.334 [648/740] Linking target app/dpdk-test-bbdev 00:01:56.334 [649/740] Linking target app/dpdk-test-crypto-perf 00:01:56.334 [650/740] Linking target app/dpdk-test-acl 00:01:56.334 [651/740] Linking target app/dpdk-test-sad 00:01:56.334 [652/740] Linking target app/dpdk-test-fib 00:01:56.334 [653/740] Linking target app/dpdk-pdump 00:01:56.334 [654/740] Linking target app/dpdk-test-eventdev 00:01:56.334 [655/740] Linking target app/dpdk-testpmd 00:01:57.714 [656/740] Generating lib/vhost.sym_chk with a custom command (wrapped by meson to capture output) 00:01:57.974 [657/740] Generating lib/eal.sym_chk with a custom command (wrapped by meson to capture output) 00:01:58.233 [658/740] Linking target lib/librte_eal.so.23.0 00:01:58.233 [659/740] Generating symbol file lib/librte_eal.so.23.0.p/librte_eal.so.23.0.symbols 00:01:58.233 [660/740] Linking target lib/librte_ring.so.23.0 00:01:58.233 [661/740] Linking target lib/librte_meter.so.23.0 00:01:58.233 [662/740] Linking target lib/librte_cfgfile.so.23.0 00:01:58.233 [663/740] Linking target lib/librte_pci.so.23.0 00:01:58.233 [664/740] Linking target lib/librte_timer.so.23.0 00:01:58.233 [665/740] Linking target lib/librte_jobstats.so.23.0 00:01:58.233 [666/740] Linking target lib/librte_dmadev.so.23.0 00:01:58.233 [667/740] Linking target lib/librte_graph.so.23.0 00:01:58.233 [668/740] Linking target lib/librte_rawdev.so.23.0 00:01:58.233 [669/740] Linking target lib/librte_stack.so.23.0 00:01:58.233 [670/740] Linking target drivers/librte_bus_vdev.so.23.0 00:01:58.233 [671/740] Linking target lib/librte_acl.so.23.0 00:01:58.493 [672/740] Generating symbol file lib/librte_graph.so.23.0.p/librte_graph.so.23.0.symbols 00:01:58.493 [673/740] Generating symbol file lib/librte_timer.so.23.0.p/librte_timer.so.23.0.symbols 00:01:58.493 [674/740] Generating symbol file lib/librte_meter.so.23.0.p/librte_meter.so.23.0.symbols 00:01:58.493 [675/740] Generating symbol file lib/librte_ring.so.23.0.p/librte_ring.so.23.0.symbols 00:01:58.493 [676/740] Generating symbol file lib/librte_pci.so.23.0.p/librte_pci.so.23.0.symbols 00:01:58.493 [677/740] Generating symbol file lib/librte_acl.so.23.0.p/librte_acl.so.23.0.symbols 00:01:58.493 [678/740] Generating symbol file lib/librte_dmadev.so.23.0.p/librte_dmadev.so.23.0.symbols 00:01:58.493 [679/740] Generating symbol file drivers/librte_bus_vdev.so.23.0.p/librte_bus_vdev.so.23.0.symbols 00:01:58.493 [680/740] Linking target lib/librte_rcu.so.23.0 00:01:58.493 [681/740] Linking target lib/librte_mempool.so.23.0 00:01:58.493 [682/740] Linking target drivers/librte_bus_pci.so.23.0 00:01:58.493 [683/740] Generating symbol file lib/librte_mempool.so.23.0.p/librte_mempool.so.23.0.symbols 00:01:58.752 [684/740] Generating symbol file lib/librte_rcu.so.23.0.p/librte_rcu.so.23.0.symbols 00:01:58.752 [685/740] Generating symbol file drivers/librte_bus_pci.so.23.0.p/librte_bus_pci.so.23.0.symbols 00:01:58.752 [686/740] Linking target drivers/librte_mempool_ring.so.23.0 00:01:58.752 [687/740] Linking target lib/librte_rib.so.23.0 00:01:58.752 [688/740] Linking target lib/librte_mbuf.so.23.0 00:01:58.752 [689/740] Generating symbol file lib/librte_rib.so.23.0.p/librte_rib.so.23.0.symbols 00:01:58.752 [690/740] Generating symbol file lib/librte_mbuf.so.23.0.p/librte_mbuf.so.23.0.symbols 00:01:58.752 [691/740] Linking target lib/librte_fib.so.23.0 00:01:58.752 [692/740] Linking target lib/librte_net.so.23.0 00:01:58.752 [693/740] Linking target lib/librte_bbdev.so.23.0 00:01:58.752 [694/740] Linking target lib/librte_reorder.so.23.0 00:01:58.752 [695/740] Linking target lib/librte_gpudev.so.23.0 00:01:58.752 [696/740] Linking target lib/librte_compressdev.so.23.0 00:01:58.752 [697/740] Linking target lib/librte_distributor.so.23.0 00:01:58.752 [698/740] Linking target lib/librte_regexdev.so.23.0 00:01:58.752 [699/740] Linking target lib/librte_sched.so.23.0 00:01:58.752 [700/740] Linking target lib/librte_cryptodev.so.23.0 00:01:59.012 [701/740] Generating symbol file lib/librte_net.so.23.0.p/librte_net.so.23.0.symbols 00:01:59.012 [702/740] Generating symbol file lib/librte_sched.so.23.0.p/librte_sched.so.23.0.symbols 00:01:59.012 [703/740] Generating symbol file lib/librte_cryptodev.so.23.0.p/librte_cryptodev.so.23.0.symbols 00:01:59.012 [704/740] Linking target lib/librte_cmdline.so.23.0 00:01:59.012 [705/740] Linking target lib/librte_hash.so.23.0 00:01:59.012 [706/740] Linking target lib/librte_security.so.23.0 00:01:59.012 [707/740] Linking target lib/librte_ethdev.so.23.0 00:01:59.272 [708/740] Generating symbol file lib/librte_security.so.23.0.p/librte_security.so.23.0.symbols 00:01:59.272 [709/740] Generating symbol file lib/librte_hash.so.23.0.p/librte_hash.so.23.0.symbols 00:01:59.272 [710/740] Generating symbol file lib/librte_ethdev.so.23.0.p/librte_ethdev.so.23.0.symbols 00:01:59.272 [711/740] Linking target lib/librte_efd.so.23.0 00:01:59.272 [712/740] Linking target lib/librte_lpm.so.23.0 00:01:59.272 [713/740] Linking target lib/librte_member.so.23.0 00:01:59.272 [714/740] Linking target lib/librte_metrics.so.23.0 00:01:59.272 [715/740] Linking target lib/librte_ipsec.so.23.0 00:01:59.272 [716/740] Linking target lib/librte_bpf.so.23.0 00:01:59.272 [717/740] Linking target lib/librte_pcapng.so.23.0 00:01:59.272 [718/740] Linking target lib/librte_gso.so.23.0 00:01:59.272 [719/740] Linking target lib/librte_gro.so.23.0 00:01:59.272 [720/740] Linking target lib/librte_ip_frag.so.23.0 00:01:59.272 [721/740] Linking target lib/librte_eventdev.so.23.0 00:01:59.272 [722/740] Linking target lib/librte_power.so.23.0 00:01:59.272 [723/740] Linking target lib/librte_vhost.so.23.0 00:01:59.272 [724/740] Linking target drivers/librte_net_i40e.so.23.0 00:01:59.272 [725/740] Generating symbol file lib/librte_metrics.so.23.0.p/librte_metrics.so.23.0.symbols 00:01:59.272 [726/740] Generating symbol file lib/librte_lpm.so.23.0.p/librte_lpm.so.23.0.symbols 00:01:59.272 [727/740] Generating symbol file lib/librte_pcapng.so.23.0.p/librte_pcapng.so.23.0.symbols 00:01:59.532 [728/740] Generating symbol file lib/librte_bpf.so.23.0.p/librte_bpf.so.23.0.symbols 00:01:59.532 [729/740] Generating symbol file lib/librte_ip_frag.so.23.0.p/librte_ip_frag.so.23.0.symbols 00:01:59.532 [730/740] Generating symbol file lib/librte_eventdev.so.23.0.p/librte_eventdev.so.23.0.symbols 00:01:59.532 [731/740] Linking target lib/librte_bitratestats.so.23.0 00:01:59.532 [732/740] Linking target lib/librte_node.so.23.0 00:01:59.532 [733/740] Linking target lib/librte_latencystats.so.23.0 00:01:59.532 [734/740] Linking target lib/librte_pdump.so.23.0 00:01:59.532 [735/740] Linking target lib/librte_port.so.23.0 00:01:59.532 [736/740] Generating symbol file lib/librte_port.so.23.0.p/librte_port.so.23.0.symbols 00:01:59.792 [737/740] Linking target lib/librte_table.so.23.0 00:01:59.792 [738/740] Generating symbol file lib/librte_table.so.23.0.p/librte_table.so.23.0.symbols 00:02:01.174 [739/740] Generating lib/pipeline.sym_chk with a custom command (wrapped by meson to capture output) 00:02:01.175 [740/740] Linking target lib/librte_pipeline.so.23.0 00:02:01.175 17:45:53 -- common/autobuild_common.sh@190 -- $ ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp -j112 install 00:02:01.175 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp' 00:02:01.175 [0/1] Installing files. 00:02:01.441 Installing subdir /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples 00:02:01.441 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq 00:02:01.441 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq 00:02:01.441 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/packet_ordering/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/packet_ordering 00:02:01.441 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/packet_ordering/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/packet_ordering 00:02:01.441 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/skeleton/basicfwd.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/skeleton 00:02:01.441 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/skeleton/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/skeleton 00:02:01.441 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/helloworld/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/helloworld 00:02:01.441 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/helloworld/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/helloworld 00:02:01.441 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/obj.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:01.441 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/obj.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:01.441 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/thread.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:01.441 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/conn.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:01.441 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/conn.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:01.441 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/cli.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:01.441 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:01.441 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/cli.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:01.441 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:01.441 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/thread.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:01.441 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:01.441 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan_table.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:01.441 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/mirroring.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:01.441 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/hash_func.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:01.441 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/recirculation.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:01.441 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/selector.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:01.441 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp_pcap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:01.441 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:01.441 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:01.441 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/varbit.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:01.441 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/hash_func.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:01.441 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:01.441 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/meter.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:01.441 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/registers.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:01.441 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/varbit.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:01.441 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:01.441 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/learner.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:01.441 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:01.441 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ethdev.io to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:01.441 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/learner.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:01.441 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/mirroring.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:01.441 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib_nexthop_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:01.441 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/recirculation.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:01.441 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/meter.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:01.441 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:01.441 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan_pcap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:01.441 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:01.441 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib_routing_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:01.441 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/selector.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:01.441 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:01.441 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/registers.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:01.442 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_pcap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:01.442 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib_nexthop_group_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:01.442 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/pcap.io to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:01.442 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/packet.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:01.442 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/selector.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:01.442 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_ecdsa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:01.442 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:01.442 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_xts.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:01.442 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_cmac.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:01.442 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_tdes.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:01.442 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_hmac.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:01.442 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_ccm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:01.442 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_aes.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:01.442 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:01.442 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_dev_self_test.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:01.442 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:01.442 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_rsa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:01.442 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_sha.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:01.442 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_gcm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:01.442 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_dev_self_test.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:01.442 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:01.442 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-jobstats/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-jobstats 00:02:01.442 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-jobstats/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-jobstats 00:02:01.442 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_classify/flow_classify.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_classify 00:02:01.442 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_classify/ipv4_rules_file.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_classify 00:02:01.442 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_classify/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_classify 00:02:01.442 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd 00:02:01.442 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd 00:02:01.442 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/service_cores/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/service_cores 00:02:01.442 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/service_cores/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/service_cores 00:02:01.442 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/virtio_net.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:02:01.442 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:02:01.442 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:02:01.442 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:02:01.442 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/pkt_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common 00:02:01.442 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/sse/port_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common/sse 00:02:01.442 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/altivec/port_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common/altivec 00:02:01.442 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/neon/port_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common/neon 00:02:01.442 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/flow_blocks.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:02:01.442 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:02:01.442 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:02:01.442 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_fragmentation/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_fragmentation 00:02:01.442 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_fragmentation/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_fragmentation 00:02:01.442 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_crypto/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_crypto 00:02:01.442 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_crypto/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_crypto 00:02:01.442 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor_nop.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:01.442 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_monitor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:01.442 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/power_manager.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:01.442 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:01.442 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:01.442 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/vm_power_cli.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:01.442 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/parse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:01.442 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_monitor.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:01.442 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_manager.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:01.442 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_manager.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:01.442 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/power_manager.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:01.442 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor_x86.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:01.442 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:01.442 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/vm_power_cli.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:01.442 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:01.442 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:01.442 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/vm_power_cli_guest.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:01.442 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/parse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:01.442 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:01.442 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/vm_power_cli_guest.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:01.442 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:01.442 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/perf_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:01.443 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/perf_core.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:01.443 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:01.443 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:01.443 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:01.443 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/rxtx_callbacks/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/rxtx_callbacks 00:02:01.443 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/rxtx_callbacks/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/rxtx_callbacks 00:02:01.443 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bond/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:02:01.443 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bond/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:02:01.443 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bond/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:02:01.443 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/link_status_interrupt/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/link_status_interrupt 00:02:01.443 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/link_status_interrupt/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/link_status_interrupt 00:02:01.443 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bbdev_app/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bbdev_app 00:02:01.443 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bbdev_app/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bbdev_app 00:02:01.443 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:01.443 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:01.443 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_poll.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:01.443 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_poll.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:01.443 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_common.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:01.443 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:01.443 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:01.443 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event_internal_port.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:01.443 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:01.443 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event_generic.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:01.443 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/t2.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:01.443 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/README to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:01.443 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/t1.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:01.443 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/dummy.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:01.443 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/t3.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:01.443 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/vhost_blk.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:01.443 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/blk.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:01.443 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/vhost_blk.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:01.443 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/vhost_blk_compat.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:01.443 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/blk_spec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:01.443 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:01.443 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd 00:02:01.443 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/node/node.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/node 00:02:01.443 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/node/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/node 00:02:01.443 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/server/init.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/server 00:02:01.443 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/server/args.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/server 00:02:01.443 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/server/init.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/server 00:02:01.443 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/server/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/server 00:02:01.443 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/server/args.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/server 00:02:01.443 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/server/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/server 00:02:01.443 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/shared/common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/shared 00:02:01.443 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ptpclient/ptpclient.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ptpclient 00:02:01.443 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ptpclient/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ptpclient 00:02:01.443 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_reassembly/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_reassembly 00:02:01.443 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_reassembly/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_reassembly 00:02:01.443 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/parse_obj_list.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:01.443 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/parse_obj_list.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:01.443 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/commands.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:01.443 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/commands.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:01.443 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:01.443 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:01.443 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/vdpa_blk_compact.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:02:01.443 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:02:01.443 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:02:01.443 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/dma/dmafwd.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/dma 00:02:01.443 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/dma/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/dma 00:02:01.443 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/rte_policer.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:01.443 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/rte_policer.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:01.443 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:01.443 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:01.443 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:01.443 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/cat.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:02:01.443 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/cat.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:02:01.443 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:02:01.443 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/l2fwd-cat.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:02:01.443 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/timer/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/timer 00:02:01.443 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/timer/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/timer 00:02:01.443 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/init.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:01.444 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:01.444 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/stats.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:01.444 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile_red.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:01.444 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile_pie.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:01.444 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/cfg_file.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:01.444 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile_ov.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:01.444 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:01.444 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:01.444 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/args.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:01.444 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/cmdline.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:01.444 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:01.444 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/cfg_file.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:01.444 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/app_thread.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:01.444 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipv4_multicast/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipv4_multicast 00:02:01.444 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipv4_multicast/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipv4_multicast 00:02:01.444 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:01.444 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:01.444 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:01.444 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:01.444 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_route.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:01.444 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:01.444 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl_scalar.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:01.444 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/lpm_default_v4.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:01.444 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:01.444 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_sequential.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:01.444 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:01.444 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/lpm_route_parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:01.444 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_altivec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:01.444 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:01.444 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/lpm_default_v6.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:01.444 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:01.444 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:01.444 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/em_default_v4.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:01.444 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:01.444 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_fib.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:01.444 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:01.444 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:01.444 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:01.444 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:01.444 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/em_route_parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:01.444 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/em_default_v6.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:01.444 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:01.444 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event_internal_port.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:01.444 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:01.444 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event_generic.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:01.444 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_altivec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:01.444 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:01.444 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:01.444 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:01.444 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/event_helper.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:01.444 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_worker.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:01.444 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/parser.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:01.444 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_worker.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:01.444 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec-secgw.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:01.444 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_process.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:01.444 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:01.444 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:01.444 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sp4.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:01.444 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/esp.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:01.444 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sad.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:01.444 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_lpm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:01.444 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/flow.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:01.444 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:01.444 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/parser.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:01.444 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ep1.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:01.444 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sad.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:01.444 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/flow.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:01.444 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/rt.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:01.444 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/event_helper.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:01.444 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/esp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:01.444 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ep0.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:01.445 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipip.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:01.445 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec-secgw.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:01.445 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sp6.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:01.445 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:01.445 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesctr_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:01.445 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_3descbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:01.445 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aescbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:01.445 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/run_test.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:01.445 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aescbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:01.445 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/bypass_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:01.445 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesctr_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:01.445 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aescbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:01.445 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/common_defs_secgw.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:01.445 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesctr_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:01.445 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:01.445 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesgcm_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:01.445 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/pkttest.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:01.445 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/linux_test.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:01.445 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/data_rxtx.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:01.445 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_3descbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:01.445 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_null_header_reconstruct.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:01.445 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_3descbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:01.445 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_ipv6opts.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:01.445 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/pkttest.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:01.445 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/load_env.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:01.445 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_3descbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:01.445 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesgcm_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:01.445 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aescbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:01.445 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesctr_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:01.445 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesgcm_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:01.445 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesgcm_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:01.445 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/shm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:02:01.445 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/shm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:02:01.445 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:02:01.445 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:02:01.445 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/ka-agent/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive/ka-agent 00:02:01.472 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/ka-agent/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive/ka-agent 00:02:01.472 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:01.472 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/thread.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:01.472 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/link.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:01.472 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/action.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:01.473 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cryptodev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:01.473 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/conn.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:01.473 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/conn.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:01.473 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cryptodev.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:01.473 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/parser.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:01.473 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/mempool.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:01.473 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/swq.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:01.473 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/parser.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:01.473 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/pipeline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:01.473 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tmgr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:01.473 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/swq.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:01.473 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tmgr.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:01.473 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tap.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:01.473 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/kni.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:01.473 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tap.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:01.473 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/mempool.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:01.473 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cli.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:01.473 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:01.473 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/link.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:01.473 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/action.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:01.473 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/pipeline.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:01.473 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/kni.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:01.473 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cli.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:01.473 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:01.473 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/thread.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:01.473 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/kni.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:01.473 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/firewall.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:01.473 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/route_ecmp.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:01.473 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/flow.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:01.473 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/route.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:01.473 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/l2fwd.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:01.473 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/flow_crypto.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:01.473 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/tap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:01.473 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/rss.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:01.473 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq_dcb/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq_dcb 00:02:01.473 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq_dcb/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq_dcb 00:02:01.473 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ntb/ntb_fwd.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ntb 00:02:01.473 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ntb/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ntb 00:02:01.473 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool 00:02:01.473 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/ethapp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:02:01.473 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/ethapp.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:02:01.473 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:02:01.473 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:02:01.473 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/lib/rte_ethtool.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:02:01.473 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/lib/rte_ethtool.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:02:01.473 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/lib/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:02:01.473 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process 00:02:01.473 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/mp_commands.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:01.473 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:01.473 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/mp_commands.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:01.473 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:01.474 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/commands.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:02:01.474 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/commands.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:02:01.474 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:02:01.474 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:02:01.474 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp 00:02:01.474 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_client/client.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_client 00:02:01.474 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_client/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_client 00:02:01.474 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/init.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:01.474 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/args.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:01.474 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/init.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:01.474 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:01.474 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/args.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:01.474 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:01.474 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/shared/common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/shared 00:02:01.474 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/symmetric_mp/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/symmetric_mp 00:02:01.474 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/symmetric_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/symmetric_mp 00:02:01.474 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_worker_generic.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:01.474 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:01.474 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:01.474 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_worker_tx.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:01.474 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:01.474 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/distributor/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/distributor 00:02:01.474 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/distributor/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/distributor 00:02:01.474 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-graph/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-graph 00:02:01.474 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-graph/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-graph 00:02:01.474 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-crypto/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-crypto 00:02:01.474 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-crypto/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-crypto 00:02:01.474 Installing lib/librte_kvargs.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.474 Installing lib/librte_kvargs.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.474 Installing lib/librte_telemetry.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.474 Installing lib/librte_telemetry.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.474 Installing lib/librte_eal.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.474 Installing lib/librte_eal.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.474 Installing lib/librte_ring.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.474 Installing lib/librte_ring.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.474 Installing lib/librte_rcu.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.474 Installing lib/librte_rcu.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.474 Installing lib/librte_mempool.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.474 Installing lib/librte_mempool.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.474 Installing lib/librte_mbuf.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.474 Installing lib/librte_mbuf.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.474 Installing lib/librte_net.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.474 Installing lib/librte_net.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.474 Installing lib/librte_meter.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.474 Installing lib/librte_meter.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.474 Installing lib/librte_ethdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.474 Installing lib/librte_ethdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.474 Installing lib/librte_pci.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.474 Installing lib/librte_pci.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.474 Installing lib/librte_cmdline.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.474 Installing lib/librte_cmdline.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.474 Installing lib/librte_metrics.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.474 Installing lib/librte_metrics.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.474 Installing lib/librte_hash.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.474 Installing lib/librte_hash.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.474 Installing lib/librte_timer.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.474 Installing lib/librte_timer.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.474 Installing lib/librte_acl.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.474 Installing lib/librte_acl.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.474 Installing lib/librte_bbdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.474 Installing lib/librte_bbdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.474 Installing lib/librte_bitratestats.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.474 Installing lib/librte_bitratestats.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.474 Installing lib/librte_bpf.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.475 Installing lib/librte_bpf.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.475 Installing lib/librte_cfgfile.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.475 Installing lib/librte_cfgfile.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.475 Installing lib/librte_compressdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.475 Installing lib/librte_compressdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.475 Installing lib/librte_cryptodev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.475 Installing lib/librte_cryptodev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.475 Installing lib/librte_distributor.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.475 Installing lib/librte_distributor.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.475 Installing lib/librte_efd.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.475 Installing lib/librte_efd.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.475 Installing lib/librte_eventdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.475 Installing lib/librte_eventdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.475 Installing lib/librte_gpudev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.475 Installing lib/librte_gpudev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.475 Installing lib/librte_gro.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.475 Installing lib/librte_gro.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.475 Installing lib/librte_gso.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.475 Installing lib/librte_gso.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.475 Installing lib/librte_ip_frag.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.475 Installing lib/librte_ip_frag.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.475 Installing lib/librte_jobstats.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.475 Installing lib/librte_jobstats.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.475 Installing lib/librte_latencystats.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.475 Installing lib/librte_latencystats.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.475 Installing lib/librte_lpm.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.475 Installing lib/librte_lpm.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.475 Installing lib/librte_member.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.475 Installing lib/librte_member.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.475 Installing lib/librte_pcapng.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.475 Installing lib/librte_pcapng.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.475 Installing lib/librte_power.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.475 Installing lib/librte_power.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.475 Installing lib/librte_rawdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.475 Installing lib/librte_rawdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.475 Installing lib/librte_regexdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.475 Installing lib/librte_regexdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.475 Installing lib/librte_dmadev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.475 Installing lib/librte_dmadev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.475 Installing lib/librte_rib.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.475 Installing lib/librte_rib.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.475 Installing lib/librte_reorder.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.475 Installing lib/librte_reorder.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.739 Installing lib/librte_sched.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.739 Installing lib/librte_sched.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.739 Installing lib/librte_security.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.739 Installing lib/librte_security.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.739 Installing lib/librte_stack.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.739 Installing lib/librte_stack.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.739 Installing lib/librte_vhost.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.739 Installing lib/librte_vhost.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.739 Installing lib/librte_ipsec.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.739 Installing lib/librte_ipsec.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.739 Installing lib/librte_fib.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.739 Installing lib/librte_fib.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.739 Installing lib/librte_port.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.739 Installing lib/librte_port.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.739 Installing lib/librte_pdump.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.739 Installing lib/librte_pdump.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.739 Installing lib/librte_table.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.739 Installing lib/librte_table.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.739 Installing lib/librte_pipeline.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.739 Installing lib/librte_pipeline.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.739 Installing lib/librte_graph.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.739 Installing lib/librte_graph.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.739 Installing lib/librte_node.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.739 Installing lib/librte_node.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.739 Installing drivers/librte_bus_pci.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.739 Installing drivers/librte_bus_pci.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0 00:02:01.739 Installing drivers/librte_bus_vdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.739 Installing drivers/librte_bus_vdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0 00:02:01.739 Installing drivers/librte_mempool_ring.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.739 Installing drivers/librte_mempool_ring.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0 00:02:01.739 Installing drivers/librte_net_i40e.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.740 Installing drivers/librte_net_i40e.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0 00:02:01.740 Installing app/dpdk-dumpcap to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:01.740 Installing app/dpdk-pdump to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:01.740 Installing app/dpdk-proc-info to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:01.740 Installing app/dpdk-test-acl to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:01.740 Installing app/dpdk-test-bbdev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:01.740 Installing app/dpdk-test-cmdline to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:01.740 Installing app/dpdk-test-compress-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:01.740 Installing app/dpdk-test-crypto-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:01.740 Installing app/dpdk-test-eventdev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:01.740 Installing app/dpdk-test-fib to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:01.740 Installing app/dpdk-test-flow-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:01.740 Installing app/dpdk-test-gpudev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:01.740 Installing app/dpdk-test-pipeline to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:01.740 Installing app/dpdk-testpmd to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:01.740 Installing app/dpdk-test-regex to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:01.740 Installing app/dpdk-test-sad to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:01.740 Installing app/dpdk-test-security-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:01.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/config/rte_config.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/kvargs/rte_kvargs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/telemetry/rte_telemetry.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_atomic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:01.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_byteorder.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:01.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_cpuflags.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:01.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_cycles.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:01.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_io.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:01.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_memcpy.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:01.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_pause.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:01.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_power_intrinsics.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:01.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_prefetch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:01.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_rwlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:01.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_spinlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:01.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_vect.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:01.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_cpuflags.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_cycles.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_io.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_memcpy.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_pause.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_power_intrinsics.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_prefetch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_rtm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_rwlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_spinlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_vect.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic_32.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic_64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder_32.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder_64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_alarm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bitmap.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bitops.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_branch_prediction.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bus.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_class.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_compat.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_debug.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_dev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_devargs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_eal.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_eal_memconfig.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_eal_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_errno.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_epoll.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_fbarray.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_hexdump.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_hypervisor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_interrupts.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_keepalive.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_launch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_lcore.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_log.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_malloc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_mcslock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_memory.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_memzone.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_pci_dev_feature_defs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_pci_dev_features.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_per_lcore.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_pflock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_random.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_reciprocal.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_seqcount.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_seqlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_service.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_service_component.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_string_fns.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_tailq.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_thread.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_ticketlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_time.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_trace_point.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_trace_point_register.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_uuid.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_version.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_vfio.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/linux/include/rte_os.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_elem.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_c11_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_generic_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_hts.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_hts_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_peek.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_peek_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_peek_zc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_rts.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_rts_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rcu/rte_rcu_qsbr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mempool/rte_mempool.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mempool/rte_mempool_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mempool/rte_mempool_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_ptype.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_pool_ops.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_dyn.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ip.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_tcp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_udp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_esp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_sctp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_icmp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_arp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ether.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_macsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_vxlan.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_gre.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_gtp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_net.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_net_crc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_mpls.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_higig.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ecpri.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_geneve.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_l2tpv2.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ppp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/meter/rte_meter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_cman.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_dev_info.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_flow.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_flow_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_mtr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_mtr_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_tm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_tm_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_eth_ctrl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pci/rte_pci.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_num.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_ipaddr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_etheraddr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_string.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_rdline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_vt100.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_socket.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_cirbuf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_portlist.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/metrics/rte_metrics.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/metrics/rte_metrics_telemetry.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_fbk_hash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_hash_crc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_hash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_jhash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_thash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_thash_gfni.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_arm64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_generic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_sw.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_x86.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_thash_x86_gfni.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/timer/rte_timer.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/acl/rte_acl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/acl/rte_acl_osdep.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bbdev/rte_bbdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bbdev/rte_bbdev_pmd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bbdev/rte_bbdev_op.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bitratestats/rte_bitrate.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bpf/bpf_def.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bpf/rte_bpf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bpf/rte_bpf_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cfgfile/rte_cfgfile.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/compressdev/rte_compressdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/compressdev/rte_comp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_crypto.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_crypto_sym.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_crypto_asym.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/distributor/rte_distributor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/efd/rte_efd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_crypto_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_eth_rx_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_eth_tx_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_timer_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_eventdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_eventdev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_eventdev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/gpudev/rte_gpudev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/gro/rte_gro.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/gso/rte_gso.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ip_frag/rte_ip_frag.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/jobstats/rte_jobstats.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/latencystats/rte_latencystats.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_altivec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_scalar.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_sve.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/member/rte_member.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pcapng/rte_pcapng.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_empty_poll.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_intel_uncore.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_pmd_mgmt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_guest_channel.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rawdev/rte_rawdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rawdev/rte_rawdev_pmd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/regexdev/rte_regexdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/regexdev/rte_regexdev_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/regexdev/rte_regexdev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/dmadev/rte_dmadev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/dmadev/rte_dmadev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rib/rte_rib.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rib/rte_rib6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/reorder/rte_reorder.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_approx.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_red.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_sched.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_sched_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_pie.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/security/rte_security.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/security/rte_security_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_std.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf_generic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf_c11.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf_stubs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vdpa.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vhost.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vhost_async.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vhost_crypto.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec_sa.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec_sad.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/fib/rte_fib.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/fib/rte_fib6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_fd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_frag.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_ras.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_sched.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_source_sink.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_sym_crypto.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_eventdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_fd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_source_sink.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pdump/rte_pdump.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_lru.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_hash_func.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_em.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_learner.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_selector.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_wm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_acl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_array.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash_cuckoo.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash_func.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_lpm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_lpm_ipv6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_stub.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_lru_arm64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_lru_x86.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash_func_arm64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_pipeline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_port_in_action.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_table_action.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_pipeline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_extern.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_ctl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph_worker.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/node/rte_node_ip4_api.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/node/rte_node_eth_api.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/drivers/bus/pci/rte_bus_pci.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/drivers/bus/vdev/rte_bus_vdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/drivers/net/i40e/rte_pmd_i40e.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-devbind.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:01.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-pmdinfo.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:01.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-telemetry.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:01.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-hugepages.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:01.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp/rte_build_config.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp/meson-private/libdpdk-libs.pc to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig 00:02:01.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp/meson-private/libdpdk.pc to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig 00:02:01.743 Installing symlink pointing to librte_kvargs.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_kvargs.so.23 00:02:01.743 Installing symlink pointing to librte_kvargs.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_kvargs.so 00:02:01.743 Installing symlink pointing to librte_telemetry.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_telemetry.so.23 00:02:01.743 Installing symlink pointing to librte_telemetry.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_telemetry.so 00:02:01.743 Installing symlink pointing to librte_eal.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eal.so.23 00:02:01.744 Installing symlink pointing to librte_eal.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eal.so 00:02:01.744 Installing symlink pointing to librte_ring.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ring.so.23 00:02:01.744 Installing symlink pointing to librte_ring.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ring.so 00:02:01.744 Installing symlink pointing to librte_rcu.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rcu.so.23 00:02:01.744 Installing symlink pointing to librte_rcu.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rcu.so 00:02:01.744 Installing symlink pointing to librte_mempool.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mempool.so.23 00:02:01.744 Installing symlink pointing to librte_mempool.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mempool.so 00:02:01.744 Installing symlink pointing to librte_mbuf.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mbuf.so.23 00:02:01.744 Installing symlink pointing to librte_mbuf.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mbuf.so 00:02:01.744 Installing symlink pointing to librte_net.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_net.so.23 00:02:01.744 Installing symlink pointing to librte_net.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_net.so 00:02:01.744 Installing symlink pointing to librte_meter.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_meter.so.23 00:02:01.744 Installing symlink pointing to librte_meter.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_meter.so 00:02:01.744 Installing symlink pointing to librte_ethdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ethdev.so.23 00:02:01.744 Installing symlink pointing to librte_ethdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ethdev.so 00:02:01.744 Installing symlink pointing to librte_pci.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pci.so.23 00:02:01.744 Installing symlink pointing to librte_pci.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pci.so 00:02:01.744 Installing symlink pointing to librte_cmdline.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cmdline.so.23 00:02:01.744 Installing symlink pointing to librte_cmdline.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cmdline.so 00:02:01.744 Installing symlink pointing to librte_metrics.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_metrics.so.23 00:02:01.744 Installing symlink pointing to librte_metrics.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_metrics.so 00:02:01.744 Installing symlink pointing to librte_hash.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_hash.so.23 00:02:01.744 Installing symlink pointing to librte_hash.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_hash.so 00:02:01.744 Installing symlink pointing to librte_timer.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_timer.so.23 00:02:01.744 Installing symlink pointing to librte_timer.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_timer.so 00:02:01.744 Installing symlink pointing to librte_acl.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_acl.so.23 00:02:01.744 Installing symlink pointing to librte_acl.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_acl.so 00:02:01.744 Installing symlink pointing to librte_bbdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bbdev.so.23 00:02:01.744 Installing symlink pointing to librte_bbdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bbdev.so 00:02:01.744 Installing symlink pointing to librte_bitratestats.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bitratestats.so.23 00:02:01.744 Installing symlink pointing to librte_bitratestats.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bitratestats.so 00:02:01.744 Installing symlink pointing to librte_bpf.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bpf.so.23 00:02:01.744 Installing symlink pointing to librte_bpf.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bpf.so 00:02:01.744 Installing symlink pointing to librte_cfgfile.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cfgfile.so.23 00:02:01.744 Installing symlink pointing to librte_cfgfile.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cfgfile.so 00:02:01.744 Installing symlink pointing to librte_compressdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_compressdev.so.23 00:02:01.744 Installing symlink pointing to librte_compressdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_compressdev.so 00:02:01.744 Installing symlink pointing to librte_cryptodev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cryptodev.so.23 00:02:01.744 Installing symlink pointing to librte_cryptodev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cryptodev.so 00:02:01.744 Installing symlink pointing to librte_distributor.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_distributor.so.23 00:02:01.744 Installing symlink pointing to librte_distributor.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_distributor.so 00:02:01.744 Installing symlink pointing to librte_efd.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_efd.so.23 00:02:01.744 Installing symlink pointing to librte_efd.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_efd.so 00:02:01.744 Installing symlink pointing to librte_eventdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eventdev.so.23 00:02:01.744 Installing symlink pointing to librte_eventdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eventdev.so 00:02:01.744 Installing symlink pointing to librte_gpudev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gpudev.so.23 00:02:01.744 Installing symlink pointing to librte_gpudev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gpudev.so 00:02:01.744 Installing symlink pointing to librte_gro.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gro.so.23 00:02:01.744 Installing symlink pointing to librte_gro.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gro.so 00:02:01.744 Installing symlink pointing to librte_gso.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gso.so.23 00:02:01.744 Installing symlink pointing to librte_gso.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gso.so 00:02:01.744 Installing symlink pointing to librte_ip_frag.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ip_frag.so.23 00:02:01.744 Installing symlink pointing to librte_ip_frag.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ip_frag.so 00:02:01.744 Installing symlink pointing to librte_jobstats.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_jobstats.so.23 00:02:01.744 Installing symlink pointing to librte_jobstats.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_jobstats.so 00:02:01.744 Installing symlink pointing to librte_latencystats.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_latencystats.so.23 00:02:01.744 Installing symlink pointing to librte_latencystats.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_latencystats.so 00:02:01.744 Installing symlink pointing to librte_lpm.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_lpm.so.23 00:02:01.744 Installing symlink pointing to librte_lpm.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_lpm.so 00:02:01.744 Installing symlink pointing to librte_member.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_member.so.23 00:02:01.744 Installing symlink pointing to librte_member.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_member.so 00:02:01.744 Installing symlink pointing to librte_pcapng.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pcapng.so.23 00:02:01.744 Installing symlink pointing to librte_pcapng.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pcapng.so 00:02:01.744 Installing symlink pointing to librte_power.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_power.so.23 00:02:01.744 Installing symlink pointing to librte_power.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_power.so 00:02:01.744 Installing symlink pointing to librte_rawdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rawdev.so.23 00:02:01.744 Installing symlink pointing to librte_rawdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rawdev.so 00:02:01.744 Installing symlink pointing to librte_regexdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_regexdev.so.23 00:02:01.744 Installing symlink pointing to librte_regexdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_regexdev.so 00:02:01.744 Installing symlink pointing to librte_dmadev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_dmadev.so.23 00:02:01.744 Installing symlink pointing to librte_dmadev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_dmadev.so 00:02:01.744 Installing symlink pointing to librte_rib.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rib.so.23 00:02:01.744 Installing symlink pointing to librte_rib.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rib.so 00:02:01.744 Installing symlink pointing to librte_reorder.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_reorder.so.23 00:02:01.744 Installing symlink pointing to librte_reorder.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_reorder.so 00:02:01.744 Installing symlink pointing to librte_sched.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_sched.so.23 00:02:01.744 Installing symlink pointing to librte_sched.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_sched.so 00:02:01.745 Installing symlink pointing to librte_security.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_security.so.23 00:02:01.745 Installing symlink pointing to librte_security.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_security.so 00:02:01.745 Installing symlink pointing to librte_stack.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_stack.so.23 00:02:01.745 Installing symlink pointing to librte_stack.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_stack.so 00:02:01.745 Installing symlink pointing to librte_vhost.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_vhost.so.23 00:02:01.745 Installing symlink pointing to librte_vhost.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_vhost.so 00:02:01.745 './librte_bus_pci.so' -> 'dpdk/pmds-23.0/librte_bus_pci.so' 00:02:01.745 './librte_bus_pci.so.23' -> 'dpdk/pmds-23.0/librte_bus_pci.so.23' 00:02:01.745 './librte_bus_pci.so.23.0' -> 'dpdk/pmds-23.0/librte_bus_pci.so.23.0' 00:02:01.745 './librte_bus_vdev.so' -> 'dpdk/pmds-23.0/librte_bus_vdev.so' 00:02:01.745 './librte_bus_vdev.so.23' -> 'dpdk/pmds-23.0/librte_bus_vdev.so.23' 00:02:01.745 './librte_bus_vdev.so.23.0' -> 'dpdk/pmds-23.0/librte_bus_vdev.so.23.0' 00:02:01.745 './librte_mempool_ring.so' -> 'dpdk/pmds-23.0/librte_mempool_ring.so' 00:02:01.745 './librte_mempool_ring.so.23' -> 'dpdk/pmds-23.0/librte_mempool_ring.so.23' 00:02:01.745 './librte_mempool_ring.so.23.0' -> 'dpdk/pmds-23.0/librte_mempool_ring.so.23.0' 00:02:01.745 './librte_net_i40e.so' -> 'dpdk/pmds-23.0/librte_net_i40e.so' 00:02:01.745 './librte_net_i40e.so.23' -> 'dpdk/pmds-23.0/librte_net_i40e.so.23' 00:02:01.745 './librte_net_i40e.so.23.0' -> 'dpdk/pmds-23.0/librte_net_i40e.so.23.0' 00:02:01.745 Installing symlink pointing to librte_ipsec.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ipsec.so.23 00:02:01.745 Installing symlink pointing to librte_ipsec.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ipsec.so 00:02:01.745 Installing symlink pointing to librte_fib.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_fib.so.23 00:02:01.745 Installing symlink pointing to librte_fib.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_fib.so 00:02:01.745 Installing symlink pointing to librte_port.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_port.so.23 00:02:01.745 Installing symlink pointing to librte_port.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_port.so 00:02:01.745 Installing symlink pointing to librte_pdump.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pdump.so.23 00:02:01.745 Installing symlink pointing to librte_pdump.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pdump.so 00:02:01.745 Installing symlink pointing to librte_table.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_table.so.23 00:02:01.745 Installing symlink pointing to librte_table.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_table.so 00:02:01.745 Installing symlink pointing to librte_pipeline.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pipeline.so.23 00:02:01.745 Installing symlink pointing to librte_pipeline.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pipeline.so 00:02:01.745 Installing symlink pointing to librte_graph.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_graph.so.23 00:02:01.745 Installing symlink pointing to librte_graph.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_graph.so 00:02:01.745 Installing symlink pointing to librte_node.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_node.so.23 00:02:01.745 Installing symlink pointing to librte_node.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_node.so 00:02:01.745 Installing symlink pointing to librte_bus_pci.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_bus_pci.so.23 00:02:01.745 Installing symlink pointing to librte_bus_pci.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_bus_pci.so 00:02:01.745 Installing symlink pointing to librte_bus_vdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_bus_vdev.so.23 00:02:01.745 Installing symlink pointing to librte_bus_vdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_bus_vdev.so 00:02:01.745 Installing symlink pointing to librte_mempool_ring.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_mempool_ring.so.23 00:02:01.745 Installing symlink pointing to librte_mempool_ring.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_mempool_ring.so 00:02:01.745 Installing symlink pointing to librte_net_i40e.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_net_i40e.so.23 00:02:01.745 Installing symlink pointing to librte_net_i40e.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_net_i40e.so 00:02:01.745 Running custom install script '/bin/sh /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/config/../buildtools/symlink-drivers-solibs.sh lib dpdk/pmds-23.0' 00:02:01.745 17:45:54 -- common/autobuild_common.sh@192 -- $ uname -s 00:02:01.745 17:45:54 -- common/autobuild_common.sh@192 -- $ [[ Linux == \F\r\e\e\B\S\D ]] 00:02:01.745 17:45:54 -- common/autobuild_common.sh@203 -- $ cat 00:02:01.745 17:45:54 -- common/autobuild_common.sh@208 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:02:01.745 00:02:01.745 real 0m25.786s 00:02:01.745 user 6m35.213s 00:02:01.745 sys 2m12.605s 00:02:01.745 17:45:54 -- common/autotest_common.sh@1115 -- $ xtrace_disable 00:02:01.745 17:45:54 -- common/autotest_common.sh@10 -- $ set +x 00:02:01.745 ************************************ 00:02:01.745 END TEST build_native_dpdk 00:02:01.745 ************************************ 00:02:02.005 17:45:54 -- spdk/autobuild.sh@31 -- $ case "$SPDK_TEST_AUTOBUILD" in 00:02:02.005 17:45:54 -- spdk/autobuild.sh@47 -- $ [[ 0 -eq 1 ]] 00:02:02.005 17:45:54 -- spdk/autobuild.sh@51 -- $ [[ 1 -eq 1 ]] 00:02:02.005 17:45:54 -- spdk/autobuild.sh@52 -- $ llvm_precompile 00:02:02.005 17:45:54 -- common/autobuild_common.sh@428 -- $ run_test autobuild_llvm_precompile _llvm_precompile 00:02:02.005 17:45:54 -- common/autotest_common.sh@1087 -- $ '[' 2 -le 1 ']' 00:02:02.005 17:45:54 -- common/autotest_common.sh@1093 -- $ xtrace_disable 00:02:02.005 17:45:54 -- common/autotest_common.sh@10 -- $ set +x 00:02:02.005 ************************************ 00:02:02.005 START TEST autobuild_llvm_precompile 00:02:02.005 ************************************ 00:02:02.005 17:45:54 -- common/autotest_common.sh@1114 -- $ _llvm_precompile 00:02:02.005 17:45:54 -- common/autobuild_common.sh@32 -- $ clang --version 00:02:02.005 17:45:54 -- common/autobuild_common.sh@32 -- $ [[ clang version 17.0.6 (Fedora 17.0.6-2.fc39) 00:02:02.005 Target: x86_64-redhat-linux-gnu 00:02:02.005 Thread model: posix 00:02:02.005 InstalledDir: /usr/bin =~ version (([0-9]+).([0-9]+).([0-9]+)) ]] 00:02:02.005 17:45:54 -- common/autobuild_common.sh@33 -- $ clang_num=17 00:02:02.005 17:45:54 -- common/autobuild_common.sh@35 -- $ export CC=clang-17 00:02:02.005 17:45:54 -- common/autobuild_common.sh@35 -- $ CC=clang-17 00:02:02.005 17:45:54 -- common/autobuild_common.sh@36 -- $ export CXX=clang++-17 00:02:02.005 17:45:54 -- common/autobuild_common.sh@36 -- $ CXX=clang++-17 00:02:02.005 17:45:54 -- common/autobuild_common.sh@38 -- $ fuzzer_libs=(/usr/lib*/clang/@("$clang_num"|"$clang_version")/lib/*linux*/libclang_rt.fuzzer_no_main?(-x86_64).a) 00:02:02.005 17:45:54 -- common/autobuild_common.sh@39 -- $ fuzzer_lib=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:02:02.005 17:45:54 -- common/autobuild_common.sh@40 -- $ [[ -e /usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a ]] 00:02:02.005 17:45:54 -- common/autobuild_common.sh@42 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user --with-fuzzer=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a' 00:02:02.005 17:45:54 -- common/autobuild_common.sh@44 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user --with-fuzzer=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:02:02.265 Using /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig for additional libs... 00:02:02.265 DPDK libraries: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:02.265 DPDK includes: //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.525 Using default SPDK env in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:02:02.785 Using 'verbs' RDMA provider 00:02:18.617 Configuring ISA-L (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l/spdk-isal.log)...done. 00:02:30.838 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l-crypto/spdk-isal-crypto.log)...done. 00:02:31.406 Creating mk/config.mk...done. 00:02:31.406 Creating mk/cc.flags.mk...done. 00:02:31.406 Type 'make' to build. 00:02:31.406 00:02:31.406 real 0m29.350s 00:02:31.406 user 0m12.756s 00:02:31.406 sys 0m16.070s 00:02:31.406 17:46:23 -- common/autotest_common.sh@1115 -- $ xtrace_disable 00:02:31.406 17:46:23 -- common/autotest_common.sh@10 -- $ set +x 00:02:31.406 ************************************ 00:02:31.406 END TEST autobuild_llvm_precompile 00:02:31.407 ************************************ 00:02:31.407 17:46:24 -- spdk/autobuild.sh@55 -- $ [[ -n '' ]] 00:02:31.407 17:46:24 -- spdk/autobuild.sh@57 -- $ [[ 0 -eq 1 ]] 00:02:31.407 17:46:24 -- spdk/autobuild.sh@59 -- $ [[ 0 -eq 1 ]] 00:02:31.407 17:46:24 -- spdk/autobuild.sh@62 -- $ [[ 1 -eq 1 ]] 00:02:31.407 17:46:24 -- spdk/autobuild.sh@64 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user --with-fuzzer=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:02:31.407 Using /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig for additional libs... 00:02:31.666 DPDK libraries: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:31.666 DPDK includes: //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:31.666 Using default SPDK env in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:02:32.235 Using 'verbs' RDMA provider 00:02:45.026 Configuring ISA-L (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l/spdk-isal.log)...done. 00:02:57.251 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l-crypto/spdk-isal-crypto.log)...done. 00:02:57.251 Creating mk/config.mk...done. 00:02:57.251 Creating mk/cc.flags.mk...done. 00:02:57.251 Type 'make' to build. 00:02:57.251 17:46:49 -- spdk/autobuild.sh@69 -- $ run_test make make -j112 00:02:57.251 17:46:49 -- common/autotest_common.sh@1087 -- $ '[' 3 -le 1 ']' 00:02:57.251 17:46:49 -- common/autotest_common.sh@1093 -- $ xtrace_disable 00:02:57.251 17:46:49 -- common/autotest_common.sh@10 -- $ set +x 00:02:57.251 ************************************ 00:02:57.251 START TEST make 00:02:57.251 ************************************ 00:02:57.251 17:46:49 -- common/autotest_common.sh@1114 -- $ make -j112 00:02:57.251 make[1]: Nothing to be done for 'all'. 00:02:58.189 The Meson build system 00:02:58.189 Version: 1.5.0 00:02:58.189 Source dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user 00:02:58.189 Build dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:02:58.189 Build type: native build 00:02:58.189 Project name: libvfio-user 00:02:58.189 Project version: 0.0.1 00:02:58.189 C compiler for the host machine: clang-17 (clang 17.0.6 "clang version 17.0.6 (Fedora 17.0.6-2.fc39)") 00:02:58.189 C linker for the host machine: clang-17 ld.bfd 2.40-14 00:02:58.189 Host machine cpu family: x86_64 00:02:58.189 Host machine cpu: x86_64 00:02:58.189 Run-time dependency threads found: YES 00:02:58.189 Library dl found: YES 00:02:58.189 Found pkg-config: YES (/usr/bin/pkg-config) 1.9.5 00:02:58.189 Run-time dependency json-c found: YES 0.17 00:02:58.189 Run-time dependency cmocka found: YES 1.1.7 00:02:58.189 Program pytest-3 found: NO 00:02:58.189 Program flake8 found: NO 00:02:58.189 Program misspell-fixer found: NO 00:02:58.189 Program restructuredtext-lint found: NO 00:02:58.189 Program valgrind found: YES (/usr/bin/valgrind) 00:02:58.189 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:02:58.189 Compiler for C supports arguments -Wmissing-declarations: YES 00:02:58.189 Compiler for C supports arguments -Wwrite-strings: YES 00:02:58.189 ../libvfio-user/test/meson.build:20: WARNING: Project targets '>= 0.53.0' but uses feature introduced in '0.57.0': exclude_suites arg in add_test_setup. 00:02:58.189 Program test-lspci.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user/test/test-lspci.sh) 00:02:58.189 Program test-linkage.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user/test/test-linkage.sh) 00:02:58.189 ../libvfio-user/test/py/meson.build:16: WARNING: Project targets '>= 0.53.0' but uses feature introduced in '0.57.0': exclude_suites arg in add_test_setup. 00:02:58.189 Build targets in project: 8 00:02:58.189 WARNING: Project specifies a minimum meson_version '>= 0.53.0' but uses features which were added in newer versions: 00:02:58.190 * 0.57.0: {'exclude_suites arg in add_test_setup'} 00:02:58.190 00:02:58.190 libvfio-user 0.0.1 00:02:58.190 00:02:58.190 User defined options 00:02:58.190 buildtype : debug 00:02:58.190 default_library: static 00:02:58.190 libdir : /usr/local/lib 00:02:58.190 00:02:58.190 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:02:58.758 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug' 00:02:58.758 [1/36] Compiling C object lib/libvfio-user.a.p/irq.c.o 00:02:58.758 [2/36] Compiling C object samples/shadow_ioeventfd_server.p/shadow_ioeventfd_server.c.o 00:02:58.758 [3/36] Compiling C object lib/libvfio-user.a.p/tran.c.o 00:02:58.758 [4/36] Compiling C object samples/null.p/null.c.o 00:02:58.758 [5/36] Compiling C object samples/lspci.p/lspci.c.o 00:02:58.758 [6/36] Compiling C object lib/libvfio-user.a.p/migration.c.o 00:02:58.758 [7/36] Compiling C object samples/gpio-pci-idio-16.p/gpio-pci-idio-16.c.o 00:02:58.758 [8/36] Compiling C object samples/client.p/.._lib_tran.c.o 00:02:58.758 [9/36] Compiling C object samples/client.p/.._lib_migration.c.o 00:02:58.758 [10/36] Compiling C object test/unit_tests.p/.._lib_irq.c.o 00:02:58.758 [11/36] Compiling C object lib/libvfio-user.a.p/pci.c.o 00:02:58.758 [12/36] Compiling C object test/unit_tests.p/.._lib_tran.c.o 00:02:58.758 [13/36] Compiling C object lib/libvfio-user.a.p/dma.c.o 00:02:58.758 [14/36] Compiling C object test/unit_tests.p/mocks.c.o 00:02:58.758 [15/36] Compiling C object test/unit_tests.p/.._lib_tran_pipe.c.o 00:02:58.758 [16/36] Compiling C object test/unit_tests.p/.._lib_migration.c.o 00:02:58.758 [17/36] Compiling C object samples/client.p/.._lib_tran_sock.c.o 00:02:58.758 [18/36] Compiling C object lib/libvfio-user.a.p/tran_sock.c.o 00:02:58.758 [19/36] Compiling C object test/unit_tests.p/.._lib_pci.c.o 00:02:58.758 [20/36] Compiling C object samples/server.p/server.c.o 00:02:58.758 [21/36] Compiling C object lib/libvfio-user.a.p/pci_caps.c.o 00:02:58.758 [22/36] Compiling C object test/unit_tests.p/.._lib_dma.c.o 00:02:58.758 [23/36] Compiling C object test/unit_tests.p/.._lib_tran_sock.c.o 00:02:58.758 [24/36] Compiling C object test/unit_tests.p/.._lib_pci_caps.c.o 00:02:58.758 [25/36] Compiling C object test/unit_tests.p/unit-tests.c.o 00:02:58.758 [26/36] Compiling C object samples/client.p/client.c.o 00:02:58.758 [27/36] Compiling C object lib/libvfio-user.a.p/libvfio-user.c.o 00:02:58.758 [28/36] Compiling C object test/unit_tests.p/.._lib_libvfio-user.c.o 00:02:58.758 [29/36] Linking static target lib/libvfio-user.a 00:02:58.758 [30/36] Linking target samples/client 00:02:58.758 [31/36] Linking target samples/gpio-pci-idio-16 00:02:58.758 [32/36] Linking target test/unit_tests 00:02:58.758 [33/36] Linking target samples/null 00:02:58.758 [34/36] Linking target samples/server 00:02:58.758 [35/36] Linking target samples/shadow_ioeventfd_server 00:02:58.758 [36/36] Linking target samples/lspci 00:02:58.758 INFO: autodetecting backend as ninja 00:02:58.758 INFO: calculating backend command to run: /usr/local/bin/ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:02:59.017 DESTDIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user meson install --quiet -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:02:59.277 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug' 00:02:59.277 ninja: no work to do. 00:03:02.570 CC lib/ut/ut.o 00:03:02.570 CC lib/log/log.o 00:03:02.570 CC lib/log/log_flags.o 00:03:02.570 CC lib/log/log_deprecated.o 00:03:02.570 CC lib/ut_mock/mock.o 00:03:02.570 LIB libspdk_ut_mock.a 00:03:02.570 LIB libspdk_ut.a 00:03:02.570 LIB libspdk_log.a 00:03:02.829 CC lib/dma/dma.o 00:03:02.829 CC lib/util/base64.o 00:03:02.829 CC lib/util/bit_array.o 00:03:02.829 CC lib/ioat/ioat.o 00:03:02.829 CC lib/util/cpuset.o 00:03:02.829 CC lib/util/crc16.o 00:03:02.829 CC lib/util/crc32.o 00:03:02.829 CXX lib/trace_parser/trace.o 00:03:02.829 CC lib/util/crc32c.o 00:03:02.829 CC lib/util/crc64.o 00:03:02.829 CC lib/util/crc32_ieee.o 00:03:02.829 CC lib/util/dif.o 00:03:02.829 CC lib/util/fd.o 00:03:02.829 CC lib/util/file.o 00:03:02.829 CC lib/util/hexlify.o 00:03:02.829 CC lib/util/iov.o 00:03:02.829 CC lib/util/pipe.o 00:03:02.829 CC lib/util/math.o 00:03:02.829 CC lib/util/strerror_tls.o 00:03:02.829 CC lib/util/string.o 00:03:02.830 CC lib/util/uuid.o 00:03:02.830 CC lib/util/fd_group.o 00:03:02.830 CC lib/util/xor.o 00:03:02.830 CC lib/util/zipf.o 00:03:03.088 CC lib/vfio_user/host/vfio_user_pci.o 00:03:03.088 CC lib/vfio_user/host/vfio_user.o 00:03:03.088 LIB libspdk_dma.a 00:03:03.088 LIB libspdk_ioat.a 00:03:03.088 LIB libspdk_vfio_user.a 00:03:03.088 LIB libspdk_util.a 00:03:03.347 LIB libspdk_trace_parser.a 00:03:03.606 CC lib/json/json_parse.o 00:03:03.606 CC lib/json/json_util.o 00:03:03.606 CC lib/json/json_write.o 00:03:03.606 CC lib/idxd/idxd.o 00:03:03.606 CC lib/idxd/idxd_user.o 00:03:03.606 CC lib/vmd/vmd.o 00:03:03.606 CC lib/idxd/idxd_kernel.o 00:03:03.606 CC lib/rdma/common.o 00:03:03.606 CC lib/vmd/led.o 00:03:03.606 CC lib/rdma/rdma_verbs.o 00:03:03.606 CC lib/conf/conf.o 00:03:03.606 CC lib/env_dpdk/env.o 00:03:03.606 CC lib/env_dpdk/memory.o 00:03:03.606 CC lib/env_dpdk/pci.o 00:03:03.606 CC lib/env_dpdk/init.o 00:03:03.606 CC lib/env_dpdk/threads.o 00:03:03.606 CC lib/env_dpdk/pci_ioat.o 00:03:03.606 CC lib/env_dpdk/pci_virtio.o 00:03:03.606 CC lib/env_dpdk/pci_vmd.o 00:03:03.606 CC lib/env_dpdk/pci_idxd.o 00:03:03.606 CC lib/env_dpdk/pci_event.o 00:03:03.606 CC lib/env_dpdk/sigbus_handler.o 00:03:03.606 CC lib/env_dpdk/pci_dpdk.o 00:03:03.606 CC lib/env_dpdk/pci_dpdk_2207.o 00:03:03.606 CC lib/env_dpdk/pci_dpdk_2211.o 00:03:03.606 LIB libspdk_conf.a 00:03:03.606 LIB libspdk_json.a 00:03:03.606 LIB libspdk_rdma.a 00:03:03.865 LIB libspdk_idxd.a 00:03:03.865 LIB libspdk_vmd.a 00:03:04.126 CC lib/jsonrpc/jsonrpc_server.o 00:03:04.126 CC lib/jsonrpc/jsonrpc_server_tcp.o 00:03:04.126 CC lib/jsonrpc/jsonrpc_client.o 00:03:04.126 CC lib/jsonrpc/jsonrpc_client_tcp.o 00:03:04.126 LIB libspdk_jsonrpc.a 00:03:04.386 LIB libspdk_env_dpdk.a 00:03:04.386 CC lib/rpc/rpc.o 00:03:04.646 LIB libspdk_rpc.a 00:03:04.906 CC lib/trace/trace.o 00:03:04.906 CC lib/notify/notify.o 00:03:04.906 CC lib/trace/trace_flags.o 00:03:04.906 CC lib/notify/notify_rpc.o 00:03:04.906 CC lib/trace/trace_rpc.o 00:03:04.906 CC lib/sock/sock.o 00:03:04.906 CC lib/sock/sock_rpc.o 00:03:05.165 LIB libspdk_notify.a 00:03:05.165 LIB libspdk_trace.a 00:03:05.165 LIB libspdk_sock.a 00:03:05.424 CC lib/thread/thread.o 00:03:05.424 CC lib/thread/iobuf.o 00:03:05.424 CC lib/nvme/nvme_ctrlr_cmd.o 00:03:05.424 CC lib/nvme/nvme_ctrlr.o 00:03:05.424 CC lib/nvme/nvme_fabric.o 00:03:05.424 CC lib/nvme/nvme_ns_cmd.o 00:03:05.424 CC lib/nvme/nvme_ns.o 00:03:05.424 CC lib/nvme/nvme_pcie_common.o 00:03:05.424 CC lib/nvme/nvme_pcie.o 00:03:05.424 CC lib/nvme/nvme_qpair.o 00:03:05.424 CC lib/nvme/nvme.o 00:03:05.424 CC lib/nvme/nvme_quirks.o 00:03:05.424 CC lib/nvme/nvme_transport.o 00:03:05.424 CC lib/nvme/nvme_discovery.o 00:03:05.424 CC lib/nvme/nvme_ctrlr_ocssd_cmd.o 00:03:05.424 CC lib/nvme/nvme_ns_ocssd_cmd.o 00:03:05.424 CC lib/nvme/nvme_tcp.o 00:03:05.424 CC lib/nvme/nvme_opal.o 00:03:05.424 CC lib/nvme/nvme_io_msg.o 00:03:05.424 CC lib/nvme/nvme_poll_group.o 00:03:05.424 CC lib/nvme/nvme_zns.o 00:03:05.424 CC lib/nvme/nvme_cuse.o 00:03:05.424 CC lib/nvme/nvme_vfio_user.o 00:03:05.424 CC lib/nvme/nvme_rdma.o 00:03:06.364 LIB libspdk_thread.a 00:03:06.364 CC lib/blob/blobstore.o 00:03:06.364 CC lib/accel/accel.o 00:03:06.364 CC lib/blob/request.o 00:03:06.364 CC lib/accel/accel_rpc.o 00:03:06.364 CC lib/blob/zeroes.o 00:03:06.364 CC lib/accel/accel_sw.o 00:03:06.364 CC lib/blob/blob_bs_dev.o 00:03:06.364 CC lib/vfu_tgt/tgt_endpoint.o 00:03:06.364 CC lib/vfu_tgt/tgt_rpc.o 00:03:06.364 CC lib/init/json_config.o 00:03:06.364 CC lib/virtio/virtio.o 00:03:06.364 CC lib/init/subsystem.o 00:03:06.364 CC lib/init/subsystem_rpc.o 00:03:06.364 CC lib/virtio/virtio_vhost_user.o 00:03:06.364 CC lib/init/rpc.o 00:03:06.364 CC lib/virtio/virtio_vfio_user.o 00:03:06.364 CC lib/virtio/virtio_pci.o 00:03:06.622 LIB libspdk_init.a 00:03:06.622 LIB libspdk_virtio.a 00:03:06.622 LIB libspdk_vfu_tgt.a 00:03:06.622 LIB libspdk_nvme.a 00:03:06.882 CC lib/event/app.o 00:03:06.882 CC lib/event/reactor.o 00:03:06.882 CC lib/event/log_rpc.o 00:03:06.882 CC lib/event/app_rpc.o 00:03:06.882 CC lib/event/scheduler_static.o 00:03:07.141 LIB libspdk_accel.a 00:03:07.141 LIB libspdk_event.a 00:03:07.401 CC lib/bdev/bdev_rpc.o 00:03:07.401 CC lib/bdev/bdev.o 00:03:07.401 CC lib/bdev/bdev_zone.o 00:03:07.401 CC lib/bdev/part.o 00:03:07.401 CC lib/bdev/scsi_nvme.o 00:03:07.969 LIB libspdk_blob.a 00:03:08.229 CC lib/lvol/lvol.o 00:03:08.229 CC lib/blobfs/blobfs.o 00:03:08.229 CC lib/blobfs/tree.o 00:03:08.798 LIB libspdk_lvol.a 00:03:08.798 LIB libspdk_blobfs.a 00:03:09.058 LIB libspdk_bdev.a 00:03:09.317 CC lib/nvmf/ctrlr.o 00:03:09.317 CC lib/nvmf/ctrlr_discovery.o 00:03:09.317 CC lib/nvmf/ctrlr_bdev.o 00:03:09.317 CC lib/ftl/ftl_core.o 00:03:09.317 CC lib/nvmf/subsystem.o 00:03:09.317 CC lib/ftl/ftl_init.o 00:03:09.317 CC lib/nvmf/nvmf.o 00:03:09.317 CC lib/ftl/ftl_layout.o 00:03:09.317 CC lib/nvmf/nvmf_rpc.o 00:03:09.317 CC lib/ftl/ftl_debug.o 00:03:09.317 CC lib/nvmf/transport.o 00:03:09.317 CC lib/nvmf/tcp.o 00:03:09.317 CC lib/ftl/ftl_sb.o 00:03:09.317 CC lib/ftl/ftl_io.o 00:03:09.317 CC lib/ftl/ftl_l2p.o 00:03:09.317 CC lib/nvmf/vfio_user.o 00:03:09.317 CC lib/scsi/dev.o 00:03:09.317 CC lib/nvmf/rdma.o 00:03:09.317 CC lib/scsi/lun.o 00:03:09.317 CC lib/ftl/ftl_l2p_flat.o 00:03:09.317 CC lib/scsi/scsi.o 00:03:09.317 CC lib/scsi/port.o 00:03:09.317 CC lib/ftl/ftl_nv_cache.o 00:03:09.317 CC lib/ftl/ftl_band.o 00:03:09.317 CC lib/scsi/scsi_pr.o 00:03:09.317 CC lib/scsi/scsi_bdev.o 00:03:09.317 CC lib/ublk/ublk.o 00:03:09.317 CC lib/ftl/ftl_band_ops.o 00:03:09.317 CC lib/ublk/ublk_rpc.o 00:03:09.317 CC lib/ftl/ftl_writer.o 00:03:09.317 CC lib/scsi/scsi_rpc.o 00:03:09.317 CC lib/scsi/task.o 00:03:09.317 CC lib/ftl/ftl_rq.o 00:03:09.317 CC lib/ftl/ftl_reloc.o 00:03:09.317 CC lib/ftl/ftl_l2p_cache.o 00:03:09.317 CC lib/ftl/ftl_p2l.o 00:03:09.317 CC lib/ftl/mngt/ftl_mngt.o 00:03:09.317 CC lib/ftl/mngt/ftl_mngt_bdev.o 00:03:09.317 CC lib/nbd/nbd.o 00:03:09.317 CC lib/nbd/nbd_rpc.o 00:03:09.317 CC lib/ftl/mngt/ftl_mngt_shutdown.o 00:03:09.317 CC lib/ftl/mngt/ftl_mngt_startup.o 00:03:09.317 CC lib/ftl/mngt/ftl_mngt_md.o 00:03:09.317 CC lib/ftl/mngt/ftl_mngt_misc.o 00:03:09.317 CC lib/ftl/mngt/ftl_mngt_ioch.o 00:03:09.317 CC lib/ftl/mngt/ftl_mngt_l2p.o 00:03:09.317 CC lib/ftl/mngt/ftl_mngt_band.o 00:03:09.317 CC lib/ftl/mngt/ftl_mngt_self_test.o 00:03:09.317 CC lib/ftl/mngt/ftl_mngt_p2l.o 00:03:09.317 CC lib/ftl/mngt/ftl_mngt_upgrade.o 00:03:09.317 CC lib/ftl/mngt/ftl_mngt_recovery.o 00:03:09.317 CC lib/ftl/utils/ftl_conf.o 00:03:09.317 CC lib/ftl/utils/ftl_md.o 00:03:09.317 CC lib/ftl/utils/ftl_mempool.o 00:03:09.317 CC lib/ftl/utils/ftl_bitmap.o 00:03:09.317 CC lib/ftl/utils/ftl_property.o 00:03:09.317 CC lib/ftl/utils/ftl_layout_tracker_bdev.o 00:03:09.317 CC lib/ftl/upgrade/ftl_sb_upgrade.o 00:03:09.317 CC lib/ftl/upgrade/ftl_layout_upgrade.o 00:03:09.317 CC lib/ftl/upgrade/ftl_p2l_upgrade.o 00:03:09.317 CC lib/ftl/upgrade/ftl_band_upgrade.o 00:03:09.317 CC lib/ftl/upgrade/ftl_sb_v3.o 00:03:09.317 CC lib/ftl/upgrade/ftl_chunk_upgrade.o 00:03:09.317 CC lib/ftl/upgrade/ftl_sb_v5.o 00:03:09.317 CC lib/ftl/nvc/ftl_nvc_bdev_vss.o 00:03:09.317 CC lib/ftl/nvc/ftl_nvc_dev.o 00:03:09.317 CC lib/ftl/base/ftl_base_bdev.o 00:03:09.317 CC lib/ftl/base/ftl_base_dev.o 00:03:09.317 CC lib/ftl/ftl_trace.o 00:03:09.886 LIB libspdk_nbd.a 00:03:09.886 LIB libspdk_scsi.a 00:03:09.886 LIB libspdk_ublk.a 00:03:09.886 LIB libspdk_ftl.a 00:03:10.146 CC lib/iscsi/conn.o 00:03:10.146 CC lib/vhost/vhost.o 00:03:10.146 CC lib/iscsi/init_grp.o 00:03:10.146 CC lib/vhost/vhost_rpc.o 00:03:10.146 CC lib/vhost/vhost_blk.o 00:03:10.146 CC lib/vhost/vhost_scsi.o 00:03:10.146 CC lib/iscsi/iscsi.o 00:03:10.146 CC lib/iscsi/md5.o 00:03:10.146 CC lib/vhost/rte_vhost_user.o 00:03:10.146 CC lib/iscsi/param.o 00:03:10.146 CC lib/iscsi/portal_grp.o 00:03:10.146 CC lib/iscsi/tgt_node.o 00:03:10.146 CC lib/iscsi/iscsi_subsystem.o 00:03:10.146 CC lib/iscsi/iscsi_rpc.o 00:03:10.146 CC lib/iscsi/task.o 00:03:10.405 LIB libspdk_nvmf.a 00:03:10.665 LIB libspdk_vhost.a 00:03:10.924 LIB libspdk_iscsi.a 00:03:11.184 CC module/env_dpdk/env_dpdk_rpc.o 00:03:11.184 CC module/vfu_device/vfu_virtio.o 00:03:11.184 CC module/vfu_device/vfu_virtio_blk.o 00:03:11.184 CC module/vfu_device/vfu_virtio_scsi.o 00:03:11.184 CC module/vfu_device/vfu_virtio_rpc.o 00:03:11.443 LIB libspdk_env_dpdk_rpc.a 00:03:11.443 CC module/accel/dsa/accel_dsa.o 00:03:11.443 CC module/accel/dsa/accel_dsa_rpc.o 00:03:11.443 CC module/accel/error/accel_error.o 00:03:11.443 CC module/accel/error/accel_error_rpc.o 00:03:11.443 CC module/scheduler/dynamic/scheduler_dynamic.o 00:03:11.443 CC module/accel/ioat/accel_ioat_rpc.o 00:03:11.443 CC module/sock/posix/posix.o 00:03:11.443 CC module/accel/ioat/accel_ioat.o 00:03:11.443 CC module/scheduler/dpdk_governor/dpdk_governor.o 00:03:11.443 CC module/blob/bdev/blob_bdev.o 00:03:11.443 CC module/scheduler/gscheduler/gscheduler.o 00:03:11.443 CC module/accel/iaa/accel_iaa.o 00:03:11.443 CC module/accel/iaa/accel_iaa_rpc.o 00:03:11.443 LIB libspdk_scheduler_dpdk_governor.a 00:03:11.443 LIB libspdk_scheduler_gscheduler.a 00:03:11.443 LIB libspdk_accel_error.a 00:03:11.443 LIB libspdk_scheduler_dynamic.a 00:03:11.443 LIB libspdk_accel_ioat.a 00:03:11.702 LIB libspdk_accel_iaa.a 00:03:11.702 LIB libspdk_accel_dsa.a 00:03:11.702 LIB libspdk_blob_bdev.a 00:03:11.702 LIB libspdk_vfu_device.a 00:03:11.960 LIB libspdk_sock_posix.a 00:03:11.960 CC module/bdev/lvol/vbdev_lvol.o 00:03:11.960 CC module/bdev/lvol/vbdev_lvol_rpc.o 00:03:11.960 CC module/bdev/delay/vbdev_delay_rpc.o 00:03:11.960 CC module/bdev/malloc/bdev_malloc_rpc.o 00:03:11.960 CC module/bdev/delay/vbdev_delay.o 00:03:11.960 CC module/bdev/malloc/bdev_malloc.o 00:03:11.960 CC module/bdev/nvme/bdev_nvme.o 00:03:11.960 CC module/bdev/nvme/bdev_nvme_rpc.o 00:03:11.960 CC module/bdev/null/bdev_null.o 00:03:11.960 CC module/bdev/nvme/nvme_rpc.o 00:03:11.960 CC module/bdev/error/vbdev_error.o 00:03:11.960 CC module/bdev/nvme/bdev_mdns_client.o 00:03:11.960 CC module/bdev/ftl/bdev_ftl_rpc.o 00:03:11.960 CC module/bdev/ftl/bdev_ftl.o 00:03:11.960 CC module/bdev/error/vbdev_error_rpc.o 00:03:11.960 CC module/bdev/null/bdev_null_rpc.o 00:03:11.960 CC module/bdev/gpt/gpt.o 00:03:11.960 CC module/bdev/passthru/vbdev_passthru.o 00:03:11.960 CC module/bdev/nvme/vbdev_opal.o 00:03:11.960 CC module/bdev/gpt/vbdev_gpt.o 00:03:11.961 CC module/bdev/nvme/bdev_nvme_cuse_rpc.o 00:03:11.961 CC module/bdev/nvme/vbdev_opal_rpc.o 00:03:11.961 CC module/bdev/passthru/vbdev_passthru_rpc.o 00:03:11.961 CC module/bdev/aio/bdev_aio.o 00:03:11.961 CC module/bdev/aio/bdev_aio_rpc.o 00:03:11.961 CC module/bdev/split/vbdev_split.o 00:03:11.961 CC module/bdev/split/vbdev_split_rpc.o 00:03:11.961 CC module/bdev/zone_block/vbdev_zone_block.o 00:03:11.961 CC module/bdev/zone_block/vbdev_zone_block_rpc.o 00:03:11.961 CC module/bdev/iscsi/bdev_iscsi.o 00:03:11.961 CC module/bdev/iscsi/bdev_iscsi_rpc.o 00:03:11.961 CC module/bdev/raid/bdev_raid_rpc.o 00:03:11.961 CC module/bdev/raid/bdev_raid.o 00:03:11.961 CC module/bdev/raid/bdev_raid_sb.o 00:03:11.961 CC module/bdev/raid/raid0.o 00:03:11.961 CC module/blobfs/bdev/blobfs_bdev_rpc.o 00:03:11.961 CC module/bdev/raid/raid1.o 00:03:11.961 CC module/bdev/virtio/bdev_virtio_scsi.o 00:03:11.961 CC module/bdev/virtio/bdev_virtio_blk.o 00:03:11.961 CC module/blobfs/bdev/blobfs_bdev.o 00:03:11.961 CC module/bdev/raid/concat.o 00:03:11.961 CC module/bdev/virtio/bdev_virtio_rpc.o 00:03:12.219 LIB libspdk_blobfs_bdev.a 00:03:12.219 LIB libspdk_bdev_split.a 00:03:12.219 LIB libspdk_bdev_null.a 00:03:12.219 LIB libspdk_bdev_error.a 00:03:12.219 LIB libspdk_bdev_gpt.a 00:03:12.219 LIB libspdk_bdev_ftl.a 00:03:12.219 LIB libspdk_bdev_passthru.a 00:03:12.219 LIB libspdk_bdev_aio.a 00:03:12.219 LIB libspdk_bdev_zone_block.a 00:03:12.219 LIB libspdk_bdev_iscsi.a 00:03:12.219 LIB libspdk_bdev_delay.a 00:03:12.219 LIB libspdk_bdev_malloc.a 00:03:12.479 LIB libspdk_bdev_lvol.a 00:03:12.479 LIB libspdk_bdev_virtio.a 00:03:12.479 LIB libspdk_bdev_raid.a 00:03:13.418 LIB libspdk_bdev_nvme.a 00:03:13.677 CC module/event/subsystems/vhost_blk/vhost_blk.o 00:03:13.937 CC module/event/subsystems/iobuf/iobuf_rpc.o 00:03:13.937 CC module/event/subsystems/scheduler/scheduler.o 00:03:13.937 CC module/event/subsystems/iobuf/iobuf.o 00:03:13.937 CC module/event/subsystems/vmd/vmd_rpc.o 00:03:13.937 CC module/event/subsystems/vmd/vmd.o 00:03:13.937 CC module/event/subsystems/sock/sock.o 00:03:13.937 CC module/event/subsystems/vfu_tgt/vfu_tgt.o 00:03:13.937 LIB libspdk_event_vhost_blk.a 00:03:13.937 LIB libspdk_event_sock.a 00:03:13.937 LIB libspdk_event_scheduler.a 00:03:13.937 LIB libspdk_event_iobuf.a 00:03:13.937 LIB libspdk_event_vfu_tgt.a 00:03:13.937 LIB libspdk_event_vmd.a 00:03:14.196 CC module/event/subsystems/accel/accel.o 00:03:14.456 LIB libspdk_event_accel.a 00:03:14.715 CC module/event/subsystems/bdev/bdev.o 00:03:14.715 LIB libspdk_event_bdev.a 00:03:15.285 CC module/event/subsystems/ublk/ublk.o 00:03:15.285 CC module/event/subsystems/scsi/scsi.o 00:03:15.285 CC module/event/subsystems/nvmf/nvmf_rpc.o 00:03:15.285 CC module/event/subsystems/nvmf/nvmf_tgt.o 00:03:15.285 CC module/event/subsystems/nbd/nbd.o 00:03:15.285 LIB libspdk_event_ublk.a 00:03:15.285 LIB libspdk_event_scsi.a 00:03:15.285 LIB libspdk_event_nbd.a 00:03:15.285 LIB libspdk_event_nvmf.a 00:03:15.544 CC module/event/subsystems/vhost_scsi/vhost_scsi.o 00:03:15.544 CC module/event/subsystems/iscsi/iscsi.o 00:03:15.544 LIB libspdk_event_vhost_scsi.a 00:03:15.803 LIB libspdk_event_iscsi.a 00:03:16.066 CC app/spdk_nvme_perf/perf.o 00:03:16.066 CXX app/trace/trace.o 00:03:16.066 CC app/spdk_top/spdk_top.o 00:03:16.066 CC app/spdk_nvme_identify/identify.o 00:03:16.066 CC app/trace_record/trace_record.o 00:03:16.066 CC test/rpc_client/rpc_client_test.o 00:03:16.066 CC app/spdk_lspci/spdk_lspci.o 00:03:16.066 TEST_HEADER include/spdk/accel_module.h 00:03:16.066 TEST_HEADER include/spdk/assert.h 00:03:16.066 TEST_HEADER include/spdk/accel.h 00:03:16.066 TEST_HEADER include/spdk/barrier.h 00:03:16.066 TEST_HEADER include/spdk/bdev.h 00:03:16.066 TEST_HEADER include/spdk/base64.h 00:03:16.066 TEST_HEADER include/spdk/bdev_module.h 00:03:16.066 CC app/spdk_nvme_discover/discovery_aer.o 00:03:16.066 TEST_HEADER include/spdk/bdev_zone.h 00:03:16.066 TEST_HEADER include/spdk/bit_array.h 00:03:16.066 TEST_HEADER include/spdk/bit_pool.h 00:03:16.066 TEST_HEADER include/spdk/blob_bdev.h 00:03:16.066 TEST_HEADER include/spdk/blobfs_bdev.h 00:03:16.066 TEST_HEADER include/spdk/blobfs.h 00:03:16.066 TEST_HEADER include/spdk/blob.h 00:03:16.066 TEST_HEADER include/spdk/conf.h 00:03:16.066 TEST_HEADER include/spdk/config.h 00:03:16.066 TEST_HEADER include/spdk/cpuset.h 00:03:16.066 TEST_HEADER include/spdk/crc16.h 00:03:16.066 TEST_HEADER include/spdk/crc64.h 00:03:16.066 TEST_HEADER include/spdk/dif.h 00:03:16.066 TEST_HEADER include/spdk/crc32.h 00:03:16.066 TEST_HEADER include/spdk/dma.h 00:03:16.066 TEST_HEADER include/spdk/endian.h 00:03:16.066 TEST_HEADER include/spdk/env_dpdk.h 00:03:16.066 TEST_HEADER include/spdk/env.h 00:03:16.066 TEST_HEADER include/spdk/event.h 00:03:16.066 TEST_HEADER include/spdk/fd_group.h 00:03:16.066 CC app/spdk_dd/spdk_dd.o 00:03:16.066 TEST_HEADER include/spdk/fd.h 00:03:16.066 TEST_HEADER include/spdk/ftl.h 00:03:16.066 TEST_HEADER include/spdk/file.h 00:03:16.066 TEST_HEADER include/spdk/gpt_spec.h 00:03:16.066 TEST_HEADER include/spdk/hexlify.h 00:03:16.066 CC app/nvmf_tgt/nvmf_main.o 00:03:16.066 TEST_HEADER include/spdk/histogram_data.h 00:03:16.066 TEST_HEADER include/spdk/idxd.h 00:03:16.066 TEST_HEADER include/spdk/idxd_spec.h 00:03:16.066 TEST_HEADER include/spdk/init.h 00:03:16.066 TEST_HEADER include/spdk/ioat.h 00:03:16.066 TEST_HEADER include/spdk/ioat_spec.h 00:03:16.066 TEST_HEADER include/spdk/iscsi_spec.h 00:03:16.066 TEST_HEADER include/spdk/json.h 00:03:16.066 TEST_HEADER include/spdk/jsonrpc.h 00:03:16.066 TEST_HEADER include/spdk/log.h 00:03:16.066 CC examples/interrupt_tgt/interrupt_tgt.o 00:03:16.066 TEST_HEADER include/spdk/likely.h 00:03:16.066 TEST_HEADER include/spdk/memory.h 00:03:16.066 TEST_HEADER include/spdk/lvol.h 00:03:16.066 TEST_HEADER include/spdk/mmio.h 00:03:16.066 TEST_HEADER include/spdk/notify.h 00:03:16.066 TEST_HEADER include/spdk/nbd.h 00:03:16.066 TEST_HEADER include/spdk/nvme.h 00:03:16.066 TEST_HEADER include/spdk/nvme_intel.h 00:03:16.066 CC app/vhost/vhost.o 00:03:16.066 TEST_HEADER include/spdk/nvme_ocssd_spec.h 00:03:16.066 TEST_HEADER include/spdk/nvme_ocssd.h 00:03:16.066 TEST_HEADER include/spdk/nvme_spec.h 00:03:16.066 TEST_HEADER include/spdk/nvme_zns.h 00:03:16.066 TEST_HEADER include/spdk/nvmf_fc_spec.h 00:03:16.066 TEST_HEADER include/spdk/nvmf_cmd.h 00:03:16.066 TEST_HEADER include/spdk/nvmf.h 00:03:16.066 TEST_HEADER include/spdk/nvmf_spec.h 00:03:16.066 TEST_HEADER include/spdk/nvmf_transport.h 00:03:16.066 TEST_HEADER include/spdk/opal.h 00:03:16.066 TEST_HEADER include/spdk/opal_spec.h 00:03:16.066 TEST_HEADER include/spdk/pci_ids.h 00:03:16.066 TEST_HEADER include/spdk/pipe.h 00:03:16.066 TEST_HEADER include/spdk/queue.h 00:03:16.066 CC app/iscsi_tgt/iscsi_tgt.o 00:03:16.066 TEST_HEADER include/spdk/reduce.h 00:03:16.066 TEST_HEADER include/spdk/scheduler.h 00:03:16.066 TEST_HEADER include/spdk/rpc.h 00:03:16.066 TEST_HEADER include/spdk/scsi.h 00:03:16.066 TEST_HEADER include/spdk/scsi_spec.h 00:03:16.066 TEST_HEADER include/spdk/sock.h 00:03:16.066 TEST_HEADER include/spdk/stdinc.h 00:03:16.066 TEST_HEADER include/spdk/string.h 00:03:16.066 TEST_HEADER include/spdk/thread.h 00:03:16.066 TEST_HEADER include/spdk/trace_parser.h 00:03:16.066 TEST_HEADER include/spdk/trace.h 00:03:16.066 TEST_HEADER include/spdk/tree.h 00:03:16.066 TEST_HEADER include/spdk/ublk.h 00:03:16.066 TEST_HEADER include/spdk/util.h 00:03:16.066 TEST_HEADER include/spdk/uuid.h 00:03:16.066 TEST_HEADER include/spdk/version.h 00:03:16.066 TEST_HEADER include/spdk/vfio_user_pci.h 00:03:16.066 TEST_HEADER include/spdk/vfio_user_spec.h 00:03:16.066 TEST_HEADER include/spdk/vhost.h 00:03:16.066 TEST_HEADER include/spdk/vmd.h 00:03:16.066 TEST_HEADER include/spdk/zipf.h 00:03:16.066 CXX test/cpp_headers/accel.o 00:03:16.066 TEST_HEADER include/spdk/xor.h 00:03:16.066 CXX test/cpp_headers/accel_module.o 00:03:16.066 CXX test/cpp_headers/barrier.o 00:03:16.066 CXX test/cpp_headers/assert.o 00:03:16.066 CXX test/cpp_headers/base64.o 00:03:16.066 CXX test/cpp_headers/bdev.o 00:03:16.066 CXX test/cpp_headers/bdev_zone.o 00:03:16.066 CXX test/cpp_headers/bdev_module.o 00:03:16.066 CXX test/cpp_headers/bit_array.o 00:03:16.066 CXX test/cpp_headers/blob_bdev.o 00:03:16.066 CXX test/cpp_headers/bit_pool.o 00:03:16.066 CXX test/cpp_headers/blobfs.o 00:03:16.066 CXX test/cpp_headers/blobfs_bdev.o 00:03:16.066 CC app/spdk_tgt/spdk_tgt.o 00:03:16.066 CXX test/cpp_headers/blob.o 00:03:16.066 CXX test/cpp_headers/conf.o 00:03:16.066 CXX test/cpp_headers/config.o 00:03:16.066 CXX test/cpp_headers/cpuset.o 00:03:16.066 CXX test/cpp_headers/crc16.o 00:03:16.066 CXX test/cpp_headers/crc32.o 00:03:16.066 CXX test/cpp_headers/crc64.o 00:03:16.066 CXX test/cpp_headers/dif.o 00:03:16.066 CXX test/cpp_headers/endian.o 00:03:16.066 CXX test/cpp_headers/dma.o 00:03:16.066 CXX test/cpp_headers/env_dpdk.o 00:03:16.066 CXX test/cpp_headers/env.o 00:03:16.066 CXX test/cpp_headers/event.o 00:03:16.066 CXX test/cpp_headers/fd_group.o 00:03:16.066 CXX test/cpp_headers/fd.o 00:03:16.066 CXX test/cpp_headers/file.o 00:03:16.066 CXX test/cpp_headers/gpt_spec.o 00:03:16.066 CXX test/cpp_headers/ftl.o 00:03:16.066 CC examples/sock/hello_world/hello_sock.o 00:03:16.066 CXX test/cpp_headers/hexlify.o 00:03:16.066 CXX test/cpp_headers/histogram_data.o 00:03:16.066 CXX test/cpp_headers/idxd.o 00:03:16.066 CXX test/cpp_headers/idxd_spec.o 00:03:16.066 CXX test/cpp_headers/init.o 00:03:16.066 CC examples/idxd/perf/perf.o 00:03:16.066 CC test/app/stub/stub.o 00:03:16.066 CC test/app/histogram_perf/histogram_perf.o 00:03:16.066 CC test/thread/poller_perf/poller_perf.o 00:03:16.066 CC examples/accel/perf/accel_perf.o 00:03:16.066 CC test/event/event_perf/event_perf.o 00:03:16.066 CC test/app/jsoncat/jsoncat.o 00:03:16.066 CC examples/ioat/verify/verify.o 00:03:16.066 CC app/fio/nvme/fio_plugin.o 00:03:16.066 CC test/nvme/reset/reset.o 00:03:16.067 CC test/thread/lock/spdk_lock.o 00:03:16.067 CC test/nvme/overhead/overhead.o 00:03:16.067 CC test/nvme/aer/aer.o 00:03:16.067 CC test/event/reactor/reactor.o 00:03:16.067 CC test/env/memory/memory_ut.o 00:03:16.067 CC test/nvme/err_injection/err_injection.o 00:03:16.067 CC examples/nvme/pmr_persistence/pmr_persistence.o 00:03:16.067 CC examples/nvme/hello_world/hello_world.o 00:03:16.067 CC examples/nvme/arbitration/arbitration.o 00:03:16.067 CC examples/ioat/perf/perf.o 00:03:16.067 CC test/nvme/reserve/reserve.o 00:03:16.067 CC examples/vmd/lsvmd/lsvmd.o 00:03:16.067 CC test/nvme/fdp/fdp.o 00:03:16.067 CC test/nvme/sgl/sgl.o 00:03:16.067 CC test/nvme/fused_ordering/fused_ordering.o 00:03:16.067 CC test/nvme/simple_copy/simple_copy.o 00:03:16.067 CC test/nvme/compliance/nvme_compliance.o 00:03:16.067 CC test/env/pci/pci_ut.o 00:03:16.067 CC examples/nvme/reconnect/reconnect.o 00:03:16.067 CC test/nvme/startup/startup.o 00:03:16.067 CC examples/nvme/abort/abort.o 00:03:16.335 CC test/nvme/e2edp/nvme_dp.o 00:03:16.335 CC test/env/vtophys/vtophys.o 00:03:16.335 LINK spdk_lspci 00:03:16.335 CC examples/nvme/cmb_copy/cmb_copy.o 00:03:16.335 CC examples/nvme/hotplug/hotplug.o 00:03:16.335 CC test/event/reactor_perf/reactor_perf.o 00:03:16.335 CC examples/util/zipf/zipf.o 00:03:16.335 CC examples/vmd/led/led.o 00:03:16.335 CC test/nvme/connect_stress/connect_stress.o 00:03:16.335 CC test/nvme/doorbell_aers/doorbell_aers.o 00:03:16.335 CC examples/nvme/nvme_manage/nvme_manage.o 00:03:16.335 CC test/nvme/cuse/cuse.o 00:03:16.335 CC test/env/env_dpdk_post_init/env_dpdk_post_init.o 00:03:16.335 CC test/nvme/boot_partition/boot_partition.o 00:03:16.335 CC test/accel/dif/dif.o 00:03:16.335 CC test/blobfs/mkfs/mkfs.o 00:03:16.335 CC examples/blob/cli/blobcli.o 00:03:16.335 CC app/fio/bdev/fio_plugin.o 00:03:16.335 CC test/event/app_repeat/app_repeat.o 00:03:16.335 CC examples/blob/hello_world/hello_blob.o 00:03:16.335 CC test/app/bdev_svc/bdev_svc.o 00:03:16.335 CC examples/bdev/bdevperf/bdevperf.o 00:03:16.335 CC examples/bdev/hello_world/hello_bdev.o 00:03:16.335 CC test/bdev/bdevio/bdevio.o 00:03:16.335 CC examples/nvmf/nvmf/nvmf.o 00:03:16.335 CC test/dma/test_dma/test_dma.o 00:03:16.335 LINK rpc_client_test 00:03:16.335 CC examples/thread/thread/thread_ex.o 00:03:16.335 CC test/event/scheduler/scheduler.o 00:03:16.335 CC test/app/fuzz/nvme_fuzz/nvme_fuzz.o 00:03:16.335 CC test/lvol/esnap/esnap.o 00:03:16.335 CC test/env/mem_callbacks/mem_callbacks.o 00:03:16.335 CC test/app/fuzz/iscsi_fuzz/iscsi_fuzz.o 00:03:16.335 LINK spdk_nvme_discover 00:03:16.335 LINK nvmf_tgt 00:03:16.335 LINK interrupt_tgt 00:03:16.335 LINK spdk_trace_record 00:03:16.335 CXX test/cpp_headers/ioat.o 00:03:16.335 LINK vhost 00:03:16.335 CXX test/cpp_headers/ioat_spec.o 00:03:16.335 LINK jsoncat 00:03:16.335 CXX test/cpp_headers/iscsi_spec.o 00:03:16.335 LINK event_perf 00:03:16.335 CXX test/cpp_headers/json.o 00:03:16.335 LINK histogram_perf 00:03:16.335 CXX test/cpp_headers/jsonrpc.o 00:03:16.335 CXX test/cpp_headers/likely.o 00:03:16.335 CXX test/cpp_headers/log.o 00:03:16.335 LINK poller_perf 00:03:16.335 CXX test/cpp_headers/lvol.o 00:03:16.335 CXX test/cpp_headers/memory.o 00:03:16.335 CXX test/cpp_headers/mmio.o 00:03:16.335 CXX test/cpp_headers/nbd.o 00:03:16.335 CXX test/cpp_headers/notify.o 00:03:16.335 CXX test/cpp_headers/nvme.o 00:03:16.335 CXX test/cpp_headers/nvme_intel.o 00:03:16.335 CXX test/cpp_headers/nvme_ocssd.o 00:03:16.335 CXX test/cpp_headers/nvme_ocssd_spec.o 00:03:16.335 CXX test/cpp_headers/nvme_spec.o 00:03:16.335 CXX test/cpp_headers/nvme_zns.o 00:03:16.335 LINK lsvmd 00:03:16.335 CXX test/cpp_headers/nvmf_cmd.o 00:03:16.335 CXX test/cpp_headers/nvmf_fc_spec.o 00:03:16.335 CXX test/cpp_headers/nvmf.o 00:03:16.335 CXX test/cpp_headers/nvmf_spec.o 00:03:16.335 CXX test/cpp_headers/nvmf_transport.o 00:03:16.335 CXX test/cpp_headers/opal.o 00:03:16.335 LINK reactor_perf 00:03:16.335 LINK iscsi_tgt 00:03:16.335 CXX test/cpp_headers/opal_spec.o 00:03:16.335 LINK reactor 00:03:16.335 CXX test/cpp_headers/pci_ids.o 00:03:16.335 CXX test/cpp_headers/pipe.o 00:03:16.335 LINK vtophys 00:03:16.335 CXX test/cpp_headers/queue.o 00:03:16.335 CXX test/cpp_headers/reduce.o 00:03:16.335 LINK led 00:03:16.335 CXX test/cpp_headers/rpc.o 00:03:16.335 LINK env_dpdk_post_init 00:03:16.335 LINK zipf 00:03:16.335 LINK stub 00:03:16.335 CXX test/cpp_headers/scheduler.o 00:03:16.335 CXX test/cpp_headers/scsi.o 00:03:16.335 LINK err_injection 00:03:16.335 CXX test/cpp_headers/scsi_spec.o 00:03:16.335 LINK pmr_persistence 00:03:16.335 LINK spdk_tgt 00:03:16.335 LINK startup 00:03:16.335 LINK boot_partition 00:03:16.335 LINK app_repeat 00:03:16.335 LINK connect_stress 00:03:16.335 LINK reserve 00:03:16.335 CXX test/cpp_headers/sock.o 00:03:16.335 LINK fused_ordering 00:03:16.335 LINK doorbell_aers 00:03:16.335 LINK hello_sock 00:03:16.335 LINK verify 00:03:16.595 LINK cmb_copy 00:03:16.595 LINK hello_world 00:03:16.595 LINK ioat_perf 00:03:16.595 LINK bdev_svc 00:03:16.595 LINK mkfs 00:03:16.595 CC test/app/fuzz/vhost_fuzz/vhost_fuzz_rpc.o 00:03:16.595 LINK hotplug 00:03:16.595 LINK simple_copy 00:03:16.595 LINK reset 00:03:16.595 LINK hello_bdev 00:03:16.595 LINK fdp 00:03:16.595 LINK hello_blob 00:03:16.595 CC test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.o 00:03:16.595 CC test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.o 00:03:16.595 LINK nvme_dp 00:03:16.595 LINK sgl 00:03:16.595 LINK aer 00:03:16.595 LINK spdk_trace 00:03:16.595 LINK overhead 00:03:16.595 LINK scheduler 00:03:16.595 LINK thread 00:03:16.595 CXX test/cpp_headers/stdinc.o 00:03:16.595 CC test/app/fuzz/vhost_fuzz/vhost_fuzz.o 00:03:16.595 LINK mem_callbacks 00:03:16.595 CXX test/cpp_headers/string.o 00:03:16.595 CXX test/cpp_headers/thread.o 00:03:16.595 LINK idxd_perf 00:03:16.595 CXX test/cpp_headers/trace.o 00:03:16.595 CXX test/cpp_headers/trace_parser.o 00:03:16.595 CXX test/cpp_headers/tree.o 00:03:16.595 CXX test/cpp_headers/ublk.o 00:03:16.595 CXX test/cpp_headers/util.o 00:03:16.595 CXX test/cpp_headers/uuid.o 00:03:16.595 CXX test/cpp_headers/version.o 00:03:16.595 CXX test/cpp_headers/vfio_user_pci.o 00:03:16.595 CXX test/cpp_headers/vfio_user_spec.o 00:03:16.595 CXX test/cpp_headers/vhost.o 00:03:16.595 CXX test/cpp_headers/vmd.o 00:03:16.595 CXX test/cpp_headers/xor.o 00:03:16.595 CXX test/cpp_headers/zipf.o 00:03:16.596 LINK nvmf 00:03:16.596 LINK spdk_dd 00:03:16.596 LINK reconnect 00:03:16.596 LINK dif 00:03:16.596 LINK arbitration 00:03:16.596 LINK abort 00:03:16.855 LINK test_dma 00:03:16.855 LINK bdevio 00:03:16.855 LINK accel_perf 00:03:16.855 LINK nvme_manage 00:03:16.855 LINK pci_ut 00:03:16.855 LINK nvme_compliance 00:03:16.855 LINK spdk_nvme 00:03:16.855 LINK nvme_fuzz 00:03:16.855 LINK memory_ut 00:03:16.855 LINK blobcli 00:03:16.855 LINK llvm_vfio_fuzz 00:03:16.855 LINK spdk_bdev 00:03:16.855 LINK spdk_nvme_identify 00:03:17.113 LINK bdevperf 00:03:17.113 LINK spdk_nvme_perf 00:03:17.113 LINK spdk_top 00:03:17.113 LINK vhost_fuzz 00:03:17.113 LINK llvm_nvme_fuzz 00:03:17.371 LINK cuse 00:03:17.630 LINK spdk_lock 00:03:17.888 LINK iscsi_fuzz 00:03:19.795 LINK esnap 00:03:20.055 00:03:20.055 real 0m23.839s 00:03:20.055 user 4m15.973s 00:03:20.055 sys 2m5.316s 00:03:20.055 17:47:12 -- common/autotest_common.sh@1115 -- $ xtrace_disable 00:03:20.055 17:47:12 -- common/autotest_common.sh@10 -- $ set +x 00:03:20.055 ************************************ 00:03:20.055 END TEST make 00:03:20.055 ************************************ 00:03:20.316 17:47:12 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:03:20.316 17:47:13 -- common/autotest_common.sh@1690 -- # lcov --version 00:03:20.316 17:47:13 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:03:20.316 17:47:13 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:03:20.316 17:47:13 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:03:20.316 17:47:13 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:03:20.316 17:47:13 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:03:20.316 17:47:13 -- scripts/common.sh@335 -- # IFS=.-: 00:03:20.316 17:47:13 -- scripts/common.sh@335 -- # read -ra ver1 00:03:20.316 17:47:13 -- scripts/common.sh@336 -- # IFS=.-: 00:03:20.316 17:47:13 -- scripts/common.sh@336 -- # read -ra ver2 00:03:20.316 17:47:13 -- scripts/common.sh@337 -- # local 'op=<' 00:03:20.316 17:47:13 -- scripts/common.sh@339 -- # ver1_l=2 00:03:20.316 17:47:13 -- scripts/common.sh@340 -- # ver2_l=1 00:03:20.316 17:47:13 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:03:20.316 17:47:13 -- scripts/common.sh@343 -- # case "$op" in 00:03:20.316 17:47:13 -- scripts/common.sh@344 -- # : 1 00:03:20.316 17:47:13 -- scripts/common.sh@363 -- # (( v = 0 )) 00:03:20.316 17:47:13 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:03:20.316 17:47:13 -- scripts/common.sh@364 -- # decimal 1 00:03:20.316 17:47:13 -- scripts/common.sh@352 -- # local d=1 00:03:20.316 17:47:13 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:03:20.316 17:47:13 -- scripts/common.sh@354 -- # echo 1 00:03:20.316 17:47:13 -- scripts/common.sh@364 -- # ver1[v]=1 00:03:20.316 17:47:13 -- scripts/common.sh@365 -- # decimal 2 00:03:20.316 17:47:13 -- scripts/common.sh@352 -- # local d=2 00:03:20.316 17:47:13 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:03:20.316 17:47:13 -- scripts/common.sh@354 -- # echo 2 00:03:20.316 17:47:13 -- scripts/common.sh@365 -- # ver2[v]=2 00:03:20.316 17:47:13 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:03:20.316 17:47:13 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:03:20.316 17:47:13 -- scripts/common.sh@367 -- # return 0 00:03:20.316 17:47:13 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:03:20.316 17:47:13 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:03:20.316 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:20.316 --rc genhtml_branch_coverage=1 00:03:20.316 --rc genhtml_function_coverage=1 00:03:20.316 --rc genhtml_legend=1 00:03:20.316 --rc geninfo_all_blocks=1 00:03:20.316 --rc geninfo_unexecuted_blocks=1 00:03:20.316 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:20.316 ' 00:03:20.316 17:47:13 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:03:20.316 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:20.316 --rc genhtml_branch_coverage=1 00:03:20.316 --rc genhtml_function_coverage=1 00:03:20.316 --rc genhtml_legend=1 00:03:20.316 --rc geninfo_all_blocks=1 00:03:20.316 --rc geninfo_unexecuted_blocks=1 00:03:20.316 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:20.316 ' 00:03:20.316 17:47:13 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:03:20.316 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:20.316 --rc genhtml_branch_coverage=1 00:03:20.316 --rc genhtml_function_coverage=1 00:03:20.316 --rc genhtml_legend=1 00:03:20.316 --rc geninfo_all_blocks=1 00:03:20.316 --rc geninfo_unexecuted_blocks=1 00:03:20.316 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:20.316 ' 00:03:20.316 17:47:13 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:03:20.316 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:20.316 --rc genhtml_branch_coverage=1 00:03:20.316 --rc genhtml_function_coverage=1 00:03:20.316 --rc genhtml_legend=1 00:03:20.316 --rc geninfo_all_blocks=1 00:03:20.316 --rc geninfo_unexecuted_blocks=1 00:03:20.316 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:20.316 ' 00:03:20.316 17:47:13 -- spdk/autotest.sh@25 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:03:20.316 17:47:13 -- nvmf/common.sh@7 -- # uname -s 00:03:20.316 17:47:13 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:03:20.316 17:47:13 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:03:20.316 17:47:13 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:03:20.316 17:47:13 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:03:20.316 17:47:13 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:03:20.316 17:47:13 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:03:20.316 17:47:13 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:03:20.316 17:47:13 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:03:20.316 17:47:13 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:03:20.316 17:47:13 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:03:20.316 17:47:13 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:03:20.316 17:47:13 -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:03:20.316 17:47:13 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:03:20.316 17:47:13 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:03:20.316 17:47:13 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:03:20.316 17:47:13 -- nvmf/common.sh@44 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:03:20.316 17:47:13 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:03:20.316 17:47:13 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:03:20.316 17:47:13 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:03:20.316 17:47:13 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:20.316 17:47:13 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:20.316 17:47:13 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:20.316 17:47:13 -- paths/export.sh@5 -- # export PATH 00:03:20.317 17:47:13 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:20.317 17:47:13 -- nvmf/common.sh@46 -- # : 0 00:03:20.317 17:47:13 -- nvmf/common.sh@47 -- # export NVMF_APP_SHM_ID 00:03:20.317 17:47:13 -- nvmf/common.sh@48 -- # build_nvmf_app_args 00:03:20.317 17:47:13 -- nvmf/common.sh@24 -- # '[' 0 -eq 1 ']' 00:03:20.317 17:47:13 -- nvmf/common.sh@28 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:03:20.317 17:47:13 -- nvmf/common.sh@30 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:03:20.317 17:47:13 -- nvmf/common.sh@32 -- # '[' -n '' ']' 00:03:20.317 17:47:13 -- nvmf/common.sh@34 -- # '[' 0 -eq 1 ']' 00:03:20.317 17:47:13 -- nvmf/common.sh@50 -- # have_pci_nics=0 00:03:20.317 17:47:13 -- spdk/autotest.sh@27 -- # '[' 0 -ne 0 ']' 00:03:20.317 17:47:13 -- spdk/autotest.sh@32 -- # uname -s 00:03:20.317 17:47:13 -- spdk/autotest.sh@32 -- # '[' Linux = Linux ']' 00:03:20.317 17:47:13 -- spdk/autotest.sh@33 -- # old_core_pattern='|/usr/lib/systemd/systemd-coredump %P %u %g %s %t %c %h' 00:03:20.317 17:47:13 -- spdk/autotest.sh@34 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/coredumps 00:03:20.317 17:47:13 -- spdk/autotest.sh@39 -- # echo '|/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/core-collector.sh %P %s %t' 00:03:20.317 17:47:13 -- spdk/autotest.sh@40 -- # echo /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/coredumps 00:03:20.317 17:47:13 -- spdk/autotest.sh@44 -- # modprobe nbd 00:03:20.317 17:47:13 -- spdk/autotest.sh@46 -- # type -P udevadm 00:03:20.317 17:47:13 -- spdk/autotest.sh@46 -- # udevadm=/usr/sbin/udevadm 00:03:20.317 17:47:13 -- spdk/autotest.sh@48 -- # udevadm_pid=559608 00:03:20.317 17:47:13 -- spdk/autotest.sh@51 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:03:20.317 17:47:13 -- spdk/autotest.sh@47 -- # /usr/sbin/udevadm monitor --property 00:03:20.317 17:47:13 -- spdk/autotest.sh@54 -- # echo 559610 00:03:20.317 17:47:13 -- spdk/autotest.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:03:20.317 17:47:13 -- spdk/autotest.sh@56 -- # echo 559611 00:03:20.317 17:47:13 -- spdk/autotest.sh@55 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:03:20.317 17:47:13 -- spdk/autotest.sh@58 -- # [[ ............................... != QEMU ]] 00:03:20.317 17:47:13 -- spdk/autotest.sh@60 -- # echo 559612 00:03:20.317 17:47:13 -- spdk/autotest.sh@59 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l 00:03:20.317 17:47:13 -- spdk/autotest.sh@62 -- # echo 559613 00:03:20.317 17:47:13 -- spdk/autotest.sh@61 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l 00:03:20.317 17:47:13 -- spdk/autotest.sh@66 -- # trap 'autotest_cleanup || :; exit 1' SIGINT SIGTERM EXIT 00:03:20.317 17:47:13 -- spdk/autotest.sh@68 -- # timing_enter autotest 00:03:20.317 17:47:13 -- common/autotest_common.sh@722 -- # xtrace_disable 00:03:20.317 17:47:13 -- common/autotest_common.sh@10 -- # set +x 00:03:20.317 17:47:13 -- spdk/autotest.sh@70 -- # create_test_list 00:03:20.317 17:47:13 -- common/autotest_common.sh@746 -- # xtrace_disable 00:03:20.317 17:47:13 -- common/autotest_common.sh@10 -- # set +x 00:03:20.317 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-bmc-pm.bmc.pm.log 00:03:20.577 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-cpu-temp.pm.log 00:03:20.577 17:47:13 -- spdk/autotest.sh@72 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/autotest.sh 00:03:20.577 17:47:13 -- spdk/autotest.sh@72 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:03:20.577 17:47:13 -- spdk/autotest.sh@72 -- # src=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:03:20.577 17:47:13 -- spdk/autotest.sh@73 -- # out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:03:20.577 17:47:13 -- spdk/autotest.sh@74 -- # cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:03:20.577 17:47:13 -- spdk/autotest.sh@76 -- # freebsd_update_contigmem_mod 00:03:20.577 17:47:13 -- common/autotest_common.sh@1450 -- # uname 00:03:20.577 17:47:13 -- common/autotest_common.sh@1450 -- # '[' Linux = FreeBSD ']' 00:03:20.577 17:47:13 -- spdk/autotest.sh@77 -- # freebsd_set_maxsock_buf 00:03:20.577 17:47:13 -- common/autotest_common.sh@1470 -- # uname 00:03:20.577 17:47:13 -- common/autotest_common.sh@1470 -- # [[ Linux = FreeBSD ]] 00:03:20.577 17:47:13 -- spdk/autotest.sh@79 -- # [[ y == y ]] 00:03:20.577 17:47:13 -- spdk/autotest.sh@81 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh --version 00:03:20.577 lcov: LCOV version 1.15 00:03:20.577 17:47:13 -- spdk/autotest.sh@83 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -c --no-external -i -t Baseline -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_base.info 00:03:22.487 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/ftl/upgrade/ftl_chunk_upgrade.gcno 00:03:22.487 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/ftl/upgrade/ftl_band_upgrade.gcno 00:03:22.487 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/ftl/upgrade/ftl_p2l_upgrade.gcno 00:03:34.705 17:47:27 -- spdk/autotest.sh@87 -- # timing_enter pre_cleanup 00:03:34.705 17:47:27 -- common/autotest_common.sh@722 -- # xtrace_disable 00:03:34.705 17:47:27 -- common/autotest_common.sh@10 -- # set +x 00:03:34.705 17:47:27 -- spdk/autotest.sh@89 -- # rm -f 00:03:34.705 17:47:27 -- spdk/autotest.sh@92 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:03:38.900 0000:00:04.7 (8086 2021): Already using the ioatdma driver 00:03:38.900 0000:00:04.6 (8086 2021): Already using the ioatdma driver 00:03:38.900 0000:00:04.5 (8086 2021): Already using the ioatdma driver 00:03:38.900 0000:00:04.4 (8086 2021): Already using the ioatdma driver 00:03:38.900 0000:00:04.3 (8086 2021): Already using the ioatdma driver 00:03:38.900 0000:00:04.2 (8086 2021): Already using the ioatdma driver 00:03:38.900 0000:00:04.1 (8086 2021): Already using the ioatdma driver 00:03:38.900 0000:00:04.0 (8086 2021): Already using the ioatdma driver 00:03:38.900 0000:80:04.7 (8086 2021): Already using the ioatdma driver 00:03:38.900 0000:80:04.6 (8086 2021): Already using the ioatdma driver 00:03:38.900 0000:80:04.5 (8086 2021): Already using the ioatdma driver 00:03:38.900 0000:80:04.4 (8086 2021): Already using the ioatdma driver 00:03:38.900 0000:80:04.3 (8086 2021): Already using the ioatdma driver 00:03:38.900 0000:80:04.2 (8086 2021): Already using the ioatdma driver 00:03:38.900 0000:80:04.1 (8086 2021): Already using the ioatdma driver 00:03:38.900 0000:80:04.0 (8086 2021): Already using the ioatdma driver 00:03:38.900 0000:d8:00.0 (8086 0a54): Already using the nvme driver 00:03:38.900 17:47:31 -- spdk/autotest.sh@94 -- # get_zoned_devs 00:03:38.900 17:47:31 -- common/autotest_common.sh@1664 -- # zoned_devs=() 00:03:38.900 17:47:31 -- common/autotest_common.sh@1664 -- # local -gA zoned_devs 00:03:38.900 17:47:31 -- common/autotest_common.sh@1665 -- # local nvme bdf 00:03:38.900 17:47:31 -- common/autotest_common.sh@1667 -- # for nvme in /sys/block/nvme* 00:03:38.900 17:47:31 -- common/autotest_common.sh@1668 -- # is_block_zoned nvme0n1 00:03:38.900 17:47:31 -- common/autotest_common.sh@1657 -- # local device=nvme0n1 00:03:38.900 17:47:31 -- common/autotest_common.sh@1659 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:03:38.900 17:47:31 -- common/autotest_common.sh@1660 -- # [[ none != none ]] 00:03:38.900 17:47:31 -- spdk/autotest.sh@96 -- # (( 0 > 0 )) 00:03:38.900 17:47:31 -- spdk/autotest.sh@108 -- # ls /dev/nvme0n1 00:03:38.900 17:47:31 -- spdk/autotest.sh@108 -- # grep -v p 00:03:38.900 17:47:31 -- spdk/autotest.sh@108 -- # for dev in $(ls /dev/nvme*n* | grep -v p || true) 00:03:38.900 17:47:31 -- spdk/autotest.sh@110 -- # [[ -z '' ]] 00:03:38.900 17:47:31 -- spdk/autotest.sh@111 -- # block_in_use /dev/nvme0n1 00:03:38.900 17:47:31 -- scripts/common.sh@380 -- # local block=/dev/nvme0n1 pt 00:03:38.900 17:47:31 -- scripts/common.sh@389 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/spdk-gpt.py /dev/nvme0n1 00:03:38.900 No valid GPT data, bailing 00:03:38.900 17:47:31 -- scripts/common.sh@393 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:03:38.900 17:47:31 -- scripts/common.sh@393 -- # pt= 00:03:38.900 17:47:31 -- scripts/common.sh@394 -- # return 1 00:03:38.901 17:47:31 -- spdk/autotest.sh@112 -- # dd if=/dev/zero of=/dev/nvme0n1 bs=1M count=1 00:03:38.901 1+0 records in 00:03:38.901 1+0 records out 00:03:38.901 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00522292 s, 201 MB/s 00:03:38.901 17:47:31 -- spdk/autotest.sh@116 -- # sync 00:03:38.901 17:47:31 -- spdk/autotest.sh@118 -- # xtrace_disable_per_cmd reap_spdk_processes 00:03:38.901 17:47:31 -- common/autotest_common.sh@22 -- # eval 'reap_spdk_processes 12> /dev/null' 00:03:38.901 17:47:31 -- common/autotest_common.sh@22 -- # reap_spdk_processes 00:03:47.030 17:47:38 -- spdk/autotest.sh@122 -- # uname -s 00:03:47.030 17:47:38 -- spdk/autotest.sh@122 -- # '[' Linux = Linux ']' 00:03:47.030 17:47:38 -- spdk/autotest.sh@123 -- # run_test setup.sh /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/test-setup.sh 00:03:47.030 17:47:38 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:47.030 17:47:38 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:47.030 17:47:38 -- common/autotest_common.sh@10 -- # set +x 00:03:47.030 ************************************ 00:03:47.030 START TEST setup.sh 00:03:47.030 ************************************ 00:03:47.030 17:47:38 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/test-setup.sh 00:03:47.030 * Looking for test storage... 00:03:47.030 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:03:47.030 17:47:38 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:03:47.030 17:47:38 -- common/autotest_common.sh@1690 -- # lcov --version 00:03:47.030 17:47:38 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:03:47.030 17:47:38 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:03:47.030 17:47:38 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:03:47.030 17:47:38 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:03:47.030 17:47:38 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:03:47.030 17:47:38 -- scripts/common.sh@335 -- # IFS=.-: 00:03:47.030 17:47:38 -- scripts/common.sh@335 -- # read -ra ver1 00:03:47.030 17:47:38 -- scripts/common.sh@336 -- # IFS=.-: 00:03:47.030 17:47:38 -- scripts/common.sh@336 -- # read -ra ver2 00:03:47.030 17:47:38 -- scripts/common.sh@337 -- # local 'op=<' 00:03:47.030 17:47:38 -- scripts/common.sh@339 -- # ver1_l=2 00:03:47.030 17:47:38 -- scripts/common.sh@340 -- # ver2_l=1 00:03:47.030 17:47:38 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:03:47.030 17:47:38 -- scripts/common.sh@343 -- # case "$op" in 00:03:47.030 17:47:38 -- scripts/common.sh@344 -- # : 1 00:03:47.030 17:47:38 -- scripts/common.sh@363 -- # (( v = 0 )) 00:03:47.030 17:47:38 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:03:47.030 17:47:38 -- scripts/common.sh@364 -- # decimal 1 00:03:47.030 17:47:38 -- scripts/common.sh@352 -- # local d=1 00:03:47.030 17:47:38 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:03:47.030 17:47:38 -- scripts/common.sh@354 -- # echo 1 00:03:47.030 17:47:38 -- scripts/common.sh@364 -- # ver1[v]=1 00:03:47.030 17:47:38 -- scripts/common.sh@365 -- # decimal 2 00:03:47.030 17:47:38 -- scripts/common.sh@352 -- # local d=2 00:03:47.030 17:47:38 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:03:47.030 17:47:38 -- scripts/common.sh@354 -- # echo 2 00:03:47.030 17:47:38 -- scripts/common.sh@365 -- # ver2[v]=2 00:03:47.030 17:47:38 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:03:47.030 17:47:38 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:03:47.030 17:47:38 -- scripts/common.sh@367 -- # return 0 00:03:47.030 17:47:38 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:03:47.030 17:47:38 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:03:47.030 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:47.030 --rc genhtml_branch_coverage=1 00:03:47.030 --rc genhtml_function_coverage=1 00:03:47.030 --rc genhtml_legend=1 00:03:47.030 --rc geninfo_all_blocks=1 00:03:47.030 --rc geninfo_unexecuted_blocks=1 00:03:47.030 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:47.030 ' 00:03:47.030 17:47:38 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:03:47.030 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:47.030 --rc genhtml_branch_coverage=1 00:03:47.030 --rc genhtml_function_coverage=1 00:03:47.030 --rc genhtml_legend=1 00:03:47.030 --rc geninfo_all_blocks=1 00:03:47.030 --rc geninfo_unexecuted_blocks=1 00:03:47.030 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:47.030 ' 00:03:47.030 17:47:38 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:03:47.030 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:47.030 --rc genhtml_branch_coverage=1 00:03:47.030 --rc genhtml_function_coverage=1 00:03:47.030 --rc genhtml_legend=1 00:03:47.030 --rc geninfo_all_blocks=1 00:03:47.030 --rc geninfo_unexecuted_blocks=1 00:03:47.030 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:47.030 ' 00:03:47.030 17:47:38 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:03:47.030 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:47.030 --rc genhtml_branch_coverage=1 00:03:47.030 --rc genhtml_function_coverage=1 00:03:47.030 --rc genhtml_legend=1 00:03:47.030 --rc geninfo_all_blocks=1 00:03:47.030 --rc geninfo_unexecuted_blocks=1 00:03:47.030 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:47.030 ' 00:03:47.030 17:47:38 -- setup/test-setup.sh@10 -- # uname -s 00:03:47.030 17:47:38 -- setup/test-setup.sh@10 -- # [[ Linux == Linux ]] 00:03:47.030 17:47:38 -- setup/test-setup.sh@12 -- # run_test acl /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/acl.sh 00:03:47.030 17:47:38 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:47.030 17:47:38 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:47.030 17:47:38 -- common/autotest_common.sh@10 -- # set +x 00:03:47.030 ************************************ 00:03:47.030 START TEST acl 00:03:47.030 ************************************ 00:03:47.030 17:47:38 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/acl.sh 00:03:47.031 * Looking for test storage... 00:03:47.031 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:03:47.031 17:47:38 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:03:47.031 17:47:38 -- common/autotest_common.sh@1690 -- # lcov --version 00:03:47.031 17:47:38 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:03:47.031 17:47:39 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:03:47.031 17:47:39 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:03:47.031 17:47:39 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:03:47.031 17:47:39 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:03:47.031 17:47:39 -- scripts/common.sh@335 -- # IFS=.-: 00:03:47.031 17:47:39 -- scripts/common.sh@335 -- # read -ra ver1 00:03:47.031 17:47:39 -- scripts/common.sh@336 -- # IFS=.-: 00:03:47.031 17:47:39 -- scripts/common.sh@336 -- # read -ra ver2 00:03:47.031 17:47:39 -- scripts/common.sh@337 -- # local 'op=<' 00:03:47.031 17:47:39 -- scripts/common.sh@339 -- # ver1_l=2 00:03:47.031 17:47:39 -- scripts/common.sh@340 -- # ver2_l=1 00:03:47.031 17:47:39 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:03:47.031 17:47:39 -- scripts/common.sh@343 -- # case "$op" in 00:03:47.031 17:47:39 -- scripts/common.sh@344 -- # : 1 00:03:47.031 17:47:39 -- scripts/common.sh@363 -- # (( v = 0 )) 00:03:47.031 17:47:39 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:03:47.031 17:47:39 -- scripts/common.sh@364 -- # decimal 1 00:03:47.031 17:47:39 -- scripts/common.sh@352 -- # local d=1 00:03:47.031 17:47:39 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:03:47.031 17:47:39 -- scripts/common.sh@354 -- # echo 1 00:03:47.031 17:47:39 -- scripts/common.sh@364 -- # ver1[v]=1 00:03:47.031 17:47:39 -- scripts/common.sh@365 -- # decimal 2 00:03:47.031 17:47:39 -- scripts/common.sh@352 -- # local d=2 00:03:47.031 17:47:39 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:03:47.031 17:47:39 -- scripts/common.sh@354 -- # echo 2 00:03:47.031 17:47:39 -- scripts/common.sh@365 -- # ver2[v]=2 00:03:47.031 17:47:39 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:03:47.031 17:47:39 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:03:47.031 17:47:39 -- scripts/common.sh@367 -- # return 0 00:03:47.031 17:47:39 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:03:47.031 17:47:39 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:03:47.031 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:47.031 --rc genhtml_branch_coverage=1 00:03:47.031 --rc genhtml_function_coverage=1 00:03:47.031 --rc genhtml_legend=1 00:03:47.031 --rc geninfo_all_blocks=1 00:03:47.031 --rc geninfo_unexecuted_blocks=1 00:03:47.031 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:47.031 ' 00:03:47.031 17:47:39 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:03:47.031 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:47.031 --rc genhtml_branch_coverage=1 00:03:47.031 --rc genhtml_function_coverage=1 00:03:47.031 --rc genhtml_legend=1 00:03:47.031 --rc geninfo_all_blocks=1 00:03:47.031 --rc geninfo_unexecuted_blocks=1 00:03:47.031 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:47.031 ' 00:03:47.031 17:47:39 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:03:47.031 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:47.031 --rc genhtml_branch_coverage=1 00:03:47.031 --rc genhtml_function_coverage=1 00:03:47.031 --rc genhtml_legend=1 00:03:47.031 --rc geninfo_all_blocks=1 00:03:47.031 --rc geninfo_unexecuted_blocks=1 00:03:47.031 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:47.031 ' 00:03:47.031 17:47:39 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:03:47.031 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:47.031 --rc genhtml_branch_coverage=1 00:03:47.031 --rc genhtml_function_coverage=1 00:03:47.031 --rc genhtml_legend=1 00:03:47.031 --rc geninfo_all_blocks=1 00:03:47.031 --rc geninfo_unexecuted_blocks=1 00:03:47.031 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:47.031 ' 00:03:47.031 17:47:39 -- setup/acl.sh@10 -- # get_zoned_devs 00:03:47.031 17:47:39 -- common/autotest_common.sh@1664 -- # zoned_devs=() 00:03:47.031 17:47:39 -- common/autotest_common.sh@1664 -- # local -gA zoned_devs 00:03:47.031 17:47:39 -- common/autotest_common.sh@1665 -- # local nvme bdf 00:03:47.031 17:47:39 -- common/autotest_common.sh@1667 -- # for nvme in /sys/block/nvme* 00:03:47.031 17:47:39 -- common/autotest_common.sh@1668 -- # is_block_zoned nvme0n1 00:03:47.031 17:47:39 -- common/autotest_common.sh@1657 -- # local device=nvme0n1 00:03:47.031 17:47:39 -- common/autotest_common.sh@1659 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:03:47.031 17:47:39 -- common/autotest_common.sh@1660 -- # [[ none != none ]] 00:03:47.031 17:47:39 -- setup/acl.sh@12 -- # devs=() 00:03:47.031 17:47:39 -- setup/acl.sh@12 -- # declare -a devs 00:03:47.031 17:47:39 -- setup/acl.sh@13 -- # drivers=() 00:03:47.031 17:47:39 -- setup/acl.sh@13 -- # declare -A drivers 00:03:47.031 17:47:39 -- setup/acl.sh@51 -- # setup reset 00:03:47.031 17:47:39 -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:47.031 17:47:39 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:03:50.325 17:47:42 -- setup/acl.sh@52 -- # collect_setup_devs 00:03:50.325 17:47:42 -- setup/acl.sh@16 -- # local dev driver 00:03:50.325 17:47:42 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:50.325 17:47:42 -- setup/acl.sh@15 -- # setup output status 00:03:50.325 17:47:42 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:50.325 17:47:42 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:03:53.789 Hugepages 00:03:53.789 node hugesize free / total 00:03:53.789 17:47:46 -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:03:53.789 17:47:46 -- setup/acl.sh@19 -- # continue 00:03:53.789 17:47:46 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:53.789 17:47:46 -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:03:53.789 17:47:46 -- setup/acl.sh@19 -- # continue 00:03:53.789 17:47:46 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:53.789 17:47:46 -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:03:53.789 17:47:46 -- setup/acl.sh@19 -- # continue 00:03:53.789 17:47:46 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:53.789 00:03:53.789 Type BDF Vendor Device NUMA Driver Device Block devices 00:03:53.789 17:47:46 -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:03:53.789 17:47:46 -- setup/acl.sh@19 -- # continue 00:03:53.789 17:47:46 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:53.789 17:47:46 -- setup/acl.sh@19 -- # [[ 0000:00:04.0 == *:*:*.* ]] 00:03:53.789 17:47:46 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:53.789 17:47:46 -- setup/acl.sh@20 -- # continue 00:03:53.789 17:47:46 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:53.789 17:47:46 -- setup/acl.sh@19 -- # [[ 0000:00:04.1 == *:*:*.* ]] 00:03:53.789 17:47:46 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:53.789 17:47:46 -- setup/acl.sh@20 -- # continue 00:03:53.789 17:47:46 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:53.789 17:47:46 -- setup/acl.sh@19 -- # [[ 0000:00:04.2 == *:*:*.* ]] 00:03:53.789 17:47:46 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:53.789 17:47:46 -- setup/acl.sh@20 -- # continue 00:03:53.789 17:47:46 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:53.789 17:47:46 -- setup/acl.sh@19 -- # [[ 0000:00:04.3 == *:*:*.* ]] 00:03:53.789 17:47:46 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:53.789 17:47:46 -- setup/acl.sh@20 -- # continue 00:03:53.790 17:47:46 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:53.790 17:47:46 -- setup/acl.sh@19 -- # [[ 0000:00:04.4 == *:*:*.* ]] 00:03:53.790 17:47:46 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:53.790 17:47:46 -- setup/acl.sh@20 -- # continue 00:03:53.790 17:47:46 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:53.790 17:47:46 -- setup/acl.sh@19 -- # [[ 0000:00:04.5 == *:*:*.* ]] 00:03:53.790 17:47:46 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:53.790 17:47:46 -- setup/acl.sh@20 -- # continue 00:03:53.790 17:47:46 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:53.790 17:47:46 -- setup/acl.sh@19 -- # [[ 0000:00:04.6 == *:*:*.* ]] 00:03:53.790 17:47:46 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:53.790 17:47:46 -- setup/acl.sh@20 -- # continue 00:03:53.790 17:47:46 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:53.790 17:47:46 -- setup/acl.sh@19 -- # [[ 0000:00:04.7 == *:*:*.* ]] 00:03:53.790 17:47:46 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:53.790 17:47:46 -- setup/acl.sh@20 -- # continue 00:03:53.790 17:47:46 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:53.790 17:47:46 -- setup/acl.sh@19 -- # [[ 0000:80:04.0 == *:*:*.* ]] 00:03:53.790 17:47:46 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:53.790 17:47:46 -- setup/acl.sh@20 -- # continue 00:03:53.790 17:47:46 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:53.790 17:47:46 -- setup/acl.sh@19 -- # [[ 0000:80:04.1 == *:*:*.* ]] 00:03:53.790 17:47:46 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:53.790 17:47:46 -- setup/acl.sh@20 -- # continue 00:03:53.790 17:47:46 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:53.790 17:47:46 -- setup/acl.sh@19 -- # [[ 0000:80:04.2 == *:*:*.* ]] 00:03:53.790 17:47:46 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:53.790 17:47:46 -- setup/acl.sh@20 -- # continue 00:03:53.790 17:47:46 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:53.790 17:47:46 -- setup/acl.sh@19 -- # [[ 0000:80:04.3 == *:*:*.* ]] 00:03:53.790 17:47:46 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:53.790 17:47:46 -- setup/acl.sh@20 -- # continue 00:03:53.790 17:47:46 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:53.790 17:47:46 -- setup/acl.sh@19 -- # [[ 0000:80:04.4 == *:*:*.* ]] 00:03:53.790 17:47:46 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:53.790 17:47:46 -- setup/acl.sh@20 -- # continue 00:03:53.790 17:47:46 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:53.790 17:47:46 -- setup/acl.sh@19 -- # [[ 0000:80:04.5 == *:*:*.* ]] 00:03:53.790 17:47:46 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:53.790 17:47:46 -- setup/acl.sh@20 -- # continue 00:03:53.790 17:47:46 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:53.790 17:47:46 -- setup/acl.sh@19 -- # [[ 0000:80:04.6 == *:*:*.* ]] 00:03:53.790 17:47:46 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:53.790 17:47:46 -- setup/acl.sh@20 -- # continue 00:03:53.790 17:47:46 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:53.790 17:47:46 -- setup/acl.sh@19 -- # [[ 0000:80:04.7 == *:*:*.* ]] 00:03:53.790 17:47:46 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:53.790 17:47:46 -- setup/acl.sh@20 -- # continue 00:03:53.790 17:47:46 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:53.790 17:47:46 -- setup/acl.sh@19 -- # [[ 0000:d8:00.0 == *:*:*.* ]] 00:03:53.790 17:47:46 -- setup/acl.sh@20 -- # [[ nvme == nvme ]] 00:03:53.790 17:47:46 -- setup/acl.sh@21 -- # [[ '' == *\0\0\0\0\:\d\8\:\0\0\.\0* ]] 00:03:53.790 17:47:46 -- setup/acl.sh@22 -- # devs+=("$dev") 00:03:53.790 17:47:46 -- setup/acl.sh@22 -- # drivers["$dev"]=nvme 00:03:53.790 17:47:46 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:53.790 17:47:46 -- setup/acl.sh@24 -- # (( 1 > 0 )) 00:03:53.790 17:47:46 -- setup/acl.sh@54 -- # run_test denied denied 00:03:53.790 17:47:46 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:53.790 17:47:46 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:53.790 17:47:46 -- common/autotest_common.sh@10 -- # set +x 00:03:53.790 ************************************ 00:03:53.790 START TEST denied 00:03:53.790 ************************************ 00:03:53.790 17:47:46 -- common/autotest_common.sh@1114 -- # denied 00:03:53.790 17:47:46 -- setup/acl.sh@38 -- # PCI_BLOCKED=' 0000:d8:00.0' 00:03:53.790 17:47:46 -- setup/acl.sh@38 -- # setup output config 00:03:53.790 17:47:46 -- setup/acl.sh@39 -- # grep 'Skipping denied controller at 0000:d8:00.0' 00:03:53.790 17:47:46 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:53.790 17:47:46 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:03:58.146 0000:d8:00.0 (8086 0a54): Skipping denied controller at 0000:d8:00.0 00:03:58.146 17:47:50 -- setup/acl.sh@40 -- # verify 0000:d8:00.0 00:03:58.146 17:47:50 -- setup/acl.sh@28 -- # local dev driver 00:03:58.146 17:47:50 -- setup/acl.sh@30 -- # for dev in "$@" 00:03:58.146 17:47:50 -- setup/acl.sh@31 -- # [[ -e /sys/bus/pci/devices/0000:d8:00.0 ]] 00:03:58.146 17:47:50 -- setup/acl.sh@32 -- # readlink -f /sys/bus/pci/devices/0000:d8:00.0/driver 00:03:58.146 17:47:50 -- setup/acl.sh@32 -- # driver=/sys/bus/pci/drivers/nvme 00:03:58.146 17:47:50 -- setup/acl.sh@33 -- # [[ nvme == \n\v\m\e ]] 00:03:58.146 17:47:50 -- setup/acl.sh@41 -- # setup reset 00:03:58.146 17:47:50 -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:58.146 17:47:50 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:02.343 00:04:02.343 real 0m8.594s 00:04:02.343 user 0m2.825s 00:04:02.343 sys 0m5.086s 00:04:02.343 17:47:55 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:02.343 17:47:55 -- common/autotest_common.sh@10 -- # set +x 00:04:02.343 ************************************ 00:04:02.343 END TEST denied 00:04:02.343 ************************************ 00:04:02.343 17:47:55 -- setup/acl.sh@55 -- # run_test allowed allowed 00:04:02.343 17:47:55 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:02.343 17:47:55 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:02.343 17:47:55 -- common/autotest_common.sh@10 -- # set +x 00:04:02.343 ************************************ 00:04:02.343 START TEST allowed 00:04:02.343 ************************************ 00:04:02.343 17:47:55 -- common/autotest_common.sh@1114 -- # allowed 00:04:02.343 17:47:55 -- setup/acl.sh@45 -- # PCI_ALLOWED=0000:d8:00.0 00:04:02.343 17:47:55 -- setup/acl.sh@45 -- # setup output config 00:04:02.343 17:47:55 -- setup/acl.sh@46 -- # grep -E '0000:d8:00.0 .*: nvme -> .*' 00:04:02.343 17:47:55 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:02.343 17:47:55 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:07.646 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:04:07.646 17:48:00 -- setup/acl.sh@47 -- # verify 00:04:07.646 17:48:00 -- setup/acl.sh@28 -- # local dev driver 00:04:07.646 17:48:00 -- setup/acl.sh@48 -- # setup reset 00:04:07.646 17:48:00 -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:07.646 17:48:00 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:11.844 00:04:11.844 real 0m9.247s 00:04:11.844 user 0m2.665s 00:04:11.844 sys 0m5.151s 00:04:11.844 17:48:04 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:11.844 17:48:04 -- common/autotest_common.sh@10 -- # set +x 00:04:11.844 ************************************ 00:04:11.844 END TEST allowed 00:04:11.844 ************************************ 00:04:11.844 00:04:11.844 real 0m25.557s 00:04:11.844 user 0m8.272s 00:04:11.844 sys 0m15.485s 00:04:11.844 17:48:04 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:11.844 17:48:04 -- common/autotest_common.sh@10 -- # set +x 00:04:11.844 ************************************ 00:04:11.844 END TEST acl 00:04:11.844 ************************************ 00:04:11.844 17:48:04 -- setup/test-setup.sh@13 -- # run_test hugepages /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/hugepages.sh 00:04:11.844 17:48:04 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:11.844 17:48:04 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:11.844 17:48:04 -- common/autotest_common.sh@10 -- # set +x 00:04:11.844 ************************************ 00:04:11.844 START TEST hugepages 00:04:11.844 ************************************ 00:04:11.844 17:48:04 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/hugepages.sh 00:04:11.844 * Looking for test storage... 00:04:11.844 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:04:11.844 17:48:04 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:04:11.844 17:48:04 -- common/autotest_common.sh@1690 -- # lcov --version 00:04:11.844 17:48:04 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:04:11.844 17:48:04 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:04:11.844 17:48:04 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:04:11.844 17:48:04 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:04:11.844 17:48:04 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:04:11.844 17:48:04 -- scripts/common.sh@335 -- # IFS=.-: 00:04:11.844 17:48:04 -- scripts/common.sh@335 -- # read -ra ver1 00:04:11.844 17:48:04 -- scripts/common.sh@336 -- # IFS=.-: 00:04:11.844 17:48:04 -- scripts/common.sh@336 -- # read -ra ver2 00:04:11.844 17:48:04 -- scripts/common.sh@337 -- # local 'op=<' 00:04:11.844 17:48:04 -- scripts/common.sh@339 -- # ver1_l=2 00:04:11.844 17:48:04 -- scripts/common.sh@340 -- # ver2_l=1 00:04:11.844 17:48:04 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:04:11.844 17:48:04 -- scripts/common.sh@343 -- # case "$op" in 00:04:11.844 17:48:04 -- scripts/common.sh@344 -- # : 1 00:04:11.844 17:48:04 -- scripts/common.sh@363 -- # (( v = 0 )) 00:04:11.844 17:48:04 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:11.844 17:48:04 -- scripts/common.sh@364 -- # decimal 1 00:04:11.844 17:48:04 -- scripts/common.sh@352 -- # local d=1 00:04:11.844 17:48:04 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:11.844 17:48:04 -- scripts/common.sh@354 -- # echo 1 00:04:11.844 17:48:04 -- scripts/common.sh@364 -- # ver1[v]=1 00:04:11.844 17:48:04 -- scripts/common.sh@365 -- # decimal 2 00:04:11.844 17:48:04 -- scripts/common.sh@352 -- # local d=2 00:04:11.844 17:48:04 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:11.844 17:48:04 -- scripts/common.sh@354 -- # echo 2 00:04:11.844 17:48:04 -- scripts/common.sh@365 -- # ver2[v]=2 00:04:11.844 17:48:04 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:04:11.844 17:48:04 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:04:11.844 17:48:04 -- scripts/common.sh@367 -- # return 0 00:04:11.844 17:48:04 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:11.845 17:48:04 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:04:11.845 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:11.845 --rc genhtml_branch_coverage=1 00:04:11.845 --rc genhtml_function_coverage=1 00:04:11.845 --rc genhtml_legend=1 00:04:11.845 --rc geninfo_all_blocks=1 00:04:11.845 --rc geninfo_unexecuted_blocks=1 00:04:11.845 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:11.845 ' 00:04:11.845 17:48:04 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:04:11.845 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:11.845 --rc genhtml_branch_coverage=1 00:04:11.845 --rc genhtml_function_coverage=1 00:04:11.845 --rc genhtml_legend=1 00:04:11.845 --rc geninfo_all_blocks=1 00:04:11.845 --rc geninfo_unexecuted_blocks=1 00:04:11.845 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:11.845 ' 00:04:11.845 17:48:04 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:04:11.845 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:11.845 --rc genhtml_branch_coverage=1 00:04:11.845 --rc genhtml_function_coverage=1 00:04:11.845 --rc genhtml_legend=1 00:04:11.845 --rc geninfo_all_blocks=1 00:04:11.845 --rc geninfo_unexecuted_blocks=1 00:04:11.845 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:11.845 ' 00:04:11.845 17:48:04 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:04:11.845 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:11.845 --rc genhtml_branch_coverage=1 00:04:11.845 --rc genhtml_function_coverage=1 00:04:11.845 --rc genhtml_legend=1 00:04:11.845 --rc geninfo_all_blocks=1 00:04:11.845 --rc geninfo_unexecuted_blocks=1 00:04:11.845 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:11.845 ' 00:04:11.845 17:48:04 -- setup/hugepages.sh@10 -- # nodes_sys=() 00:04:11.845 17:48:04 -- setup/hugepages.sh@10 -- # declare -a nodes_sys 00:04:11.845 17:48:04 -- setup/hugepages.sh@12 -- # declare -i default_hugepages=0 00:04:11.845 17:48:04 -- setup/hugepages.sh@13 -- # declare -i no_nodes=0 00:04:11.845 17:48:04 -- setup/hugepages.sh@14 -- # declare -i nr_hugepages=0 00:04:11.845 17:48:04 -- setup/hugepages.sh@16 -- # get_meminfo Hugepagesize 00:04:11.845 17:48:04 -- setup/common.sh@17 -- # local get=Hugepagesize 00:04:11.845 17:48:04 -- setup/common.sh@18 -- # local node= 00:04:11.845 17:48:04 -- setup/common.sh@19 -- # local var val 00:04:11.845 17:48:04 -- setup/common.sh@20 -- # local mem_f mem 00:04:11.845 17:48:04 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:11.845 17:48:04 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:11.845 17:48:04 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:11.845 17:48:04 -- setup/common.sh@28 -- # mapfile -t mem 00:04:11.845 17:48:04 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:11.845 17:48:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.845 17:48:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.845 17:48:04 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 39345996 kB' 'MemAvailable: 43073696 kB' 'Buffers: 8940 kB' 'Cached: 12525312 kB' 'SwapCached: 0 kB' 'Active: 9415632 kB' 'Inactive: 3688312 kB' 'Active(anon): 8998788 kB' 'Inactive(anon): 0 kB' 'Active(file): 416844 kB' 'Inactive(file): 3688312 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 573248 kB' 'Mapped: 154080 kB' 'Shmem: 8429096 kB' 'KReclaimable: 239268 kB' 'Slab: 914680 kB' 'SReclaimable: 239268 kB' 'SUnreclaim: 675412 kB' 'KernelStack: 22032 kB' 'PageTables: 8540 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36433348 kB' 'Committed_AS: 10306536 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214388 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 2048' 'HugePages_Free: 2048' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 4194304 kB' 'DirectMap4k: 603508 kB' 'DirectMap2M: 10616832 kB' 'DirectMap1G: 58720256 kB' 00:04:11.845 17:48:04 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.845 17:48:04 -- setup/common.sh@32 -- # continue 00:04:11.845 17:48:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.845 17:48:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.845 17:48:04 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.845 17:48:04 -- setup/common.sh@32 -- # continue 00:04:11.845 17:48:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.845 17:48:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.845 17:48:04 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.845 17:48:04 -- setup/common.sh@32 -- # continue 00:04:11.845 17:48:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.845 17:48:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.845 17:48:04 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.845 17:48:04 -- setup/common.sh@32 -- # continue 00:04:11.845 17:48:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.845 17:48:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.845 17:48:04 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.845 17:48:04 -- setup/common.sh@32 -- # continue 00:04:11.845 17:48:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.845 17:48:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.845 17:48:04 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.845 17:48:04 -- setup/common.sh@32 -- # continue 00:04:11.845 17:48:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.845 17:48:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.845 17:48:04 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.845 17:48:04 -- setup/common.sh@32 -- # continue 00:04:11.845 17:48:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.845 17:48:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.845 17:48:04 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.845 17:48:04 -- setup/common.sh@32 -- # continue 00:04:11.845 17:48:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.845 17:48:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.845 17:48:04 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.845 17:48:04 -- setup/common.sh@32 -- # continue 00:04:11.845 17:48:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.845 17:48:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.845 17:48:04 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.845 17:48:04 -- setup/common.sh@32 -- # continue 00:04:11.845 17:48:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.845 17:48:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.845 17:48:04 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.845 17:48:04 -- setup/common.sh@32 -- # continue 00:04:11.845 17:48:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.845 17:48:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.845 17:48:04 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.845 17:48:04 -- setup/common.sh@32 -- # continue 00:04:11.845 17:48:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.845 17:48:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.845 17:48:04 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.845 17:48:04 -- setup/common.sh@32 -- # continue 00:04:11.845 17:48:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.845 17:48:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.845 17:48:04 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.845 17:48:04 -- setup/common.sh@32 -- # continue 00:04:11.845 17:48:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.845 17:48:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.845 17:48:04 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.845 17:48:04 -- setup/common.sh@32 -- # continue 00:04:11.845 17:48:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.845 17:48:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.845 17:48:04 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.845 17:48:04 -- setup/common.sh@32 -- # continue 00:04:11.845 17:48:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.845 17:48:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.845 17:48:04 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.845 17:48:04 -- setup/common.sh@32 -- # continue 00:04:11.845 17:48:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.845 17:48:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.845 17:48:04 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.845 17:48:04 -- setup/common.sh@32 -- # continue 00:04:11.845 17:48:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.845 17:48:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.845 17:48:04 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.845 17:48:04 -- setup/common.sh@32 -- # continue 00:04:11.845 17:48:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.845 17:48:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.845 17:48:04 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.845 17:48:04 -- setup/common.sh@32 -- # continue 00:04:11.845 17:48:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.845 17:48:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.845 17:48:04 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.845 17:48:04 -- setup/common.sh@32 -- # continue 00:04:11.845 17:48:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.845 17:48:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.845 17:48:04 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.845 17:48:04 -- setup/common.sh@32 -- # continue 00:04:11.845 17:48:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.845 17:48:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.845 17:48:04 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.845 17:48:04 -- setup/common.sh@32 -- # continue 00:04:11.845 17:48:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.845 17:48:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.845 17:48:04 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.845 17:48:04 -- setup/common.sh@32 -- # continue 00:04:11.845 17:48:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.845 17:48:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.845 17:48:04 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.845 17:48:04 -- setup/common.sh@32 -- # continue 00:04:11.845 17:48:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.846 17:48:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.846 17:48:04 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.846 17:48:04 -- setup/common.sh@32 -- # continue 00:04:11.846 17:48:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.846 17:48:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.846 17:48:04 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.846 17:48:04 -- setup/common.sh@32 -- # continue 00:04:11.846 17:48:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.846 17:48:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.846 17:48:04 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.846 17:48:04 -- setup/common.sh@32 -- # continue 00:04:11.846 17:48:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.846 17:48:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.846 17:48:04 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.846 17:48:04 -- setup/common.sh@32 -- # continue 00:04:11.846 17:48:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.846 17:48:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.846 17:48:04 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.846 17:48:04 -- setup/common.sh@32 -- # continue 00:04:11.846 17:48:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.846 17:48:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.846 17:48:04 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.846 17:48:04 -- setup/common.sh@32 -- # continue 00:04:11.846 17:48:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.846 17:48:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.846 17:48:04 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.846 17:48:04 -- setup/common.sh@32 -- # continue 00:04:11.846 17:48:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.846 17:48:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.846 17:48:04 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.846 17:48:04 -- setup/common.sh@32 -- # continue 00:04:11.846 17:48:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.846 17:48:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.846 17:48:04 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.846 17:48:04 -- setup/common.sh@32 -- # continue 00:04:11.846 17:48:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.846 17:48:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.846 17:48:04 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.846 17:48:04 -- setup/common.sh@32 -- # continue 00:04:11.846 17:48:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.846 17:48:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.846 17:48:04 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.846 17:48:04 -- setup/common.sh@32 -- # continue 00:04:11.846 17:48:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.846 17:48:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.846 17:48:04 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.846 17:48:04 -- setup/common.sh@32 -- # continue 00:04:11.846 17:48:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.846 17:48:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.846 17:48:04 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.846 17:48:04 -- setup/common.sh@32 -- # continue 00:04:11.846 17:48:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.846 17:48:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.846 17:48:04 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.846 17:48:04 -- setup/common.sh@32 -- # continue 00:04:11.846 17:48:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.846 17:48:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.846 17:48:04 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.846 17:48:04 -- setup/common.sh@32 -- # continue 00:04:11.846 17:48:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.846 17:48:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.846 17:48:04 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.846 17:48:04 -- setup/common.sh@32 -- # continue 00:04:11.846 17:48:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.846 17:48:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.846 17:48:04 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.846 17:48:04 -- setup/common.sh@32 -- # continue 00:04:11.846 17:48:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.846 17:48:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.846 17:48:04 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.846 17:48:04 -- setup/common.sh@32 -- # continue 00:04:11.846 17:48:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.846 17:48:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.846 17:48:04 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.846 17:48:04 -- setup/common.sh@32 -- # continue 00:04:11.846 17:48:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.846 17:48:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.846 17:48:04 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.846 17:48:04 -- setup/common.sh@32 -- # continue 00:04:11.846 17:48:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.846 17:48:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.846 17:48:04 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.846 17:48:04 -- setup/common.sh@32 -- # continue 00:04:11.846 17:48:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.846 17:48:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.846 17:48:04 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.846 17:48:04 -- setup/common.sh@32 -- # continue 00:04:11.846 17:48:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.846 17:48:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.846 17:48:04 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.846 17:48:04 -- setup/common.sh@32 -- # continue 00:04:11.846 17:48:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.846 17:48:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.846 17:48:04 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.846 17:48:04 -- setup/common.sh@32 -- # continue 00:04:11.846 17:48:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.846 17:48:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.846 17:48:04 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.846 17:48:04 -- setup/common.sh@32 -- # continue 00:04:11.846 17:48:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.846 17:48:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.846 17:48:04 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.846 17:48:04 -- setup/common.sh@32 -- # continue 00:04:11.846 17:48:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.846 17:48:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.846 17:48:04 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.846 17:48:04 -- setup/common.sh@32 -- # continue 00:04:11.846 17:48:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.846 17:48:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.846 17:48:04 -- setup/common.sh@32 -- # [[ Hugepagesize == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.846 17:48:04 -- setup/common.sh@33 -- # echo 2048 00:04:11.846 17:48:04 -- setup/common.sh@33 -- # return 0 00:04:11.846 17:48:04 -- setup/hugepages.sh@16 -- # default_hugepages=2048 00:04:11.846 17:48:04 -- setup/hugepages.sh@17 -- # default_huge_nr=/sys/kernel/mm/hugepages/hugepages-2048kB/nr_hugepages 00:04:11.846 17:48:04 -- setup/hugepages.sh@18 -- # global_huge_nr=/proc/sys/vm/nr_hugepages 00:04:11.846 17:48:04 -- setup/hugepages.sh@21 -- # unset -v HUGE_EVEN_ALLOC 00:04:11.846 17:48:04 -- setup/hugepages.sh@22 -- # unset -v HUGEMEM 00:04:11.846 17:48:04 -- setup/hugepages.sh@23 -- # unset -v HUGENODE 00:04:11.846 17:48:04 -- setup/hugepages.sh@24 -- # unset -v NRHUGE 00:04:11.846 17:48:04 -- setup/hugepages.sh@207 -- # get_nodes 00:04:11.846 17:48:04 -- setup/hugepages.sh@27 -- # local node 00:04:11.846 17:48:04 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:11.846 17:48:04 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=2048 00:04:11.846 17:48:04 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:11.846 17:48:04 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:04:11.846 17:48:04 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:11.846 17:48:04 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:11.846 17:48:04 -- setup/hugepages.sh@208 -- # clear_hp 00:04:11.846 17:48:04 -- setup/hugepages.sh@37 -- # local node hp 00:04:11.846 17:48:04 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:11.846 17:48:04 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:11.846 17:48:04 -- setup/hugepages.sh@41 -- # echo 0 00:04:11.846 17:48:04 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:11.846 17:48:04 -- setup/hugepages.sh@41 -- # echo 0 00:04:11.846 17:48:04 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:11.846 17:48:04 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:11.846 17:48:04 -- setup/hugepages.sh@41 -- # echo 0 00:04:11.846 17:48:04 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:11.846 17:48:04 -- setup/hugepages.sh@41 -- # echo 0 00:04:11.846 17:48:04 -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:04:11.846 17:48:04 -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:04:11.846 17:48:04 -- setup/hugepages.sh@210 -- # run_test default_setup default_setup 00:04:11.846 17:48:04 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:11.846 17:48:04 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:11.846 17:48:04 -- common/autotest_common.sh@10 -- # set +x 00:04:12.106 ************************************ 00:04:12.106 START TEST default_setup 00:04:12.106 ************************************ 00:04:12.106 17:48:04 -- common/autotest_common.sh@1114 -- # default_setup 00:04:12.106 17:48:04 -- setup/hugepages.sh@136 -- # get_test_nr_hugepages 2097152 0 00:04:12.106 17:48:04 -- setup/hugepages.sh@49 -- # local size=2097152 00:04:12.106 17:48:04 -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:04:12.107 17:48:04 -- setup/hugepages.sh@51 -- # shift 00:04:12.107 17:48:04 -- setup/hugepages.sh@52 -- # node_ids=('0') 00:04:12.107 17:48:04 -- setup/hugepages.sh@52 -- # local node_ids 00:04:12.107 17:48:04 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:12.107 17:48:04 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:12.107 17:48:04 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:04:12.107 17:48:04 -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:04:12.107 17:48:04 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:12.107 17:48:04 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:12.107 17:48:04 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:12.107 17:48:04 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:12.107 17:48:04 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:12.107 17:48:04 -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:04:12.107 17:48:04 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:12.107 17:48:04 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:04:12.107 17:48:04 -- setup/hugepages.sh@73 -- # return 0 00:04:12.107 17:48:04 -- setup/hugepages.sh@137 -- # setup output 00:04:12.107 17:48:04 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:12.107 17:48:04 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:15.399 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:15.399 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:15.399 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:15.399 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:15.399 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:15.399 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:15.399 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:15.399 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:15.659 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:15.659 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:15.659 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:15.659 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:15.659 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:15.659 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:15.659 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:15.659 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:17.572 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:04:17.572 17:48:10 -- setup/hugepages.sh@138 -- # verify_nr_hugepages 00:04:17.572 17:48:10 -- setup/hugepages.sh@89 -- # local node 00:04:17.572 17:48:10 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:17.572 17:48:10 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:17.572 17:48:10 -- setup/hugepages.sh@92 -- # local surp 00:04:17.572 17:48:10 -- setup/hugepages.sh@93 -- # local resv 00:04:17.572 17:48:10 -- setup/hugepages.sh@94 -- # local anon 00:04:17.572 17:48:10 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:17.572 17:48:10 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:17.572 17:48:10 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:17.572 17:48:10 -- setup/common.sh@18 -- # local node= 00:04:17.572 17:48:10 -- setup/common.sh@19 -- # local var val 00:04:17.572 17:48:10 -- setup/common.sh@20 -- # local mem_f mem 00:04:17.572 17:48:10 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:17.572 17:48:10 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:17.572 17:48:10 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:17.572 17:48:10 -- setup/common.sh@28 -- # mapfile -t mem 00:04:17.572 17:48:10 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:17.572 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.572 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.572 17:48:10 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41559792 kB' 'MemAvailable: 45286988 kB' 'Buffers: 8940 kB' 'Cached: 12525448 kB' 'SwapCached: 0 kB' 'Active: 9411120 kB' 'Inactive: 3688312 kB' 'Active(anon): 8994276 kB' 'Inactive(anon): 0 kB' 'Active(file): 416844 kB' 'Inactive(file): 3688312 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 568356 kB' 'Mapped: 153572 kB' 'Shmem: 8429232 kB' 'KReclaimable: 238260 kB' 'Slab: 912228 kB' 'SReclaimable: 238260 kB' 'SUnreclaim: 673968 kB' 'KernelStack: 22000 kB' 'PageTables: 7988 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10299132 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214272 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 603508 kB' 'DirectMap2M: 10616832 kB' 'DirectMap1G: 58720256 kB' 00:04:17.572 17:48:10 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.572 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.572 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.572 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.572 17:48:10 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.572 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.572 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.572 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.572 17:48:10 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.572 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.572 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.572 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.572 17:48:10 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.572 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.572 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.572 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.572 17:48:10 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.572 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.572 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.572 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.572 17:48:10 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.572 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.572 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.572 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.572 17:48:10 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.572 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.572 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.572 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.572 17:48:10 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.572 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.572 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.572 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.572 17:48:10 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.572 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.572 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.572 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.572 17:48:10 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.572 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.572 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.572 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.572 17:48:10 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.572 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.572 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.572 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.572 17:48:10 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.572 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.572 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.572 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.572 17:48:10 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.572 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.572 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.572 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.572 17:48:10 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.572 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.572 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.572 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.572 17:48:10 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.572 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.572 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.572 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.572 17:48:10 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.572 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.572 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.572 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.572 17:48:10 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.572 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.572 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.572 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.572 17:48:10 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.572 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.572 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.572 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.572 17:48:10 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.572 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.572 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.572 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.572 17:48:10 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.572 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.572 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.572 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.572 17:48:10 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.572 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.572 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.572 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.572 17:48:10 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.572 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.572 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.572 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.572 17:48:10 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.572 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.572 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.572 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.572 17:48:10 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.572 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.572 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.572 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.572 17:48:10 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.572 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.572 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.572 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.572 17:48:10 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.572 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.572 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.573 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.573 17:48:10 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.573 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.573 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.573 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.573 17:48:10 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.573 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.573 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.573 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.573 17:48:10 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.573 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.573 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.573 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.573 17:48:10 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.573 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.573 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.573 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.573 17:48:10 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.573 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.573 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.573 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.573 17:48:10 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.573 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.573 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.573 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.573 17:48:10 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.573 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.573 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.573 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.573 17:48:10 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.573 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.573 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.573 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.573 17:48:10 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.573 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.573 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.573 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.573 17:48:10 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.573 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.573 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.573 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.573 17:48:10 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.573 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.573 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.573 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.573 17:48:10 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.573 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.573 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.573 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.573 17:48:10 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.573 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.573 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.573 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.573 17:48:10 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.573 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.573 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.573 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.573 17:48:10 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.573 17:48:10 -- setup/common.sh@33 -- # echo 0 00:04:17.573 17:48:10 -- setup/common.sh@33 -- # return 0 00:04:17.573 17:48:10 -- setup/hugepages.sh@97 -- # anon=0 00:04:17.573 17:48:10 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:17.573 17:48:10 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:17.573 17:48:10 -- setup/common.sh@18 -- # local node= 00:04:17.573 17:48:10 -- setup/common.sh@19 -- # local var val 00:04:17.573 17:48:10 -- setup/common.sh@20 -- # local mem_f mem 00:04:17.573 17:48:10 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:17.573 17:48:10 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:17.573 17:48:10 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:17.573 17:48:10 -- setup/common.sh@28 -- # mapfile -t mem 00:04:17.573 17:48:10 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:17.573 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.573 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.573 17:48:10 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41561476 kB' 'MemAvailable: 45288672 kB' 'Buffers: 8940 kB' 'Cached: 12525452 kB' 'SwapCached: 0 kB' 'Active: 9411844 kB' 'Inactive: 3688312 kB' 'Active(anon): 8995000 kB' 'Inactive(anon): 0 kB' 'Active(file): 416844 kB' 'Inactive(file): 3688312 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 569136 kB' 'Mapped: 153528 kB' 'Shmem: 8429236 kB' 'KReclaimable: 238260 kB' 'Slab: 912296 kB' 'SReclaimable: 238260 kB' 'SUnreclaim: 674036 kB' 'KernelStack: 22000 kB' 'PageTables: 8228 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10299144 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214256 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 603508 kB' 'DirectMap2M: 10616832 kB' 'DirectMap1G: 58720256 kB' 00:04:17.573 17:48:10 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.573 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.573 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.573 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.573 17:48:10 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.573 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.573 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.573 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.573 17:48:10 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.573 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.573 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.573 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.573 17:48:10 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.573 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.573 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.573 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.573 17:48:10 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.573 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.573 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.573 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.573 17:48:10 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.573 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.573 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.573 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.573 17:48:10 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.573 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.573 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.573 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.573 17:48:10 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.573 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.573 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.573 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.573 17:48:10 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.573 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.573 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.573 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.573 17:48:10 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.573 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.573 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.573 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.573 17:48:10 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.573 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.573 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.573 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.573 17:48:10 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.573 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.573 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.573 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.573 17:48:10 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.573 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.573 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.573 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.573 17:48:10 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.573 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.573 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.573 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.573 17:48:10 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.573 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.573 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.573 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.573 17:48:10 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.573 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.573 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.573 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.573 17:48:10 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.573 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.573 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.573 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.574 17:48:10 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.574 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.574 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.574 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.574 17:48:10 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.574 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.574 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.574 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.574 17:48:10 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.574 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.574 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.574 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.574 17:48:10 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.574 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.574 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.574 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.574 17:48:10 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.574 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.574 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.574 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.574 17:48:10 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.574 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.574 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.574 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.574 17:48:10 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.574 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.574 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.574 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.574 17:48:10 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.574 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.574 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.574 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.574 17:48:10 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.574 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.574 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.574 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.574 17:48:10 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.574 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.574 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.574 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.574 17:48:10 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.574 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.574 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.574 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.574 17:48:10 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.574 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.574 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.574 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.574 17:48:10 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.574 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.574 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.574 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.574 17:48:10 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.574 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.574 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.574 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.574 17:48:10 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.574 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.574 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.574 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.574 17:48:10 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.574 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.574 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.574 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.574 17:48:10 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.574 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.574 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.574 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.574 17:48:10 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.574 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.574 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.574 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.574 17:48:10 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.574 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.574 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.574 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.574 17:48:10 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.574 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.574 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.574 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.574 17:48:10 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.574 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.574 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.574 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.574 17:48:10 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.574 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.574 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.574 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.574 17:48:10 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.574 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.574 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.574 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.574 17:48:10 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.574 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.574 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.574 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.574 17:48:10 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.574 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.574 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.574 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.574 17:48:10 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.574 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.574 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.574 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.574 17:48:10 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.574 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.574 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.574 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.574 17:48:10 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.574 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.574 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.574 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.574 17:48:10 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.574 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.574 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.574 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.574 17:48:10 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.574 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.574 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.574 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.574 17:48:10 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.574 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.574 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.574 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.574 17:48:10 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.574 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.574 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.574 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.574 17:48:10 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.574 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.574 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.574 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.574 17:48:10 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.574 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.574 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.574 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.574 17:48:10 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.574 17:48:10 -- setup/common.sh@33 -- # echo 0 00:04:17.574 17:48:10 -- setup/common.sh@33 -- # return 0 00:04:17.574 17:48:10 -- setup/hugepages.sh@99 -- # surp=0 00:04:17.574 17:48:10 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:17.574 17:48:10 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:17.574 17:48:10 -- setup/common.sh@18 -- # local node= 00:04:17.574 17:48:10 -- setup/common.sh@19 -- # local var val 00:04:17.574 17:48:10 -- setup/common.sh@20 -- # local mem_f mem 00:04:17.574 17:48:10 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:17.574 17:48:10 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:17.574 17:48:10 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:17.574 17:48:10 -- setup/common.sh@28 -- # mapfile -t mem 00:04:17.574 17:48:10 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:17.574 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.575 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.575 17:48:10 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41562656 kB' 'MemAvailable: 45289852 kB' 'Buffers: 8940 kB' 'Cached: 12525452 kB' 'SwapCached: 0 kB' 'Active: 9411564 kB' 'Inactive: 3688312 kB' 'Active(anon): 8994720 kB' 'Inactive(anon): 0 kB' 'Active(file): 416844 kB' 'Inactive(file): 3688312 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 569324 kB' 'Mapped: 153436 kB' 'Shmem: 8429236 kB' 'KReclaimable: 238260 kB' 'Slab: 912244 kB' 'SReclaimable: 238260 kB' 'SUnreclaim: 673984 kB' 'KernelStack: 22192 kB' 'PageTables: 8528 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10299160 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214288 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 603508 kB' 'DirectMap2M: 10616832 kB' 'DirectMap1G: 58720256 kB' 00:04:17.575 17:48:10 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.575 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.575 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.575 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.575 17:48:10 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.575 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.575 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.575 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.575 17:48:10 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.575 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.575 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.575 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.575 17:48:10 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.575 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.575 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.575 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.575 17:48:10 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.575 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.575 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.575 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.575 17:48:10 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.575 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.575 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.575 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.575 17:48:10 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.575 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.575 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.575 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.575 17:48:10 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.575 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.575 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.575 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.575 17:48:10 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.575 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.575 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.575 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.575 17:48:10 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.575 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.575 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.575 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.575 17:48:10 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.575 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.575 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.575 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.575 17:48:10 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.575 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.575 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.575 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.575 17:48:10 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.575 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.575 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.575 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.575 17:48:10 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.575 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.575 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.575 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.575 17:48:10 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.575 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.575 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.575 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.575 17:48:10 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.575 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.575 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.575 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.575 17:48:10 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.575 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.575 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.575 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.575 17:48:10 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.575 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.575 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.575 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.575 17:48:10 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.575 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.575 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.575 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.575 17:48:10 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.575 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.575 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.575 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.575 17:48:10 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.575 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.575 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.575 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.575 17:48:10 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.575 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.575 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.575 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.575 17:48:10 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.575 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.575 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.575 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.575 17:48:10 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.575 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.575 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.575 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.575 17:48:10 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.575 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.575 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.575 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.575 17:48:10 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.575 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.575 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.575 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.575 17:48:10 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.575 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.575 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.575 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.575 17:48:10 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.575 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.575 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.575 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.575 17:48:10 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.575 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.575 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.575 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.575 17:48:10 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.575 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.575 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.575 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.575 17:48:10 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.575 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.575 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.575 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.575 17:48:10 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.575 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.575 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.575 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.576 17:48:10 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.576 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.576 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.576 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.576 17:48:10 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.576 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.576 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.576 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.576 17:48:10 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.576 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.576 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.576 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.576 17:48:10 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.576 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.576 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.576 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.576 17:48:10 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.576 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.576 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.576 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.576 17:48:10 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.576 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.576 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.576 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.576 17:48:10 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.576 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.576 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.576 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.576 17:48:10 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.576 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.576 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.576 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.576 17:48:10 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.576 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.576 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.576 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.576 17:48:10 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.576 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.576 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.576 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.576 17:48:10 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.576 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.576 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.576 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.576 17:48:10 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.576 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.576 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.576 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.576 17:48:10 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.576 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.576 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.576 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.576 17:48:10 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.576 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.576 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.576 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.576 17:48:10 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.576 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.576 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.576 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.576 17:48:10 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.576 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.576 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.576 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.576 17:48:10 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.576 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.576 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.576 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.576 17:48:10 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.576 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.576 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.576 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.576 17:48:10 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.576 17:48:10 -- setup/common.sh@33 -- # echo 0 00:04:17.576 17:48:10 -- setup/common.sh@33 -- # return 0 00:04:17.576 17:48:10 -- setup/hugepages.sh@100 -- # resv=0 00:04:17.576 17:48:10 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:17.576 nr_hugepages=1024 00:04:17.576 17:48:10 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:17.576 resv_hugepages=0 00:04:17.576 17:48:10 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:17.576 surplus_hugepages=0 00:04:17.576 17:48:10 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:17.576 anon_hugepages=0 00:04:17.576 17:48:10 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:17.576 17:48:10 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:17.576 17:48:10 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:17.576 17:48:10 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:17.576 17:48:10 -- setup/common.sh@18 -- # local node= 00:04:17.576 17:48:10 -- setup/common.sh@19 -- # local var val 00:04:17.576 17:48:10 -- setup/common.sh@20 -- # local mem_f mem 00:04:17.576 17:48:10 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:17.576 17:48:10 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:17.576 17:48:10 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:17.576 17:48:10 -- setup/common.sh@28 -- # mapfile -t mem 00:04:17.576 17:48:10 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:17.577 17:48:10 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41564232 kB' 'MemAvailable: 45291428 kB' 'Buffers: 8940 kB' 'Cached: 12525456 kB' 'SwapCached: 0 kB' 'Active: 9410812 kB' 'Inactive: 3688312 kB' 'Active(anon): 8993968 kB' 'Inactive(anon): 0 kB' 'Active(file): 416844 kB' 'Inactive(file): 3688312 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 569036 kB' 'Mapped: 153436 kB' 'Shmem: 8429240 kB' 'KReclaimable: 238260 kB' 'Slab: 912244 kB' 'SReclaimable: 238260 kB' 'SUnreclaim: 673984 kB' 'KernelStack: 21936 kB' 'PageTables: 7872 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10297656 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214256 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 603508 kB' 'DirectMap2M: 10616832 kB' 'DirectMap1G: 58720256 kB' 00:04:17.577 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.577 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.577 17:48:10 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.577 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.577 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.577 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.577 17:48:10 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.577 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.577 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.577 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.577 17:48:10 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.577 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.577 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.577 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.577 17:48:10 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.577 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.577 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.577 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.577 17:48:10 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.577 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.577 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.577 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.577 17:48:10 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.577 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.577 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.577 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.577 17:48:10 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.577 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.577 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.577 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.577 17:48:10 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.577 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.577 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.577 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.577 17:48:10 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.577 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.577 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.577 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.577 17:48:10 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.577 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.577 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.577 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.577 17:48:10 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.577 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.577 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.577 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.577 17:48:10 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.577 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.577 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.577 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.577 17:48:10 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.577 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.577 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.577 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.577 17:48:10 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.577 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.577 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.577 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.577 17:48:10 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.577 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.577 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.577 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.577 17:48:10 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.577 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.577 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.577 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.577 17:48:10 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.577 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.577 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.578 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.578 17:48:10 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.578 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.578 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.578 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.578 17:48:10 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.578 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.578 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.578 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.578 17:48:10 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.578 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.578 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.578 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.578 17:48:10 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.578 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.578 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.578 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.578 17:48:10 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.578 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.578 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.578 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.578 17:48:10 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.578 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.578 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.578 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.578 17:48:10 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.578 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.578 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.578 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.578 17:48:10 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.578 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.578 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.578 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.578 17:48:10 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.578 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.578 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.578 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.578 17:48:10 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.578 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.578 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.578 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.578 17:48:10 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.578 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.578 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.578 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.578 17:48:10 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.578 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.578 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.578 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.578 17:48:10 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.578 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.578 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.578 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.578 17:48:10 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.578 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.578 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.578 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.578 17:48:10 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.578 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.578 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.578 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.578 17:48:10 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.578 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.578 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.578 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.578 17:48:10 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.578 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.578 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.578 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.578 17:48:10 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.578 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.578 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.578 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.578 17:48:10 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.578 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.578 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.578 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.578 17:48:10 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.578 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.578 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.578 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.578 17:48:10 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.578 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.578 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.578 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.578 17:48:10 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.578 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.578 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.578 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.578 17:48:10 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.578 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.578 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.578 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.578 17:48:10 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.578 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.578 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.578 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.578 17:48:10 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.578 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.578 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.578 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.579 17:48:10 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.579 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.579 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.579 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.579 17:48:10 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.579 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.579 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.579 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.579 17:48:10 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.579 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.579 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.579 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.579 17:48:10 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.579 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.579 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.579 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.579 17:48:10 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.579 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.579 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.579 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.579 17:48:10 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.579 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.579 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.579 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.579 17:48:10 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.579 17:48:10 -- setup/common.sh@33 -- # echo 1024 00:04:17.579 17:48:10 -- setup/common.sh@33 -- # return 0 00:04:17.579 17:48:10 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:17.579 17:48:10 -- setup/hugepages.sh@112 -- # get_nodes 00:04:17.579 17:48:10 -- setup/hugepages.sh@27 -- # local node 00:04:17.579 17:48:10 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:17.579 17:48:10 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:17.579 17:48:10 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:17.579 17:48:10 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:04:17.579 17:48:10 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:17.579 17:48:10 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:17.579 17:48:10 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:17.579 17:48:10 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:17.579 17:48:10 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:17.579 17:48:10 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:17.579 17:48:10 -- setup/common.sh@18 -- # local node=0 00:04:17.579 17:48:10 -- setup/common.sh@19 -- # local var val 00:04:17.579 17:48:10 -- setup/common.sh@20 -- # local mem_f mem 00:04:17.579 17:48:10 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:17.579 17:48:10 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:17.579 17:48:10 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:17.579 17:48:10 -- setup/common.sh@28 -- # mapfile -t mem 00:04:17.579 17:48:10 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:17.579 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.579 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.579 17:48:10 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634436 kB' 'MemFree: 25115372 kB' 'MemUsed: 7519064 kB' 'SwapCached: 0 kB' 'Active: 3691716 kB' 'Inactive: 168948 kB' 'Active(anon): 3569332 kB' 'Inactive(anon): 0 kB' 'Active(file): 122384 kB' 'Inactive(file): 168948 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3681524 kB' 'Mapped: 60692 kB' 'AnonPages: 182304 kB' 'Shmem: 3390192 kB' 'KernelStack: 10664 kB' 'PageTables: 3252 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 153472 kB' 'Slab: 469272 kB' 'SReclaimable: 153472 kB' 'SUnreclaim: 315800 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:17.579 17:48:10 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.579 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.579 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.579 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.579 17:48:10 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.579 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.579 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.579 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.579 17:48:10 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.579 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.579 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.579 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.579 17:48:10 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.579 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.579 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.579 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.579 17:48:10 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.579 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.579 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.579 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.579 17:48:10 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.579 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.579 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.579 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.579 17:48:10 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.579 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.579 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.579 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.579 17:48:10 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.579 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.579 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.579 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.579 17:48:10 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.579 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.579 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.579 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.580 17:48:10 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.580 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.580 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.580 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.580 17:48:10 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.580 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.580 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.580 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.580 17:48:10 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.580 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.580 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.580 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.580 17:48:10 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.580 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.580 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.580 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.580 17:48:10 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.580 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.580 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.580 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.580 17:48:10 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.580 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.580 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.580 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.580 17:48:10 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.580 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.580 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.580 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.580 17:48:10 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.580 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.580 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.580 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.580 17:48:10 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.580 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.580 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.580 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.580 17:48:10 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.580 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.580 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.580 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.580 17:48:10 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.580 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.580 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.580 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.580 17:48:10 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.580 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.580 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.580 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.580 17:48:10 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.580 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.580 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.580 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.580 17:48:10 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.580 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.580 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.580 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.580 17:48:10 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.580 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.580 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.580 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.580 17:48:10 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.580 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.580 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.580 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.580 17:48:10 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.580 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.580 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.580 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.580 17:48:10 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.580 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.580 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.580 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.580 17:48:10 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.580 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.580 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.580 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.580 17:48:10 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.580 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.580 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.580 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.580 17:48:10 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.580 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.580 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.580 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.580 17:48:10 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.580 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.580 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.580 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.580 17:48:10 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.580 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.580 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.580 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.580 17:48:10 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.580 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.580 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.580 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.580 17:48:10 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.580 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.580 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.580 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.580 17:48:10 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.580 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.580 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.580 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.580 17:48:10 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.581 17:48:10 -- setup/common.sh@32 -- # continue 00:04:17.581 17:48:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.581 17:48:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.581 17:48:10 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.581 17:48:10 -- setup/common.sh@33 -- # echo 0 00:04:17.581 17:48:10 -- setup/common.sh@33 -- # return 0 00:04:17.581 17:48:10 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:17.581 17:48:10 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:17.581 17:48:10 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:17.581 17:48:10 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:17.581 17:48:10 -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:04:17.581 node0=1024 expecting 1024 00:04:17.581 17:48:10 -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:04:17.581 00:04:17.581 real 0m5.517s 00:04:17.581 user 0m1.456s 00:04:17.581 sys 0m2.525s 00:04:17.581 17:48:10 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:17.581 17:48:10 -- common/autotest_common.sh@10 -- # set +x 00:04:17.581 ************************************ 00:04:17.581 END TEST default_setup 00:04:17.581 ************************************ 00:04:17.581 17:48:10 -- setup/hugepages.sh@211 -- # run_test per_node_1G_alloc per_node_1G_alloc 00:04:17.581 17:48:10 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:17.581 17:48:10 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:17.581 17:48:10 -- common/autotest_common.sh@10 -- # set +x 00:04:17.581 ************************************ 00:04:17.581 START TEST per_node_1G_alloc 00:04:17.581 ************************************ 00:04:17.581 17:48:10 -- common/autotest_common.sh@1114 -- # per_node_1G_alloc 00:04:17.581 17:48:10 -- setup/hugepages.sh@143 -- # local IFS=, 00:04:17.581 17:48:10 -- setup/hugepages.sh@145 -- # get_test_nr_hugepages 1048576 0 1 00:04:17.581 17:48:10 -- setup/hugepages.sh@49 -- # local size=1048576 00:04:17.581 17:48:10 -- setup/hugepages.sh@50 -- # (( 3 > 1 )) 00:04:17.581 17:48:10 -- setup/hugepages.sh@51 -- # shift 00:04:17.581 17:48:10 -- setup/hugepages.sh@52 -- # node_ids=('0' '1') 00:04:17.581 17:48:10 -- setup/hugepages.sh@52 -- # local node_ids 00:04:17.581 17:48:10 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:17.581 17:48:10 -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:04:17.581 17:48:10 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 1 00:04:17.581 17:48:10 -- setup/hugepages.sh@62 -- # user_nodes=('0' '1') 00:04:17.581 17:48:10 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:17.581 17:48:10 -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:04:17.581 17:48:10 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:17.581 17:48:10 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:17.581 17:48:10 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:17.581 17:48:10 -- setup/hugepages.sh@69 -- # (( 2 > 0 )) 00:04:17.581 17:48:10 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:17.581 17:48:10 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:04:17.581 17:48:10 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:17.581 17:48:10 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:04:17.581 17:48:10 -- setup/hugepages.sh@73 -- # return 0 00:04:17.581 17:48:10 -- setup/hugepages.sh@146 -- # NRHUGE=512 00:04:17.581 17:48:10 -- setup/hugepages.sh@146 -- # HUGENODE=0,1 00:04:17.581 17:48:10 -- setup/hugepages.sh@146 -- # setup output 00:04:17.581 17:48:10 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:17.581 17:48:10 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:20.877 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:20.877 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:20.877 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:20.877 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:20.877 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:20.877 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:20.877 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:20.877 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:20.877 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:20.877 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:21.140 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:21.140 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:21.140 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:21.140 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:21.140 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:21.140 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:21.140 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:21.140 17:48:13 -- setup/hugepages.sh@147 -- # nr_hugepages=1024 00:04:21.140 17:48:13 -- setup/hugepages.sh@147 -- # verify_nr_hugepages 00:04:21.140 17:48:13 -- setup/hugepages.sh@89 -- # local node 00:04:21.140 17:48:13 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:21.140 17:48:13 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:21.140 17:48:13 -- setup/hugepages.sh@92 -- # local surp 00:04:21.140 17:48:13 -- setup/hugepages.sh@93 -- # local resv 00:04:21.140 17:48:13 -- setup/hugepages.sh@94 -- # local anon 00:04:21.140 17:48:13 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:21.140 17:48:13 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:21.140 17:48:13 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:21.140 17:48:13 -- setup/common.sh@18 -- # local node= 00:04:21.140 17:48:13 -- setup/common.sh@19 -- # local var val 00:04:21.140 17:48:13 -- setup/common.sh@20 -- # local mem_f mem 00:04:21.140 17:48:13 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:21.140 17:48:13 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:21.140 17:48:13 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:21.140 17:48:13 -- setup/common.sh@28 -- # mapfile -t mem 00:04:21.140 17:48:13 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:21.140 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.140 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.140 17:48:13 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41563296 kB' 'MemAvailable: 45290460 kB' 'Buffers: 8940 kB' 'Cached: 12525576 kB' 'SwapCached: 0 kB' 'Active: 9406184 kB' 'Inactive: 3688312 kB' 'Active(anon): 8989340 kB' 'Inactive(anon): 0 kB' 'Active(file): 416844 kB' 'Inactive(file): 3688312 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 563332 kB' 'Mapped: 151292 kB' 'Shmem: 8429360 kB' 'KReclaimable: 238196 kB' 'Slab: 912652 kB' 'SReclaimable: 238196 kB' 'SUnreclaim: 674456 kB' 'KernelStack: 21744 kB' 'PageTables: 7456 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10253396 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214272 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 603508 kB' 'DirectMap2M: 10616832 kB' 'DirectMap1G: 58720256 kB' 00:04:21.140 17:48:13 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.140 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.140 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.140 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.140 17:48:13 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.140 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.140 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.140 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.140 17:48:13 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.140 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.140 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.140 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.140 17:48:13 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.140 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.140 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.140 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.140 17:48:13 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.140 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.140 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.140 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.140 17:48:13 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.140 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.140 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.140 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.140 17:48:13 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.140 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.140 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.140 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.140 17:48:13 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.140 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.140 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.140 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.140 17:48:13 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.140 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.140 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.140 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.140 17:48:13 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.140 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.140 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.140 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.140 17:48:13 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.140 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.140 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.140 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.140 17:48:13 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.140 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.140 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.140 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.140 17:48:13 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.140 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.140 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.140 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.140 17:48:13 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.140 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.140 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.140 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.140 17:48:13 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.140 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.140 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.140 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.140 17:48:13 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.140 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.140 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.140 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.140 17:48:13 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.140 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.140 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.140 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.140 17:48:13 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.140 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.140 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.140 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.140 17:48:13 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.140 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.140 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.140 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.140 17:48:13 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.140 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.140 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.140 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.140 17:48:13 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.140 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.140 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.140 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.140 17:48:13 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.140 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.140 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.140 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.140 17:48:13 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.140 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.140 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.140 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.140 17:48:13 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.140 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.140 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.140 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.140 17:48:13 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.140 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.140 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.140 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.140 17:48:13 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.140 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.140 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.140 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.140 17:48:13 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.140 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.140 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.140 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.140 17:48:13 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.140 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.140 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.140 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.140 17:48:13 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.140 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.140 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.140 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.140 17:48:13 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.140 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.140 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.140 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.140 17:48:13 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.140 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.140 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.140 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.140 17:48:13 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.140 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.140 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.140 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.140 17:48:13 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.140 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.140 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.140 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.141 17:48:13 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.141 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.141 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.141 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.141 17:48:13 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.141 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.141 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.141 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.141 17:48:13 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.141 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.141 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.141 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.141 17:48:13 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.141 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.141 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.141 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.141 17:48:13 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.141 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.141 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.141 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.141 17:48:13 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.141 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.141 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.141 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.141 17:48:13 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.141 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.141 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.141 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.141 17:48:13 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.141 17:48:13 -- setup/common.sh@33 -- # echo 0 00:04:21.141 17:48:13 -- setup/common.sh@33 -- # return 0 00:04:21.141 17:48:13 -- setup/hugepages.sh@97 -- # anon=0 00:04:21.141 17:48:13 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:21.141 17:48:13 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:21.141 17:48:13 -- setup/common.sh@18 -- # local node= 00:04:21.141 17:48:13 -- setup/common.sh@19 -- # local var val 00:04:21.141 17:48:13 -- setup/common.sh@20 -- # local mem_f mem 00:04:21.141 17:48:13 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:21.141 17:48:13 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:21.141 17:48:13 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:21.141 17:48:13 -- setup/common.sh@28 -- # mapfile -t mem 00:04:21.141 17:48:13 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:21.141 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.141 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.141 17:48:13 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41565060 kB' 'MemAvailable: 45292224 kB' 'Buffers: 8940 kB' 'Cached: 12525580 kB' 'SwapCached: 0 kB' 'Active: 9405800 kB' 'Inactive: 3688312 kB' 'Active(anon): 8988956 kB' 'Inactive(anon): 0 kB' 'Active(file): 416844 kB' 'Inactive(file): 3688312 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 562864 kB' 'Mapped: 151292 kB' 'Shmem: 8429364 kB' 'KReclaimable: 238196 kB' 'Slab: 912644 kB' 'SReclaimable: 238196 kB' 'SUnreclaim: 674448 kB' 'KernelStack: 21728 kB' 'PageTables: 7412 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10253408 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214240 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 603508 kB' 'DirectMap2M: 10616832 kB' 'DirectMap1G: 58720256 kB' 00:04:21.141 17:48:13 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.141 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.141 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.141 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.141 17:48:13 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.141 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.141 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.141 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.141 17:48:13 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.141 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.141 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.141 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.141 17:48:13 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.141 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.141 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.141 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.141 17:48:13 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.141 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.141 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.141 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.141 17:48:13 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.141 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.141 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.141 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.141 17:48:13 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.141 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.141 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.141 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.141 17:48:13 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.141 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.141 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.141 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.141 17:48:13 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.141 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.141 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.141 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.141 17:48:13 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.141 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.141 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.141 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.141 17:48:13 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.141 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.141 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.141 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.141 17:48:13 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.141 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.141 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.141 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.141 17:48:13 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.141 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.141 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.141 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.141 17:48:13 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.141 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.141 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.141 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.141 17:48:13 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.141 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.141 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.141 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.141 17:48:13 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.141 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.141 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.141 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.141 17:48:13 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.141 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.141 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.141 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.141 17:48:13 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.141 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.141 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.141 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.141 17:48:13 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.141 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.141 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.141 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.141 17:48:13 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.141 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.141 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.141 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.141 17:48:13 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.141 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.141 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.141 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.141 17:48:13 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.141 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.141 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.141 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.141 17:48:13 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.141 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.141 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.141 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.141 17:48:13 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.141 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.141 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.141 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.141 17:48:13 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.141 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.141 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.141 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.142 17:48:13 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.142 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.142 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.142 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.142 17:48:13 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.142 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.142 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.142 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.142 17:48:13 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.142 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.142 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.142 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.142 17:48:13 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.142 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.142 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.142 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.142 17:48:13 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.142 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.142 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.142 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.142 17:48:13 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.142 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.142 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.142 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.142 17:48:13 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.142 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.142 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.142 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.142 17:48:13 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.142 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.142 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.142 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.142 17:48:13 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.142 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.142 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.142 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.142 17:48:13 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.142 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.142 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.142 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.142 17:48:13 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.142 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.142 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.142 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.142 17:48:13 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.142 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.142 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.142 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.142 17:48:13 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.142 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.142 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.142 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.142 17:48:13 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.142 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.142 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.142 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.142 17:48:13 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.142 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.142 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.142 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.142 17:48:13 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.142 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.142 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.142 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.142 17:48:13 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.142 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.142 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.142 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.142 17:48:13 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.142 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.142 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.142 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.142 17:48:13 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.142 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.142 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.142 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.142 17:48:13 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.142 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.142 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.142 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.142 17:48:13 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.142 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.142 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.142 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.142 17:48:13 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.142 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.142 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.142 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.142 17:48:13 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.142 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.142 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.142 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.142 17:48:13 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.142 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.142 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.142 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.142 17:48:13 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.142 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.142 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.142 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.142 17:48:13 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.142 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.142 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.142 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.142 17:48:13 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.142 17:48:13 -- setup/common.sh@33 -- # echo 0 00:04:21.142 17:48:13 -- setup/common.sh@33 -- # return 0 00:04:21.142 17:48:13 -- setup/hugepages.sh@99 -- # surp=0 00:04:21.142 17:48:13 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:21.142 17:48:13 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:21.142 17:48:13 -- setup/common.sh@18 -- # local node= 00:04:21.142 17:48:13 -- setup/common.sh@19 -- # local var val 00:04:21.142 17:48:13 -- setup/common.sh@20 -- # local mem_f mem 00:04:21.142 17:48:13 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:21.142 17:48:13 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:21.142 17:48:13 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:21.142 17:48:13 -- setup/common.sh@28 -- # mapfile -t mem 00:04:21.142 17:48:13 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:21.142 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.142 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.142 17:48:13 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41566992 kB' 'MemAvailable: 45294156 kB' 'Buffers: 8940 kB' 'Cached: 12525604 kB' 'SwapCached: 0 kB' 'Active: 9405764 kB' 'Inactive: 3688312 kB' 'Active(anon): 8988920 kB' 'Inactive(anon): 0 kB' 'Active(file): 416844 kB' 'Inactive(file): 3688312 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 562812 kB' 'Mapped: 151292 kB' 'Shmem: 8429388 kB' 'KReclaimable: 238196 kB' 'Slab: 912596 kB' 'SReclaimable: 238196 kB' 'SUnreclaim: 674400 kB' 'KernelStack: 21712 kB' 'PageTables: 7356 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10253420 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214240 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 603508 kB' 'DirectMap2M: 10616832 kB' 'DirectMap1G: 58720256 kB' 00:04:21.142 17:48:13 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.142 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.142 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.142 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.142 17:48:13 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.142 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.142 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.142 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.142 17:48:13 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.142 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.142 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.142 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.142 17:48:13 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.142 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.142 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.142 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.142 17:48:13 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.142 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.142 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.142 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.142 17:48:13 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.142 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.142 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.143 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.143 17:48:13 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.143 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.143 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.143 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.143 17:48:13 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.143 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.143 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.143 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.143 17:48:13 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.143 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.143 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.143 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.143 17:48:13 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.143 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.143 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.143 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.143 17:48:13 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.143 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.143 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.143 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.143 17:48:13 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.143 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.143 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.143 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.143 17:48:13 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.143 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.143 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.143 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.143 17:48:13 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.143 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.143 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.143 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.143 17:48:13 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.143 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.143 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.143 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.143 17:48:13 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.143 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.143 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.143 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.143 17:48:13 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.143 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.143 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.143 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.143 17:48:13 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.143 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.143 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.143 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.143 17:48:13 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.143 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.143 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.143 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.143 17:48:13 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.143 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.143 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.143 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.143 17:48:13 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.143 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.143 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.143 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.143 17:48:13 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.143 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.143 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.143 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.143 17:48:13 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.143 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.143 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.143 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.143 17:48:13 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.143 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.143 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.143 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.143 17:48:13 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.143 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.143 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.143 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.143 17:48:13 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.143 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.143 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.143 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.143 17:48:13 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.143 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.143 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.143 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.143 17:48:13 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.143 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.143 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.143 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.143 17:48:13 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.143 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.143 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.143 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.143 17:48:13 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.143 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.143 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.143 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.143 17:48:13 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.143 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.143 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.143 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.143 17:48:13 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.143 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.143 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.143 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.143 17:48:13 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.143 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.143 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.143 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.143 17:48:13 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.143 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.143 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.143 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.143 17:48:13 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.143 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.143 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.143 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.143 17:48:13 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.143 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.143 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.143 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.143 17:48:13 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.143 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.143 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.143 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.143 17:48:13 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.143 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.143 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.143 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.143 17:48:13 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.143 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.143 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.143 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.143 17:48:13 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.143 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.143 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.143 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.143 17:48:13 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.143 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.143 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.143 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.143 17:48:13 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.143 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.143 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.143 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.143 17:48:13 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.143 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.143 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.143 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.143 17:48:13 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.143 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.143 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.143 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.143 17:48:13 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.143 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.143 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.143 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.143 17:48:13 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.143 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.143 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.143 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.143 17:48:13 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.144 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.144 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.144 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.144 17:48:13 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.144 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.144 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.144 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.144 17:48:13 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.144 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.144 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.144 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.144 17:48:13 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.144 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.144 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.144 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.144 17:48:13 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.144 17:48:13 -- setup/common.sh@33 -- # echo 0 00:04:21.144 17:48:13 -- setup/common.sh@33 -- # return 0 00:04:21.144 17:48:13 -- setup/hugepages.sh@100 -- # resv=0 00:04:21.144 17:48:13 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:21.144 nr_hugepages=1024 00:04:21.144 17:48:13 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:21.144 resv_hugepages=0 00:04:21.144 17:48:13 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:21.144 surplus_hugepages=0 00:04:21.144 17:48:13 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:21.144 anon_hugepages=0 00:04:21.144 17:48:13 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:21.144 17:48:13 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:21.144 17:48:13 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:21.144 17:48:13 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:21.144 17:48:13 -- setup/common.sh@18 -- # local node= 00:04:21.144 17:48:13 -- setup/common.sh@19 -- # local var val 00:04:21.144 17:48:13 -- setup/common.sh@20 -- # local mem_f mem 00:04:21.144 17:48:13 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:21.144 17:48:13 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:21.144 17:48:13 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:21.144 17:48:13 -- setup/common.sh@28 -- # mapfile -t mem 00:04:21.144 17:48:13 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:21.144 17:48:13 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41567360 kB' 'MemAvailable: 45294524 kB' 'Buffers: 8940 kB' 'Cached: 12525620 kB' 'SwapCached: 0 kB' 'Active: 9405776 kB' 'Inactive: 3688312 kB' 'Active(anon): 8988932 kB' 'Inactive(anon): 0 kB' 'Active(file): 416844 kB' 'Inactive(file): 3688312 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 562812 kB' 'Mapped: 151292 kB' 'Shmem: 8429404 kB' 'KReclaimable: 238196 kB' 'Slab: 912596 kB' 'SReclaimable: 238196 kB' 'SUnreclaim: 674400 kB' 'KernelStack: 21712 kB' 'PageTables: 7356 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10253436 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214240 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 603508 kB' 'DirectMap2M: 10616832 kB' 'DirectMap1G: 58720256 kB' 00:04:21.144 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.144 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.144 17:48:13 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.144 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.144 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.144 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.144 17:48:13 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.144 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.144 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.144 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.144 17:48:13 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.144 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.144 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.144 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.144 17:48:13 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.144 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.144 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.144 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.144 17:48:13 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.144 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.144 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.144 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.144 17:48:13 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.144 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.144 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.144 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.144 17:48:13 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.144 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.144 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.144 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.144 17:48:13 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.144 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.144 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.144 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.144 17:48:13 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.144 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.144 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.144 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.144 17:48:13 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.144 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.144 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.144 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.144 17:48:13 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.144 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.144 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.144 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.144 17:48:13 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.144 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.144 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.144 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.144 17:48:13 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.144 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.144 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.144 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.144 17:48:13 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.144 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.144 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.144 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.144 17:48:13 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.144 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.144 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.144 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.144 17:48:13 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.144 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.144 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.144 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.144 17:48:13 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.144 17:48:13 -- setup/common.sh@32 -- # continue 00:04:21.144 17:48:13 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.144 17:48:13 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.144 17:48:13 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.406 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.406 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.406 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.406 17:48:14 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.406 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.406 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.406 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.406 17:48:14 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.406 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.406 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.406 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.406 17:48:14 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.406 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.406 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.407 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.407 17:48:14 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.407 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.407 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.407 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.407 17:48:14 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.407 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.407 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.407 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.407 17:48:14 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.407 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.407 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.407 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.407 17:48:14 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.407 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.407 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.407 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.407 17:48:14 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.407 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.407 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.407 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.407 17:48:14 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.407 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.407 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.407 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.407 17:48:14 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.407 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.407 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.407 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.407 17:48:14 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.407 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.407 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.407 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.407 17:48:14 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.407 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.407 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.407 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.407 17:48:14 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.407 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.407 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.407 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.407 17:48:14 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.407 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.407 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.407 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.407 17:48:14 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.407 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.407 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.407 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.407 17:48:14 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.407 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.407 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.407 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.407 17:48:14 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.407 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.407 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.407 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.407 17:48:14 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.407 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.407 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.407 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.407 17:48:14 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.407 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.407 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.407 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.407 17:48:14 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.407 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.407 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.407 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.407 17:48:14 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.407 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.407 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.407 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.407 17:48:14 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.407 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.407 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.407 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.407 17:48:14 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.407 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.407 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.407 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.407 17:48:14 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.407 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.407 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.407 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.407 17:48:14 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.407 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.407 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.407 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.407 17:48:14 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.407 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.407 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.407 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.407 17:48:14 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.407 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.407 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.407 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.407 17:48:14 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.407 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.407 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.407 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.407 17:48:14 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.407 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.407 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.407 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.407 17:48:14 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.407 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.407 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.407 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.407 17:48:14 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.407 17:48:14 -- setup/common.sh@33 -- # echo 1024 00:04:21.407 17:48:14 -- setup/common.sh@33 -- # return 0 00:04:21.407 17:48:14 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:21.407 17:48:14 -- setup/hugepages.sh@112 -- # get_nodes 00:04:21.407 17:48:14 -- setup/hugepages.sh@27 -- # local node 00:04:21.407 17:48:14 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:21.408 17:48:14 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:21.408 17:48:14 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:21.408 17:48:14 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:21.408 17:48:14 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:21.408 17:48:14 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:21.408 17:48:14 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:21.408 17:48:14 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:21.408 17:48:14 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:21.408 17:48:14 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:21.408 17:48:14 -- setup/common.sh@18 -- # local node=0 00:04:21.408 17:48:14 -- setup/common.sh@19 -- # local var val 00:04:21.408 17:48:14 -- setup/common.sh@20 -- # local mem_f mem 00:04:21.408 17:48:14 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:21.408 17:48:14 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:21.408 17:48:14 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:21.408 17:48:14 -- setup/common.sh@28 -- # mapfile -t mem 00:04:21.408 17:48:14 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:21.408 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.408 17:48:14 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634436 kB' 'MemFree: 26204556 kB' 'MemUsed: 6429880 kB' 'SwapCached: 0 kB' 'Active: 3691680 kB' 'Inactive: 168948 kB' 'Active(anon): 3569296 kB' 'Inactive(anon): 0 kB' 'Active(file): 122384 kB' 'Inactive(file): 168948 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3681648 kB' 'Mapped: 59600 kB' 'AnonPages: 182236 kB' 'Shmem: 3390316 kB' 'KernelStack: 10648 kB' 'PageTables: 3388 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 153408 kB' 'Slab: 469404 kB' 'SReclaimable: 153408 kB' 'SUnreclaim: 315996 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:21.408 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.408 17:48:14 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.408 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.408 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.408 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.408 17:48:14 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.408 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.408 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.408 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.408 17:48:14 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.408 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.408 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.408 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.408 17:48:14 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.408 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.408 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.408 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.408 17:48:14 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.408 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.408 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.408 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.408 17:48:14 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.408 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.408 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.408 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.408 17:48:14 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.408 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.408 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.408 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.408 17:48:14 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.408 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.408 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.408 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.408 17:48:14 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.408 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.408 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.408 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.408 17:48:14 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.408 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.408 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.408 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.408 17:48:14 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.408 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.408 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.408 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.408 17:48:14 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.408 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.408 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.408 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.408 17:48:14 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.408 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.408 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.408 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.408 17:48:14 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.408 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.408 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.408 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.408 17:48:14 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.408 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.408 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.408 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.408 17:48:14 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.408 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.408 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.408 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.408 17:48:14 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.408 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.408 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.408 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.408 17:48:14 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.408 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.409 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.409 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.409 17:48:14 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.409 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.409 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.409 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.409 17:48:14 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.409 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.409 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.409 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.409 17:48:14 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.409 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.409 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.409 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.409 17:48:14 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.409 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.409 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.409 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.409 17:48:14 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.409 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.409 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.409 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.409 17:48:14 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.409 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.409 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.409 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.409 17:48:14 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.409 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.409 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.409 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.409 17:48:14 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.409 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.409 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.409 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.409 17:48:14 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.409 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.409 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.409 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.409 17:48:14 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.409 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.409 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.409 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.409 17:48:14 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.409 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.409 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.409 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.409 17:48:14 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.409 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.409 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.409 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.409 17:48:14 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.409 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.409 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.409 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.409 17:48:14 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.409 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.409 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.409 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.409 17:48:14 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.409 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.409 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.409 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.409 17:48:14 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.409 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.409 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.409 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.409 17:48:14 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.409 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.409 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.409 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.409 17:48:14 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.409 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.409 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.409 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.409 17:48:14 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.409 17:48:14 -- setup/common.sh@33 -- # echo 0 00:04:21.409 17:48:14 -- setup/common.sh@33 -- # return 0 00:04:21.409 17:48:14 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:21.409 17:48:14 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:21.409 17:48:14 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:21.409 17:48:14 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:21.409 17:48:14 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:21.409 17:48:14 -- setup/common.sh@18 -- # local node=1 00:04:21.409 17:48:14 -- setup/common.sh@19 -- # local var val 00:04:21.409 17:48:14 -- setup/common.sh@20 -- # local mem_f mem 00:04:21.409 17:48:14 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:21.409 17:48:14 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:21.409 17:48:14 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:21.409 17:48:14 -- setup/common.sh@28 -- # mapfile -t mem 00:04:21.409 17:48:14 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:21.409 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.409 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.409 17:48:14 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27649360 kB' 'MemFree: 15365752 kB' 'MemUsed: 12283608 kB' 'SwapCached: 0 kB' 'Active: 5714244 kB' 'Inactive: 3519364 kB' 'Active(anon): 5419784 kB' 'Inactive(anon): 0 kB' 'Active(file): 294460 kB' 'Inactive(file): 3519364 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8852912 kB' 'Mapped: 91640 kB' 'AnonPages: 380904 kB' 'Shmem: 5039088 kB' 'KernelStack: 11080 kB' 'PageTables: 4020 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 84788 kB' 'Slab: 443192 kB' 'SReclaimable: 84788 kB' 'SUnreclaim: 358404 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:21.410 17:48:14 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.410 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.410 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.410 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.410 17:48:14 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.410 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.410 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.410 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.410 17:48:14 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.410 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.410 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.410 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.410 17:48:14 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.410 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.410 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.410 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.410 17:48:14 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.410 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.410 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.410 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.410 17:48:14 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.410 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.410 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.410 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.410 17:48:14 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.410 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.410 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.410 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.410 17:48:14 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.410 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.410 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.410 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.410 17:48:14 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.410 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.410 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.410 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.410 17:48:14 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.410 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.410 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.410 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.410 17:48:14 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.410 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.410 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.410 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.410 17:48:14 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.410 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.410 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.410 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.410 17:48:14 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.410 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.410 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.410 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.410 17:48:14 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.410 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.410 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.410 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.410 17:48:14 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.410 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.410 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.410 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.410 17:48:14 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.410 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.410 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.410 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.410 17:48:14 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.410 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.410 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.410 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.410 17:48:14 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.410 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.410 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.410 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.410 17:48:14 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.410 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.410 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.410 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.410 17:48:14 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.410 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.410 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.410 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.410 17:48:14 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.410 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.410 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.410 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.410 17:48:14 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.410 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.410 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.410 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.410 17:48:14 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.410 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.410 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.410 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.410 17:48:14 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.410 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.410 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.410 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.410 17:48:14 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.410 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.410 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.411 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.411 17:48:14 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.411 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.411 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.411 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.411 17:48:14 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.411 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.411 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.411 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.411 17:48:14 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.411 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.411 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.411 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.411 17:48:14 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.411 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.411 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.411 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.411 17:48:14 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.411 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.411 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.411 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.411 17:48:14 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.411 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.411 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.411 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.411 17:48:14 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.411 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.411 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.411 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.411 17:48:14 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.411 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.411 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.411 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.411 17:48:14 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.411 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.411 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.411 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.411 17:48:14 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.411 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.411 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.411 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.411 17:48:14 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.411 17:48:14 -- setup/common.sh@32 -- # continue 00:04:21.411 17:48:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.411 17:48:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.411 17:48:14 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.411 17:48:14 -- setup/common.sh@33 -- # echo 0 00:04:21.411 17:48:14 -- setup/common.sh@33 -- # return 0 00:04:21.411 17:48:14 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:21.411 17:48:14 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:21.411 17:48:14 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:21.411 17:48:14 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:21.411 17:48:14 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:04:21.411 node0=512 expecting 512 00:04:21.411 17:48:14 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:21.411 17:48:14 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:21.411 17:48:14 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:21.411 17:48:14 -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:04:21.411 node1=512 expecting 512 00:04:21.411 17:48:14 -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:04:21.411 00:04:21.411 real 0m3.802s 00:04:21.411 user 0m1.427s 00:04:21.411 sys 0m2.447s 00:04:21.411 17:48:14 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:21.411 17:48:14 -- common/autotest_common.sh@10 -- # set +x 00:04:21.411 ************************************ 00:04:21.411 END TEST per_node_1G_alloc 00:04:21.411 ************************************ 00:04:21.411 17:48:14 -- setup/hugepages.sh@212 -- # run_test even_2G_alloc even_2G_alloc 00:04:21.411 17:48:14 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:21.411 17:48:14 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:21.411 17:48:14 -- common/autotest_common.sh@10 -- # set +x 00:04:21.411 ************************************ 00:04:21.411 START TEST even_2G_alloc 00:04:21.411 ************************************ 00:04:21.411 17:48:14 -- common/autotest_common.sh@1114 -- # even_2G_alloc 00:04:21.411 17:48:14 -- setup/hugepages.sh@152 -- # get_test_nr_hugepages 2097152 00:04:21.411 17:48:14 -- setup/hugepages.sh@49 -- # local size=2097152 00:04:21.411 17:48:14 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:21.411 17:48:14 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:21.411 17:48:14 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:21.411 17:48:14 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:21.411 17:48:14 -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:21.411 17:48:14 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:21.411 17:48:14 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:21.411 17:48:14 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:21.411 17:48:14 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:21.411 17:48:14 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:21.411 17:48:14 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:21.411 17:48:14 -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:21.411 17:48:14 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:21.411 17:48:14 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:21.411 17:48:14 -- setup/hugepages.sh@83 -- # : 512 00:04:21.411 17:48:14 -- setup/hugepages.sh@84 -- # : 1 00:04:21.411 17:48:14 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:21.411 17:48:14 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:21.411 17:48:14 -- setup/hugepages.sh@83 -- # : 0 00:04:21.411 17:48:14 -- setup/hugepages.sh@84 -- # : 0 00:04:21.411 17:48:14 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:21.411 17:48:14 -- setup/hugepages.sh@153 -- # NRHUGE=1024 00:04:21.411 17:48:14 -- setup/hugepages.sh@153 -- # HUGE_EVEN_ALLOC=yes 00:04:21.411 17:48:14 -- setup/hugepages.sh@153 -- # setup output 00:04:21.411 17:48:14 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:21.411 17:48:14 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:24.707 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:24.707 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:24.707 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:24.707 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:24.707 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:24.707 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:24.707 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:24.707 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:24.707 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:24.707 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:24.707 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:24.707 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:24.707 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:24.707 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:24.707 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:24.970 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:24.970 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:24.970 17:48:17 -- setup/hugepages.sh@154 -- # verify_nr_hugepages 00:04:24.970 17:48:17 -- setup/hugepages.sh@89 -- # local node 00:04:24.970 17:48:17 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:24.970 17:48:17 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:24.970 17:48:17 -- setup/hugepages.sh@92 -- # local surp 00:04:24.970 17:48:17 -- setup/hugepages.sh@93 -- # local resv 00:04:24.970 17:48:17 -- setup/hugepages.sh@94 -- # local anon 00:04:24.970 17:48:17 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:24.970 17:48:17 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:24.970 17:48:17 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:24.970 17:48:17 -- setup/common.sh@18 -- # local node= 00:04:24.970 17:48:17 -- setup/common.sh@19 -- # local var val 00:04:24.970 17:48:17 -- setup/common.sh@20 -- # local mem_f mem 00:04:24.970 17:48:17 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:24.970 17:48:17 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:24.970 17:48:17 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:24.970 17:48:17 -- setup/common.sh@28 -- # mapfile -t mem 00:04:24.970 17:48:17 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:24.970 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.970 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.970 17:48:17 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41571464 kB' 'MemAvailable: 45298628 kB' 'Buffers: 8940 kB' 'Cached: 12525704 kB' 'SwapCached: 0 kB' 'Active: 9407232 kB' 'Inactive: 3688312 kB' 'Active(anon): 8990388 kB' 'Inactive(anon): 0 kB' 'Active(file): 416844 kB' 'Inactive(file): 3688312 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 564156 kB' 'Mapped: 151332 kB' 'Shmem: 8429488 kB' 'KReclaimable: 238196 kB' 'Slab: 913084 kB' 'SReclaimable: 238196 kB' 'SUnreclaim: 674888 kB' 'KernelStack: 21760 kB' 'PageTables: 7516 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10254048 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214320 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 603508 kB' 'DirectMap2M: 10616832 kB' 'DirectMap1G: 58720256 kB' 00:04:24.970 17:48:17 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.970 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.970 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.970 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.970 17:48:17 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.970 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.970 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.970 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.970 17:48:17 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.970 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.970 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.970 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.970 17:48:17 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.970 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.970 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.970 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.970 17:48:17 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.970 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.970 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.970 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.970 17:48:17 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.970 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.970 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.970 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.970 17:48:17 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.970 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.970 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.970 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.970 17:48:17 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.970 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.970 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.970 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.970 17:48:17 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.970 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.971 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.971 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.971 17:48:17 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.971 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.971 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.971 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.971 17:48:17 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.971 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.971 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.971 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.971 17:48:17 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.971 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.971 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.971 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.971 17:48:17 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.971 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.971 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.971 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.971 17:48:17 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.971 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.971 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.971 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.971 17:48:17 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.971 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.971 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.971 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.971 17:48:17 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.971 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.971 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.971 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.971 17:48:17 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.971 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.971 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.971 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.971 17:48:17 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.971 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.971 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.971 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.971 17:48:17 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.971 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.971 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.971 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.971 17:48:17 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.971 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.971 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.971 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.971 17:48:17 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.971 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.971 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.971 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.971 17:48:17 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.971 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.971 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.971 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.971 17:48:17 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.971 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.971 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.971 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.971 17:48:17 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.971 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.971 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.971 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.971 17:48:17 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.971 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.971 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.971 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.971 17:48:17 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.971 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.971 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.971 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.971 17:48:17 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.971 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.971 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.971 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.971 17:48:17 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.971 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.971 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.971 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.971 17:48:17 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.971 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.971 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.971 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.971 17:48:17 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.971 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.971 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.971 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.971 17:48:17 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.971 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.971 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.971 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.971 17:48:17 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.971 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.971 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.971 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.971 17:48:17 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.971 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.971 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.971 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.971 17:48:17 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.971 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.971 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.971 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.971 17:48:17 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.971 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.971 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.971 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.971 17:48:17 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.971 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.971 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.971 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.971 17:48:17 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.971 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.971 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.971 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.971 17:48:17 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.971 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.971 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.971 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.971 17:48:17 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.971 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.971 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.971 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.971 17:48:17 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.971 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.971 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.971 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.971 17:48:17 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.971 17:48:17 -- setup/common.sh@33 -- # echo 0 00:04:24.971 17:48:17 -- setup/common.sh@33 -- # return 0 00:04:24.971 17:48:17 -- setup/hugepages.sh@97 -- # anon=0 00:04:24.971 17:48:17 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:24.971 17:48:17 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:24.971 17:48:17 -- setup/common.sh@18 -- # local node= 00:04:24.971 17:48:17 -- setup/common.sh@19 -- # local var val 00:04:24.971 17:48:17 -- setup/common.sh@20 -- # local mem_f mem 00:04:24.971 17:48:17 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:24.971 17:48:17 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:24.971 17:48:17 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:24.971 17:48:17 -- setup/common.sh@28 -- # mapfile -t mem 00:04:24.971 17:48:17 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:24.971 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.971 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.972 17:48:17 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41572332 kB' 'MemAvailable: 45299496 kB' 'Buffers: 8940 kB' 'Cached: 12525704 kB' 'SwapCached: 0 kB' 'Active: 9406920 kB' 'Inactive: 3688312 kB' 'Active(anon): 8990076 kB' 'Inactive(anon): 0 kB' 'Active(file): 416844 kB' 'Inactive(file): 3688312 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 563916 kB' 'Mapped: 151304 kB' 'Shmem: 8429488 kB' 'KReclaimable: 238196 kB' 'Slab: 913084 kB' 'SReclaimable: 238196 kB' 'SUnreclaim: 674888 kB' 'KernelStack: 21744 kB' 'PageTables: 7472 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10254060 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214288 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 603508 kB' 'DirectMap2M: 10616832 kB' 'DirectMap1G: 58720256 kB' 00:04:24.972 17:48:17 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.972 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.972 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.972 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.972 17:48:17 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.972 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.972 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.972 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.972 17:48:17 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.972 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.972 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.972 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.972 17:48:17 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.972 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.972 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.972 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.972 17:48:17 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.972 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.972 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.972 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.972 17:48:17 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.972 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.972 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.972 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.972 17:48:17 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.972 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.972 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.972 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.972 17:48:17 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.972 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.972 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.972 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.972 17:48:17 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.972 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.972 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.972 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.972 17:48:17 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.972 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.972 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.972 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.972 17:48:17 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.972 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.972 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.972 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.972 17:48:17 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.972 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.972 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.972 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.972 17:48:17 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.972 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.972 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.972 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.972 17:48:17 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.972 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.972 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.972 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.972 17:48:17 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.972 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.972 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.972 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.972 17:48:17 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.972 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.972 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.972 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.972 17:48:17 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.972 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.972 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.972 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.972 17:48:17 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.972 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.972 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.972 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.972 17:48:17 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.972 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.972 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.972 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.972 17:48:17 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.972 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.972 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.972 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.972 17:48:17 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.972 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.972 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.972 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.972 17:48:17 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.972 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.972 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.972 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.972 17:48:17 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.972 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.972 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.972 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.972 17:48:17 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.972 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.972 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.972 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.972 17:48:17 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.972 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.972 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.972 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.972 17:48:17 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.972 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.972 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.972 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.972 17:48:17 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.972 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.972 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.972 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.972 17:48:17 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.972 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.972 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.972 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.972 17:48:17 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.972 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.972 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.972 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.972 17:48:17 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.972 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.972 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.972 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.972 17:48:17 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.972 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.972 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.972 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.972 17:48:17 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.972 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.972 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.972 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.972 17:48:17 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.972 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.972 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.972 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.972 17:48:17 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.972 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.972 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.972 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.972 17:48:17 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.972 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.972 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.972 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.972 17:48:17 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.972 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.972 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.972 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.972 17:48:17 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.972 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.972 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.972 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.972 17:48:17 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.972 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.972 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.972 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.972 17:48:17 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.972 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.972 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.972 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.972 17:48:17 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.973 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.973 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.973 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.973 17:48:17 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.973 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.973 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.973 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.973 17:48:17 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.973 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.973 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.973 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.973 17:48:17 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.973 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.973 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.973 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.973 17:48:17 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.973 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.973 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.973 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.973 17:48:17 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.973 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.973 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.973 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.973 17:48:17 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.973 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.973 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.973 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.973 17:48:17 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.973 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.973 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.973 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.973 17:48:17 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.973 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.973 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.973 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.973 17:48:17 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.973 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.973 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.973 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.973 17:48:17 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.973 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.973 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.973 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.973 17:48:17 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.973 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.973 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.973 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.973 17:48:17 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.973 17:48:17 -- setup/common.sh@33 -- # echo 0 00:04:24.973 17:48:17 -- setup/common.sh@33 -- # return 0 00:04:24.973 17:48:17 -- setup/hugepages.sh@99 -- # surp=0 00:04:24.973 17:48:17 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:24.973 17:48:17 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:24.973 17:48:17 -- setup/common.sh@18 -- # local node= 00:04:24.973 17:48:17 -- setup/common.sh@19 -- # local var val 00:04:24.973 17:48:17 -- setup/common.sh@20 -- # local mem_f mem 00:04:24.973 17:48:17 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:24.973 17:48:17 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:24.973 17:48:17 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:24.973 17:48:17 -- setup/common.sh@28 -- # mapfile -t mem 00:04:24.973 17:48:17 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:24.973 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.973 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.973 17:48:17 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41573364 kB' 'MemAvailable: 45300528 kB' 'Buffers: 8940 kB' 'Cached: 12525716 kB' 'SwapCached: 0 kB' 'Active: 9406496 kB' 'Inactive: 3688312 kB' 'Active(anon): 8989652 kB' 'Inactive(anon): 0 kB' 'Active(file): 416844 kB' 'Inactive(file): 3688312 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 563464 kB' 'Mapped: 151304 kB' 'Shmem: 8429500 kB' 'KReclaimable: 238196 kB' 'Slab: 913096 kB' 'SReclaimable: 238196 kB' 'SUnreclaim: 674900 kB' 'KernelStack: 21744 kB' 'PageTables: 7460 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10254072 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214288 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 603508 kB' 'DirectMap2M: 10616832 kB' 'DirectMap1G: 58720256 kB' 00:04:24.973 17:48:17 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.973 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.973 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.973 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.973 17:48:17 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.973 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.973 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.973 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.973 17:48:17 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.973 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.973 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.973 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.973 17:48:17 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.973 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.973 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.973 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.973 17:48:17 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.973 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.973 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.973 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.973 17:48:17 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.973 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.973 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.973 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.973 17:48:17 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.973 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.973 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.973 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.973 17:48:17 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.973 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.973 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.973 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.973 17:48:17 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.973 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.973 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.973 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.973 17:48:17 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.973 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.973 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.973 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.973 17:48:17 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.973 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.973 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.973 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.973 17:48:17 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.973 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.973 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.973 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.973 17:48:17 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.973 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.973 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.973 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.973 17:48:17 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.973 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.973 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.973 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.973 17:48:17 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.973 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.973 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.973 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.973 17:48:17 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.973 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.973 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.973 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.973 17:48:17 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.973 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.973 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.973 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.973 17:48:17 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.973 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.973 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.973 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.973 17:48:17 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.973 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.973 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.974 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.974 17:48:17 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.974 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.974 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.974 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.974 17:48:17 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.974 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.974 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.974 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.974 17:48:17 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.974 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.974 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.974 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.974 17:48:17 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.974 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.974 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.974 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.974 17:48:17 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.974 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.974 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.974 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.974 17:48:17 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.974 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.974 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.974 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.974 17:48:17 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.974 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.974 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.974 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.974 17:48:17 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.974 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.974 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.974 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.974 17:48:17 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.974 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.974 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.974 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.974 17:48:17 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.974 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.974 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.974 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.974 17:48:17 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.974 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.974 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.974 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.974 17:48:17 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.974 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.974 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.974 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.974 17:48:17 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.974 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.974 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.974 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.974 17:48:17 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.974 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.974 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.974 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.974 17:48:17 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.974 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.974 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.974 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.974 17:48:17 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.974 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.974 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.974 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.974 17:48:17 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.974 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.974 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.974 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.974 17:48:17 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.974 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.974 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.974 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.974 17:48:17 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.974 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.974 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.974 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.974 17:48:17 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.974 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.974 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.974 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.974 17:48:17 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.974 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.974 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.974 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.974 17:48:17 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.974 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.974 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.974 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.974 17:48:17 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.974 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.974 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.974 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.974 17:48:17 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.974 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.974 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.974 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.974 17:48:17 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.974 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.974 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.974 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.974 17:48:17 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.974 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.974 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.974 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.974 17:48:17 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.974 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.974 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.974 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.974 17:48:17 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.974 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.974 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.974 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.974 17:48:17 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.974 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.974 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.974 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.974 17:48:17 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.974 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.974 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.974 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.974 17:48:17 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.974 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.974 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.974 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.974 17:48:17 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.974 17:48:17 -- setup/common.sh@33 -- # echo 0 00:04:24.974 17:48:17 -- setup/common.sh@33 -- # return 0 00:04:24.974 17:48:17 -- setup/hugepages.sh@100 -- # resv=0 00:04:24.974 17:48:17 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:24.974 nr_hugepages=1024 00:04:24.974 17:48:17 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:24.974 resv_hugepages=0 00:04:24.974 17:48:17 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:24.974 surplus_hugepages=0 00:04:24.974 17:48:17 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:24.974 anon_hugepages=0 00:04:24.974 17:48:17 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:24.974 17:48:17 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:24.974 17:48:17 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:24.974 17:48:17 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:24.974 17:48:17 -- setup/common.sh@18 -- # local node= 00:04:24.974 17:48:17 -- setup/common.sh@19 -- # local var val 00:04:24.974 17:48:17 -- setup/common.sh@20 -- # local mem_f mem 00:04:24.974 17:48:17 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:24.974 17:48:17 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:24.974 17:48:17 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:24.974 17:48:17 -- setup/common.sh@28 -- # mapfile -t mem 00:04:24.974 17:48:17 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:24.974 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.974 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.975 17:48:17 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41574552 kB' 'MemAvailable: 45301716 kB' 'Buffers: 8940 kB' 'Cached: 12525716 kB' 'SwapCached: 0 kB' 'Active: 9406864 kB' 'Inactive: 3688312 kB' 'Active(anon): 8990020 kB' 'Inactive(anon): 0 kB' 'Active(file): 416844 kB' 'Inactive(file): 3688312 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 563992 kB' 'Mapped: 151304 kB' 'Shmem: 8429500 kB' 'KReclaimable: 238196 kB' 'Slab: 913096 kB' 'SReclaimable: 238196 kB' 'SUnreclaim: 674900 kB' 'KernelStack: 21776 kB' 'PageTables: 7584 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10258636 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214288 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 603508 kB' 'DirectMap2M: 10616832 kB' 'DirectMap1G: 58720256 kB' 00:04:24.975 17:48:17 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.975 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.975 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.975 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.975 17:48:17 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.975 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.975 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.975 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.975 17:48:17 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.975 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.975 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.975 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.975 17:48:17 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.975 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.975 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.975 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.975 17:48:17 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.975 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.975 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.975 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.975 17:48:17 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.975 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.975 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.975 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.975 17:48:17 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.975 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.975 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.975 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.975 17:48:17 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.975 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.975 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.975 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.975 17:48:17 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.975 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.975 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.975 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.975 17:48:17 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.975 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.975 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.975 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.975 17:48:17 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.975 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.975 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.975 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.975 17:48:17 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.975 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.975 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.975 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.975 17:48:17 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.975 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.975 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.975 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.975 17:48:17 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.975 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.975 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.975 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.975 17:48:17 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.975 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.975 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.975 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.975 17:48:17 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.975 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.975 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.975 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.975 17:48:17 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.975 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.975 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.975 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.975 17:48:17 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.975 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.975 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.975 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.975 17:48:17 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.975 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.975 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.975 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.975 17:48:17 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.975 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.975 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.975 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.975 17:48:17 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.975 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.975 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.975 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.975 17:48:17 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.975 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.975 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.975 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.975 17:48:17 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.975 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.975 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.975 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.975 17:48:17 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.975 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.975 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.975 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.975 17:48:17 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.975 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.975 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.975 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.975 17:48:17 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.975 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.975 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.975 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.975 17:48:17 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.975 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.975 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.975 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.975 17:48:17 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.975 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.975 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.975 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.975 17:48:17 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.975 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.975 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.975 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.975 17:48:17 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.975 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.975 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.975 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.975 17:48:17 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.975 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.975 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.975 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.975 17:48:17 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.975 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.975 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.975 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.976 17:48:17 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.976 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.976 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.976 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.976 17:48:17 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.976 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.976 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.976 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.976 17:48:17 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.976 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.976 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.976 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.976 17:48:17 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.976 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.976 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.976 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.976 17:48:17 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.976 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.976 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.976 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.976 17:48:17 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.976 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.976 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.976 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.976 17:48:17 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.976 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.976 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.976 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.976 17:48:17 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.976 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.976 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.976 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.976 17:48:17 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.976 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.976 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.976 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.976 17:48:17 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.976 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.976 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.976 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.976 17:48:17 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.976 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.976 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.976 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.976 17:48:17 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.976 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.976 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.976 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.976 17:48:17 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.976 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.976 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.976 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.976 17:48:17 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.976 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.976 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.976 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.976 17:48:17 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.976 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.976 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.976 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.976 17:48:17 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.976 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.976 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.976 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.976 17:48:17 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.976 17:48:17 -- setup/common.sh@33 -- # echo 1024 00:04:24.976 17:48:17 -- setup/common.sh@33 -- # return 0 00:04:24.976 17:48:17 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:24.976 17:48:17 -- setup/hugepages.sh@112 -- # get_nodes 00:04:24.976 17:48:17 -- setup/hugepages.sh@27 -- # local node 00:04:24.976 17:48:17 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:24.976 17:48:17 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:24.976 17:48:17 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:24.976 17:48:17 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:24.976 17:48:17 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:24.976 17:48:17 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:24.976 17:48:17 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:24.976 17:48:17 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:24.976 17:48:17 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:24.976 17:48:17 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:24.976 17:48:17 -- setup/common.sh@18 -- # local node=0 00:04:24.976 17:48:17 -- setup/common.sh@19 -- # local var val 00:04:24.976 17:48:17 -- setup/common.sh@20 -- # local mem_f mem 00:04:24.976 17:48:17 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:24.976 17:48:17 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:24.976 17:48:17 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:24.976 17:48:17 -- setup/common.sh@28 -- # mapfile -t mem 00:04:24.976 17:48:17 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:24.976 17:48:17 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634436 kB' 'MemFree: 26222132 kB' 'MemUsed: 6412304 kB' 'SwapCached: 0 kB' 'Active: 3692132 kB' 'Inactive: 168948 kB' 'Active(anon): 3569748 kB' 'Inactive(anon): 0 kB' 'Active(file): 122384 kB' 'Inactive(file): 168948 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3681708 kB' 'Mapped: 59612 kB' 'AnonPages: 182540 kB' 'Shmem: 3390376 kB' 'KernelStack: 10680 kB' 'PageTables: 3496 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 153408 kB' 'Slab: 469400 kB' 'SReclaimable: 153408 kB' 'SUnreclaim: 315992 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:24.976 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.976 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.976 17:48:17 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.976 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.976 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.976 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.976 17:48:17 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.976 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.976 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.976 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.976 17:48:17 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.976 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.976 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.976 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.976 17:48:17 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.976 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.976 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.976 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.976 17:48:17 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.976 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.976 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.976 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.976 17:48:17 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.976 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.976 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.976 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.976 17:48:17 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.976 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.976 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.976 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.976 17:48:17 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.976 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.976 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.976 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.976 17:48:17 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.976 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.976 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.976 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.976 17:48:17 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.976 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.976 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.976 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.976 17:48:17 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.976 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.976 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.976 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.976 17:48:17 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.976 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.976 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.976 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.976 17:48:17 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.976 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.976 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.976 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.976 17:48:17 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.976 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.976 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.976 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.976 17:48:17 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.976 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.976 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.977 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.977 17:48:17 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.977 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.977 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.977 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.977 17:48:17 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.977 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.977 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.977 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.977 17:48:17 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.977 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.977 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.977 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.977 17:48:17 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.977 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.977 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.977 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.977 17:48:17 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.977 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.977 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.977 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.977 17:48:17 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.977 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.977 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.977 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.977 17:48:17 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.977 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.977 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.977 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.977 17:48:17 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.977 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.977 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.977 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.977 17:48:17 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.977 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.977 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.977 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.977 17:48:17 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.977 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.977 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.977 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.977 17:48:17 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.977 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.977 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.977 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.977 17:48:17 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.977 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.977 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.977 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.977 17:48:17 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.977 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.977 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.977 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.977 17:48:17 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.977 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.977 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.977 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.977 17:48:17 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.977 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.977 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.977 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.977 17:48:17 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.977 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.977 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.977 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.977 17:48:17 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.977 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.977 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.977 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.977 17:48:17 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.977 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.977 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.977 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.977 17:48:17 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.977 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.977 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.977 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.977 17:48:17 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.977 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.977 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.977 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.977 17:48:17 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.977 17:48:17 -- setup/common.sh@32 -- # continue 00:04:24.977 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.977 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.977 17:48:17 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.977 17:48:17 -- setup/common.sh@33 -- # echo 0 00:04:24.977 17:48:17 -- setup/common.sh@33 -- # return 0 00:04:24.977 17:48:17 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:24.977 17:48:17 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:24.977 17:48:17 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:25.238 17:48:17 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:25.238 17:48:17 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:25.238 17:48:17 -- setup/common.sh@18 -- # local node=1 00:04:25.238 17:48:17 -- setup/common.sh@19 -- # local var val 00:04:25.238 17:48:17 -- setup/common.sh@20 -- # local mem_f mem 00:04:25.238 17:48:17 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:25.238 17:48:17 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:25.238 17:48:17 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:25.238 17:48:17 -- setup/common.sh@28 -- # mapfile -t mem 00:04:25.238 17:48:17 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:25.238 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:25.238 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:25.238 17:48:17 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27649360 kB' 'MemFree: 15354596 kB' 'MemUsed: 12294764 kB' 'SwapCached: 0 kB' 'Active: 5715124 kB' 'Inactive: 3519364 kB' 'Active(anon): 5420664 kB' 'Inactive(anon): 0 kB' 'Active(file): 294460 kB' 'Inactive(file): 3519364 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8852948 kB' 'Mapped: 91700 kB' 'AnonPages: 381708 kB' 'Shmem: 5039124 kB' 'KernelStack: 11096 kB' 'PageTables: 4060 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 84788 kB' 'Slab: 443708 kB' 'SReclaimable: 84788 kB' 'SUnreclaim: 358920 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:25.238 17:48:17 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.238 17:48:17 -- setup/common.sh@32 -- # continue 00:04:25.238 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:25.238 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:25.239 17:48:17 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.239 17:48:17 -- setup/common.sh@32 -- # continue 00:04:25.239 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:25.239 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:25.239 17:48:17 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.239 17:48:17 -- setup/common.sh@32 -- # continue 00:04:25.239 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:25.239 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:25.239 17:48:17 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.239 17:48:17 -- setup/common.sh@32 -- # continue 00:04:25.239 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:25.239 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:25.239 17:48:17 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.239 17:48:17 -- setup/common.sh@32 -- # continue 00:04:25.239 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:25.239 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:25.239 17:48:17 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.239 17:48:17 -- setup/common.sh@32 -- # continue 00:04:25.239 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:25.239 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:25.239 17:48:17 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.239 17:48:17 -- setup/common.sh@32 -- # continue 00:04:25.239 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:25.239 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:25.239 17:48:17 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.239 17:48:17 -- setup/common.sh@32 -- # continue 00:04:25.239 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:25.239 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:25.239 17:48:17 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.239 17:48:17 -- setup/common.sh@32 -- # continue 00:04:25.239 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:25.239 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:25.239 17:48:17 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.239 17:48:17 -- setup/common.sh@32 -- # continue 00:04:25.239 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:25.239 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:25.239 17:48:17 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.239 17:48:17 -- setup/common.sh@32 -- # continue 00:04:25.239 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:25.239 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:25.239 17:48:17 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.239 17:48:17 -- setup/common.sh@32 -- # continue 00:04:25.239 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:25.239 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:25.239 17:48:17 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.239 17:48:17 -- setup/common.sh@32 -- # continue 00:04:25.239 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:25.239 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:25.239 17:48:17 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.239 17:48:17 -- setup/common.sh@32 -- # continue 00:04:25.239 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:25.239 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:25.239 17:48:17 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.239 17:48:17 -- setup/common.sh@32 -- # continue 00:04:25.239 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:25.239 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:25.239 17:48:17 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.239 17:48:17 -- setup/common.sh@32 -- # continue 00:04:25.239 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:25.239 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:25.239 17:48:17 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.239 17:48:17 -- setup/common.sh@32 -- # continue 00:04:25.239 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:25.239 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:25.239 17:48:17 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.239 17:48:17 -- setup/common.sh@32 -- # continue 00:04:25.239 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:25.239 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:25.239 17:48:17 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.239 17:48:17 -- setup/common.sh@32 -- # continue 00:04:25.239 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:25.239 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:25.239 17:48:17 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.239 17:48:17 -- setup/common.sh@32 -- # continue 00:04:25.239 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:25.239 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:25.239 17:48:17 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.239 17:48:17 -- setup/common.sh@32 -- # continue 00:04:25.239 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:25.239 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:25.239 17:48:17 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.239 17:48:17 -- setup/common.sh@32 -- # continue 00:04:25.239 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:25.239 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:25.239 17:48:17 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.239 17:48:17 -- setup/common.sh@32 -- # continue 00:04:25.239 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:25.239 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:25.239 17:48:17 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.239 17:48:17 -- setup/common.sh@32 -- # continue 00:04:25.239 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:25.239 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:25.239 17:48:17 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.239 17:48:17 -- setup/common.sh@32 -- # continue 00:04:25.239 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:25.239 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:25.239 17:48:17 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.239 17:48:17 -- setup/common.sh@32 -- # continue 00:04:25.239 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:25.239 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:25.239 17:48:17 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.239 17:48:17 -- setup/common.sh@32 -- # continue 00:04:25.239 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:25.239 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:25.239 17:48:17 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.239 17:48:17 -- setup/common.sh@32 -- # continue 00:04:25.239 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:25.239 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:25.239 17:48:17 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.239 17:48:17 -- setup/common.sh@32 -- # continue 00:04:25.239 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:25.239 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:25.239 17:48:17 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.239 17:48:17 -- setup/common.sh@32 -- # continue 00:04:25.239 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:25.239 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:25.239 17:48:17 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.239 17:48:17 -- setup/common.sh@32 -- # continue 00:04:25.239 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:25.239 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:25.239 17:48:17 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.239 17:48:17 -- setup/common.sh@32 -- # continue 00:04:25.239 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:25.239 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:25.239 17:48:17 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.239 17:48:17 -- setup/common.sh@32 -- # continue 00:04:25.239 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:25.239 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:25.239 17:48:17 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.239 17:48:17 -- setup/common.sh@32 -- # continue 00:04:25.239 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:25.239 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:25.239 17:48:17 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.239 17:48:17 -- setup/common.sh@32 -- # continue 00:04:25.239 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:25.239 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:25.239 17:48:17 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.239 17:48:17 -- setup/common.sh@32 -- # continue 00:04:25.239 17:48:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:25.239 17:48:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:25.239 17:48:17 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.239 17:48:17 -- setup/common.sh@33 -- # echo 0 00:04:25.239 17:48:17 -- setup/common.sh@33 -- # return 0 00:04:25.239 17:48:17 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:25.239 17:48:17 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:25.239 17:48:17 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:25.239 17:48:17 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:25.239 17:48:17 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:04:25.239 node0=512 expecting 512 00:04:25.239 17:48:17 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:25.239 17:48:17 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:25.239 17:48:17 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:25.239 17:48:17 -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:04:25.239 node1=512 expecting 512 00:04:25.239 17:48:17 -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:04:25.239 00:04:25.239 real 0m3.728s 00:04:25.239 user 0m1.462s 00:04:25.239 sys 0m2.339s 00:04:25.239 17:48:17 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:25.239 17:48:17 -- common/autotest_common.sh@10 -- # set +x 00:04:25.239 ************************************ 00:04:25.239 END TEST even_2G_alloc 00:04:25.239 ************************************ 00:04:25.239 17:48:17 -- setup/hugepages.sh@213 -- # run_test odd_alloc odd_alloc 00:04:25.240 17:48:17 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:25.240 17:48:17 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:25.240 17:48:17 -- common/autotest_common.sh@10 -- # set +x 00:04:25.240 ************************************ 00:04:25.240 START TEST odd_alloc 00:04:25.240 ************************************ 00:04:25.240 17:48:17 -- common/autotest_common.sh@1114 -- # odd_alloc 00:04:25.240 17:48:17 -- setup/hugepages.sh@159 -- # get_test_nr_hugepages 2098176 00:04:25.240 17:48:17 -- setup/hugepages.sh@49 -- # local size=2098176 00:04:25.240 17:48:17 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:25.240 17:48:17 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:25.240 17:48:17 -- setup/hugepages.sh@57 -- # nr_hugepages=1025 00:04:25.240 17:48:17 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:25.240 17:48:17 -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:25.240 17:48:17 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:25.240 17:48:17 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1025 00:04:25.240 17:48:17 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:25.240 17:48:17 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:25.240 17:48:17 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:25.240 17:48:17 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:25.240 17:48:17 -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:25.240 17:48:17 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:25.240 17:48:17 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:25.240 17:48:17 -- setup/hugepages.sh@83 -- # : 513 00:04:25.240 17:48:17 -- setup/hugepages.sh@84 -- # : 1 00:04:25.240 17:48:17 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:25.240 17:48:17 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=513 00:04:25.240 17:48:17 -- setup/hugepages.sh@83 -- # : 0 00:04:25.240 17:48:17 -- setup/hugepages.sh@84 -- # : 0 00:04:25.240 17:48:17 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:25.240 17:48:17 -- setup/hugepages.sh@160 -- # HUGEMEM=2049 00:04:25.240 17:48:17 -- setup/hugepages.sh@160 -- # HUGE_EVEN_ALLOC=yes 00:04:25.240 17:48:17 -- setup/hugepages.sh@160 -- # setup output 00:04:25.240 17:48:17 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:25.240 17:48:17 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:28.534 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:28.534 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:28.534 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:28.534 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:28.534 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:28.534 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:28.534 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:28.534 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:28.534 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:28.534 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:28.534 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:28.534 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:28.534 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:28.534 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:28.534 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:28.534 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:28.534 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:28.800 17:48:21 -- setup/hugepages.sh@161 -- # verify_nr_hugepages 00:04:28.800 17:48:21 -- setup/hugepages.sh@89 -- # local node 00:04:28.800 17:48:21 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:28.800 17:48:21 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:28.800 17:48:21 -- setup/hugepages.sh@92 -- # local surp 00:04:28.800 17:48:21 -- setup/hugepages.sh@93 -- # local resv 00:04:28.800 17:48:21 -- setup/hugepages.sh@94 -- # local anon 00:04:28.800 17:48:21 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:28.800 17:48:21 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:28.800 17:48:21 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:28.800 17:48:21 -- setup/common.sh@18 -- # local node= 00:04:28.800 17:48:21 -- setup/common.sh@19 -- # local var val 00:04:28.800 17:48:21 -- setup/common.sh@20 -- # local mem_f mem 00:04:28.800 17:48:21 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:28.800 17:48:21 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:28.800 17:48:21 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:28.800 17:48:21 -- setup/common.sh@28 -- # mapfile -t mem 00:04:28.800 17:48:21 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:28.800 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.800 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.800 17:48:21 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41592524 kB' 'MemAvailable: 45319688 kB' 'Buffers: 8940 kB' 'Cached: 12525836 kB' 'SwapCached: 0 kB' 'Active: 9407076 kB' 'Inactive: 3688312 kB' 'Active(anon): 8990232 kB' 'Inactive(anon): 0 kB' 'Active(file): 416844 kB' 'Inactive(file): 3688312 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 563448 kB' 'Mapped: 151404 kB' 'Shmem: 8429620 kB' 'KReclaimable: 238196 kB' 'Slab: 913728 kB' 'SReclaimable: 238196 kB' 'SUnreclaim: 675532 kB' 'KernelStack: 21712 kB' 'PageTables: 7220 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480900 kB' 'Committed_AS: 10254708 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214368 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 603508 kB' 'DirectMap2M: 10616832 kB' 'DirectMap1G: 58720256 kB' 00:04:28.800 17:48:21 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:28.800 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.800 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.800 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.800 17:48:21 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:28.800 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.800 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.800 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.800 17:48:21 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:28.800 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.800 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.800 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.800 17:48:21 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:28.800 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.800 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.800 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.800 17:48:21 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:28.800 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.800 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.800 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.800 17:48:21 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:28.800 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.800 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.800 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.800 17:48:21 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:28.800 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.800 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.800 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.800 17:48:21 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:28.800 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.800 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.800 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.800 17:48:21 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:28.800 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.800 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.800 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.800 17:48:21 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:28.800 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.800 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.800 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.800 17:48:21 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:28.800 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.800 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.800 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.800 17:48:21 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:28.800 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.800 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.800 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.800 17:48:21 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:28.800 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.800 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.800 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.800 17:48:21 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:28.800 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.800 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.800 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.800 17:48:21 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:28.800 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.800 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.800 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.800 17:48:21 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:28.800 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.800 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.800 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.800 17:48:21 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:28.800 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.800 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.800 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.800 17:48:21 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:28.800 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.800 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.800 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.800 17:48:21 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:28.800 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.800 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.800 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.800 17:48:21 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:28.800 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.800 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.800 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.800 17:48:21 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:28.800 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.800 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.800 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.800 17:48:21 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:28.800 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.800 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.801 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.801 17:48:21 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:28.801 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.801 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.801 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.801 17:48:21 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:28.801 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.801 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.801 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.801 17:48:21 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:28.801 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.801 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.801 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.801 17:48:21 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:28.801 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.801 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.801 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.801 17:48:21 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:28.801 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.801 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.801 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.801 17:48:21 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:28.801 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.801 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.801 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.801 17:48:21 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:28.801 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.801 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.801 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.801 17:48:21 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:28.801 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.801 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.801 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.801 17:48:21 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:28.801 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.801 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.801 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.801 17:48:21 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:28.801 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.801 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.801 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.801 17:48:21 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:28.801 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.801 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.801 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.801 17:48:21 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:28.801 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.801 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.801 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.801 17:48:21 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:28.801 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.801 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.801 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.801 17:48:21 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:28.801 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.801 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.801 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.801 17:48:21 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:28.801 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.801 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.801 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.801 17:48:21 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:28.801 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.801 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.801 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.801 17:48:21 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:28.801 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.801 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.801 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.801 17:48:21 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:28.801 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.801 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.801 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.801 17:48:21 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:28.801 17:48:21 -- setup/common.sh@33 -- # echo 0 00:04:28.801 17:48:21 -- setup/common.sh@33 -- # return 0 00:04:28.801 17:48:21 -- setup/hugepages.sh@97 -- # anon=0 00:04:28.801 17:48:21 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:28.801 17:48:21 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:28.801 17:48:21 -- setup/common.sh@18 -- # local node= 00:04:28.801 17:48:21 -- setup/common.sh@19 -- # local var val 00:04:28.801 17:48:21 -- setup/common.sh@20 -- # local mem_f mem 00:04:28.801 17:48:21 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:28.801 17:48:21 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:28.801 17:48:21 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:28.801 17:48:21 -- setup/common.sh@28 -- # mapfile -t mem 00:04:28.801 17:48:21 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:28.801 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.801 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.801 17:48:21 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41597212 kB' 'MemAvailable: 45324364 kB' 'Buffers: 8940 kB' 'Cached: 12525840 kB' 'SwapCached: 0 kB' 'Active: 9406384 kB' 'Inactive: 3688312 kB' 'Active(anon): 8989540 kB' 'Inactive(anon): 0 kB' 'Active(file): 416844 kB' 'Inactive(file): 3688312 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 563240 kB' 'Mapped: 151388 kB' 'Shmem: 8429624 kB' 'KReclaimable: 238172 kB' 'Slab: 913808 kB' 'SReclaimable: 238172 kB' 'SUnreclaim: 675636 kB' 'KernelStack: 21744 kB' 'PageTables: 7476 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480900 kB' 'Committed_AS: 10254720 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214336 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 603508 kB' 'DirectMap2M: 10616832 kB' 'DirectMap1G: 58720256 kB' 00:04:28.801 17:48:21 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.801 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.801 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.801 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.801 17:48:21 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.801 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.801 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.801 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.801 17:48:21 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.801 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.801 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.801 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.801 17:48:21 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.801 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.801 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.801 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.801 17:48:21 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.801 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.801 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.801 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.801 17:48:21 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.801 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.801 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.801 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.801 17:48:21 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.801 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.801 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.801 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.801 17:48:21 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.801 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.801 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.801 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.801 17:48:21 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.801 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.801 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.801 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.801 17:48:21 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.801 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.801 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.801 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.801 17:48:21 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.801 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.801 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.801 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.801 17:48:21 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.801 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.801 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.801 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.801 17:48:21 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.801 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.801 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.801 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.801 17:48:21 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.802 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.802 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.802 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.802 17:48:21 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.802 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.802 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.802 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.802 17:48:21 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.802 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.802 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.802 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.802 17:48:21 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.802 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.802 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.802 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.802 17:48:21 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.802 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.802 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.802 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.802 17:48:21 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.802 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.802 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.802 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.802 17:48:21 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.802 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.802 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.802 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.802 17:48:21 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.802 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.802 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.802 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.802 17:48:21 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.802 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.802 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.802 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.802 17:48:21 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.802 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.802 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.802 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.802 17:48:21 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.802 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.802 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.802 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.802 17:48:21 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.802 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.802 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.802 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.802 17:48:21 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.802 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.802 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.802 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.802 17:48:21 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.802 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.802 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.802 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.802 17:48:21 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.802 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.802 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.802 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.802 17:48:21 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.802 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.802 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.802 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.802 17:48:21 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.802 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.802 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.802 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.802 17:48:21 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.802 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.802 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.802 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.802 17:48:21 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.802 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.802 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.802 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.802 17:48:21 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.802 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.802 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.802 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.802 17:48:21 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.802 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.802 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.802 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.802 17:48:21 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.802 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.802 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.802 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.802 17:48:21 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.802 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.802 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.802 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.802 17:48:21 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.802 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.802 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.802 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.802 17:48:21 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.802 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.802 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.802 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.802 17:48:21 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.802 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.802 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.802 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.802 17:48:21 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.802 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.802 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.802 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.802 17:48:21 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.802 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.802 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.802 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.802 17:48:21 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.802 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.802 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.802 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.802 17:48:21 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.802 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.802 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.802 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.802 17:48:21 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.802 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.802 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.802 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.802 17:48:21 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.802 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.802 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.802 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.802 17:48:21 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.802 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.802 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.802 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.802 17:48:21 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.802 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.802 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.802 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.802 17:48:21 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.802 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.802 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.802 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.802 17:48:21 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.802 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.802 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.802 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.802 17:48:21 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.802 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.802 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.802 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.802 17:48:21 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.802 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.802 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.802 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.802 17:48:21 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.802 17:48:21 -- setup/common.sh@33 -- # echo 0 00:04:28.802 17:48:21 -- setup/common.sh@33 -- # return 0 00:04:28.802 17:48:21 -- setup/hugepages.sh@99 -- # surp=0 00:04:28.802 17:48:21 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:28.802 17:48:21 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:28.802 17:48:21 -- setup/common.sh@18 -- # local node= 00:04:28.802 17:48:21 -- setup/common.sh@19 -- # local var val 00:04:28.802 17:48:21 -- setup/common.sh@20 -- # local mem_f mem 00:04:28.802 17:48:21 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:28.802 17:48:21 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:28.802 17:48:21 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:28.803 17:48:21 -- setup/common.sh@28 -- # mapfile -t mem 00:04:28.803 17:48:21 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:28.803 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.803 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.803 17:48:21 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41597444 kB' 'MemAvailable: 45324596 kB' 'Buffers: 8940 kB' 'Cached: 12525840 kB' 'SwapCached: 0 kB' 'Active: 9406212 kB' 'Inactive: 3688312 kB' 'Active(anon): 8989368 kB' 'Inactive(anon): 0 kB' 'Active(file): 416844 kB' 'Inactive(file): 3688312 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 563020 kB' 'Mapped: 151308 kB' 'Shmem: 8429624 kB' 'KReclaimable: 238172 kB' 'Slab: 913756 kB' 'SReclaimable: 238172 kB' 'SUnreclaim: 675584 kB' 'KernelStack: 21728 kB' 'PageTables: 7408 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480900 kB' 'Committed_AS: 10254736 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214352 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 603508 kB' 'DirectMap2M: 10616832 kB' 'DirectMap1G: 58720256 kB' 00:04:28.803 17:48:21 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.803 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.803 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.803 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.803 17:48:21 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.803 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.803 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.803 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.803 17:48:21 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.803 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.803 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.803 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.803 17:48:21 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.803 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.803 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.803 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.803 17:48:21 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.803 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.803 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.803 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.803 17:48:21 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.803 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.803 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.803 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.803 17:48:21 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.803 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.803 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.803 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.803 17:48:21 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.803 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.803 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.803 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.803 17:48:21 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.803 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.803 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.803 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.803 17:48:21 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.803 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.803 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.803 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.803 17:48:21 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.803 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.803 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.803 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.803 17:48:21 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.803 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.803 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.803 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.803 17:48:21 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.803 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.803 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.803 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.803 17:48:21 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.803 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.803 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.803 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.803 17:48:21 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.803 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.803 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.803 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.803 17:48:21 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.803 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.803 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.803 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.803 17:48:21 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.803 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.803 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.803 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.803 17:48:21 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.803 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.803 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.803 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.803 17:48:21 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.803 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.803 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.803 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.803 17:48:21 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.803 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.803 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.803 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.803 17:48:21 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.803 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.803 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.803 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.803 17:48:21 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.803 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.803 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.803 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.803 17:48:21 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.803 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.803 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.803 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.803 17:48:21 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.803 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.803 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.803 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.803 17:48:21 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.803 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.803 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.803 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.803 17:48:21 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.803 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.803 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.803 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.803 17:48:21 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.803 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.803 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.803 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.803 17:48:21 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.803 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.803 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.803 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.803 17:48:21 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.803 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.803 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.803 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.803 17:48:21 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.803 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.803 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.803 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.803 17:48:21 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.803 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.803 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.803 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.803 17:48:21 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.803 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.803 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.803 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.803 17:48:21 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.803 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.803 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.803 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.803 17:48:21 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.803 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.803 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.803 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.803 17:48:21 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.803 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.803 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.803 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.804 17:48:21 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.804 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.804 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.804 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.804 17:48:21 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.804 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.804 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.804 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.804 17:48:21 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.804 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.804 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.804 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.804 17:48:21 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.804 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.804 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.804 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.804 17:48:21 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.804 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.804 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.804 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.804 17:48:21 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.804 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.804 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.804 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.804 17:48:21 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.804 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.804 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.804 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.804 17:48:21 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.804 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.804 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.804 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.804 17:48:21 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.804 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.804 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.804 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.804 17:48:21 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.804 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.804 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.804 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.804 17:48:21 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.804 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.804 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.804 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.804 17:48:21 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.804 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.804 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.804 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.804 17:48:21 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.804 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.804 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.804 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.804 17:48:21 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.804 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.804 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.804 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.804 17:48:21 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.804 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.804 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.804 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.804 17:48:21 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.804 17:48:21 -- setup/common.sh@33 -- # echo 0 00:04:28.804 17:48:21 -- setup/common.sh@33 -- # return 0 00:04:28.804 17:48:21 -- setup/hugepages.sh@100 -- # resv=0 00:04:28.804 17:48:21 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1025 00:04:28.804 nr_hugepages=1025 00:04:28.804 17:48:21 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:28.804 resv_hugepages=0 00:04:28.804 17:48:21 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:28.804 surplus_hugepages=0 00:04:28.804 17:48:21 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:28.804 anon_hugepages=0 00:04:28.804 17:48:21 -- setup/hugepages.sh@107 -- # (( 1025 == nr_hugepages + surp + resv )) 00:04:28.804 17:48:21 -- setup/hugepages.sh@109 -- # (( 1025 == nr_hugepages )) 00:04:28.804 17:48:21 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:28.804 17:48:21 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:28.804 17:48:21 -- setup/common.sh@18 -- # local node= 00:04:28.804 17:48:21 -- setup/common.sh@19 -- # local var val 00:04:28.804 17:48:21 -- setup/common.sh@20 -- # local mem_f mem 00:04:28.804 17:48:21 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:28.804 17:48:21 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:28.804 17:48:21 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:28.804 17:48:21 -- setup/common.sh@28 -- # mapfile -t mem 00:04:28.804 17:48:21 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:28.804 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.804 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.804 17:48:21 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41597444 kB' 'MemAvailable: 45324596 kB' 'Buffers: 8940 kB' 'Cached: 12525844 kB' 'SwapCached: 0 kB' 'Active: 9406376 kB' 'Inactive: 3688312 kB' 'Active(anon): 8989532 kB' 'Inactive(anon): 0 kB' 'Active(file): 416844 kB' 'Inactive(file): 3688312 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 563180 kB' 'Mapped: 151308 kB' 'Shmem: 8429628 kB' 'KReclaimable: 238172 kB' 'Slab: 913756 kB' 'SReclaimable: 238172 kB' 'SUnreclaim: 675584 kB' 'KernelStack: 21712 kB' 'PageTables: 7352 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480900 kB' 'Committed_AS: 10254748 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214352 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 603508 kB' 'DirectMap2M: 10616832 kB' 'DirectMap1G: 58720256 kB' 00:04:28.804 17:48:21 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.804 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.804 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.804 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.804 17:48:21 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.804 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.804 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.804 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.804 17:48:21 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.804 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.804 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.804 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.804 17:48:21 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.804 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.804 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.804 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.804 17:48:21 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.804 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.804 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.804 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.804 17:48:21 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.804 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.804 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.804 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.804 17:48:21 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.804 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.804 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.804 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.804 17:48:21 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.804 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.804 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.804 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.804 17:48:21 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.804 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.804 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.804 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.804 17:48:21 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.804 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.805 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.805 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.805 17:48:21 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.805 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.805 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.805 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.805 17:48:21 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.805 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.805 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.805 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.805 17:48:21 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.805 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.805 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.805 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.805 17:48:21 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.805 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.805 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.805 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.805 17:48:21 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.805 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.805 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.805 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.805 17:48:21 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.805 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.805 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.805 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.805 17:48:21 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.805 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.805 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.805 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.805 17:48:21 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.805 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.805 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.805 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.805 17:48:21 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.805 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.805 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.805 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.805 17:48:21 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.805 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.805 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.805 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.805 17:48:21 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.805 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.805 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.805 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.805 17:48:21 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.805 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.805 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.805 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.805 17:48:21 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.805 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.805 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.805 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.805 17:48:21 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.805 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.805 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.805 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.805 17:48:21 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.805 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.805 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.805 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.805 17:48:21 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.805 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.805 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.805 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.805 17:48:21 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.805 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.805 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.805 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.805 17:48:21 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.805 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.805 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.805 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.805 17:48:21 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.805 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.805 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.805 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.805 17:48:21 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.805 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.805 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.805 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.805 17:48:21 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.805 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.805 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.805 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.805 17:48:21 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.805 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.805 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.805 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.805 17:48:21 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.805 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.805 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.805 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.805 17:48:21 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.805 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.805 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.805 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.805 17:48:21 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.805 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.805 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.805 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.805 17:48:21 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.805 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.805 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.805 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.805 17:48:21 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.805 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.805 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.805 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.805 17:48:21 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.805 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.805 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.805 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.805 17:48:21 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.805 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.805 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.805 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.805 17:48:21 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.805 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.805 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.805 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.805 17:48:21 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.805 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.805 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.805 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.805 17:48:21 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.805 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.805 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.805 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.805 17:48:21 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.805 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.805 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.805 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.805 17:48:21 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.805 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.805 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.805 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.805 17:48:21 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.805 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.805 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.805 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.805 17:48:21 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.805 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.805 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.805 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.805 17:48:21 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.805 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.805 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.805 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.805 17:48:21 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.805 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.805 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.805 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.805 17:48:21 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.805 17:48:21 -- setup/common.sh@33 -- # echo 1025 00:04:28.805 17:48:21 -- setup/common.sh@33 -- # return 0 00:04:28.805 17:48:21 -- setup/hugepages.sh@110 -- # (( 1025 == nr_hugepages + surp + resv )) 00:04:28.805 17:48:21 -- setup/hugepages.sh@112 -- # get_nodes 00:04:28.806 17:48:21 -- setup/hugepages.sh@27 -- # local node 00:04:28.806 17:48:21 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:28.806 17:48:21 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:28.806 17:48:21 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:28.806 17:48:21 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=513 00:04:28.806 17:48:21 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:28.806 17:48:21 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:28.806 17:48:21 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:28.806 17:48:21 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:28.806 17:48:21 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:28.806 17:48:21 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:28.806 17:48:21 -- setup/common.sh@18 -- # local node=0 00:04:28.806 17:48:21 -- setup/common.sh@19 -- # local var val 00:04:28.806 17:48:21 -- setup/common.sh@20 -- # local mem_f mem 00:04:28.806 17:48:21 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:28.806 17:48:21 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:28.806 17:48:21 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:28.806 17:48:21 -- setup/common.sh@28 -- # mapfile -t mem 00:04:28.806 17:48:21 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:28.806 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.806 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.806 17:48:21 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634436 kB' 'MemFree: 26220384 kB' 'MemUsed: 6414052 kB' 'SwapCached: 0 kB' 'Active: 3691036 kB' 'Inactive: 168948 kB' 'Active(anon): 3568652 kB' 'Inactive(anon): 0 kB' 'Active(file): 122384 kB' 'Inactive(file): 168948 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3681836 kB' 'Mapped: 59616 kB' 'AnonPages: 181256 kB' 'Shmem: 3390504 kB' 'KernelStack: 10648 kB' 'PageTables: 3380 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 153400 kB' 'Slab: 469980 kB' 'SReclaimable: 153400 kB' 'SUnreclaim: 316580 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:28.806 17:48:21 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.806 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.806 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.806 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.806 17:48:21 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.806 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.806 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.806 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.806 17:48:21 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.806 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.806 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.806 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.806 17:48:21 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.806 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.806 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.806 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.806 17:48:21 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.806 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.806 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.806 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.806 17:48:21 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.806 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.806 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.806 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.806 17:48:21 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.806 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.806 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.806 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.806 17:48:21 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.806 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.806 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.806 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.806 17:48:21 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.806 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.806 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.806 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.806 17:48:21 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.806 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.806 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.806 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.806 17:48:21 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.806 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.806 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.806 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.806 17:48:21 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.806 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.806 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.806 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.806 17:48:21 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.806 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.806 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.806 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.806 17:48:21 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.806 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.806 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.806 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.806 17:48:21 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.806 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.806 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.806 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.806 17:48:21 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.806 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.806 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.806 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.806 17:48:21 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.806 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.806 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.806 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.806 17:48:21 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.806 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.806 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.806 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.806 17:48:21 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.806 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.806 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.806 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.806 17:48:21 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.806 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.806 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.806 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.806 17:48:21 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.806 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.806 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.806 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.806 17:48:21 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.806 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.806 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.806 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.806 17:48:21 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.806 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.806 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.806 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.806 17:48:21 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.806 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.806 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.806 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.806 17:48:21 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.806 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.806 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.806 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.806 17:48:21 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.806 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.806 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.806 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.806 17:48:21 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.806 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.806 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.806 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.806 17:48:21 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.806 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.806 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.806 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.806 17:48:21 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.806 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.806 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.806 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.806 17:48:21 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.806 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.806 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.806 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.806 17:48:21 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.806 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.806 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.806 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.806 17:48:21 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.806 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.806 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.806 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.807 17:48:21 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.807 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.807 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.807 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.807 17:48:21 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.807 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.807 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.807 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.807 17:48:21 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.807 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.807 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.807 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.807 17:48:21 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.807 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.807 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.807 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.807 17:48:21 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.807 17:48:21 -- setup/common.sh@33 -- # echo 0 00:04:28.807 17:48:21 -- setup/common.sh@33 -- # return 0 00:04:28.807 17:48:21 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:28.807 17:48:21 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:28.807 17:48:21 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:28.807 17:48:21 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:28.807 17:48:21 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:28.807 17:48:21 -- setup/common.sh@18 -- # local node=1 00:04:28.807 17:48:21 -- setup/common.sh@19 -- # local var val 00:04:28.807 17:48:21 -- setup/common.sh@20 -- # local mem_f mem 00:04:28.807 17:48:21 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:28.807 17:48:21 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:28.807 17:48:21 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:28.807 17:48:21 -- setup/common.sh@28 -- # mapfile -t mem 00:04:28.807 17:48:21 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:28.807 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.807 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.807 17:48:21 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27649360 kB' 'MemFree: 15377928 kB' 'MemUsed: 12271432 kB' 'SwapCached: 0 kB' 'Active: 5714868 kB' 'Inactive: 3519364 kB' 'Active(anon): 5420408 kB' 'Inactive(anon): 0 kB' 'Active(file): 294460 kB' 'Inactive(file): 3519364 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8852996 kB' 'Mapped: 91692 kB' 'AnonPages: 381372 kB' 'Shmem: 5039172 kB' 'KernelStack: 11064 kB' 'PageTables: 3968 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 84772 kB' 'Slab: 443776 kB' 'SReclaimable: 84772 kB' 'SUnreclaim: 359004 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 513' 'HugePages_Free: 513' 'HugePages_Surp: 0' 00:04:28.807 17:48:21 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.807 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.807 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.807 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.807 17:48:21 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.807 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.807 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.807 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.807 17:48:21 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.807 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.807 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.807 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.807 17:48:21 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.807 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.807 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.807 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.807 17:48:21 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.807 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.807 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.807 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.807 17:48:21 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.807 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.807 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.807 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.807 17:48:21 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.807 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.807 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.807 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.807 17:48:21 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.807 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.807 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.807 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.807 17:48:21 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.807 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.807 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.807 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.807 17:48:21 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.807 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.807 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.807 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.807 17:48:21 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.807 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.807 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.807 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.807 17:48:21 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.807 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.807 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.807 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.807 17:48:21 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.807 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.807 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.807 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.807 17:48:21 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.807 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.807 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.807 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.807 17:48:21 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.807 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.807 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.807 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.807 17:48:21 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.807 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.807 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.807 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.807 17:48:21 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.807 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.807 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.807 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.807 17:48:21 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.807 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.807 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.807 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.807 17:48:21 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.807 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.807 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.807 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.807 17:48:21 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.807 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.807 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.807 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.807 17:48:21 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.807 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.807 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.807 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.807 17:48:21 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.807 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.807 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.807 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.807 17:48:21 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.807 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.807 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.807 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.807 17:48:21 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.807 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.807 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.807 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.807 17:48:21 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.807 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.807 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.807 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.807 17:48:21 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.807 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.807 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.807 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.807 17:48:21 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.807 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.807 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.807 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.807 17:48:21 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.807 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.807 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.807 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.807 17:48:21 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.807 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.807 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.808 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.808 17:48:21 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.808 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.808 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.808 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.808 17:48:21 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.808 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.808 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.808 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.808 17:48:21 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.808 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.808 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.808 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.808 17:48:21 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.808 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.808 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.808 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.808 17:48:21 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.808 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.808 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.808 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.808 17:48:21 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.808 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.808 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.808 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.808 17:48:21 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.808 17:48:21 -- setup/common.sh@32 -- # continue 00:04:28.808 17:48:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.808 17:48:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.808 17:48:21 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.808 17:48:21 -- setup/common.sh@33 -- # echo 0 00:04:28.808 17:48:21 -- setup/common.sh@33 -- # return 0 00:04:28.808 17:48:21 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:28.808 17:48:21 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:28.808 17:48:21 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:28.808 17:48:21 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:28.808 17:48:21 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 513' 00:04:28.808 node0=512 expecting 513 00:04:28.808 17:48:21 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:28.808 17:48:21 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:28.808 17:48:21 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:28.808 17:48:21 -- setup/hugepages.sh@128 -- # echo 'node1=513 expecting 512' 00:04:28.808 node1=513 expecting 512 00:04:28.808 17:48:21 -- setup/hugepages.sh@130 -- # [[ 512 513 == \5\1\2\ \5\1\3 ]] 00:04:28.808 00:04:28.808 real 0m3.725s 00:04:28.808 user 0m1.400s 00:04:28.808 sys 0m2.397s 00:04:28.808 17:48:21 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:28.808 17:48:21 -- common/autotest_common.sh@10 -- # set +x 00:04:28.808 ************************************ 00:04:28.808 END TEST odd_alloc 00:04:28.808 ************************************ 00:04:29.069 17:48:21 -- setup/hugepages.sh@214 -- # run_test custom_alloc custom_alloc 00:04:29.069 17:48:21 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:29.069 17:48:21 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:29.069 17:48:21 -- common/autotest_common.sh@10 -- # set +x 00:04:29.069 ************************************ 00:04:29.069 START TEST custom_alloc 00:04:29.069 ************************************ 00:04:29.069 17:48:21 -- common/autotest_common.sh@1114 -- # custom_alloc 00:04:29.069 17:48:21 -- setup/hugepages.sh@167 -- # local IFS=, 00:04:29.069 17:48:21 -- setup/hugepages.sh@169 -- # local node 00:04:29.069 17:48:21 -- setup/hugepages.sh@170 -- # nodes_hp=() 00:04:29.069 17:48:21 -- setup/hugepages.sh@170 -- # local nodes_hp 00:04:29.069 17:48:21 -- setup/hugepages.sh@172 -- # local nr_hugepages=0 _nr_hugepages=0 00:04:29.069 17:48:21 -- setup/hugepages.sh@174 -- # get_test_nr_hugepages 1048576 00:04:29.069 17:48:21 -- setup/hugepages.sh@49 -- # local size=1048576 00:04:29.069 17:48:21 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:29.069 17:48:21 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:29.069 17:48:21 -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:04:29.069 17:48:21 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:29.069 17:48:21 -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:29.069 17:48:21 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:29.069 17:48:21 -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:04:29.069 17:48:21 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:29.069 17:48:21 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:29.069 17:48:21 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:29.069 17:48:21 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:29.069 17:48:21 -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:29.069 17:48:21 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:29.069 17:48:21 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:04:29.069 17:48:21 -- setup/hugepages.sh@83 -- # : 256 00:04:29.069 17:48:21 -- setup/hugepages.sh@84 -- # : 1 00:04:29.069 17:48:21 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:29.069 17:48:21 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:04:29.069 17:48:21 -- setup/hugepages.sh@83 -- # : 0 00:04:29.069 17:48:21 -- setup/hugepages.sh@84 -- # : 0 00:04:29.069 17:48:21 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:29.069 17:48:21 -- setup/hugepages.sh@175 -- # nodes_hp[0]=512 00:04:29.069 17:48:21 -- setup/hugepages.sh@176 -- # (( 2 > 1 )) 00:04:29.069 17:48:21 -- setup/hugepages.sh@177 -- # get_test_nr_hugepages 2097152 00:04:29.069 17:48:21 -- setup/hugepages.sh@49 -- # local size=2097152 00:04:29.069 17:48:21 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:29.069 17:48:21 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:29.069 17:48:21 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:29.069 17:48:21 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:29.069 17:48:21 -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:29.069 17:48:21 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:29.069 17:48:21 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:29.069 17:48:21 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:29.069 17:48:21 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:29.069 17:48:21 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:29.069 17:48:21 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:29.069 17:48:21 -- setup/hugepages.sh@74 -- # (( 1 > 0 )) 00:04:29.069 17:48:21 -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:29.069 17:48:21 -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:04:29.069 17:48:21 -- setup/hugepages.sh@78 -- # return 0 00:04:29.069 17:48:21 -- setup/hugepages.sh@178 -- # nodes_hp[1]=1024 00:04:29.069 17:48:21 -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:04:29.069 17:48:21 -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:04:29.069 17:48:21 -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:04:29.069 17:48:21 -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:04:29.069 17:48:21 -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:04:29.069 17:48:21 -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:04:29.069 17:48:21 -- setup/hugepages.sh@186 -- # get_test_nr_hugepages_per_node 00:04:29.069 17:48:21 -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:29.069 17:48:21 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:29.069 17:48:21 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:29.069 17:48:21 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:29.069 17:48:21 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:29.069 17:48:21 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:29.069 17:48:21 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:29.069 17:48:21 -- setup/hugepages.sh@74 -- # (( 2 > 0 )) 00:04:29.069 17:48:21 -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:29.069 17:48:21 -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:04:29.069 17:48:21 -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:29.069 17:48:21 -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=1024 00:04:29.069 17:48:21 -- setup/hugepages.sh@78 -- # return 0 00:04:29.069 17:48:21 -- setup/hugepages.sh@187 -- # HUGENODE='nodes_hp[0]=512,nodes_hp[1]=1024' 00:04:29.069 17:48:21 -- setup/hugepages.sh@187 -- # setup output 00:04:29.069 17:48:21 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:29.069 17:48:21 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:32.362 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:32.362 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:32.362 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:32.362 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:32.362 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:32.362 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:32.362 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:32.362 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:32.362 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:32.362 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:32.362 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:32.362 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:32.362 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:32.362 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:32.362 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:32.362 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:32.363 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:32.627 17:48:25 -- setup/hugepages.sh@188 -- # nr_hugepages=1536 00:04:32.627 17:48:25 -- setup/hugepages.sh@188 -- # verify_nr_hugepages 00:04:32.627 17:48:25 -- setup/hugepages.sh@89 -- # local node 00:04:32.627 17:48:25 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:32.627 17:48:25 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:32.627 17:48:25 -- setup/hugepages.sh@92 -- # local surp 00:04:32.627 17:48:25 -- setup/hugepages.sh@93 -- # local resv 00:04:32.627 17:48:25 -- setup/hugepages.sh@94 -- # local anon 00:04:32.627 17:48:25 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:32.628 17:48:25 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:32.628 17:48:25 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:32.628 17:48:25 -- setup/common.sh@18 -- # local node= 00:04:32.628 17:48:25 -- setup/common.sh@19 -- # local var val 00:04:32.628 17:48:25 -- setup/common.sh@20 -- # local mem_f mem 00:04:32.628 17:48:25 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:32.628 17:48:25 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:32.628 17:48:25 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:32.628 17:48:25 -- setup/common.sh@28 -- # mapfile -t mem 00:04:32.628 17:48:25 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:32.628 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.628 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.628 17:48:25 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 40553576 kB' 'MemAvailable: 44280728 kB' 'Buffers: 8940 kB' 'Cached: 12525976 kB' 'SwapCached: 0 kB' 'Active: 9412676 kB' 'Inactive: 3688312 kB' 'Active(anon): 8995832 kB' 'Inactive(anon): 0 kB' 'Active(file): 416844 kB' 'Inactive(file): 3688312 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 569796 kB' 'Mapped: 151868 kB' 'Shmem: 8429760 kB' 'KReclaimable: 238172 kB' 'Slab: 913312 kB' 'SReclaimable: 238172 kB' 'SUnreclaim: 675140 kB' 'KernelStack: 21712 kB' 'PageTables: 7240 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957636 kB' 'Committed_AS: 10261240 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214324 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 603508 kB' 'DirectMap2M: 10616832 kB' 'DirectMap1G: 58720256 kB' 00:04:32.628 17:48:25 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:32.628 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.628 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.628 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.628 17:48:25 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:32.628 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.628 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.628 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.628 17:48:25 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:32.628 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.628 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.628 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.628 17:48:25 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:32.628 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.628 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.628 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.628 17:48:25 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:32.628 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.628 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.628 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.628 17:48:25 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:32.628 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.628 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.628 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.628 17:48:25 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:32.628 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.628 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.628 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.628 17:48:25 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:32.628 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.628 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.628 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.628 17:48:25 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:32.628 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.628 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.628 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.628 17:48:25 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:32.628 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.628 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.628 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.628 17:48:25 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:32.628 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.628 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.628 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.628 17:48:25 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:32.628 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.628 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.628 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.628 17:48:25 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:32.628 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.628 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.628 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.628 17:48:25 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:32.628 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.628 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.628 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.628 17:48:25 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:32.628 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.628 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.628 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.628 17:48:25 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:32.628 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.628 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.628 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.628 17:48:25 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:32.628 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.628 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.628 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.628 17:48:25 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:32.628 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.628 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.628 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.628 17:48:25 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:32.628 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.628 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.628 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.628 17:48:25 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:32.628 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.628 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.628 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.628 17:48:25 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:32.628 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.628 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.628 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.628 17:48:25 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:32.628 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.628 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.628 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.628 17:48:25 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:32.628 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.628 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.628 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.628 17:48:25 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:32.628 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.628 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.628 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.628 17:48:25 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:32.628 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.628 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.628 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.628 17:48:25 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:32.628 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.628 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.628 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.628 17:48:25 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:32.628 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.628 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.628 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.628 17:48:25 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:32.628 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.628 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.628 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.628 17:48:25 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:32.628 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.628 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.628 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.628 17:48:25 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:32.628 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.628 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.628 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.628 17:48:25 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:32.628 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.628 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.628 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.628 17:48:25 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:32.628 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.628 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.628 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.628 17:48:25 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:32.628 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.628 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.629 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.629 17:48:25 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:32.629 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.629 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.629 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.629 17:48:25 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:32.629 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.629 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.629 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.629 17:48:25 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:32.629 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.629 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.629 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.629 17:48:25 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:32.629 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.629 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.629 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.629 17:48:25 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:32.629 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.629 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.629 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.629 17:48:25 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:32.629 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.629 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.629 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.629 17:48:25 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:32.629 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.629 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.629 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.629 17:48:25 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:32.629 17:48:25 -- setup/common.sh@33 -- # echo 0 00:04:32.629 17:48:25 -- setup/common.sh@33 -- # return 0 00:04:32.629 17:48:25 -- setup/hugepages.sh@97 -- # anon=0 00:04:32.629 17:48:25 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:32.629 17:48:25 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:32.629 17:48:25 -- setup/common.sh@18 -- # local node= 00:04:32.629 17:48:25 -- setup/common.sh@19 -- # local var val 00:04:32.629 17:48:25 -- setup/common.sh@20 -- # local mem_f mem 00:04:32.629 17:48:25 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:32.629 17:48:25 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:32.629 17:48:25 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:32.629 17:48:25 -- setup/common.sh@28 -- # mapfile -t mem 00:04:32.629 17:48:25 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:32.629 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.629 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.629 17:48:25 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 40555784 kB' 'MemAvailable: 44282936 kB' 'Buffers: 8940 kB' 'Cached: 12525980 kB' 'SwapCached: 0 kB' 'Active: 9407356 kB' 'Inactive: 3688312 kB' 'Active(anon): 8990512 kB' 'Inactive(anon): 0 kB' 'Active(file): 416844 kB' 'Inactive(file): 3688312 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 563988 kB' 'Mapped: 151724 kB' 'Shmem: 8429764 kB' 'KReclaimable: 238172 kB' 'Slab: 913360 kB' 'SReclaimable: 238172 kB' 'SUnreclaim: 675188 kB' 'KernelStack: 21696 kB' 'PageTables: 7252 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957636 kB' 'Committed_AS: 10255268 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214304 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 603508 kB' 'DirectMap2M: 10616832 kB' 'DirectMap1G: 58720256 kB' 00:04:32.629 17:48:25 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.629 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.629 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.629 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.629 17:48:25 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.629 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.629 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.629 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.629 17:48:25 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.629 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.629 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.629 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.629 17:48:25 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.629 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.629 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.629 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.629 17:48:25 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.629 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.629 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.629 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.629 17:48:25 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.629 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.629 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.629 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.629 17:48:25 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.629 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.629 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.629 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.629 17:48:25 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.629 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.629 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.629 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.629 17:48:25 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.629 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.629 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.629 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.629 17:48:25 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.629 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.629 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.629 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.629 17:48:25 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.629 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.629 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.629 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.629 17:48:25 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.629 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.629 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.629 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.629 17:48:25 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.629 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.629 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.629 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.629 17:48:25 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.629 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.629 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.629 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.629 17:48:25 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.629 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.629 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.629 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.629 17:48:25 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.629 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.629 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.629 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.629 17:48:25 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.629 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.629 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.629 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.629 17:48:25 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.629 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.629 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.629 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.629 17:48:25 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.629 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.629 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.629 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.629 17:48:25 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.629 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.629 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.629 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.629 17:48:25 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.629 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.629 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.629 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.629 17:48:25 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.629 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.629 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.629 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.629 17:48:25 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.629 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.629 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.629 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.629 17:48:25 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.629 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.629 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.629 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.629 17:48:25 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.629 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.629 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.630 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.630 17:48:25 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.630 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.630 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.630 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.630 17:48:25 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.630 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.630 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.630 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.630 17:48:25 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.630 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.630 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.630 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.630 17:48:25 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.630 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.630 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.630 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.630 17:48:25 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.630 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.630 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.630 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.630 17:48:25 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.630 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.630 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.630 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.630 17:48:25 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.630 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.630 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.630 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.630 17:48:25 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.630 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.630 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.630 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.630 17:48:25 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.630 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.630 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.630 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.630 17:48:25 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.630 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.630 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.630 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.630 17:48:25 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.630 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.630 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.630 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.630 17:48:25 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.630 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.630 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.630 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.630 17:48:25 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.630 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.630 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.630 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.630 17:48:25 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.630 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.630 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.630 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.630 17:48:25 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.630 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.630 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.630 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.630 17:48:25 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.630 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.630 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.630 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.630 17:48:25 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.630 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.630 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.630 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.630 17:48:25 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.630 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.630 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.630 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.630 17:48:25 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.630 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.630 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.630 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.630 17:48:25 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.630 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.630 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.630 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.630 17:48:25 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.630 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.630 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.630 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.630 17:48:25 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.630 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.630 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.630 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.630 17:48:25 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.630 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.630 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.630 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.630 17:48:25 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.630 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.630 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.630 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.630 17:48:25 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.630 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.630 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.630 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.630 17:48:25 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.630 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.630 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.630 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.630 17:48:25 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.630 17:48:25 -- setup/common.sh@33 -- # echo 0 00:04:32.630 17:48:25 -- setup/common.sh@33 -- # return 0 00:04:32.630 17:48:25 -- setup/hugepages.sh@99 -- # surp=0 00:04:32.630 17:48:25 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:32.630 17:48:25 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:32.630 17:48:25 -- setup/common.sh@18 -- # local node= 00:04:32.630 17:48:25 -- setup/common.sh@19 -- # local var val 00:04:32.630 17:48:25 -- setup/common.sh@20 -- # local mem_f mem 00:04:32.630 17:48:25 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:32.630 17:48:25 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:32.630 17:48:25 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:32.630 17:48:25 -- setup/common.sh@28 -- # mapfile -t mem 00:04:32.630 17:48:25 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:32.630 17:48:25 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 40556232 kB' 'MemAvailable: 44283384 kB' 'Buffers: 8940 kB' 'Cached: 12525992 kB' 'SwapCached: 0 kB' 'Active: 9407704 kB' 'Inactive: 3688312 kB' 'Active(anon): 8990860 kB' 'Inactive(anon): 0 kB' 'Active(file): 416844 kB' 'Inactive(file): 3688312 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 564412 kB' 'Mapped: 151312 kB' 'Shmem: 8429776 kB' 'KReclaimable: 238172 kB' 'Slab: 913360 kB' 'SReclaimable: 238172 kB' 'SUnreclaim: 675188 kB' 'KernelStack: 21744 kB' 'PageTables: 7404 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957636 kB' 'Committed_AS: 10255652 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214320 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 603508 kB' 'DirectMap2M: 10616832 kB' 'DirectMap1G: 58720256 kB' 00:04:32.630 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.630 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.630 17:48:25 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:32.630 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.630 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.630 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.630 17:48:25 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:32.630 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.630 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.630 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.630 17:48:25 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:32.630 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.630 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.630 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.630 17:48:25 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:32.630 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.630 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.630 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.630 17:48:25 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:32.630 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.630 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.630 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.630 17:48:25 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:32.630 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.631 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.631 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.631 17:48:25 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:32.631 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.631 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.631 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.631 17:48:25 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:32.631 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.631 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.631 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.631 17:48:25 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:32.631 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.631 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.631 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.631 17:48:25 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:32.631 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.631 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.631 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.631 17:48:25 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:32.631 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.631 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.631 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.631 17:48:25 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:32.631 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.631 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.631 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.631 17:48:25 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:32.631 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.631 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.631 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.631 17:48:25 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:32.631 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.631 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.631 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.631 17:48:25 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:32.631 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.631 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.631 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.631 17:48:25 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:32.631 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.631 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.631 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.631 17:48:25 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:32.631 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.631 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.631 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.631 17:48:25 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:32.631 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.631 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.631 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.631 17:48:25 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:32.631 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.631 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.631 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.631 17:48:25 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:32.631 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.631 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.631 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.631 17:48:25 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:32.631 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.631 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.631 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.631 17:48:25 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:32.631 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.631 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.631 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.631 17:48:25 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:32.631 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.631 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.631 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.631 17:48:25 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:32.631 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.631 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.631 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.631 17:48:25 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:32.631 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.631 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.631 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.631 17:48:25 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:32.631 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.631 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.631 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.631 17:48:25 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:32.631 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.631 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.631 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.631 17:48:25 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:32.631 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.631 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.631 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.631 17:48:25 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:32.631 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.631 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.631 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.631 17:48:25 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:32.631 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.631 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.631 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.631 17:48:25 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:32.631 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.631 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.631 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.631 17:48:25 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:32.631 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.631 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.631 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.631 17:48:25 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:32.631 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.631 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.631 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.631 17:48:25 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:32.631 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.631 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.631 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.631 17:48:25 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:32.631 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.631 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.631 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.631 17:48:25 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:32.631 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.631 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.631 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.631 17:48:25 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:32.631 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.631 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.631 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.631 17:48:25 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:32.631 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.631 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.631 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.631 17:48:25 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:32.631 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.631 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.631 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.631 17:48:25 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:32.631 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.631 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.631 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.631 17:48:25 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:32.631 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.631 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.631 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.631 17:48:25 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:32.631 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.631 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.631 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.631 17:48:25 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:32.631 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.631 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.631 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.631 17:48:25 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:32.631 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.631 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.631 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.631 17:48:25 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:32.631 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.631 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.631 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.631 17:48:25 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:32.631 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.631 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.631 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.632 17:48:25 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:32.632 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.632 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.632 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.632 17:48:25 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:32.632 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.632 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.632 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.632 17:48:25 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:32.632 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.632 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.632 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.632 17:48:25 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:32.632 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.632 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.632 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.632 17:48:25 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:32.632 17:48:25 -- setup/common.sh@33 -- # echo 0 00:04:32.632 17:48:25 -- setup/common.sh@33 -- # return 0 00:04:32.632 17:48:25 -- setup/hugepages.sh@100 -- # resv=0 00:04:32.632 17:48:25 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1536 00:04:32.632 nr_hugepages=1536 00:04:32.632 17:48:25 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:32.632 resv_hugepages=0 00:04:32.632 17:48:25 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:32.632 surplus_hugepages=0 00:04:32.632 17:48:25 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:32.632 anon_hugepages=0 00:04:32.632 17:48:25 -- setup/hugepages.sh@107 -- # (( 1536 == nr_hugepages + surp + resv )) 00:04:32.632 17:48:25 -- setup/hugepages.sh@109 -- # (( 1536 == nr_hugepages )) 00:04:32.632 17:48:25 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:32.632 17:48:25 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:32.632 17:48:25 -- setup/common.sh@18 -- # local node= 00:04:32.632 17:48:25 -- setup/common.sh@19 -- # local var val 00:04:32.632 17:48:25 -- setup/common.sh@20 -- # local mem_f mem 00:04:32.632 17:48:25 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:32.632 17:48:25 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:32.632 17:48:25 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:32.632 17:48:25 -- setup/common.sh@28 -- # mapfile -t mem 00:04:32.632 17:48:25 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:32.632 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.632 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.632 17:48:25 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 40556232 kB' 'MemAvailable: 44283384 kB' 'Buffers: 8940 kB' 'Cached: 12526004 kB' 'SwapCached: 0 kB' 'Active: 9408008 kB' 'Inactive: 3688312 kB' 'Active(anon): 8991164 kB' 'Inactive(anon): 0 kB' 'Active(file): 416844 kB' 'Inactive(file): 3688312 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 564692 kB' 'Mapped: 151312 kB' 'Shmem: 8429788 kB' 'KReclaimable: 238172 kB' 'Slab: 913360 kB' 'SReclaimable: 238172 kB' 'SUnreclaim: 675188 kB' 'KernelStack: 21760 kB' 'PageTables: 7436 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957636 kB' 'Committed_AS: 10257544 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214320 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 603508 kB' 'DirectMap2M: 10616832 kB' 'DirectMap1G: 58720256 kB' 00:04:32.632 17:48:25 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:32.632 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.632 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.632 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.632 17:48:25 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:32.632 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.632 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.632 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.632 17:48:25 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:32.632 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.632 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.632 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.632 17:48:25 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:32.632 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.632 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.632 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.632 17:48:25 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:32.632 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.632 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.632 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.632 17:48:25 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:32.632 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.632 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.632 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.632 17:48:25 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:32.632 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.632 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.632 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.632 17:48:25 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:32.632 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.632 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.632 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.632 17:48:25 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:32.632 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.632 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.632 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.632 17:48:25 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:32.632 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.632 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.632 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.632 17:48:25 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:32.632 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.632 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.632 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.632 17:48:25 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:32.632 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.632 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.632 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.632 17:48:25 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:32.632 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.632 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.632 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.632 17:48:25 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:32.632 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.632 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.632 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.632 17:48:25 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:32.632 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.632 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.632 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.632 17:48:25 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:32.632 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.632 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.632 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.632 17:48:25 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:32.632 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.632 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.632 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.632 17:48:25 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:32.632 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.632 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.632 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.632 17:48:25 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:32.632 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.632 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.633 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.633 17:48:25 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:32.633 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.633 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.633 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.633 17:48:25 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:32.633 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.633 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.633 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.633 17:48:25 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:32.633 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.633 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.633 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.633 17:48:25 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:32.633 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.633 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.633 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.633 17:48:25 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:32.633 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.633 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.633 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.633 17:48:25 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:32.633 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.633 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.633 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.633 17:48:25 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:32.633 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.633 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.633 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.633 17:48:25 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:32.633 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.633 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.633 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.633 17:48:25 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:32.633 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.633 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.633 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.633 17:48:25 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:32.633 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.633 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.633 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.633 17:48:25 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:32.633 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.633 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.633 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.633 17:48:25 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:32.633 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.633 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.633 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.633 17:48:25 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:32.633 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.633 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.633 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.633 17:48:25 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:32.633 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.633 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.633 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.633 17:48:25 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:32.633 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.633 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.633 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.633 17:48:25 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:32.633 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.633 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.633 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.633 17:48:25 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:32.633 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.633 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.633 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.633 17:48:25 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:32.633 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.633 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.633 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.633 17:48:25 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:32.633 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.633 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.633 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.633 17:48:25 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:32.633 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.633 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.633 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.633 17:48:25 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:32.633 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.633 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.633 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.633 17:48:25 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:32.633 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.633 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.633 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.633 17:48:25 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:32.633 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.633 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.633 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.633 17:48:25 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:32.633 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.633 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.633 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.633 17:48:25 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:32.633 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.633 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.633 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.633 17:48:25 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:32.633 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.633 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.633 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.633 17:48:25 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:32.633 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.633 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.633 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.633 17:48:25 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:32.633 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.633 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.633 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.633 17:48:25 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:32.633 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.633 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.633 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.633 17:48:25 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:32.633 17:48:25 -- setup/common.sh@33 -- # echo 1536 00:04:32.633 17:48:25 -- setup/common.sh@33 -- # return 0 00:04:32.633 17:48:25 -- setup/hugepages.sh@110 -- # (( 1536 == nr_hugepages + surp + resv )) 00:04:32.633 17:48:25 -- setup/hugepages.sh@112 -- # get_nodes 00:04:32.633 17:48:25 -- setup/hugepages.sh@27 -- # local node 00:04:32.633 17:48:25 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:32.633 17:48:25 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:32.633 17:48:25 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:32.633 17:48:25 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:32.633 17:48:25 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:32.633 17:48:25 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:32.633 17:48:25 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:32.633 17:48:25 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:32.633 17:48:25 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:32.633 17:48:25 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:32.633 17:48:25 -- setup/common.sh@18 -- # local node=0 00:04:32.633 17:48:25 -- setup/common.sh@19 -- # local var val 00:04:32.633 17:48:25 -- setup/common.sh@20 -- # local mem_f mem 00:04:32.633 17:48:25 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:32.633 17:48:25 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:32.633 17:48:25 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:32.633 17:48:25 -- setup/common.sh@28 -- # mapfile -t mem 00:04:32.633 17:48:25 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:32.633 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.633 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.633 17:48:25 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634436 kB' 'MemFree: 26189448 kB' 'MemUsed: 6444988 kB' 'SwapCached: 0 kB' 'Active: 3691880 kB' 'Inactive: 168948 kB' 'Active(anon): 3569496 kB' 'Inactive(anon): 0 kB' 'Active(file): 122384 kB' 'Inactive(file): 168948 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3681920 kB' 'Mapped: 59620 kB' 'AnonPages: 182156 kB' 'Shmem: 3390588 kB' 'KernelStack: 10712 kB' 'PageTables: 3536 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 153400 kB' 'Slab: 469840 kB' 'SReclaimable: 153400 kB' 'SUnreclaim: 316440 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:32.633 17:48:25 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.633 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.633 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.633 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.633 17:48:25 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.633 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.634 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.634 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.634 17:48:25 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.634 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.634 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.634 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.634 17:48:25 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.634 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.634 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.634 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.634 17:48:25 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.634 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.634 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.634 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.634 17:48:25 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.634 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.634 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.634 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.634 17:48:25 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.634 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.634 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.634 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.634 17:48:25 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.634 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.634 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.634 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.634 17:48:25 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.634 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.634 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.634 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.634 17:48:25 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.634 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.634 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.634 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.634 17:48:25 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.634 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.634 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.634 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.634 17:48:25 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.634 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.634 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.634 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.634 17:48:25 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.634 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.634 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.634 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.634 17:48:25 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.634 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.634 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.634 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.634 17:48:25 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.634 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.634 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.634 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.634 17:48:25 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.634 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.634 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.634 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.634 17:48:25 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.634 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.634 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.634 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.634 17:48:25 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.634 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.634 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.634 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.634 17:48:25 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.634 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.634 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.634 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.634 17:48:25 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.634 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.634 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.634 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.634 17:48:25 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.634 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.634 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.634 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.634 17:48:25 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.634 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.634 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.634 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.634 17:48:25 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.634 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.634 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.634 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.634 17:48:25 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.634 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.634 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.634 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.634 17:48:25 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.634 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.634 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.634 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.634 17:48:25 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.634 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.634 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.634 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.634 17:48:25 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.634 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.634 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.634 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.634 17:48:25 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.634 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.634 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.634 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.634 17:48:25 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.634 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.634 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.634 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.634 17:48:25 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.634 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.634 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.634 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.634 17:48:25 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.634 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.634 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.634 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.634 17:48:25 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.634 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.634 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.634 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.634 17:48:25 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.634 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.634 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.634 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.634 17:48:25 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.634 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.634 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.634 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.634 17:48:25 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.634 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.634 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.634 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.634 17:48:25 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.634 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.634 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.634 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.634 17:48:25 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.634 17:48:25 -- setup/common.sh@33 -- # echo 0 00:04:32.634 17:48:25 -- setup/common.sh@33 -- # return 0 00:04:32.634 17:48:25 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:32.634 17:48:25 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:32.634 17:48:25 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:32.634 17:48:25 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:32.634 17:48:25 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:32.634 17:48:25 -- setup/common.sh@18 -- # local node=1 00:04:32.634 17:48:25 -- setup/common.sh@19 -- # local var val 00:04:32.634 17:48:25 -- setup/common.sh@20 -- # local mem_f mem 00:04:32.634 17:48:25 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:32.634 17:48:25 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:32.634 17:48:25 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:32.634 17:48:25 -- setup/common.sh@28 -- # mapfile -t mem 00:04:32.634 17:48:25 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:32.634 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.634 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.635 17:48:25 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27649360 kB' 'MemFree: 14365528 kB' 'MemUsed: 13283832 kB' 'SwapCached: 0 kB' 'Active: 5715968 kB' 'Inactive: 3519364 kB' 'Active(anon): 5421508 kB' 'Inactive(anon): 0 kB' 'Active(file): 294460 kB' 'Inactive(file): 3519364 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8853044 kB' 'Mapped: 91692 kB' 'AnonPages: 382404 kB' 'Shmem: 5039220 kB' 'KernelStack: 11048 kB' 'PageTables: 3912 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 84772 kB' 'Slab: 443512 kB' 'SReclaimable: 84772 kB' 'SUnreclaim: 358740 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:32.635 17:48:25 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.635 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.635 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.635 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.635 17:48:25 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.635 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.635 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.635 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.635 17:48:25 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.635 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.635 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.635 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.635 17:48:25 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.635 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.635 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.635 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.635 17:48:25 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.635 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.635 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.635 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.635 17:48:25 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.635 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.635 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.635 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.635 17:48:25 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.635 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.635 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.635 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.635 17:48:25 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.635 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.635 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.635 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.635 17:48:25 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.635 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.635 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.635 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.635 17:48:25 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.635 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.635 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.635 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.635 17:48:25 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.635 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.635 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.635 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.635 17:48:25 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.635 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.635 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.635 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.635 17:48:25 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.635 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.635 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.635 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.635 17:48:25 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.635 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.635 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.635 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.635 17:48:25 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.635 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.635 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.635 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.635 17:48:25 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.635 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.635 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.635 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.635 17:48:25 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.635 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.635 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.635 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.635 17:48:25 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.635 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.635 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.635 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.635 17:48:25 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.635 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.635 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.635 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.635 17:48:25 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.635 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.635 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.635 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.635 17:48:25 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.635 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.635 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.635 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.635 17:48:25 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.635 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.635 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.635 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.635 17:48:25 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.635 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.635 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.635 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.635 17:48:25 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.635 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.635 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.635 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.635 17:48:25 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.635 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.635 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.635 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.635 17:48:25 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.635 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.635 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.635 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.635 17:48:25 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.635 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.635 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.635 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.635 17:48:25 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.635 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.635 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.635 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.635 17:48:25 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.635 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.635 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.635 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.635 17:48:25 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.635 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.635 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.635 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.635 17:48:25 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.635 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.635 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.635 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.635 17:48:25 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.635 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.635 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.635 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.635 17:48:25 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.635 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.635 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.635 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.635 17:48:25 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.635 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.635 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.635 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.635 17:48:25 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.635 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.635 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.635 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.635 17:48:25 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.635 17:48:25 -- setup/common.sh@32 -- # continue 00:04:32.635 17:48:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:32.635 17:48:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:32.635 17:48:25 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.635 17:48:25 -- setup/common.sh@33 -- # echo 0 00:04:32.635 17:48:25 -- setup/common.sh@33 -- # return 0 00:04:32.635 17:48:25 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:32.635 17:48:25 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:32.635 17:48:25 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:32.635 17:48:25 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:32.635 17:48:25 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:04:32.635 node0=512 expecting 512 00:04:32.635 17:48:25 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:32.636 17:48:25 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:32.636 17:48:25 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:32.636 17:48:25 -- setup/hugepages.sh@128 -- # echo 'node1=1024 expecting 1024' 00:04:32.636 node1=1024 expecting 1024 00:04:32.636 17:48:25 -- setup/hugepages.sh@130 -- # [[ 512,1024 == \5\1\2\,\1\0\2\4 ]] 00:04:32.636 00:04:32.636 real 0m3.773s 00:04:32.636 user 0m1.481s 00:04:32.636 sys 0m2.363s 00:04:32.636 17:48:25 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:32.636 17:48:25 -- common/autotest_common.sh@10 -- # set +x 00:04:32.636 ************************************ 00:04:32.636 END TEST custom_alloc 00:04:32.636 ************************************ 00:04:32.896 17:48:25 -- setup/hugepages.sh@215 -- # run_test no_shrink_alloc no_shrink_alloc 00:04:32.896 17:48:25 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:32.896 17:48:25 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:32.896 17:48:25 -- common/autotest_common.sh@10 -- # set +x 00:04:32.896 ************************************ 00:04:32.896 START TEST no_shrink_alloc 00:04:32.896 ************************************ 00:04:32.896 17:48:25 -- common/autotest_common.sh@1114 -- # no_shrink_alloc 00:04:32.896 17:48:25 -- setup/hugepages.sh@195 -- # get_test_nr_hugepages 2097152 0 00:04:32.896 17:48:25 -- setup/hugepages.sh@49 -- # local size=2097152 00:04:32.896 17:48:25 -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:04:32.896 17:48:25 -- setup/hugepages.sh@51 -- # shift 00:04:32.896 17:48:25 -- setup/hugepages.sh@52 -- # node_ids=('0') 00:04:32.896 17:48:25 -- setup/hugepages.sh@52 -- # local node_ids 00:04:32.896 17:48:25 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:32.896 17:48:25 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:32.896 17:48:25 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:04:32.896 17:48:25 -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:04:32.896 17:48:25 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:32.896 17:48:25 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:32.896 17:48:25 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:32.896 17:48:25 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:32.896 17:48:25 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:32.896 17:48:25 -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:04:32.896 17:48:25 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:32.896 17:48:25 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:04:32.896 17:48:25 -- setup/hugepages.sh@73 -- # return 0 00:04:32.896 17:48:25 -- setup/hugepages.sh@198 -- # setup output 00:04:32.896 17:48:25 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:32.896 17:48:25 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:36.189 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:36.189 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:36.189 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:36.189 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:36.190 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:36.190 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:36.190 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:36.190 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:36.190 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:36.190 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:36.190 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:36.190 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:36.190 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:36.190 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:36.190 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:36.190 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:36.190 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:36.453 17:48:29 -- setup/hugepages.sh@199 -- # verify_nr_hugepages 00:04:36.453 17:48:29 -- setup/hugepages.sh@89 -- # local node 00:04:36.453 17:48:29 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:36.453 17:48:29 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:36.453 17:48:29 -- setup/hugepages.sh@92 -- # local surp 00:04:36.453 17:48:29 -- setup/hugepages.sh@93 -- # local resv 00:04:36.453 17:48:29 -- setup/hugepages.sh@94 -- # local anon 00:04:36.453 17:48:29 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:36.453 17:48:29 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:36.453 17:48:29 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:36.453 17:48:29 -- setup/common.sh@18 -- # local node= 00:04:36.453 17:48:29 -- setup/common.sh@19 -- # local var val 00:04:36.453 17:48:29 -- setup/common.sh@20 -- # local mem_f mem 00:04:36.453 17:48:29 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:36.453 17:48:29 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:36.453 17:48:29 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:36.453 17:48:29 -- setup/common.sh@28 -- # mapfile -t mem 00:04:36.453 17:48:29 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:36.453 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.453 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.453 17:48:29 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41590956 kB' 'MemAvailable: 45318108 kB' 'Buffers: 8940 kB' 'Cached: 12526116 kB' 'SwapCached: 0 kB' 'Active: 9409588 kB' 'Inactive: 3688312 kB' 'Active(anon): 8992744 kB' 'Inactive(anon): 0 kB' 'Active(file): 416844 kB' 'Inactive(file): 3688312 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 566128 kB' 'Mapped: 151316 kB' 'Shmem: 8429900 kB' 'KReclaimable: 238172 kB' 'Slab: 913472 kB' 'SReclaimable: 238172 kB' 'SUnreclaim: 675300 kB' 'KernelStack: 21936 kB' 'PageTables: 7336 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10260828 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214528 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 603508 kB' 'DirectMap2M: 10616832 kB' 'DirectMap1G: 58720256 kB' 00:04:36.453 17:48:29 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.453 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.453 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.453 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.453 17:48:29 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.453 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.453 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.453 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.453 17:48:29 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.453 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.453 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.453 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.453 17:48:29 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.453 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.453 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.453 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.453 17:48:29 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.453 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.453 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.453 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.453 17:48:29 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.453 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.453 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.453 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.453 17:48:29 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.453 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.453 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.453 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.453 17:48:29 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.453 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.453 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.453 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.453 17:48:29 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.453 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.453 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.453 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.453 17:48:29 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.453 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.453 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.453 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.453 17:48:29 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.453 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.453 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.453 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.453 17:48:29 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.453 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.453 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.453 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.453 17:48:29 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.453 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.453 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.453 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.453 17:48:29 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.453 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.453 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.453 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.453 17:48:29 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.453 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.453 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.453 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.453 17:48:29 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.453 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.453 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.453 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.453 17:48:29 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.453 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.453 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.453 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.453 17:48:29 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.453 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.453 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.453 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.453 17:48:29 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.453 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.453 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.453 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.453 17:48:29 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.453 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.453 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.453 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.453 17:48:29 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.453 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.453 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.453 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.454 17:48:29 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.454 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.454 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.454 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.454 17:48:29 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.454 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.454 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.454 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.454 17:48:29 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.454 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.454 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.454 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.454 17:48:29 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.454 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.454 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.454 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.454 17:48:29 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.454 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.454 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.454 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.454 17:48:29 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.454 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.454 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.454 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.454 17:48:29 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.454 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.454 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.454 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.454 17:48:29 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.454 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.454 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.454 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.454 17:48:29 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.454 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.454 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.454 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.454 17:48:29 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.454 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.454 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.454 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.454 17:48:29 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.454 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.454 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.454 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.454 17:48:29 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.454 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.454 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.454 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.454 17:48:29 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.454 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.454 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.454 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.454 17:48:29 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.454 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.454 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.454 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.454 17:48:29 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.454 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.454 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.454 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.454 17:48:29 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.454 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.454 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.454 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.454 17:48:29 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.454 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.454 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.454 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.454 17:48:29 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.454 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.454 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.454 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.454 17:48:29 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.454 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.454 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.454 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.454 17:48:29 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.454 17:48:29 -- setup/common.sh@33 -- # echo 0 00:04:36.454 17:48:29 -- setup/common.sh@33 -- # return 0 00:04:36.454 17:48:29 -- setup/hugepages.sh@97 -- # anon=0 00:04:36.454 17:48:29 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:36.454 17:48:29 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:36.454 17:48:29 -- setup/common.sh@18 -- # local node= 00:04:36.454 17:48:29 -- setup/common.sh@19 -- # local var val 00:04:36.454 17:48:29 -- setup/common.sh@20 -- # local mem_f mem 00:04:36.454 17:48:29 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:36.454 17:48:29 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:36.454 17:48:29 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:36.454 17:48:29 -- setup/common.sh@28 -- # mapfile -t mem 00:04:36.454 17:48:29 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:36.454 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.454 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.454 17:48:29 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41590468 kB' 'MemAvailable: 45317620 kB' 'Buffers: 8940 kB' 'Cached: 12526120 kB' 'SwapCached: 0 kB' 'Active: 9409508 kB' 'Inactive: 3688312 kB' 'Active(anon): 8992664 kB' 'Inactive(anon): 0 kB' 'Active(file): 416844 kB' 'Inactive(file): 3688312 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 566008 kB' 'Mapped: 151316 kB' 'Shmem: 8429904 kB' 'KReclaimable: 238172 kB' 'Slab: 913516 kB' 'SReclaimable: 238172 kB' 'SUnreclaim: 675344 kB' 'KernelStack: 22000 kB' 'PageTables: 8052 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10260840 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214480 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 603508 kB' 'DirectMap2M: 10616832 kB' 'DirectMap1G: 58720256 kB' 00:04:36.454 17:48:29 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.454 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.454 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.454 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.454 17:48:29 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.454 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.454 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.454 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.454 17:48:29 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.454 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.454 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.454 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.454 17:48:29 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.454 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.454 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.454 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.454 17:48:29 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.454 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.454 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.454 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.454 17:48:29 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.454 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.454 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.454 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.454 17:48:29 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.454 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.454 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.454 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.454 17:48:29 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.454 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.454 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.454 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.454 17:48:29 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.454 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.454 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.454 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.455 17:48:29 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.455 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.455 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.455 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.455 17:48:29 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.455 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.455 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.455 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.455 17:48:29 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.455 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.455 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.455 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.455 17:48:29 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.455 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.455 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.455 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.455 17:48:29 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.455 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.455 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.455 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.455 17:48:29 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.455 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.455 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.455 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.455 17:48:29 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.455 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.455 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.455 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.455 17:48:29 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.455 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.455 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.455 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.455 17:48:29 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.455 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.455 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.455 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.455 17:48:29 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.455 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.455 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.455 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.455 17:48:29 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.455 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.455 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.455 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.455 17:48:29 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.455 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.455 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.455 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.455 17:48:29 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.455 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.455 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.455 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.455 17:48:29 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.455 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.455 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.455 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.455 17:48:29 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.455 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.455 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.455 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.455 17:48:29 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.455 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.455 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.455 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.455 17:48:29 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.455 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.455 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.455 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.455 17:48:29 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.455 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.455 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.455 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.455 17:48:29 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.455 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.455 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.455 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.455 17:48:29 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.455 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.455 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.455 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.455 17:48:29 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.455 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.455 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.455 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.455 17:48:29 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.455 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.455 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.455 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.455 17:48:29 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.455 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.455 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.455 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.455 17:48:29 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.455 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.455 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.455 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.455 17:48:29 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.455 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.455 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.455 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.455 17:48:29 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.455 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.455 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.455 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.455 17:48:29 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.455 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.455 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.455 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.455 17:48:29 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.455 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.455 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.455 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.455 17:48:29 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.455 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.455 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.455 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.455 17:48:29 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.455 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.455 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.455 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.455 17:48:29 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.455 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.455 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.455 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.455 17:48:29 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.455 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.455 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.455 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.455 17:48:29 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.455 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.455 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.455 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.455 17:48:29 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.455 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.455 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.455 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.455 17:48:29 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.455 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.455 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.455 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.455 17:48:29 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.455 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.455 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.455 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.455 17:48:29 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.455 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.456 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.456 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.456 17:48:29 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.456 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.456 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.456 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.456 17:48:29 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.456 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.456 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.456 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.456 17:48:29 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.456 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.456 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.456 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.456 17:48:29 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.456 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.456 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.456 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.456 17:48:29 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.456 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.456 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.456 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.456 17:48:29 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.456 17:48:29 -- setup/common.sh@33 -- # echo 0 00:04:36.456 17:48:29 -- setup/common.sh@33 -- # return 0 00:04:36.456 17:48:29 -- setup/hugepages.sh@99 -- # surp=0 00:04:36.456 17:48:29 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:36.456 17:48:29 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:36.456 17:48:29 -- setup/common.sh@18 -- # local node= 00:04:36.456 17:48:29 -- setup/common.sh@19 -- # local var val 00:04:36.456 17:48:29 -- setup/common.sh@20 -- # local mem_f mem 00:04:36.456 17:48:29 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:36.456 17:48:29 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:36.456 17:48:29 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:36.456 17:48:29 -- setup/common.sh@28 -- # mapfile -t mem 00:04:36.456 17:48:29 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:36.456 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.456 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.456 17:48:29 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41591608 kB' 'MemAvailable: 45318760 kB' 'Buffers: 8940 kB' 'Cached: 12526124 kB' 'SwapCached: 0 kB' 'Active: 9409084 kB' 'Inactive: 3688312 kB' 'Active(anon): 8992240 kB' 'Inactive(anon): 0 kB' 'Active(file): 416844 kB' 'Inactive(file): 3688312 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 565580 kB' 'Mapped: 151300 kB' 'Shmem: 8429908 kB' 'KReclaimable: 238172 kB' 'Slab: 913584 kB' 'SReclaimable: 238172 kB' 'SUnreclaim: 675412 kB' 'KernelStack: 21888 kB' 'PageTables: 7848 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10259340 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214496 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 603508 kB' 'DirectMap2M: 10616832 kB' 'DirectMap1G: 58720256 kB' 00:04:36.456 17:48:29 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.456 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.456 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.456 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.456 17:48:29 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.456 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.456 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.456 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.456 17:48:29 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.456 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.456 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.456 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.456 17:48:29 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.456 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.456 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.456 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.456 17:48:29 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.456 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.456 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.456 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.456 17:48:29 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.456 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.456 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.456 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.456 17:48:29 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.456 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.456 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.456 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.456 17:48:29 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.456 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.456 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.456 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.456 17:48:29 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.456 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.456 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.456 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.456 17:48:29 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.456 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.456 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.456 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.456 17:48:29 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.456 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.456 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.456 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.456 17:48:29 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.456 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.456 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.456 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.456 17:48:29 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.456 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.456 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.456 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.456 17:48:29 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.456 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.456 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.456 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.456 17:48:29 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.456 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.456 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.456 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.456 17:48:29 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.456 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.456 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.456 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.456 17:48:29 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.456 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.456 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.456 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.456 17:48:29 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.456 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.456 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.456 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.456 17:48:29 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.456 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.456 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.456 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.456 17:48:29 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.456 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.456 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.456 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.456 17:48:29 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.456 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.456 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.456 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.456 17:48:29 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.456 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.456 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.456 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.457 17:48:29 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.457 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.457 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.457 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.457 17:48:29 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.457 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.457 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.457 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.457 17:48:29 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.457 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.457 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.457 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.457 17:48:29 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.457 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.457 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.457 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.457 17:48:29 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.457 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.457 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.457 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.457 17:48:29 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.457 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.457 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.457 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.457 17:48:29 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.457 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.457 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.457 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.457 17:48:29 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.457 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.457 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.457 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.457 17:48:29 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.457 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.457 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.457 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.457 17:48:29 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.457 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.457 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.457 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.457 17:48:29 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.457 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.457 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.457 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.457 17:48:29 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.457 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.457 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.457 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.457 17:48:29 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.457 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.457 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.457 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.457 17:48:29 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.457 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.457 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.457 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.457 17:48:29 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.457 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.457 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.457 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.457 17:48:29 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.457 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.457 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.457 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.457 17:48:29 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.457 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.457 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.457 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.457 17:48:29 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.457 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.457 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.457 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.457 17:48:29 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.457 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.457 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.457 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.457 17:48:29 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.457 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.457 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.457 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.457 17:48:29 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.457 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.457 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.457 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.457 17:48:29 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.457 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.457 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.457 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.457 17:48:29 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.457 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.457 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.457 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.457 17:48:29 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.457 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.457 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.457 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.457 17:48:29 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.457 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.457 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.457 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.457 17:48:29 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.457 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.457 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.457 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.457 17:48:29 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.457 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.457 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.457 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.457 17:48:29 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.457 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.457 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.457 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.457 17:48:29 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.457 17:48:29 -- setup/common.sh@33 -- # echo 0 00:04:36.457 17:48:29 -- setup/common.sh@33 -- # return 0 00:04:36.457 17:48:29 -- setup/hugepages.sh@100 -- # resv=0 00:04:36.457 17:48:29 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:36.457 nr_hugepages=1024 00:04:36.457 17:48:29 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:36.457 resv_hugepages=0 00:04:36.457 17:48:29 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:36.457 surplus_hugepages=0 00:04:36.457 17:48:29 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:36.457 anon_hugepages=0 00:04:36.457 17:48:29 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:36.457 17:48:29 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:36.457 17:48:29 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:36.457 17:48:29 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:36.457 17:48:29 -- setup/common.sh@18 -- # local node= 00:04:36.457 17:48:29 -- setup/common.sh@19 -- # local var val 00:04:36.457 17:48:29 -- setup/common.sh@20 -- # local mem_f mem 00:04:36.457 17:48:29 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:36.457 17:48:29 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:36.457 17:48:29 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:36.457 17:48:29 -- setup/common.sh@28 -- # mapfile -t mem 00:04:36.457 17:48:29 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:36.457 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.457 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.458 17:48:29 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41590916 kB' 'MemAvailable: 45318068 kB' 'Buffers: 8940 kB' 'Cached: 12526124 kB' 'SwapCached: 0 kB' 'Active: 9409656 kB' 'Inactive: 3688312 kB' 'Active(anon): 8992812 kB' 'Inactive(anon): 0 kB' 'Active(file): 416844 kB' 'Inactive(file): 3688312 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 566152 kB' 'Mapped: 151332 kB' 'Shmem: 8429908 kB' 'KReclaimable: 238172 kB' 'Slab: 913584 kB' 'SReclaimable: 238172 kB' 'SUnreclaim: 675412 kB' 'KernelStack: 21840 kB' 'PageTables: 7824 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10259352 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214528 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 603508 kB' 'DirectMap2M: 10616832 kB' 'DirectMap1G: 58720256 kB' 00:04:36.458 17:48:29 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.458 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.458 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.458 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.458 17:48:29 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.458 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.458 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.458 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.458 17:48:29 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.458 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.458 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.458 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.458 17:48:29 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.458 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.458 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.458 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.458 17:48:29 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.458 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.458 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.458 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.458 17:48:29 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.458 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.458 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.458 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.458 17:48:29 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.458 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.458 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.458 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.458 17:48:29 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.458 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.458 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.458 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.458 17:48:29 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.458 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.458 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.458 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.458 17:48:29 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.458 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.458 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.458 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.458 17:48:29 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.458 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.458 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.458 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.458 17:48:29 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.458 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.458 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.458 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.458 17:48:29 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.458 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.458 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.458 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.458 17:48:29 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.458 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.458 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.458 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.458 17:48:29 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.458 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.458 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.458 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.458 17:48:29 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.458 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.458 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.458 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.458 17:48:29 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.458 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.458 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.458 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.458 17:48:29 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.458 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.458 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.458 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.458 17:48:29 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.458 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.458 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.458 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.458 17:48:29 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.458 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.458 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.458 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.458 17:48:29 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.458 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.458 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.458 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.458 17:48:29 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.458 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.458 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.458 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.458 17:48:29 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.458 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.458 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.458 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.458 17:48:29 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.458 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.458 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.458 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.458 17:48:29 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.458 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.458 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.458 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.458 17:48:29 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.458 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.458 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.458 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.458 17:48:29 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.458 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.458 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.458 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.458 17:48:29 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.458 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.458 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.458 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.458 17:48:29 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.458 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.458 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.459 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.459 17:48:29 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.459 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.459 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.459 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.459 17:48:29 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.459 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.459 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.459 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.459 17:48:29 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.459 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.459 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.459 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.459 17:48:29 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.459 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.459 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.459 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.459 17:48:29 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.459 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.459 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.459 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.459 17:48:29 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.459 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.459 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.459 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.459 17:48:29 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.459 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.459 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.459 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.459 17:48:29 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.459 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.459 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.459 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.459 17:48:29 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.459 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.459 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.459 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.459 17:48:29 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.459 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.459 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.459 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.459 17:48:29 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.459 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.459 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.459 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.459 17:48:29 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.459 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.459 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.459 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.459 17:48:29 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.459 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.459 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.459 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.459 17:48:29 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.459 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.459 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.459 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.459 17:48:29 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.459 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.459 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.459 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.459 17:48:29 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.459 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.459 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.459 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.459 17:48:29 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.459 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.459 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.459 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.459 17:48:29 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.459 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.459 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.459 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.459 17:48:29 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.459 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.459 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.459 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.459 17:48:29 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.459 17:48:29 -- setup/common.sh@33 -- # echo 1024 00:04:36.459 17:48:29 -- setup/common.sh@33 -- # return 0 00:04:36.459 17:48:29 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:36.459 17:48:29 -- setup/hugepages.sh@112 -- # get_nodes 00:04:36.459 17:48:29 -- setup/hugepages.sh@27 -- # local node 00:04:36.459 17:48:29 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:36.459 17:48:29 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:36.459 17:48:29 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:36.459 17:48:29 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:04:36.459 17:48:29 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:36.459 17:48:29 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:36.459 17:48:29 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:36.459 17:48:29 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:36.459 17:48:29 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:36.459 17:48:29 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:36.459 17:48:29 -- setup/common.sh@18 -- # local node=0 00:04:36.459 17:48:29 -- setup/common.sh@19 -- # local var val 00:04:36.459 17:48:29 -- setup/common.sh@20 -- # local mem_f mem 00:04:36.459 17:48:29 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:36.459 17:48:29 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:36.459 17:48:29 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:36.459 17:48:29 -- setup/common.sh@28 -- # mapfile -t mem 00:04:36.459 17:48:29 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:36.459 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.459 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.459 17:48:29 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634436 kB' 'MemFree: 25123932 kB' 'MemUsed: 7510504 kB' 'SwapCached: 0 kB' 'Active: 3692540 kB' 'Inactive: 168948 kB' 'Active(anon): 3570156 kB' 'Inactive(anon): 0 kB' 'Active(file): 122384 kB' 'Inactive(file): 168948 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3682000 kB' 'Mapped: 59624 kB' 'AnonPages: 182680 kB' 'Shmem: 3390668 kB' 'KernelStack: 10728 kB' 'PageTables: 3640 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 153400 kB' 'Slab: 469780 kB' 'SReclaimable: 153400 kB' 'SUnreclaim: 316380 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:36.459 17:48:29 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.459 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.459 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.459 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.459 17:48:29 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.459 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.459 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.459 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.459 17:48:29 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.459 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.459 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.459 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.459 17:48:29 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.459 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.459 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.459 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.459 17:48:29 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.459 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.459 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.459 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.459 17:48:29 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.459 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.459 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.459 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.460 17:48:29 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.460 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.460 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.460 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.460 17:48:29 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.460 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.460 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.460 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.460 17:48:29 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.460 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.460 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.460 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.460 17:48:29 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.460 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.460 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.460 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.460 17:48:29 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.460 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.460 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.460 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.460 17:48:29 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.460 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.460 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.460 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.460 17:48:29 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.460 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.460 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.460 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.460 17:48:29 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.460 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.460 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.460 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.460 17:48:29 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.460 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.460 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.460 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.460 17:48:29 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.460 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.460 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.460 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.460 17:48:29 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.460 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.460 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.460 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.460 17:48:29 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.460 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.460 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.460 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.460 17:48:29 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.460 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.460 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.460 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.460 17:48:29 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.460 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.460 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.460 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.460 17:48:29 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.460 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.460 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.460 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.460 17:48:29 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.460 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.460 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.460 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.460 17:48:29 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.460 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.460 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.460 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.460 17:48:29 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.460 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.460 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.460 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.460 17:48:29 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.460 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.460 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.460 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.460 17:48:29 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.460 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.460 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.460 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.460 17:48:29 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.460 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.460 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.460 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.460 17:48:29 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.460 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.460 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.460 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.460 17:48:29 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.460 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.460 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.460 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.460 17:48:29 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.460 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.460 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.460 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.460 17:48:29 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.460 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.460 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.460 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.460 17:48:29 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.460 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.460 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.460 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.460 17:48:29 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.460 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.460 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.460 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.460 17:48:29 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.460 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.460 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.460 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.460 17:48:29 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.460 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.460 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.460 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.460 17:48:29 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.460 17:48:29 -- setup/common.sh@32 -- # continue 00:04:36.460 17:48:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:36.460 17:48:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:36.460 17:48:29 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.460 17:48:29 -- setup/common.sh@33 -- # echo 0 00:04:36.460 17:48:29 -- setup/common.sh@33 -- # return 0 00:04:36.460 17:48:29 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:36.460 17:48:29 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:36.460 17:48:29 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:36.460 17:48:29 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:36.460 17:48:29 -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:04:36.460 node0=1024 expecting 1024 00:04:36.460 17:48:29 -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:04:36.460 17:48:29 -- setup/hugepages.sh@202 -- # CLEAR_HUGE=no 00:04:36.460 17:48:29 -- setup/hugepages.sh@202 -- # NRHUGE=512 00:04:36.460 17:48:29 -- setup/hugepages.sh@202 -- # setup output 00:04:36.460 17:48:29 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:36.460 17:48:29 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:39.754 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:39.754 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:39.754 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:40.018 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:40.018 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:40.018 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:40.018 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:40.018 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:40.018 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:40.018 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:40.018 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:40.018 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:40.018 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:40.018 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:40.018 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:40.018 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:40.018 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:40.018 INFO: Requested 512 hugepages but 1024 already allocated on node0 00:04:40.018 17:48:32 -- setup/hugepages.sh@204 -- # verify_nr_hugepages 00:04:40.018 17:48:32 -- setup/hugepages.sh@89 -- # local node 00:04:40.018 17:48:32 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:40.018 17:48:32 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:40.018 17:48:32 -- setup/hugepages.sh@92 -- # local surp 00:04:40.018 17:48:32 -- setup/hugepages.sh@93 -- # local resv 00:04:40.018 17:48:32 -- setup/hugepages.sh@94 -- # local anon 00:04:40.018 17:48:32 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:40.018 17:48:32 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:40.018 17:48:32 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:40.018 17:48:32 -- setup/common.sh@18 -- # local node= 00:04:40.018 17:48:32 -- setup/common.sh@19 -- # local var val 00:04:40.018 17:48:32 -- setup/common.sh@20 -- # local mem_f mem 00:04:40.018 17:48:32 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:40.018 17:48:32 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:40.018 17:48:32 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:40.018 17:48:32 -- setup/common.sh@28 -- # mapfile -t mem 00:04:40.018 17:48:32 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:40.018 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.018 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.018 17:48:32 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41597088 kB' 'MemAvailable: 45324172 kB' 'Buffers: 8940 kB' 'Cached: 12526236 kB' 'SwapCached: 0 kB' 'Active: 9410168 kB' 'Inactive: 3688312 kB' 'Active(anon): 8993324 kB' 'Inactive(anon): 0 kB' 'Active(file): 416844 kB' 'Inactive(file): 3688312 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 566768 kB' 'Mapped: 151448 kB' 'Shmem: 8430020 kB' 'KReclaimable: 238036 kB' 'Slab: 912828 kB' 'SReclaimable: 238036 kB' 'SUnreclaim: 674792 kB' 'KernelStack: 21744 kB' 'PageTables: 7488 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10256920 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214336 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 603508 kB' 'DirectMap2M: 10616832 kB' 'DirectMap1G: 58720256 kB' 00:04:40.018 17:48:32 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:40.018 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.018 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.018 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.018 17:48:32 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:40.018 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.018 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.018 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.018 17:48:32 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:40.018 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.018 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.018 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.018 17:48:32 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:40.018 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.018 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.018 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.018 17:48:32 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:40.018 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.018 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.018 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.018 17:48:32 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:40.018 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.018 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.018 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.018 17:48:32 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:40.018 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.018 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.018 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.019 17:48:32 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:40.019 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.019 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.019 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.019 17:48:32 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:40.019 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.019 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.019 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.019 17:48:32 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:40.019 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.019 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.019 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.019 17:48:32 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:40.019 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.019 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.019 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.019 17:48:32 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:40.019 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.019 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.019 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.019 17:48:32 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:40.019 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.019 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.019 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.019 17:48:32 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:40.019 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.019 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.019 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.019 17:48:32 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:40.019 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.019 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.019 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.019 17:48:32 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:40.019 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.019 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.019 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.019 17:48:32 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:40.019 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.019 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.019 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.019 17:48:32 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:40.019 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.019 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.019 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.019 17:48:32 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:40.019 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.019 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.019 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.019 17:48:32 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:40.019 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.019 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.019 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.019 17:48:32 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:40.019 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.019 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.019 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.019 17:48:32 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:40.019 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.019 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.019 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.019 17:48:32 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:40.019 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.019 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.019 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.019 17:48:32 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:40.019 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.019 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.019 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.019 17:48:32 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:40.019 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.019 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.019 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.019 17:48:32 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:40.019 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.019 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.019 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.019 17:48:32 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:40.019 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.019 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.019 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.019 17:48:32 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:40.019 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.019 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.019 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.019 17:48:32 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:40.019 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.019 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.019 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.019 17:48:32 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:40.019 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.019 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.019 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.019 17:48:32 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:40.019 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.019 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.019 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.019 17:48:32 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:40.019 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.019 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.019 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.019 17:48:32 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:40.019 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.019 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.019 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.019 17:48:32 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:40.019 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.019 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.019 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.019 17:48:32 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:40.019 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.019 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.019 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.019 17:48:32 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:40.019 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.019 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.019 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.019 17:48:32 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:40.020 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.020 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.020 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.020 17:48:32 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:40.020 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.020 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.020 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.020 17:48:32 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:40.020 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.020 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.020 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.020 17:48:32 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:40.020 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.020 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.020 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.020 17:48:32 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:40.020 17:48:32 -- setup/common.sh@33 -- # echo 0 00:04:40.020 17:48:32 -- setup/common.sh@33 -- # return 0 00:04:40.020 17:48:32 -- setup/hugepages.sh@97 -- # anon=0 00:04:40.020 17:48:32 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:40.020 17:48:32 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:40.020 17:48:32 -- setup/common.sh@18 -- # local node= 00:04:40.020 17:48:32 -- setup/common.sh@19 -- # local var val 00:04:40.020 17:48:32 -- setup/common.sh@20 -- # local mem_f mem 00:04:40.020 17:48:32 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:40.020 17:48:32 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:40.020 17:48:32 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:40.020 17:48:32 -- setup/common.sh@28 -- # mapfile -t mem 00:04:40.020 17:48:32 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:40.020 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.020 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.020 17:48:32 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41597640 kB' 'MemAvailable: 45324716 kB' 'Buffers: 8940 kB' 'Cached: 12526240 kB' 'SwapCached: 0 kB' 'Active: 9409564 kB' 'Inactive: 3688312 kB' 'Active(anon): 8992720 kB' 'Inactive(anon): 0 kB' 'Active(file): 416844 kB' 'Inactive(file): 3688312 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 566172 kB' 'Mapped: 151404 kB' 'Shmem: 8430024 kB' 'KReclaimable: 238020 kB' 'Slab: 912816 kB' 'SReclaimable: 238020 kB' 'SUnreclaim: 674796 kB' 'KernelStack: 21728 kB' 'PageTables: 7420 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10256932 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214304 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 603508 kB' 'DirectMap2M: 10616832 kB' 'DirectMap1G: 58720256 kB' 00:04:40.020 17:48:32 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.020 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.020 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.020 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.020 17:48:32 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.020 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.020 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.020 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.020 17:48:32 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.020 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.020 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.020 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.020 17:48:32 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.020 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.020 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.020 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.020 17:48:32 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.020 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.020 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.020 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.020 17:48:32 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.020 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.020 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.020 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.020 17:48:32 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.020 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.020 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.020 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.020 17:48:32 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.020 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.020 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.020 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.020 17:48:32 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.020 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.020 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.020 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.020 17:48:32 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.020 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.020 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.020 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.020 17:48:32 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.020 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.020 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.020 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.020 17:48:32 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.020 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.020 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.020 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.020 17:48:32 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.020 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.020 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.020 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.020 17:48:32 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.020 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.020 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.020 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.020 17:48:32 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.020 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.020 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.020 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.020 17:48:32 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.020 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.020 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.020 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.020 17:48:32 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.020 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.020 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.020 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.020 17:48:32 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.020 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.020 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.020 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.021 17:48:32 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.021 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.021 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.021 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.021 17:48:32 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.021 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.021 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.021 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.021 17:48:32 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.021 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.021 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.021 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.021 17:48:32 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.021 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.021 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.021 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.021 17:48:32 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.021 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.021 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.021 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.021 17:48:32 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.021 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.021 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.021 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.021 17:48:32 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.021 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.021 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.021 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.021 17:48:32 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.021 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.021 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.021 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.021 17:48:32 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.021 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.021 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.021 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.021 17:48:32 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.021 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.021 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.021 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.021 17:48:32 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.021 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.021 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.021 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.021 17:48:32 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.021 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.021 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.021 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.021 17:48:32 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.021 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.021 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.021 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.021 17:48:32 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.021 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.021 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.021 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.021 17:48:32 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.021 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.021 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.021 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.021 17:48:32 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.021 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.021 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.021 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.021 17:48:32 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.021 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.021 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.021 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.021 17:48:32 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.021 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.021 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.021 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.021 17:48:32 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.021 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.021 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.021 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.021 17:48:32 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.021 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.021 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.021 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.021 17:48:32 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.021 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.021 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.021 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.021 17:48:32 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.021 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.021 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.021 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.021 17:48:32 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.021 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.021 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.021 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.021 17:48:32 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.021 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.021 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.021 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.021 17:48:32 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.021 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.021 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.021 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.021 17:48:32 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.021 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.021 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.021 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.021 17:48:32 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.021 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.021 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.021 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.021 17:48:32 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.021 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.021 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.021 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.021 17:48:32 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.021 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.021 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.021 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.022 17:48:32 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.022 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.022 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.022 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.022 17:48:32 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.022 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.022 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.022 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.022 17:48:32 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.022 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.022 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.022 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.022 17:48:32 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.022 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.022 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.022 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.022 17:48:32 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.022 17:48:32 -- setup/common.sh@33 -- # echo 0 00:04:40.022 17:48:32 -- setup/common.sh@33 -- # return 0 00:04:40.022 17:48:32 -- setup/hugepages.sh@99 -- # surp=0 00:04:40.022 17:48:32 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:40.022 17:48:32 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:40.022 17:48:32 -- setup/common.sh@18 -- # local node= 00:04:40.022 17:48:32 -- setup/common.sh@19 -- # local var val 00:04:40.022 17:48:32 -- setup/common.sh@20 -- # local mem_f mem 00:04:40.022 17:48:32 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:40.022 17:48:32 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:40.022 17:48:32 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:40.022 17:48:32 -- setup/common.sh@28 -- # mapfile -t mem 00:04:40.022 17:48:32 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:40.022 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.022 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.022 17:48:32 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41598756 kB' 'MemAvailable: 45325832 kB' 'Buffers: 8940 kB' 'Cached: 12526240 kB' 'SwapCached: 0 kB' 'Active: 9409592 kB' 'Inactive: 3688312 kB' 'Active(anon): 8992748 kB' 'Inactive(anon): 0 kB' 'Active(file): 416844 kB' 'Inactive(file): 3688312 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 566172 kB' 'Mapped: 151328 kB' 'Shmem: 8430024 kB' 'KReclaimable: 238020 kB' 'Slab: 912776 kB' 'SReclaimable: 238020 kB' 'SUnreclaim: 674756 kB' 'KernelStack: 21744 kB' 'PageTables: 7468 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10256948 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214304 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 603508 kB' 'DirectMap2M: 10616832 kB' 'DirectMap1G: 58720256 kB' 00:04:40.022 17:48:32 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:40.022 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.022 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.022 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.022 17:48:32 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:40.022 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.022 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.022 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.022 17:48:32 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:40.022 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.022 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.022 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.022 17:48:32 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:40.022 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.022 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.022 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.022 17:48:32 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:40.022 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.022 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.022 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.022 17:48:32 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:40.022 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.022 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.022 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.022 17:48:32 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:40.022 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.022 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.022 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.022 17:48:32 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:40.022 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.022 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.022 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.022 17:48:32 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:40.022 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.022 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.022 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.022 17:48:32 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:40.022 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.022 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.022 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.022 17:48:32 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:40.022 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.022 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.022 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.022 17:48:32 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:40.022 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.022 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.022 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.022 17:48:32 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:40.022 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.022 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.022 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.022 17:48:32 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:40.022 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.022 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.022 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.022 17:48:32 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:40.022 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.022 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.022 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.022 17:48:32 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:40.022 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.022 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.022 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.022 17:48:32 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:40.022 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.022 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.022 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.022 17:48:32 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:40.022 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.022 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.022 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.022 17:48:32 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:40.022 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.022 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.022 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.022 17:48:32 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:40.022 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.022 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.022 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.022 17:48:32 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:40.022 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.022 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.022 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.023 17:48:32 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:40.023 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.023 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.023 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.023 17:48:32 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:40.023 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.023 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.023 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.023 17:48:32 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:40.023 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.023 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.023 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.023 17:48:32 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:40.023 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.023 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.023 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.023 17:48:32 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:40.023 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.023 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.023 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.023 17:48:32 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:40.023 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.023 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.023 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.023 17:48:32 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:40.023 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.023 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.023 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.023 17:48:32 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:40.023 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.023 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.023 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.023 17:48:32 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:40.023 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.023 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.023 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.023 17:48:32 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:40.023 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.023 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.023 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.023 17:48:32 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:40.023 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.023 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.023 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.023 17:48:32 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:40.023 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.023 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.023 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.023 17:48:32 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:40.023 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.023 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.023 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.023 17:48:32 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:40.023 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.023 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.023 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.023 17:48:32 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:40.023 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.023 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.023 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.023 17:48:32 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:40.023 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.023 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.023 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.023 17:48:32 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:40.023 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.023 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.023 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.023 17:48:32 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:40.023 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.023 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.023 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.023 17:48:32 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:40.023 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.023 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.023 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.023 17:48:32 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:40.023 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.023 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.023 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.023 17:48:32 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:40.023 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.023 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.023 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.023 17:48:32 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:40.023 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.023 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.023 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.023 17:48:32 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:40.023 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.023 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.023 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.023 17:48:32 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:40.023 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.023 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.023 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.023 17:48:32 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:40.023 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.023 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.023 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.023 17:48:32 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:40.023 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.023 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.023 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.023 17:48:32 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:40.023 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.023 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.023 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.023 17:48:32 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:40.023 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.023 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.023 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.023 17:48:32 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:40.023 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.023 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.023 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.023 17:48:32 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:40.023 17:48:32 -- setup/common.sh@33 -- # echo 0 00:04:40.024 17:48:32 -- setup/common.sh@33 -- # return 0 00:04:40.024 17:48:32 -- setup/hugepages.sh@100 -- # resv=0 00:04:40.024 17:48:32 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:40.024 nr_hugepages=1024 00:04:40.024 17:48:32 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:40.024 resv_hugepages=0 00:04:40.024 17:48:32 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:40.024 surplus_hugepages=0 00:04:40.024 17:48:32 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:40.024 anon_hugepages=0 00:04:40.024 17:48:32 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:40.024 17:48:32 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:40.024 17:48:32 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:40.024 17:48:32 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:40.024 17:48:32 -- setup/common.sh@18 -- # local node= 00:04:40.024 17:48:32 -- setup/common.sh@19 -- # local var val 00:04:40.024 17:48:32 -- setup/common.sh@20 -- # local mem_f mem 00:04:40.024 17:48:32 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:40.024 17:48:32 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:40.024 17:48:32 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:40.024 17:48:32 -- setup/common.sh@28 -- # mapfile -t mem 00:04:40.024 17:48:32 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:40.024 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.024 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.024 17:48:32 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41598932 kB' 'MemAvailable: 45326008 kB' 'Buffers: 8940 kB' 'Cached: 12526244 kB' 'SwapCached: 0 kB' 'Active: 9409760 kB' 'Inactive: 3688312 kB' 'Active(anon): 8992916 kB' 'Inactive(anon): 0 kB' 'Active(file): 416844 kB' 'Inactive(file): 3688312 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 566336 kB' 'Mapped: 151328 kB' 'Shmem: 8430028 kB' 'KReclaimable: 238020 kB' 'Slab: 912776 kB' 'SReclaimable: 238020 kB' 'SUnreclaim: 674756 kB' 'KernelStack: 21728 kB' 'PageTables: 7412 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10256960 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214304 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 603508 kB' 'DirectMap2M: 10616832 kB' 'DirectMap1G: 58720256 kB' 00:04:40.024 17:48:32 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.024 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.024 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.024 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.024 17:48:32 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.024 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.024 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.024 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.024 17:48:32 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.024 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.024 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.024 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.024 17:48:32 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.024 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.024 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.024 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.024 17:48:32 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.024 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.024 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.024 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.024 17:48:32 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.024 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.024 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.024 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.024 17:48:32 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.024 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.024 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.024 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.024 17:48:32 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.024 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.024 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.024 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.024 17:48:32 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.024 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.024 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.024 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.024 17:48:32 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.024 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.024 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.024 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.024 17:48:32 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.024 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.024 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.024 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.024 17:48:32 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.024 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.024 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.024 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.024 17:48:32 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.024 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.024 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.024 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.024 17:48:32 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.024 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.024 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.024 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.024 17:48:32 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.024 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.024 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.024 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.024 17:48:32 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.025 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.025 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.025 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.025 17:48:32 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.025 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.025 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.025 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.025 17:48:32 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.025 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.025 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.025 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.025 17:48:32 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.025 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.025 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.025 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.025 17:48:32 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.025 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.025 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.025 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.025 17:48:32 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.025 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.025 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.025 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.025 17:48:32 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.025 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.025 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.025 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.025 17:48:32 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.025 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.025 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.025 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.025 17:48:32 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.025 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.025 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.025 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.025 17:48:32 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.025 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.025 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.025 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.025 17:48:32 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.025 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.025 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.025 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.025 17:48:32 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.025 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.025 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.025 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.025 17:48:32 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.025 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.025 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.025 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.025 17:48:32 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.025 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.025 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.025 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.025 17:48:32 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.025 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.025 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.025 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.025 17:48:32 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.025 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.025 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.025 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.025 17:48:32 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.025 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.025 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.025 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.025 17:48:32 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.025 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.025 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.025 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.025 17:48:32 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.025 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.025 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.025 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.025 17:48:32 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.025 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.025 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.025 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.025 17:48:32 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.025 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.025 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.025 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.025 17:48:32 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.025 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.025 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.025 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.025 17:48:32 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.025 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.025 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.025 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.025 17:48:32 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.025 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.025 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.025 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.025 17:48:32 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.025 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.025 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.025 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.025 17:48:32 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.025 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.025 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.025 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.025 17:48:32 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.025 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.025 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.025 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.025 17:48:32 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.025 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.025 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.025 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.025 17:48:32 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.025 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.025 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.025 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.025 17:48:32 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.025 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.025 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.026 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.026 17:48:32 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.026 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.026 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.026 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.026 17:48:32 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.026 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.026 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.026 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.026 17:48:32 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.026 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.026 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.026 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.026 17:48:32 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.026 17:48:32 -- setup/common.sh@33 -- # echo 1024 00:04:40.026 17:48:32 -- setup/common.sh@33 -- # return 0 00:04:40.026 17:48:32 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:40.026 17:48:32 -- setup/hugepages.sh@112 -- # get_nodes 00:04:40.026 17:48:32 -- setup/hugepages.sh@27 -- # local node 00:04:40.026 17:48:32 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:40.026 17:48:32 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:40.026 17:48:32 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:40.026 17:48:32 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:04:40.026 17:48:32 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:40.026 17:48:32 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:40.026 17:48:32 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:40.026 17:48:32 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:40.026 17:48:32 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:40.026 17:48:32 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:40.026 17:48:32 -- setup/common.sh@18 -- # local node=0 00:04:40.026 17:48:32 -- setup/common.sh@19 -- # local var val 00:04:40.026 17:48:32 -- setup/common.sh@20 -- # local mem_f mem 00:04:40.026 17:48:32 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:40.026 17:48:32 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:40.026 17:48:32 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:40.026 17:48:32 -- setup/common.sh@28 -- # mapfile -t mem 00:04:40.026 17:48:32 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:40.026 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.026 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.026 17:48:32 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634436 kB' 'MemFree: 25131660 kB' 'MemUsed: 7502776 kB' 'SwapCached: 0 kB' 'Active: 3692264 kB' 'Inactive: 168948 kB' 'Active(anon): 3569880 kB' 'Inactive(anon): 0 kB' 'Active(file): 122384 kB' 'Inactive(file): 168948 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3682124 kB' 'Mapped: 59636 kB' 'AnonPages: 182268 kB' 'Shmem: 3390792 kB' 'KernelStack: 10632 kB' 'PageTables: 3268 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 153272 kB' 'Slab: 469056 kB' 'SReclaimable: 153272 kB' 'SUnreclaim: 315784 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:40.026 17:48:32 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.026 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.026 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.026 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.026 17:48:32 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.026 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.026 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.026 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.026 17:48:32 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.026 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.026 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.026 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.026 17:48:32 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.026 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.026 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.026 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.026 17:48:32 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.026 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.026 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.026 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.026 17:48:32 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.026 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.026 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.026 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.026 17:48:32 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.026 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.026 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.026 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.287 17:48:32 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.287 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.287 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.287 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.287 17:48:32 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.287 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.287 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.287 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.287 17:48:32 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.287 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.287 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.287 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.287 17:48:32 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.287 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.287 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.287 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.287 17:48:32 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.287 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.287 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.287 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.287 17:48:32 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.287 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.287 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.287 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.287 17:48:32 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.287 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.287 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.287 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.287 17:48:32 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.287 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.287 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.287 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.287 17:48:32 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.287 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.287 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.287 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.287 17:48:32 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.287 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.287 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.287 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.287 17:48:32 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.287 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.287 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.287 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.287 17:48:32 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.287 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.287 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.287 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.287 17:48:32 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.287 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.287 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.287 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.287 17:48:32 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.287 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.287 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.287 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.287 17:48:32 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.287 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.287 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.287 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.287 17:48:32 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.287 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.287 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.287 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.287 17:48:32 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.287 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.287 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.287 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.287 17:48:32 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.287 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.287 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.287 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.287 17:48:32 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.287 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.287 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.287 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.287 17:48:32 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.287 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.287 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.287 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.287 17:48:32 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.287 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.287 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.287 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.287 17:48:32 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.287 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.287 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.287 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.287 17:48:32 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.287 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.287 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.287 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.287 17:48:32 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.287 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.287 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.287 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.287 17:48:32 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.287 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.287 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.287 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.287 17:48:32 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.287 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.287 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.287 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.287 17:48:32 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.287 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.287 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.287 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.287 17:48:32 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.287 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.287 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.287 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.287 17:48:32 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.287 17:48:32 -- setup/common.sh@32 -- # continue 00:04:40.287 17:48:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:40.287 17:48:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:40.287 17:48:32 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.287 17:48:32 -- setup/common.sh@33 -- # echo 0 00:04:40.287 17:48:32 -- setup/common.sh@33 -- # return 0 00:04:40.287 17:48:32 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:40.287 17:48:32 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:40.287 17:48:32 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:40.287 17:48:32 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:40.287 17:48:32 -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:04:40.287 node0=1024 expecting 1024 00:04:40.287 17:48:32 -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:04:40.287 00:04:40.287 real 0m7.393s 00:04:40.287 user 0m2.786s 00:04:40.287 sys 0m4.745s 00:04:40.287 17:48:32 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:40.287 17:48:32 -- common/autotest_common.sh@10 -- # set +x 00:04:40.287 ************************************ 00:04:40.287 END TEST no_shrink_alloc 00:04:40.287 ************************************ 00:04:40.287 17:48:32 -- setup/hugepages.sh@217 -- # clear_hp 00:04:40.288 17:48:32 -- setup/hugepages.sh@37 -- # local node hp 00:04:40.288 17:48:32 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:40.288 17:48:32 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:40.288 17:48:32 -- setup/hugepages.sh@41 -- # echo 0 00:04:40.288 17:48:32 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:40.288 17:48:32 -- setup/hugepages.sh@41 -- # echo 0 00:04:40.288 17:48:32 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:40.288 17:48:32 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:40.288 17:48:32 -- setup/hugepages.sh@41 -- # echo 0 00:04:40.288 17:48:32 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:40.288 17:48:32 -- setup/hugepages.sh@41 -- # echo 0 00:04:40.288 17:48:32 -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:04:40.288 17:48:32 -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:04:40.288 00:04:40.288 real 0m28.510s 00:04:40.288 user 0m10.250s 00:04:40.288 sys 0m17.222s 00:04:40.288 17:48:32 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:40.288 17:48:32 -- common/autotest_common.sh@10 -- # set +x 00:04:40.288 ************************************ 00:04:40.288 END TEST hugepages 00:04:40.288 ************************************ 00:04:40.288 17:48:32 -- setup/test-setup.sh@14 -- # run_test driver /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/driver.sh 00:04:40.288 17:48:32 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:40.288 17:48:32 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:40.288 17:48:32 -- common/autotest_common.sh@10 -- # set +x 00:04:40.288 ************************************ 00:04:40.288 START TEST driver 00:04:40.288 ************************************ 00:04:40.288 17:48:33 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/driver.sh 00:04:40.288 * Looking for test storage... 00:04:40.288 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:04:40.288 17:48:33 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:04:40.288 17:48:33 -- common/autotest_common.sh@1690 -- # lcov --version 00:04:40.288 17:48:33 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:04:40.547 17:48:33 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:04:40.547 17:48:33 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:04:40.547 17:48:33 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:04:40.547 17:48:33 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:04:40.548 17:48:33 -- scripts/common.sh@335 -- # IFS=.-: 00:04:40.548 17:48:33 -- scripts/common.sh@335 -- # read -ra ver1 00:04:40.548 17:48:33 -- scripts/common.sh@336 -- # IFS=.-: 00:04:40.548 17:48:33 -- scripts/common.sh@336 -- # read -ra ver2 00:04:40.548 17:48:33 -- scripts/common.sh@337 -- # local 'op=<' 00:04:40.548 17:48:33 -- scripts/common.sh@339 -- # ver1_l=2 00:04:40.548 17:48:33 -- scripts/common.sh@340 -- # ver2_l=1 00:04:40.548 17:48:33 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:04:40.548 17:48:33 -- scripts/common.sh@343 -- # case "$op" in 00:04:40.548 17:48:33 -- scripts/common.sh@344 -- # : 1 00:04:40.548 17:48:33 -- scripts/common.sh@363 -- # (( v = 0 )) 00:04:40.548 17:48:33 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:40.548 17:48:33 -- scripts/common.sh@364 -- # decimal 1 00:04:40.548 17:48:33 -- scripts/common.sh@352 -- # local d=1 00:04:40.548 17:48:33 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:40.548 17:48:33 -- scripts/common.sh@354 -- # echo 1 00:04:40.548 17:48:33 -- scripts/common.sh@364 -- # ver1[v]=1 00:04:40.548 17:48:33 -- scripts/common.sh@365 -- # decimal 2 00:04:40.548 17:48:33 -- scripts/common.sh@352 -- # local d=2 00:04:40.548 17:48:33 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:40.548 17:48:33 -- scripts/common.sh@354 -- # echo 2 00:04:40.548 17:48:33 -- scripts/common.sh@365 -- # ver2[v]=2 00:04:40.548 17:48:33 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:04:40.548 17:48:33 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:04:40.548 17:48:33 -- scripts/common.sh@367 -- # return 0 00:04:40.548 17:48:33 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:40.548 17:48:33 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:04:40.548 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:40.548 --rc genhtml_branch_coverage=1 00:04:40.548 --rc genhtml_function_coverage=1 00:04:40.548 --rc genhtml_legend=1 00:04:40.548 --rc geninfo_all_blocks=1 00:04:40.548 --rc geninfo_unexecuted_blocks=1 00:04:40.548 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:40.548 ' 00:04:40.548 17:48:33 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:04:40.548 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:40.548 --rc genhtml_branch_coverage=1 00:04:40.548 --rc genhtml_function_coverage=1 00:04:40.548 --rc genhtml_legend=1 00:04:40.548 --rc geninfo_all_blocks=1 00:04:40.548 --rc geninfo_unexecuted_blocks=1 00:04:40.548 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:40.548 ' 00:04:40.548 17:48:33 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:04:40.548 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:40.548 --rc genhtml_branch_coverage=1 00:04:40.548 --rc genhtml_function_coverage=1 00:04:40.548 --rc genhtml_legend=1 00:04:40.548 --rc geninfo_all_blocks=1 00:04:40.548 --rc geninfo_unexecuted_blocks=1 00:04:40.548 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:40.548 ' 00:04:40.548 17:48:33 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:04:40.548 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:40.548 --rc genhtml_branch_coverage=1 00:04:40.548 --rc genhtml_function_coverage=1 00:04:40.548 --rc genhtml_legend=1 00:04:40.548 --rc geninfo_all_blocks=1 00:04:40.548 --rc geninfo_unexecuted_blocks=1 00:04:40.548 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:40.548 ' 00:04:40.548 17:48:33 -- setup/driver.sh@68 -- # setup reset 00:04:40.548 17:48:33 -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:40.548 17:48:33 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:45.826 17:48:38 -- setup/driver.sh@69 -- # run_test guess_driver guess_driver 00:04:45.826 17:48:38 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:45.826 17:48:38 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:45.826 17:48:38 -- common/autotest_common.sh@10 -- # set +x 00:04:45.826 ************************************ 00:04:45.826 START TEST guess_driver 00:04:45.826 ************************************ 00:04:45.826 17:48:38 -- common/autotest_common.sh@1114 -- # guess_driver 00:04:45.826 17:48:38 -- setup/driver.sh@46 -- # local driver setup_driver marker 00:04:45.826 17:48:38 -- setup/driver.sh@47 -- # local fail=0 00:04:45.826 17:48:38 -- setup/driver.sh@49 -- # pick_driver 00:04:45.826 17:48:38 -- setup/driver.sh@36 -- # vfio 00:04:45.826 17:48:38 -- setup/driver.sh@21 -- # local iommu_grups 00:04:45.826 17:48:38 -- setup/driver.sh@22 -- # local unsafe_vfio 00:04:45.826 17:48:38 -- setup/driver.sh@24 -- # [[ -e /sys/module/vfio/parameters/enable_unsafe_noiommu_mode ]] 00:04:45.826 17:48:38 -- setup/driver.sh@25 -- # unsafe_vfio=N 00:04:45.826 17:48:38 -- setup/driver.sh@27 -- # iommu_groups=(/sys/kernel/iommu_groups/*) 00:04:45.826 17:48:38 -- setup/driver.sh@29 -- # (( 176 > 0 )) 00:04:45.826 17:48:38 -- setup/driver.sh@30 -- # is_driver vfio_pci 00:04:45.826 17:48:38 -- setup/driver.sh@14 -- # mod vfio_pci 00:04:45.826 17:48:38 -- setup/driver.sh@12 -- # dep vfio_pci 00:04:45.826 17:48:38 -- setup/driver.sh@11 -- # modprobe --show-depends vfio_pci 00:04:45.826 17:48:38 -- setup/driver.sh@12 -- # [[ insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/virt/lib/irqbypass.ko.xz 00:04:45.826 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:04:45.826 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:04:45.826 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:04:45.826 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:04:45.826 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/vfio_iommu_type1.ko.xz 00:04:45.826 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/pci/vfio-pci-core.ko.xz 00:04:45.826 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/pci/vfio-pci.ko.xz == *\.\k\o* ]] 00:04:45.826 17:48:38 -- setup/driver.sh@30 -- # return 0 00:04:45.826 17:48:38 -- setup/driver.sh@37 -- # echo vfio-pci 00:04:45.826 17:48:38 -- setup/driver.sh@49 -- # driver=vfio-pci 00:04:45.826 17:48:38 -- setup/driver.sh@51 -- # [[ vfio-pci == \N\o\ \v\a\l\i\d\ \d\r\i\v\e\r\ \f\o\u\n\d ]] 00:04:45.826 17:48:38 -- setup/driver.sh@56 -- # echo 'Looking for driver=vfio-pci' 00:04:45.826 Looking for driver=vfio-pci 00:04:45.826 17:48:38 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:45.826 17:48:38 -- setup/driver.sh@45 -- # setup output config 00:04:45.826 17:48:38 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:45.826 17:48:38 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:49.121 17:48:41 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:49.121 17:48:41 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:49.121 17:48:41 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:49.121 17:48:41 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:49.121 17:48:41 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:49.121 17:48:41 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:49.121 17:48:41 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:49.121 17:48:41 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:49.121 17:48:41 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:49.121 17:48:41 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:49.121 17:48:41 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:49.121 17:48:41 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:49.121 17:48:41 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:49.121 17:48:41 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:49.121 17:48:41 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:49.121 17:48:41 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:49.121 17:48:41 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:49.121 17:48:41 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:49.121 17:48:41 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:49.121 17:48:41 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:49.121 17:48:41 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:49.121 17:48:41 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:49.121 17:48:41 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:49.121 17:48:41 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:49.121 17:48:41 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:49.121 17:48:41 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:49.121 17:48:41 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:49.121 17:48:41 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:49.121 17:48:41 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:49.121 17:48:41 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:49.121 17:48:41 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:49.121 17:48:41 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:49.121 17:48:41 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:49.121 17:48:41 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:49.121 17:48:41 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:49.121 17:48:41 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:49.121 17:48:41 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:49.121 17:48:41 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:49.121 17:48:41 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:49.121 17:48:41 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:49.121 17:48:41 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:49.121 17:48:41 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:49.121 17:48:41 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:49.121 17:48:41 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:49.121 17:48:41 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:49.121 17:48:41 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:49.121 17:48:41 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:49.121 17:48:41 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:50.503 17:48:43 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:50.503 17:48:43 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:50.503 17:48:43 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:50.762 17:48:43 -- setup/driver.sh@64 -- # (( fail == 0 )) 00:04:50.762 17:48:43 -- setup/driver.sh@65 -- # setup reset 00:04:50.762 17:48:43 -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:50.762 17:48:43 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:56.044 00:04:56.044 real 0m10.230s 00:04:56.044 user 0m2.795s 00:04:56.044 sys 0m5.208s 00:04:56.044 17:48:48 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:56.044 17:48:48 -- common/autotest_common.sh@10 -- # set +x 00:04:56.044 ************************************ 00:04:56.044 END TEST guess_driver 00:04:56.044 ************************************ 00:04:56.044 00:04:56.044 real 0m15.480s 00:04:56.044 user 0m4.317s 00:04:56.044 sys 0m8.120s 00:04:56.044 17:48:48 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:56.044 17:48:48 -- common/autotest_common.sh@10 -- # set +x 00:04:56.044 ************************************ 00:04:56.044 END TEST driver 00:04:56.044 ************************************ 00:04:56.044 17:48:48 -- setup/test-setup.sh@15 -- # run_test devices /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/devices.sh 00:04:56.044 17:48:48 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:56.044 17:48:48 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:56.044 17:48:48 -- common/autotest_common.sh@10 -- # set +x 00:04:56.044 ************************************ 00:04:56.044 START TEST devices 00:04:56.044 ************************************ 00:04:56.044 17:48:48 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/devices.sh 00:04:56.044 * Looking for test storage... 00:04:56.044 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:04:56.044 17:48:48 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:04:56.044 17:48:48 -- common/autotest_common.sh@1690 -- # lcov --version 00:04:56.044 17:48:48 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:04:56.044 17:48:48 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:04:56.044 17:48:48 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:04:56.044 17:48:48 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:04:56.044 17:48:48 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:04:56.044 17:48:48 -- scripts/common.sh@335 -- # IFS=.-: 00:04:56.044 17:48:48 -- scripts/common.sh@335 -- # read -ra ver1 00:04:56.044 17:48:48 -- scripts/common.sh@336 -- # IFS=.-: 00:04:56.044 17:48:48 -- scripts/common.sh@336 -- # read -ra ver2 00:04:56.044 17:48:48 -- scripts/common.sh@337 -- # local 'op=<' 00:04:56.044 17:48:48 -- scripts/common.sh@339 -- # ver1_l=2 00:04:56.044 17:48:48 -- scripts/common.sh@340 -- # ver2_l=1 00:04:56.044 17:48:48 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:04:56.044 17:48:48 -- scripts/common.sh@343 -- # case "$op" in 00:04:56.044 17:48:48 -- scripts/common.sh@344 -- # : 1 00:04:56.044 17:48:48 -- scripts/common.sh@363 -- # (( v = 0 )) 00:04:56.044 17:48:48 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:56.044 17:48:48 -- scripts/common.sh@364 -- # decimal 1 00:04:56.044 17:48:48 -- scripts/common.sh@352 -- # local d=1 00:04:56.044 17:48:48 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:56.044 17:48:48 -- scripts/common.sh@354 -- # echo 1 00:04:56.044 17:48:48 -- scripts/common.sh@364 -- # ver1[v]=1 00:04:56.044 17:48:48 -- scripts/common.sh@365 -- # decimal 2 00:04:56.044 17:48:48 -- scripts/common.sh@352 -- # local d=2 00:04:56.044 17:48:48 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:56.044 17:48:48 -- scripts/common.sh@354 -- # echo 2 00:04:56.044 17:48:48 -- scripts/common.sh@365 -- # ver2[v]=2 00:04:56.044 17:48:48 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:04:56.044 17:48:48 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:04:56.044 17:48:48 -- scripts/common.sh@367 -- # return 0 00:04:56.044 17:48:48 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:56.044 17:48:48 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:04:56.044 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:56.044 --rc genhtml_branch_coverage=1 00:04:56.044 --rc genhtml_function_coverage=1 00:04:56.044 --rc genhtml_legend=1 00:04:56.044 --rc geninfo_all_blocks=1 00:04:56.044 --rc geninfo_unexecuted_blocks=1 00:04:56.044 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:56.044 ' 00:04:56.044 17:48:48 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:04:56.044 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:56.044 --rc genhtml_branch_coverage=1 00:04:56.044 --rc genhtml_function_coverage=1 00:04:56.044 --rc genhtml_legend=1 00:04:56.044 --rc geninfo_all_blocks=1 00:04:56.044 --rc geninfo_unexecuted_blocks=1 00:04:56.044 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:56.044 ' 00:04:56.044 17:48:48 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:04:56.044 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:56.044 --rc genhtml_branch_coverage=1 00:04:56.044 --rc genhtml_function_coverage=1 00:04:56.044 --rc genhtml_legend=1 00:04:56.044 --rc geninfo_all_blocks=1 00:04:56.044 --rc geninfo_unexecuted_blocks=1 00:04:56.044 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:56.044 ' 00:04:56.044 17:48:48 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:04:56.045 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:56.045 --rc genhtml_branch_coverage=1 00:04:56.045 --rc genhtml_function_coverage=1 00:04:56.045 --rc genhtml_legend=1 00:04:56.045 --rc geninfo_all_blocks=1 00:04:56.045 --rc geninfo_unexecuted_blocks=1 00:04:56.045 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:56.045 ' 00:04:56.045 17:48:48 -- setup/devices.sh@190 -- # trap cleanup EXIT 00:04:56.045 17:48:48 -- setup/devices.sh@192 -- # setup reset 00:04:56.045 17:48:48 -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:56.045 17:48:48 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:05:00.242 17:48:52 -- setup/devices.sh@194 -- # get_zoned_devs 00:05:00.242 17:48:52 -- common/autotest_common.sh@1664 -- # zoned_devs=() 00:05:00.242 17:48:52 -- common/autotest_common.sh@1664 -- # local -gA zoned_devs 00:05:00.242 17:48:52 -- common/autotest_common.sh@1665 -- # local nvme bdf 00:05:00.242 17:48:52 -- common/autotest_common.sh@1667 -- # for nvme in /sys/block/nvme* 00:05:00.242 17:48:52 -- common/autotest_common.sh@1668 -- # is_block_zoned nvme0n1 00:05:00.242 17:48:52 -- common/autotest_common.sh@1657 -- # local device=nvme0n1 00:05:00.242 17:48:52 -- common/autotest_common.sh@1659 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:05:00.242 17:48:52 -- common/autotest_common.sh@1660 -- # [[ none != none ]] 00:05:00.242 17:48:52 -- setup/devices.sh@196 -- # blocks=() 00:05:00.242 17:48:52 -- setup/devices.sh@196 -- # declare -a blocks 00:05:00.242 17:48:52 -- setup/devices.sh@197 -- # blocks_to_pci=() 00:05:00.242 17:48:52 -- setup/devices.sh@197 -- # declare -A blocks_to_pci 00:05:00.242 17:48:52 -- setup/devices.sh@198 -- # min_disk_size=3221225472 00:05:00.242 17:48:52 -- setup/devices.sh@200 -- # for block in "/sys/block/nvme"!(*c*) 00:05:00.242 17:48:52 -- setup/devices.sh@201 -- # ctrl=nvme0n1 00:05:00.242 17:48:52 -- setup/devices.sh@201 -- # ctrl=nvme0 00:05:00.242 17:48:52 -- setup/devices.sh@202 -- # pci=0000:d8:00.0 00:05:00.242 17:48:52 -- setup/devices.sh@203 -- # [[ '' == *\0\0\0\0\:\d\8\:\0\0\.\0* ]] 00:05:00.242 17:48:52 -- setup/devices.sh@204 -- # block_in_use nvme0n1 00:05:00.242 17:48:52 -- scripts/common.sh@380 -- # local block=nvme0n1 pt 00:05:00.242 17:48:52 -- scripts/common.sh@389 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/spdk-gpt.py nvme0n1 00:05:00.242 No valid GPT data, bailing 00:05:00.242 17:48:52 -- scripts/common.sh@393 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:05:00.242 17:48:52 -- scripts/common.sh@393 -- # pt= 00:05:00.242 17:48:52 -- scripts/common.sh@394 -- # return 1 00:05:00.242 17:48:52 -- setup/devices.sh@204 -- # sec_size_to_bytes nvme0n1 00:05:00.242 17:48:52 -- setup/common.sh@76 -- # local dev=nvme0n1 00:05:00.242 17:48:52 -- setup/common.sh@78 -- # [[ -e /sys/block/nvme0n1 ]] 00:05:00.242 17:48:52 -- setup/common.sh@80 -- # echo 1600321314816 00:05:00.242 17:48:52 -- setup/devices.sh@204 -- # (( 1600321314816 >= min_disk_size )) 00:05:00.242 17:48:52 -- setup/devices.sh@205 -- # blocks+=("${block##*/}") 00:05:00.242 17:48:52 -- setup/devices.sh@206 -- # blocks_to_pci["${block##*/}"]=0000:d8:00.0 00:05:00.242 17:48:52 -- setup/devices.sh@209 -- # (( 1 > 0 )) 00:05:00.242 17:48:52 -- setup/devices.sh@211 -- # declare -r test_disk=nvme0n1 00:05:00.242 17:48:52 -- setup/devices.sh@213 -- # run_test nvme_mount nvme_mount 00:05:00.242 17:48:52 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:00.242 17:48:52 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:00.242 17:48:52 -- common/autotest_common.sh@10 -- # set +x 00:05:00.242 ************************************ 00:05:00.242 START TEST nvme_mount 00:05:00.242 ************************************ 00:05:00.242 17:48:52 -- common/autotest_common.sh@1114 -- # nvme_mount 00:05:00.242 17:48:52 -- setup/devices.sh@95 -- # nvme_disk=nvme0n1 00:05:00.242 17:48:52 -- setup/devices.sh@96 -- # nvme_disk_p=nvme0n1p1 00:05:00.242 17:48:52 -- setup/devices.sh@97 -- # nvme_mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:00.242 17:48:52 -- setup/devices.sh@98 -- # nvme_dummy_test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:00.242 17:48:52 -- setup/devices.sh@101 -- # partition_drive nvme0n1 1 00:05:00.242 17:48:52 -- setup/common.sh@39 -- # local disk=nvme0n1 00:05:00.242 17:48:52 -- setup/common.sh@40 -- # local part_no=1 00:05:00.242 17:48:52 -- setup/common.sh@41 -- # local size=1073741824 00:05:00.242 17:48:52 -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:05:00.242 17:48:52 -- setup/common.sh@44 -- # parts=() 00:05:00.242 17:48:52 -- setup/common.sh@44 -- # local parts 00:05:00.242 17:48:52 -- setup/common.sh@46 -- # (( part = 1 )) 00:05:00.242 17:48:52 -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:00.242 17:48:52 -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:05:00.242 17:48:52 -- setup/common.sh@46 -- # (( part++ )) 00:05:00.242 17:48:52 -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:00.242 17:48:52 -- setup/common.sh@51 -- # (( size /= 512 )) 00:05:00.242 17:48:52 -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:05:00.242 17:48:52 -- setup/common.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 00:05:01.182 Creating new GPT entries in memory. 00:05:01.182 GPT data structures destroyed! You may now partition the disk using fdisk or 00:05:01.182 other utilities. 00:05:01.182 17:48:53 -- setup/common.sh@57 -- # (( part = 1 )) 00:05:01.182 17:48:53 -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:01.182 17:48:53 -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:05:01.182 17:48:53 -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:05:01.182 17:48:53 -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:05:02.121 Creating new GPT entries in memory. 00:05:02.121 The operation has completed successfully. 00:05:02.121 17:48:54 -- setup/common.sh@57 -- # (( part++ )) 00:05:02.121 17:48:54 -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:02.121 17:48:54 -- setup/common.sh@62 -- # wait 593656 00:05:02.121 17:48:54 -- setup/devices.sh@102 -- # mkfs /dev/nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:02.121 17:48:54 -- setup/common.sh@66 -- # local dev=/dev/nvme0n1p1 mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount size= 00:05:02.121 17:48:54 -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:02.121 17:48:54 -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1p1 ]] 00:05:02.121 17:48:54 -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1p1 00:05:02.121 17:48:54 -- setup/common.sh@72 -- # mount /dev/nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:02.121 17:48:54 -- setup/devices.sh@105 -- # verify 0000:d8:00.0 nvme0n1:nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:02.121 17:48:54 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:05:02.121 17:48:54 -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1p1 00:05:02.121 17:48:54 -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:02.121 17:48:54 -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:02.121 17:48:54 -- setup/devices.sh@53 -- # local found=0 00:05:02.121 17:48:54 -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:02.121 17:48:54 -- setup/devices.sh@56 -- # : 00:05:02.121 17:48:54 -- setup/devices.sh@59 -- # local pci status 00:05:02.121 17:48:54 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:02.121 17:48:54 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:05:02.121 17:48:54 -- setup/devices.sh@47 -- # setup output config 00:05:02.121 17:48:54 -- setup/common.sh@9 -- # [[ output == output ]] 00:05:02.121 17:48:54 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:05.415 17:48:58 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:05.415 17:48:58 -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1p1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1\p\1* ]] 00:05:05.415 17:48:58 -- setup/devices.sh@63 -- # found=1 00:05:05.415 17:48:58 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:05.415 17:48:58 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:05.415 17:48:58 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:05.415 17:48:58 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:05.415 17:48:58 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:05.415 17:48:58 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:05.415 17:48:58 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:05.415 17:48:58 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:05.415 17:48:58 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:05.415 17:48:58 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:05.415 17:48:58 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:05.415 17:48:58 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:05.415 17:48:58 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:05.415 17:48:58 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:05.415 17:48:58 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:05.415 17:48:58 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:05.415 17:48:58 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:05.415 17:48:58 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:05.415 17:48:58 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:05.415 17:48:58 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:05.415 17:48:58 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:05.415 17:48:58 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:05.415 17:48:58 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:05.415 17:48:58 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:05.415 17:48:58 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:05.415 17:48:58 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:05.415 17:48:58 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:05.415 17:48:58 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:05.415 17:48:58 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:05.415 17:48:58 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:05.415 17:48:58 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:05.415 17:48:58 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:05.415 17:48:58 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:05.675 17:48:58 -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:05.675 17:48:58 -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount ]] 00:05:05.675 17:48:58 -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:05.675 17:48:58 -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:05.675 17:48:58 -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:05.675 17:48:58 -- setup/devices.sh@110 -- # cleanup_nvme 00:05:05.675 17:48:58 -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:05.675 17:48:58 -- setup/devices.sh@21 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:05.675 17:48:58 -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:05.675 17:48:58 -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:05:05.675 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:05:05.675 17:48:58 -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:05:05.675 17:48:58 -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:05:05.934 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:05:05.934 /dev/nvme0n1: 8 bytes were erased at offset 0x1749a955e00 (gpt): 45 46 49 20 50 41 52 54 00:05:05.934 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:05:05.934 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:05:05.934 17:48:58 -- setup/devices.sh@113 -- # mkfs /dev/nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 1024M 00:05:05.934 17:48:58 -- setup/common.sh@66 -- # local dev=/dev/nvme0n1 mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount size=1024M 00:05:05.934 17:48:58 -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:05.934 17:48:58 -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1 ]] 00:05:05.934 17:48:58 -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1 1024M 00:05:05.934 17:48:58 -- setup/common.sh@72 -- # mount /dev/nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:05.934 17:48:58 -- setup/devices.sh@116 -- # verify 0000:d8:00.0 nvme0n1:nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:05.934 17:48:58 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:05:05.934 17:48:58 -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1 00:05:05.934 17:48:58 -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:05.934 17:48:58 -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:05.934 17:48:58 -- setup/devices.sh@53 -- # local found=0 00:05:05.934 17:48:58 -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:05.934 17:48:58 -- setup/devices.sh@56 -- # : 00:05:05.934 17:48:58 -- setup/devices.sh@59 -- # local pci status 00:05:05.934 17:48:58 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:05.934 17:48:58 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:05:05.934 17:48:58 -- setup/devices.sh@47 -- # setup output config 00:05:05.934 17:48:58 -- setup/common.sh@9 -- # [[ output == output ]] 00:05:05.934 17:48:58 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:09.227 17:49:01 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:09.227 17:49:01 -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1* ]] 00:05:09.227 17:49:01 -- setup/devices.sh@63 -- # found=1 00:05:09.227 17:49:01 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:09.227 17:49:01 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:09.227 17:49:01 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:09.227 17:49:01 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:09.227 17:49:01 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:09.227 17:49:01 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:09.227 17:49:01 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:09.227 17:49:01 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:09.227 17:49:01 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:09.227 17:49:01 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:09.227 17:49:01 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:09.227 17:49:01 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:09.227 17:49:01 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:09.227 17:49:02 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:09.227 17:49:02 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:09.227 17:49:02 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:09.227 17:49:02 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:09.227 17:49:02 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:09.227 17:49:02 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:09.227 17:49:02 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:09.227 17:49:02 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:09.227 17:49:02 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:09.227 17:49:02 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:09.227 17:49:02 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:09.227 17:49:02 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:09.227 17:49:02 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:09.227 17:49:02 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:09.227 17:49:02 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:09.227 17:49:02 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:09.227 17:49:02 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:09.227 17:49:02 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:09.227 17:49:02 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:09.227 17:49:02 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:09.487 17:49:02 -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:09.487 17:49:02 -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount ]] 00:05:09.487 17:49:02 -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:09.487 17:49:02 -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:09.487 17:49:02 -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:09.487 17:49:02 -- setup/devices.sh@123 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:09.487 17:49:02 -- setup/devices.sh@125 -- # verify 0000:d8:00.0 data@nvme0n1 '' '' 00:05:09.487 17:49:02 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:05:09.487 17:49:02 -- setup/devices.sh@49 -- # local mounts=data@nvme0n1 00:05:09.487 17:49:02 -- setup/devices.sh@50 -- # local mount_point= 00:05:09.487 17:49:02 -- setup/devices.sh@51 -- # local test_file= 00:05:09.487 17:49:02 -- setup/devices.sh@53 -- # local found=0 00:05:09.487 17:49:02 -- setup/devices.sh@55 -- # [[ -n '' ]] 00:05:09.487 17:49:02 -- setup/devices.sh@59 -- # local pci status 00:05:09.487 17:49:02 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:09.487 17:49:02 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:05:09.487 17:49:02 -- setup/devices.sh@47 -- # setup output config 00:05:09.487 17:49:02 -- setup/common.sh@9 -- # [[ output == output ]] 00:05:09.487 17:49:02 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:12.781 17:49:05 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:12.781 17:49:05 -- setup/devices.sh@62 -- # [[ Active devices: data@nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\d\a\t\a\@\n\v\m\e\0\n\1* ]] 00:05:12.781 17:49:05 -- setup/devices.sh@63 -- # found=1 00:05:12.781 17:49:05 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:12.781 17:49:05 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:12.781 17:49:05 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:12.781 17:49:05 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:12.781 17:49:05 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:12.781 17:49:05 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:12.781 17:49:05 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:12.781 17:49:05 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:12.781 17:49:05 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:12.781 17:49:05 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:12.781 17:49:05 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:12.781 17:49:05 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:12.781 17:49:05 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:12.781 17:49:05 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:12.781 17:49:05 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:12.781 17:49:05 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:12.781 17:49:05 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:12.781 17:49:05 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:12.781 17:49:05 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:12.781 17:49:05 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:12.781 17:49:05 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:12.781 17:49:05 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:12.781 17:49:05 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:12.781 17:49:05 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:12.781 17:49:05 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:12.781 17:49:05 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:12.781 17:49:05 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:12.781 17:49:05 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:12.781 17:49:05 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:12.781 17:49:05 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:12.781 17:49:05 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:12.781 17:49:05 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:12.781 17:49:05 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:13.041 17:49:05 -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:13.041 17:49:05 -- setup/devices.sh@68 -- # [[ -n '' ]] 00:05:13.041 17:49:05 -- setup/devices.sh@68 -- # return 0 00:05:13.041 17:49:05 -- setup/devices.sh@128 -- # cleanup_nvme 00:05:13.041 17:49:05 -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:13.041 17:49:05 -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:13.041 17:49:05 -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:05:13.041 17:49:05 -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:05:13.041 /dev/nvme0n1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:05:13.041 00:05:13.041 real 0m13.042s 00:05:13.041 user 0m3.931s 00:05:13.041 sys 0m7.107s 00:05:13.041 17:49:05 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:13.041 17:49:05 -- common/autotest_common.sh@10 -- # set +x 00:05:13.041 ************************************ 00:05:13.042 END TEST nvme_mount 00:05:13.042 ************************************ 00:05:13.042 17:49:05 -- setup/devices.sh@214 -- # run_test dm_mount dm_mount 00:05:13.042 17:49:05 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:13.042 17:49:05 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:13.042 17:49:05 -- common/autotest_common.sh@10 -- # set +x 00:05:13.042 ************************************ 00:05:13.042 START TEST dm_mount 00:05:13.042 ************************************ 00:05:13.042 17:49:05 -- common/autotest_common.sh@1114 -- # dm_mount 00:05:13.042 17:49:05 -- setup/devices.sh@144 -- # pv=nvme0n1 00:05:13.042 17:49:05 -- setup/devices.sh@145 -- # pv0=nvme0n1p1 00:05:13.042 17:49:05 -- setup/devices.sh@146 -- # pv1=nvme0n1p2 00:05:13.042 17:49:05 -- setup/devices.sh@148 -- # partition_drive nvme0n1 00:05:13.042 17:49:05 -- setup/common.sh@39 -- # local disk=nvme0n1 00:05:13.042 17:49:05 -- setup/common.sh@40 -- # local part_no=2 00:05:13.042 17:49:05 -- setup/common.sh@41 -- # local size=1073741824 00:05:13.042 17:49:05 -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:05:13.042 17:49:05 -- setup/common.sh@44 -- # parts=() 00:05:13.042 17:49:05 -- setup/common.sh@44 -- # local parts 00:05:13.042 17:49:05 -- setup/common.sh@46 -- # (( part = 1 )) 00:05:13.042 17:49:05 -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:13.042 17:49:05 -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:05:13.042 17:49:05 -- setup/common.sh@46 -- # (( part++ )) 00:05:13.042 17:49:05 -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:13.042 17:49:05 -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:05:13.042 17:49:05 -- setup/common.sh@46 -- # (( part++ )) 00:05:13.042 17:49:05 -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:13.042 17:49:05 -- setup/common.sh@51 -- # (( size /= 512 )) 00:05:13.042 17:49:05 -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:05:13.042 17:49:05 -- setup/common.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 nvme0n1p2 00:05:13.980 Creating new GPT entries in memory. 00:05:13.980 GPT data structures destroyed! You may now partition the disk using fdisk or 00:05:13.980 other utilities. 00:05:13.980 17:49:06 -- setup/common.sh@57 -- # (( part = 1 )) 00:05:13.980 17:49:06 -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:13.980 17:49:06 -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:05:13.980 17:49:06 -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:05:13.980 17:49:06 -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:05:15.361 Creating new GPT entries in memory. 00:05:15.361 The operation has completed successfully. 00:05:15.361 17:49:07 -- setup/common.sh@57 -- # (( part++ )) 00:05:15.361 17:49:07 -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:15.361 17:49:07 -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:05:15.361 17:49:07 -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:05:15.361 17:49:07 -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=2:2099200:4196351 00:05:16.301 The operation has completed successfully. 00:05:16.301 17:49:08 -- setup/common.sh@57 -- # (( part++ )) 00:05:16.301 17:49:08 -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:16.301 17:49:08 -- setup/common.sh@62 -- # wait 598397 00:05:16.301 17:49:08 -- setup/devices.sh@150 -- # dm_name=nvme_dm_test 00:05:16.301 17:49:08 -- setup/devices.sh@151 -- # dm_mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:16.301 17:49:08 -- setup/devices.sh@152 -- # dm_dummy_test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:16.301 17:49:08 -- setup/devices.sh@155 -- # dmsetup create nvme_dm_test 00:05:16.301 17:49:08 -- setup/devices.sh@160 -- # for t in {1..5} 00:05:16.301 17:49:08 -- setup/devices.sh@161 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:05:16.301 17:49:08 -- setup/devices.sh@161 -- # break 00:05:16.301 17:49:08 -- setup/devices.sh@164 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:05:16.301 17:49:08 -- setup/devices.sh@165 -- # readlink -f /dev/mapper/nvme_dm_test 00:05:16.301 17:49:08 -- setup/devices.sh@165 -- # dm=/dev/dm-0 00:05:16.301 17:49:08 -- setup/devices.sh@166 -- # dm=dm-0 00:05:16.301 17:49:08 -- setup/devices.sh@168 -- # [[ -e /sys/class/block/nvme0n1p1/holders/dm-0 ]] 00:05:16.301 17:49:08 -- setup/devices.sh@169 -- # [[ -e /sys/class/block/nvme0n1p2/holders/dm-0 ]] 00:05:16.301 17:49:08 -- setup/devices.sh@171 -- # mkfs /dev/mapper/nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:16.301 17:49:08 -- setup/common.sh@66 -- # local dev=/dev/mapper/nvme_dm_test mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount size= 00:05:16.301 17:49:08 -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:16.301 17:49:08 -- setup/common.sh@70 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:05:16.301 17:49:08 -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/mapper/nvme_dm_test 00:05:16.301 17:49:09 -- setup/common.sh@72 -- # mount /dev/mapper/nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:16.301 17:49:09 -- setup/devices.sh@174 -- # verify 0000:d8:00.0 nvme0n1:nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:16.301 17:49:09 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:05:16.301 17:49:09 -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme_dm_test 00:05:16.301 17:49:09 -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:16.301 17:49:09 -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:16.301 17:49:09 -- setup/devices.sh@53 -- # local found=0 00:05:16.301 17:49:09 -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:05:16.302 17:49:09 -- setup/devices.sh@56 -- # : 00:05:16.302 17:49:09 -- setup/devices.sh@59 -- # local pci status 00:05:16.302 17:49:09 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:16.302 17:49:09 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:05:16.302 17:49:09 -- setup/devices.sh@47 -- # setup output config 00:05:16.302 17:49:09 -- setup/common.sh@9 -- # [[ output == output ]] 00:05:16.302 17:49:09 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:19.594 17:49:12 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:19.594 17:49:12 -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0,mount@nvme0n1:nvme_dm_test, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\_\d\m\_\t\e\s\t* ]] 00:05:19.594 17:49:12 -- setup/devices.sh@63 -- # found=1 00:05:19.594 17:49:12 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:19.594 17:49:12 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:19.594 17:49:12 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:19.594 17:49:12 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:19.594 17:49:12 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:19.594 17:49:12 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:19.594 17:49:12 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:19.594 17:49:12 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:19.594 17:49:12 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:19.594 17:49:12 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:19.594 17:49:12 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:19.594 17:49:12 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:19.594 17:49:12 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:19.594 17:49:12 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:19.594 17:49:12 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:19.594 17:49:12 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:19.594 17:49:12 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:19.594 17:49:12 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:19.594 17:49:12 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:19.594 17:49:12 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:19.594 17:49:12 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:19.594 17:49:12 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:19.594 17:49:12 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:19.594 17:49:12 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:19.594 17:49:12 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:19.594 17:49:12 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:19.594 17:49:12 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:19.594 17:49:12 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:19.594 17:49:12 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:19.594 17:49:12 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:19.594 17:49:12 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:19.594 17:49:12 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:19.594 17:49:12 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:19.854 17:49:12 -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:19.854 17:49:12 -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount ]] 00:05:19.854 17:49:12 -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:19.854 17:49:12 -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:05:19.854 17:49:12 -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:19.854 17:49:12 -- setup/devices.sh@182 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:19.854 17:49:12 -- setup/devices.sh@184 -- # verify 0000:d8:00.0 holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 '' '' 00:05:19.854 17:49:12 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:05:19.854 17:49:12 -- setup/devices.sh@49 -- # local mounts=holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 00:05:19.854 17:49:12 -- setup/devices.sh@50 -- # local mount_point= 00:05:19.854 17:49:12 -- setup/devices.sh@51 -- # local test_file= 00:05:19.854 17:49:12 -- setup/devices.sh@53 -- # local found=0 00:05:19.854 17:49:12 -- setup/devices.sh@55 -- # [[ -n '' ]] 00:05:19.854 17:49:12 -- setup/devices.sh@59 -- # local pci status 00:05:19.854 17:49:12 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:19.854 17:49:12 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:05:19.854 17:49:12 -- setup/devices.sh@47 -- # setup output config 00:05:19.854 17:49:12 -- setup/common.sh@9 -- # [[ output == output ]] 00:05:19.854 17:49:12 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:23.147 17:49:15 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:23.147 17:49:15 -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\1\:\d\m\-\0\,\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\2\:\d\m\-\0* ]] 00:05:23.147 17:49:15 -- setup/devices.sh@63 -- # found=1 00:05:23.147 17:49:15 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:23.147 17:49:15 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:23.147 17:49:15 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:23.147 17:49:15 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:23.147 17:49:15 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:23.147 17:49:15 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:23.147 17:49:15 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:23.147 17:49:15 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:23.147 17:49:15 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:23.147 17:49:15 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:23.147 17:49:15 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:23.147 17:49:15 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:23.147 17:49:15 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:23.147 17:49:15 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:23.147 17:49:15 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:23.147 17:49:15 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:23.147 17:49:15 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:23.147 17:49:15 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:23.147 17:49:15 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:23.147 17:49:15 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:23.147 17:49:15 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:23.147 17:49:15 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:23.147 17:49:15 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:23.147 17:49:15 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:23.147 17:49:15 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:23.147 17:49:15 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:23.147 17:49:15 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:23.147 17:49:15 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:23.147 17:49:15 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:23.147 17:49:15 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:23.147 17:49:15 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:23.147 17:49:15 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:23.147 17:49:15 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:23.147 17:49:15 -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:23.147 17:49:15 -- setup/devices.sh@68 -- # [[ -n '' ]] 00:05:23.147 17:49:15 -- setup/devices.sh@68 -- # return 0 00:05:23.147 17:49:15 -- setup/devices.sh@187 -- # cleanup_dm 00:05:23.147 17:49:15 -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:23.147 17:49:15 -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:05:23.147 17:49:15 -- setup/devices.sh@37 -- # dmsetup remove --force nvme_dm_test 00:05:23.407 17:49:16 -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:23.407 17:49:16 -- setup/devices.sh@40 -- # wipefs --all /dev/nvme0n1p1 00:05:23.407 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:05:23.407 17:49:16 -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:05:23.407 17:49:16 -- setup/devices.sh@43 -- # wipefs --all /dev/nvme0n1p2 00:05:23.407 00:05:23.407 real 0m10.231s 00:05:23.407 user 0m2.619s 00:05:23.407 sys 0m4.728s 00:05:23.407 17:49:16 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:23.407 17:49:16 -- common/autotest_common.sh@10 -- # set +x 00:05:23.407 ************************************ 00:05:23.407 END TEST dm_mount 00:05:23.407 ************************************ 00:05:23.407 17:49:16 -- setup/devices.sh@1 -- # cleanup 00:05:23.407 17:49:16 -- setup/devices.sh@11 -- # cleanup_nvme 00:05:23.407 17:49:16 -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:23.407 17:49:16 -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:23.407 17:49:16 -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:05:23.407 17:49:16 -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:05:23.407 17:49:16 -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:05:23.667 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:05:23.667 /dev/nvme0n1: 8 bytes were erased at offset 0x1749a955e00 (gpt): 45 46 49 20 50 41 52 54 00:05:23.667 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:05:23.667 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:05:23.667 17:49:16 -- setup/devices.sh@12 -- # cleanup_dm 00:05:23.667 17:49:16 -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:23.667 17:49:16 -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:05:23.667 17:49:16 -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:23.667 17:49:16 -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:05:23.667 17:49:16 -- setup/devices.sh@14 -- # [[ -b /dev/nvme0n1 ]] 00:05:23.667 17:49:16 -- setup/devices.sh@15 -- # wipefs --all /dev/nvme0n1 00:05:23.667 00:05:23.667 real 0m27.849s 00:05:23.667 user 0m8.153s 00:05:23.667 sys 0m14.750s 00:05:23.667 17:49:16 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:23.667 17:49:16 -- common/autotest_common.sh@10 -- # set +x 00:05:23.667 ************************************ 00:05:23.667 END TEST devices 00:05:23.667 ************************************ 00:05:23.667 00:05:23.667 real 1m37.798s 00:05:23.667 user 0m31.162s 00:05:23.667 sys 0m55.862s 00:05:23.667 17:49:16 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:23.667 17:49:16 -- common/autotest_common.sh@10 -- # set +x 00:05:23.667 ************************************ 00:05:23.667 END TEST setup.sh 00:05:23.667 ************************************ 00:05:23.667 17:49:16 -- spdk/autotest.sh@126 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:05:26.960 Hugepages 00:05:26.960 node hugesize free / total 00:05:26.960 node0 1048576kB 0 / 0 00:05:26.960 node0 2048kB 2048 / 2048 00:05:26.960 node1 1048576kB 0 / 0 00:05:26.960 node1 2048kB 0 / 0 00:05:26.960 00:05:26.960 Type BDF Vendor Device NUMA Driver Device Block devices 00:05:26.960 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:05:26.960 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:05:26.960 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:05:26.960 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:05:26.960 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:05:26.960 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:05:26.960 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:05:26.960 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:05:26.960 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:05:26.960 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:05:26.960 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:05:26.960 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:05:26.960 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:05:26.960 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:05:26.960 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:05:26.960 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:05:27.220 NVMe 0000:d8:00.0 8086 0a54 1 nvme nvme0 nvme0n1 00:05:27.220 17:49:19 -- spdk/autotest.sh@128 -- # uname -s 00:05:27.220 17:49:19 -- spdk/autotest.sh@128 -- # [[ Linux == Linux ]] 00:05:27.220 17:49:19 -- spdk/autotest.sh@130 -- # nvme_namespace_revert 00:05:27.220 17:49:19 -- common/autotest_common.sh@1526 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:30.515 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:30.515 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:30.515 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:30.774 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:30.774 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:30.774 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:30.774 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:30.774 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:30.774 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:30.774 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:30.774 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:30.774 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:30.774 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:30.774 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:30.774 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:30.774 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:32.682 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:05:32.682 17:49:25 -- common/autotest_common.sh@1527 -- # sleep 1 00:05:33.620 17:49:26 -- common/autotest_common.sh@1528 -- # bdfs=() 00:05:33.620 17:49:26 -- common/autotest_common.sh@1528 -- # local bdfs 00:05:33.620 17:49:26 -- common/autotest_common.sh@1529 -- # bdfs=($(get_nvme_bdfs)) 00:05:33.620 17:49:26 -- common/autotest_common.sh@1529 -- # get_nvme_bdfs 00:05:33.620 17:49:26 -- common/autotest_common.sh@1508 -- # bdfs=() 00:05:33.620 17:49:26 -- common/autotest_common.sh@1508 -- # local bdfs 00:05:33.620 17:49:26 -- common/autotest_common.sh@1509 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:05:33.620 17:49:26 -- common/autotest_common.sh@1509 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/gen_nvme.sh 00:05:33.620 17:49:26 -- common/autotest_common.sh@1509 -- # jq -r '.config[].params.traddr' 00:05:33.620 17:49:26 -- common/autotest_common.sh@1510 -- # (( 1 == 0 )) 00:05:33.620 17:49:26 -- common/autotest_common.sh@1514 -- # printf '%s\n' 0000:d8:00.0 00:05:33.620 17:49:26 -- common/autotest_common.sh@1531 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:05:36.912 Waiting for block devices as requested 00:05:36.912 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:05:36.912 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:05:37.171 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:05:37.171 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:05:37.171 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:05:37.430 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:05:37.430 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:05:37.430 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:05:37.690 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:05:37.690 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:05:37.690 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:05:37.949 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:05:37.949 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:05:37.949 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:05:38.209 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:05:38.209 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:05:38.209 0000:d8:00.0 (8086 0a54): vfio-pci -> nvme 00:05:38.468 17:49:31 -- common/autotest_common.sh@1533 -- # for bdf in "${bdfs[@]}" 00:05:38.468 17:49:31 -- common/autotest_common.sh@1534 -- # get_nvme_ctrlr_from_bdf 0000:d8:00.0 00:05:38.468 17:49:31 -- common/autotest_common.sh@1497 -- # readlink -f /sys/class/nvme/nvme0 00:05:38.468 17:49:31 -- common/autotest_common.sh@1497 -- # grep 0000:d8:00.0/nvme/nvme 00:05:38.468 17:49:31 -- common/autotest_common.sh@1497 -- # bdf_sysfs_path=/sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 00:05:38.468 17:49:31 -- common/autotest_common.sh@1498 -- # [[ -z /sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 ]] 00:05:38.468 17:49:31 -- common/autotest_common.sh@1502 -- # basename /sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 00:05:38.468 17:49:31 -- common/autotest_common.sh@1502 -- # printf '%s\n' nvme0 00:05:38.468 17:49:31 -- common/autotest_common.sh@1534 -- # nvme_ctrlr=/dev/nvme0 00:05:38.468 17:49:31 -- common/autotest_common.sh@1535 -- # [[ -z /dev/nvme0 ]] 00:05:38.468 17:49:31 -- common/autotest_common.sh@1540 -- # nvme id-ctrl /dev/nvme0 00:05:38.468 17:49:31 -- common/autotest_common.sh@1540 -- # grep oacs 00:05:38.468 17:49:31 -- common/autotest_common.sh@1540 -- # cut -d: -f2 00:05:38.468 17:49:31 -- common/autotest_common.sh@1540 -- # oacs=' 0xe' 00:05:38.468 17:49:31 -- common/autotest_common.sh@1541 -- # oacs_ns_manage=8 00:05:38.468 17:49:31 -- common/autotest_common.sh@1543 -- # [[ 8 -ne 0 ]] 00:05:38.468 17:49:31 -- common/autotest_common.sh@1549 -- # nvme id-ctrl /dev/nvme0 00:05:38.468 17:49:31 -- common/autotest_common.sh@1549 -- # grep unvmcap 00:05:38.468 17:49:31 -- common/autotest_common.sh@1549 -- # cut -d: -f2 00:05:38.468 17:49:31 -- common/autotest_common.sh@1549 -- # unvmcap=' 0' 00:05:38.468 17:49:31 -- common/autotest_common.sh@1550 -- # [[ 0 -eq 0 ]] 00:05:38.468 17:49:31 -- common/autotest_common.sh@1552 -- # continue 00:05:38.468 17:49:31 -- spdk/autotest.sh@133 -- # timing_exit pre_cleanup 00:05:38.468 17:49:31 -- common/autotest_common.sh@728 -- # xtrace_disable 00:05:38.468 17:49:31 -- common/autotest_common.sh@10 -- # set +x 00:05:38.468 17:49:31 -- spdk/autotest.sh@136 -- # timing_enter afterboot 00:05:38.468 17:49:31 -- common/autotest_common.sh@722 -- # xtrace_disable 00:05:38.468 17:49:31 -- common/autotest_common.sh@10 -- # set +x 00:05:38.468 17:49:31 -- spdk/autotest.sh@137 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:42.664 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:42.664 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:42.664 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:42.664 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:42.664 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:42.664 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:42.664 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:42.664 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:42.664 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:42.664 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:42.664 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:42.664 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:42.664 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:42.664 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:42.664 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:42.664 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:44.041 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:05:44.041 17:49:36 -- spdk/autotest.sh@138 -- # timing_exit afterboot 00:05:44.041 17:49:36 -- common/autotest_common.sh@728 -- # xtrace_disable 00:05:44.041 17:49:36 -- common/autotest_common.sh@10 -- # set +x 00:05:44.041 17:49:36 -- spdk/autotest.sh@142 -- # opal_revert_cleanup 00:05:44.041 17:49:36 -- common/autotest_common.sh@1586 -- # mapfile -t bdfs 00:05:44.041 17:49:36 -- common/autotest_common.sh@1586 -- # get_nvme_bdfs_by_id 0x0a54 00:05:44.041 17:49:36 -- common/autotest_common.sh@1572 -- # bdfs=() 00:05:44.041 17:49:36 -- common/autotest_common.sh@1572 -- # local bdfs 00:05:44.041 17:49:36 -- common/autotest_common.sh@1574 -- # get_nvme_bdfs 00:05:44.041 17:49:36 -- common/autotest_common.sh@1508 -- # bdfs=() 00:05:44.041 17:49:36 -- common/autotest_common.sh@1508 -- # local bdfs 00:05:44.041 17:49:36 -- common/autotest_common.sh@1509 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:05:44.041 17:49:36 -- common/autotest_common.sh@1509 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/gen_nvme.sh 00:05:44.041 17:49:36 -- common/autotest_common.sh@1509 -- # jq -r '.config[].params.traddr' 00:05:44.041 17:49:36 -- common/autotest_common.sh@1510 -- # (( 1 == 0 )) 00:05:44.041 17:49:36 -- common/autotest_common.sh@1514 -- # printf '%s\n' 0000:d8:00.0 00:05:44.041 17:49:36 -- common/autotest_common.sh@1574 -- # for bdf in $(get_nvme_bdfs) 00:05:44.041 17:49:36 -- common/autotest_common.sh@1575 -- # cat /sys/bus/pci/devices/0000:d8:00.0/device 00:05:44.041 17:49:36 -- common/autotest_common.sh@1575 -- # device=0x0a54 00:05:44.042 17:49:36 -- common/autotest_common.sh@1576 -- # [[ 0x0a54 == \0\x\0\a\5\4 ]] 00:05:44.042 17:49:36 -- common/autotest_common.sh@1577 -- # bdfs+=($bdf) 00:05:44.042 17:49:36 -- common/autotest_common.sh@1581 -- # printf '%s\n' 0000:d8:00.0 00:05:44.042 17:49:36 -- common/autotest_common.sh@1587 -- # [[ -z 0000:d8:00.0 ]] 00:05:44.042 17:49:36 -- common/autotest_common.sh@1592 -- # spdk_tgt_pid=608361 00:05:44.042 17:49:36 -- common/autotest_common.sh@1593 -- # waitforlisten 608361 00:05:44.042 17:49:36 -- common/autotest_common.sh@1591 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:05:44.042 17:49:36 -- common/autotest_common.sh@829 -- # '[' -z 608361 ']' 00:05:44.042 17:49:36 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:44.042 17:49:36 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:44.042 17:49:36 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:44.042 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:44.042 17:49:36 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:44.042 17:49:36 -- common/autotest_common.sh@10 -- # set +x 00:05:44.042 [2024-11-19 17:49:36.859201] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:05:44.042 [2024-11-19 17:49:36.859278] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid608361 ] 00:05:44.042 EAL: No free 2048 kB hugepages reported on node 1 00:05:44.301 [2024-11-19 17:49:36.939886] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:44.301 [2024-11-19 17:49:36.980453] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:44.301 [2024-11-19 17:49:36.980570] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:44.868 17:49:37 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:44.868 17:49:37 -- common/autotest_common.sh@862 -- # return 0 00:05:44.868 17:49:37 -- common/autotest_common.sh@1595 -- # bdf_id=0 00:05:44.868 17:49:37 -- common/autotest_common.sh@1596 -- # for bdf in "${bdfs[@]}" 00:05:44.868 17:49:37 -- common/autotest_common.sh@1597 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py bdev_nvme_attach_controller -b nvme0 -t pcie -a 0000:d8:00.0 00:05:48.156 nvme0n1 00:05:48.156 17:49:40 -- common/autotest_common.sh@1599 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py bdev_nvme_opal_revert -b nvme0 -p test 00:05:48.156 [2024-11-19 17:49:40.853797] vbdev_opal_rpc.c: 125:rpc_bdev_nvme_opal_revert: *ERROR*: nvme0 not support opal 00:05:48.156 request: 00:05:48.156 { 00:05:48.156 "nvme_ctrlr_name": "nvme0", 00:05:48.156 "password": "test", 00:05:48.156 "method": "bdev_nvme_opal_revert", 00:05:48.156 "req_id": 1 00:05:48.156 } 00:05:48.156 Got JSON-RPC error response 00:05:48.156 response: 00:05:48.156 { 00:05:48.156 "code": -32602, 00:05:48.156 "message": "Invalid parameters" 00:05:48.156 } 00:05:48.156 17:49:40 -- common/autotest_common.sh@1599 -- # true 00:05:48.156 17:49:40 -- common/autotest_common.sh@1600 -- # (( ++bdf_id )) 00:05:48.156 17:49:40 -- common/autotest_common.sh@1603 -- # killprocess 608361 00:05:48.156 17:49:40 -- common/autotest_common.sh@936 -- # '[' -z 608361 ']' 00:05:48.156 17:49:40 -- common/autotest_common.sh@940 -- # kill -0 608361 00:05:48.156 17:49:40 -- common/autotest_common.sh@941 -- # uname 00:05:48.156 17:49:40 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:48.156 17:49:40 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 608361 00:05:48.156 17:49:40 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:48.156 17:49:40 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:48.156 17:49:40 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 608361' 00:05:48.156 killing process with pid 608361 00:05:48.156 17:49:40 -- common/autotest_common.sh@955 -- # kill 608361 00:05:48.156 17:49:40 -- common/autotest_common.sh@960 -- # wait 608361 00:05:48.156 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.156 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.156 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.156 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.156 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.156 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.156 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.156 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.156 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.156 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.156 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.156 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.156 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.156 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.156 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.156 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.156 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.156 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.156 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.156 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.156 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.156 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.156 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.156 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.156 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.156 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.156 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.156 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.156 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.156 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.156 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.156 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.156 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.156 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.156 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.156 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.156 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.156 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.156 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.156 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.156 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.156 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.156 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.156 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.156 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.156 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.156 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.156 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.156 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.156 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.156 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.156 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.156 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.156 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.156 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.156 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.156 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.156 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.156 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.156 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.156 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.156 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.156 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.156 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.156 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.157 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.157 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.157 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.157 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.157 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.157 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.157 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.157 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.157 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.157 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.157 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.157 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.157 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.157 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.157 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.157 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.157 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.157 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.157 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.157 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.157 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.157 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.157 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.157 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.157 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.157 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.157 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.157 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.157 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.157 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.157 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.157 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.157 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.157 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.157 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.157 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.157 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.157 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.157 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.157 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.157 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.157 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.157 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.157 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.157 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.157 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.157 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.157 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.157 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.157 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.157 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.157 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.417 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.417 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.417 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.417 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.417 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.417 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.417 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.417 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.417 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.417 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.417 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.417 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.417 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.417 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.417 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.417 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.417 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.417 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.417 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.417 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.417 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.417 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.417 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.417 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.417 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.417 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.417 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.417 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.417 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.417 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.417 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.417 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.417 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.417 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.417 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.417 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.417 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.417 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.417 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.417 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.417 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.417 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.417 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.417 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.417 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.417 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.417 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.417 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.417 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.417 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.417 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.417 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.417 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.417 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.417 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.417 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.417 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.417 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.417 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.417 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.417 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.417 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.417 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.417 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.417 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.417 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.417 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.417 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.417 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.417 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.417 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.417 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.417 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.417 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.417 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.417 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.417 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.417 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:48.417 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:50.321 17:49:43 -- spdk/autotest.sh@148 -- # '[' 0 -eq 1 ']' 00:05:50.321 17:49:43 -- spdk/autotest.sh@152 -- # '[' 1 -eq 1 ']' 00:05:50.321 17:49:43 -- spdk/autotest.sh@153 -- # [[ 0 -eq 1 ]] 00:05:50.321 17:49:43 -- spdk/autotest.sh@153 -- # [[ 0 -eq 1 ]] 00:05:50.321 17:49:43 -- spdk/autotest.sh@160 -- # timing_enter lib 00:05:50.321 17:49:43 -- common/autotest_common.sh@722 -- # xtrace_disable 00:05:50.321 17:49:43 -- common/autotest_common.sh@10 -- # set +x 00:05:50.321 17:49:43 -- spdk/autotest.sh@162 -- # run_test env /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env.sh 00:05:50.321 17:49:43 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:50.321 17:49:43 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:50.321 17:49:43 -- common/autotest_common.sh@10 -- # set +x 00:05:50.321 ************************************ 00:05:50.321 START TEST env 00:05:50.321 ************************************ 00:05:50.321 17:49:43 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env.sh 00:05:50.321 * Looking for test storage... 00:05:50.321 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env 00:05:50.321 17:49:43 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:50.321 17:49:43 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:50.321 17:49:43 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:50.582 17:49:43 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:50.582 17:49:43 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:50.582 17:49:43 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:50.582 17:49:43 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:50.582 17:49:43 -- scripts/common.sh@335 -- # IFS=.-: 00:05:50.582 17:49:43 -- scripts/common.sh@335 -- # read -ra ver1 00:05:50.582 17:49:43 -- scripts/common.sh@336 -- # IFS=.-: 00:05:50.582 17:49:43 -- scripts/common.sh@336 -- # read -ra ver2 00:05:50.582 17:49:43 -- scripts/common.sh@337 -- # local 'op=<' 00:05:50.582 17:49:43 -- scripts/common.sh@339 -- # ver1_l=2 00:05:50.582 17:49:43 -- scripts/common.sh@340 -- # ver2_l=1 00:05:50.582 17:49:43 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:50.582 17:49:43 -- scripts/common.sh@343 -- # case "$op" in 00:05:50.582 17:49:43 -- scripts/common.sh@344 -- # : 1 00:05:50.582 17:49:43 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:50.582 17:49:43 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:50.582 17:49:43 -- scripts/common.sh@364 -- # decimal 1 00:05:50.582 17:49:43 -- scripts/common.sh@352 -- # local d=1 00:05:50.582 17:49:43 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:50.582 17:49:43 -- scripts/common.sh@354 -- # echo 1 00:05:50.582 17:49:43 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:50.582 17:49:43 -- scripts/common.sh@365 -- # decimal 2 00:05:50.582 17:49:43 -- scripts/common.sh@352 -- # local d=2 00:05:50.582 17:49:43 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:50.582 17:49:43 -- scripts/common.sh@354 -- # echo 2 00:05:50.582 17:49:43 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:50.582 17:49:43 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:50.582 17:49:43 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:50.582 17:49:43 -- scripts/common.sh@367 -- # return 0 00:05:50.582 17:49:43 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:50.582 17:49:43 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:50.582 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:50.582 --rc genhtml_branch_coverage=1 00:05:50.582 --rc genhtml_function_coverage=1 00:05:50.582 --rc genhtml_legend=1 00:05:50.582 --rc geninfo_all_blocks=1 00:05:50.582 --rc geninfo_unexecuted_blocks=1 00:05:50.582 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:50.582 ' 00:05:50.582 17:49:43 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:50.582 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:50.582 --rc genhtml_branch_coverage=1 00:05:50.582 --rc genhtml_function_coverage=1 00:05:50.582 --rc genhtml_legend=1 00:05:50.582 --rc geninfo_all_blocks=1 00:05:50.582 --rc geninfo_unexecuted_blocks=1 00:05:50.582 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:50.582 ' 00:05:50.582 17:49:43 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:50.582 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:50.582 --rc genhtml_branch_coverage=1 00:05:50.582 --rc genhtml_function_coverage=1 00:05:50.582 --rc genhtml_legend=1 00:05:50.582 --rc geninfo_all_blocks=1 00:05:50.582 --rc geninfo_unexecuted_blocks=1 00:05:50.582 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:50.582 ' 00:05:50.582 17:49:43 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:50.582 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:50.582 --rc genhtml_branch_coverage=1 00:05:50.582 --rc genhtml_function_coverage=1 00:05:50.582 --rc genhtml_legend=1 00:05:50.582 --rc geninfo_all_blocks=1 00:05:50.582 --rc geninfo_unexecuted_blocks=1 00:05:50.582 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:50.582 ' 00:05:50.582 17:49:43 -- env/env.sh@10 -- # run_test env_memory /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/memory/memory_ut 00:05:50.582 17:49:43 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:50.582 17:49:43 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:50.582 17:49:43 -- common/autotest_common.sh@10 -- # set +x 00:05:50.582 ************************************ 00:05:50.582 START TEST env_memory 00:05:50.582 ************************************ 00:05:50.582 17:49:43 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/memory/memory_ut 00:05:50.582 00:05:50.582 00:05:50.582 CUnit - A unit testing framework for C - Version 2.1-3 00:05:50.582 http://cunit.sourceforge.net/ 00:05:50.582 00:05:50.582 00:05:50.582 Suite: memory 00:05:50.583 Test: alloc and free memory map ...[2024-11-19 17:49:43.275881] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 283:spdk_mem_map_alloc: *ERROR*: Initial mem_map notify failed 00:05:50.583 passed 00:05:50.583 Test: mem map translation ...[2024-11-19 17:49:43.289525] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 591:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=2097152 len=1234 00:05:50.583 [2024-11-19 17:49:43.289541] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 591:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=1234 len=2097152 00:05:50.583 [2024-11-19 17:49:43.289571] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 584:spdk_mem_map_set_translation: *ERROR*: invalid usermode virtual address 281474976710656 00:05:50.583 [2024-11-19 17:49:43.289580] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 600:spdk_mem_map_set_translation: *ERROR*: could not get 0xffffffe00000 map 00:05:50.583 passed 00:05:50.583 Test: mem map registration ...[2024-11-19 17:49:43.309715] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 347:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x200000 len=1234 00:05:50.583 [2024-11-19 17:49:43.309730] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 347:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x4d2 len=2097152 00:05:50.583 passed 00:05:50.583 Test: mem map adjacent registrations ...passed 00:05:50.583 00:05:50.583 Run Summary: Type Total Ran Passed Failed Inactive 00:05:50.583 suites 1 1 n/a 0 0 00:05:50.583 tests 4 4 4 0 0 00:05:50.583 asserts 152 152 152 0 n/a 00:05:50.583 00:05:50.583 Elapsed time = 0.086 seconds 00:05:50.583 00:05:50.583 real 0m0.099s 00:05:50.583 user 0m0.088s 00:05:50.583 sys 0m0.011s 00:05:50.583 17:49:43 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:50.583 17:49:43 -- common/autotest_common.sh@10 -- # set +x 00:05:50.583 ************************************ 00:05:50.583 END TEST env_memory 00:05:50.583 ************************************ 00:05:50.583 17:49:43 -- env/env.sh@11 -- # run_test env_vtophys /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/vtophys/vtophys 00:05:50.583 17:49:43 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:50.583 17:49:43 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:50.583 17:49:43 -- common/autotest_common.sh@10 -- # set +x 00:05:50.583 ************************************ 00:05:50.583 START TEST env_vtophys 00:05:50.583 ************************************ 00:05:50.583 17:49:43 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/vtophys/vtophys 00:05:50.583 EAL: lib.eal log level changed from notice to debug 00:05:50.583 EAL: Detected lcore 0 as core 0 on socket 0 00:05:50.583 EAL: Detected lcore 1 as core 1 on socket 0 00:05:50.583 EAL: Detected lcore 2 as core 2 on socket 0 00:05:50.583 EAL: Detected lcore 3 as core 3 on socket 0 00:05:50.583 EAL: Detected lcore 4 as core 4 on socket 0 00:05:50.583 EAL: Detected lcore 5 as core 5 on socket 0 00:05:50.583 EAL: Detected lcore 6 as core 6 on socket 0 00:05:50.583 EAL: Detected lcore 7 as core 8 on socket 0 00:05:50.583 EAL: Detected lcore 8 as core 9 on socket 0 00:05:50.583 EAL: Detected lcore 9 as core 10 on socket 0 00:05:50.583 EAL: Detected lcore 10 as core 11 on socket 0 00:05:50.583 EAL: Detected lcore 11 as core 12 on socket 0 00:05:50.583 EAL: Detected lcore 12 as core 13 on socket 0 00:05:50.583 EAL: Detected lcore 13 as core 14 on socket 0 00:05:50.583 EAL: Detected lcore 14 as core 16 on socket 0 00:05:50.583 EAL: Detected lcore 15 as core 17 on socket 0 00:05:50.583 EAL: Detected lcore 16 as core 18 on socket 0 00:05:50.583 EAL: Detected lcore 17 as core 19 on socket 0 00:05:50.583 EAL: Detected lcore 18 as core 20 on socket 0 00:05:50.583 EAL: Detected lcore 19 as core 21 on socket 0 00:05:50.583 EAL: Detected lcore 20 as core 22 on socket 0 00:05:50.583 EAL: Detected lcore 21 as core 24 on socket 0 00:05:50.583 EAL: Detected lcore 22 as core 25 on socket 0 00:05:50.583 EAL: Detected lcore 23 as core 26 on socket 0 00:05:50.583 EAL: Detected lcore 24 as core 27 on socket 0 00:05:50.583 EAL: Detected lcore 25 as core 28 on socket 0 00:05:50.583 EAL: Detected lcore 26 as core 29 on socket 0 00:05:50.583 EAL: Detected lcore 27 as core 30 on socket 0 00:05:50.583 EAL: Detected lcore 28 as core 0 on socket 1 00:05:50.583 EAL: Detected lcore 29 as core 1 on socket 1 00:05:50.583 EAL: Detected lcore 30 as core 2 on socket 1 00:05:50.583 EAL: Detected lcore 31 as core 3 on socket 1 00:05:50.583 EAL: Detected lcore 32 as core 4 on socket 1 00:05:50.583 EAL: Detected lcore 33 as core 5 on socket 1 00:05:50.583 EAL: Detected lcore 34 as core 6 on socket 1 00:05:50.583 EAL: Detected lcore 35 as core 8 on socket 1 00:05:50.583 EAL: Detected lcore 36 as core 9 on socket 1 00:05:50.583 EAL: Detected lcore 37 as core 10 on socket 1 00:05:50.583 EAL: Detected lcore 38 as core 11 on socket 1 00:05:50.583 EAL: Detected lcore 39 as core 12 on socket 1 00:05:50.583 EAL: Detected lcore 40 as core 13 on socket 1 00:05:50.583 EAL: Detected lcore 41 as core 14 on socket 1 00:05:50.583 EAL: Detected lcore 42 as core 16 on socket 1 00:05:50.583 EAL: Detected lcore 43 as core 17 on socket 1 00:05:50.583 EAL: Detected lcore 44 as core 18 on socket 1 00:05:50.583 EAL: Detected lcore 45 as core 19 on socket 1 00:05:50.583 EAL: Detected lcore 46 as core 20 on socket 1 00:05:50.583 EAL: Detected lcore 47 as core 21 on socket 1 00:05:50.583 EAL: Detected lcore 48 as core 22 on socket 1 00:05:50.583 EAL: Detected lcore 49 as core 24 on socket 1 00:05:50.583 EAL: Detected lcore 50 as core 25 on socket 1 00:05:50.583 EAL: Detected lcore 51 as core 26 on socket 1 00:05:50.583 EAL: Detected lcore 52 as core 27 on socket 1 00:05:50.583 EAL: Detected lcore 53 as core 28 on socket 1 00:05:50.583 EAL: Detected lcore 54 as core 29 on socket 1 00:05:50.583 EAL: Detected lcore 55 as core 30 on socket 1 00:05:50.583 EAL: Detected lcore 56 as core 0 on socket 0 00:05:50.583 EAL: Detected lcore 57 as core 1 on socket 0 00:05:50.583 EAL: Detected lcore 58 as core 2 on socket 0 00:05:50.583 EAL: Detected lcore 59 as core 3 on socket 0 00:05:50.583 EAL: Detected lcore 60 as core 4 on socket 0 00:05:50.583 EAL: Detected lcore 61 as core 5 on socket 0 00:05:50.583 EAL: Detected lcore 62 as core 6 on socket 0 00:05:50.583 EAL: Detected lcore 63 as core 8 on socket 0 00:05:50.583 EAL: Detected lcore 64 as core 9 on socket 0 00:05:50.583 EAL: Detected lcore 65 as core 10 on socket 0 00:05:50.583 EAL: Detected lcore 66 as core 11 on socket 0 00:05:50.583 EAL: Detected lcore 67 as core 12 on socket 0 00:05:50.583 EAL: Detected lcore 68 as core 13 on socket 0 00:05:50.583 EAL: Detected lcore 69 as core 14 on socket 0 00:05:50.583 EAL: Detected lcore 70 as core 16 on socket 0 00:05:50.583 EAL: Detected lcore 71 as core 17 on socket 0 00:05:50.583 EAL: Detected lcore 72 as core 18 on socket 0 00:05:50.583 EAL: Detected lcore 73 as core 19 on socket 0 00:05:50.583 EAL: Detected lcore 74 as core 20 on socket 0 00:05:50.583 EAL: Detected lcore 75 as core 21 on socket 0 00:05:50.583 EAL: Detected lcore 76 as core 22 on socket 0 00:05:50.583 EAL: Detected lcore 77 as core 24 on socket 0 00:05:50.583 EAL: Detected lcore 78 as core 25 on socket 0 00:05:50.583 EAL: Detected lcore 79 as core 26 on socket 0 00:05:50.583 EAL: Detected lcore 80 as core 27 on socket 0 00:05:50.583 EAL: Detected lcore 81 as core 28 on socket 0 00:05:50.583 EAL: Detected lcore 82 as core 29 on socket 0 00:05:50.583 EAL: Detected lcore 83 as core 30 on socket 0 00:05:50.583 EAL: Detected lcore 84 as core 0 on socket 1 00:05:50.583 EAL: Detected lcore 85 as core 1 on socket 1 00:05:50.583 EAL: Detected lcore 86 as core 2 on socket 1 00:05:50.583 EAL: Detected lcore 87 as core 3 on socket 1 00:05:50.583 EAL: Detected lcore 88 as core 4 on socket 1 00:05:50.583 EAL: Detected lcore 89 as core 5 on socket 1 00:05:50.583 EAL: Detected lcore 90 as core 6 on socket 1 00:05:50.583 EAL: Detected lcore 91 as core 8 on socket 1 00:05:50.583 EAL: Detected lcore 92 as core 9 on socket 1 00:05:50.583 EAL: Detected lcore 93 as core 10 on socket 1 00:05:50.583 EAL: Detected lcore 94 as core 11 on socket 1 00:05:50.583 EAL: Detected lcore 95 as core 12 on socket 1 00:05:50.583 EAL: Detected lcore 96 as core 13 on socket 1 00:05:50.583 EAL: Detected lcore 97 as core 14 on socket 1 00:05:50.583 EAL: Detected lcore 98 as core 16 on socket 1 00:05:50.583 EAL: Detected lcore 99 as core 17 on socket 1 00:05:50.583 EAL: Detected lcore 100 as core 18 on socket 1 00:05:50.583 EAL: Detected lcore 101 as core 19 on socket 1 00:05:50.583 EAL: Detected lcore 102 as core 20 on socket 1 00:05:50.583 EAL: Detected lcore 103 as core 21 on socket 1 00:05:50.583 EAL: Detected lcore 104 as core 22 on socket 1 00:05:50.583 EAL: Detected lcore 105 as core 24 on socket 1 00:05:50.583 EAL: Detected lcore 106 as core 25 on socket 1 00:05:50.583 EAL: Detected lcore 107 as core 26 on socket 1 00:05:50.583 EAL: Detected lcore 108 as core 27 on socket 1 00:05:50.583 EAL: Detected lcore 109 as core 28 on socket 1 00:05:50.583 EAL: Detected lcore 110 as core 29 on socket 1 00:05:50.583 EAL: Detected lcore 111 as core 30 on socket 1 00:05:50.583 EAL: Maximum logical cores by configuration: 128 00:05:50.583 EAL: Detected CPU lcores: 112 00:05:50.583 EAL: Detected NUMA nodes: 2 00:05:50.583 EAL: Checking presence of .so 'librte_eal.so.23.0' 00:05:50.583 EAL: Checking presence of .so 'librte_eal.so.23' 00:05:50.583 EAL: Checking presence of .so 'librte_eal.so' 00:05:50.583 EAL: Detected static linkage of DPDK 00:05:50.583 EAL: No shared files mode enabled, IPC will be disabled 00:05:50.583 EAL: Bus pci wants IOVA as 'DC' 00:05:50.583 EAL: Buses did not request a specific IOVA mode. 00:05:50.583 EAL: IOMMU is available, selecting IOVA as VA mode. 00:05:50.583 EAL: Selected IOVA mode 'VA' 00:05:50.583 EAL: No free 2048 kB hugepages reported on node 1 00:05:50.583 EAL: Probing VFIO support... 00:05:50.583 EAL: IOMMU type 1 (Type 1) is supported 00:05:50.583 EAL: IOMMU type 7 (sPAPR) is not supported 00:05:50.583 EAL: IOMMU type 8 (No-IOMMU) is not supported 00:05:50.583 EAL: VFIO support initialized 00:05:50.583 EAL: Ask a virtual area of 0x2e000 bytes 00:05:50.583 EAL: Virtual area found at 0x200000000000 (size = 0x2e000) 00:05:50.583 EAL: Setting up physically contiguous memory... 00:05:50.583 EAL: Setting maximum number of open files to 524288 00:05:50.583 EAL: Detected memory type: socket_id:0 hugepage_sz:2097152 00:05:50.583 EAL: Detected memory type: socket_id:1 hugepage_sz:2097152 00:05:50.583 EAL: Creating 4 segment lists: n_segs:8192 socket_id:0 hugepage_sz:2097152 00:05:50.583 EAL: Ask a virtual area of 0x61000 bytes 00:05:50.583 EAL: Virtual area found at 0x20000002e000 (size = 0x61000) 00:05:50.583 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:50.583 EAL: Ask a virtual area of 0x400000000 bytes 00:05:50.584 EAL: Virtual area found at 0x200000200000 (size = 0x400000000) 00:05:50.584 EAL: VA reserved for memseg list at 0x200000200000, size 400000000 00:05:50.584 EAL: Ask a virtual area of 0x61000 bytes 00:05:50.584 EAL: Virtual area found at 0x200400200000 (size = 0x61000) 00:05:50.584 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:50.584 EAL: Ask a virtual area of 0x400000000 bytes 00:05:50.584 EAL: Virtual area found at 0x200400400000 (size = 0x400000000) 00:05:50.584 EAL: VA reserved for memseg list at 0x200400400000, size 400000000 00:05:50.584 EAL: Ask a virtual area of 0x61000 bytes 00:05:50.584 EAL: Virtual area found at 0x200800400000 (size = 0x61000) 00:05:50.584 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:50.584 EAL: Ask a virtual area of 0x400000000 bytes 00:05:50.584 EAL: Virtual area found at 0x200800600000 (size = 0x400000000) 00:05:50.584 EAL: VA reserved for memseg list at 0x200800600000, size 400000000 00:05:50.584 EAL: Ask a virtual area of 0x61000 bytes 00:05:50.584 EAL: Virtual area found at 0x200c00600000 (size = 0x61000) 00:05:50.584 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:50.584 EAL: Ask a virtual area of 0x400000000 bytes 00:05:50.584 EAL: Virtual area found at 0x200c00800000 (size = 0x400000000) 00:05:50.584 EAL: VA reserved for memseg list at 0x200c00800000, size 400000000 00:05:50.584 EAL: Creating 4 segment lists: n_segs:8192 socket_id:1 hugepage_sz:2097152 00:05:50.584 EAL: Ask a virtual area of 0x61000 bytes 00:05:50.584 EAL: Virtual area found at 0x201000800000 (size = 0x61000) 00:05:50.584 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:50.584 EAL: Ask a virtual area of 0x400000000 bytes 00:05:50.584 EAL: Virtual area found at 0x201000a00000 (size = 0x400000000) 00:05:50.584 EAL: VA reserved for memseg list at 0x201000a00000, size 400000000 00:05:50.584 EAL: Ask a virtual area of 0x61000 bytes 00:05:50.584 EAL: Virtual area found at 0x201400a00000 (size = 0x61000) 00:05:50.584 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:50.584 EAL: Ask a virtual area of 0x400000000 bytes 00:05:50.584 EAL: Virtual area found at 0x201400c00000 (size = 0x400000000) 00:05:50.584 EAL: VA reserved for memseg list at 0x201400c00000, size 400000000 00:05:50.584 EAL: Ask a virtual area of 0x61000 bytes 00:05:50.584 EAL: Virtual area found at 0x201800c00000 (size = 0x61000) 00:05:50.584 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:50.584 EAL: Ask a virtual area of 0x400000000 bytes 00:05:50.584 EAL: Virtual area found at 0x201800e00000 (size = 0x400000000) 00:05:50.584 EAL: VA reserved for memseg list at 0x201800e00000, size 400000000 00:05:50.584 EAL: Ask a virtual area of 0x61000 bytes 00:05:50.584 EAL: Virtual area found at 0x201c00e00000 (size = 0x61000) 00:05:50.584 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:50.584 EAL: Ask a virtual area of 0x400000000 bytes 00:05:50.584 EAL: Virtual area found at 0x201c01000000 (size = 0x400000000) 00:05:50.584 EAL: VA reserved for memseg list at 0x201c01000000, size 400000000 00:05:50.584 EAL: Hugepages will be freed exactly as allocated. 00:05:50.584 EAL: No shared files mode enabled, IPC is disabled 00:05:50.584 EAL: No shared files mode enabled, IPC is disabled 00:05:50.584 EAL: TSC frequency is ~2500000 KHz 00:05:50.584 EAL: Main lcore 0 is ready (tid=7f2f8b658a00;cpuset=[0]) 00:05:50.584 EAL: Trying to obtain current memory policy. 00:05:50.584 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:50.584 EAL: Restoring previous memory policy: 0 00:05:50.584 EAL: request: mp_malloc_sync 00:05:50.584 EAL: No shared files mode enabled, IPC is disabled 00:05:50.584 EAL: Heap on socket 0 was expanded by 2MB 00:05:50.584 EAL: No shared files mode enabled, IPC is disabled 00:05:50.844 EAL: Mem event callback 'spdk:(nil)' registered 00:05:50.844 00:05:50.844 00:05:50.844 CUnit - A unit testing framework for C - Version 2.1-3 00:05:50.844 http://cunit.sourceforge.net/ 00:05:50.844 00:05:50.844 00:05:50.844 Suite: components_suite 00:05:50.844 Test: vtophys_malloc_test ...passed 00:05:50.844 Test: vtophys_spdk_malloc_test ...EAL: Trying to obtain current memory policy. 00:05:50.844 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:50.844 EAL: Restoring previous memory policy: 4 00:05:50.844 EAL: Calling mem event callback 'spdk:(nil)' 00:05:50.844 EAL: request: mp_malloc_sync 00:05:50.844 EAL: No shared files mode enabled, IPC is disabled 00:05:50.844 EAL: Heap on socket 0 was expanded by 4MB 00:05:50.844 EAL: Calling mem event callback 'spdk:(nil)' 00:05:50.844 EAL: request: mp_malloc_sync 00:05:50.844 EAL: No shared files mode enabled, IPC is disabled 00:05:50.844 EAL: Heap on socket 0 was shrunk by 4MB 00:05:50.844 EAL: Trying to obtain current memory policy. 00:05:50.844 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:50.844 EAL: Restoring previous memory policy: 4 00:05:50.844 EAL: Calling mem event callback 'spdk:(nil)' 00:05:50.844 EAL: request: mp_malloc_sync 00:05:50.844 EAL: No shared files mode enabled, IPC is disabled 00:05:50.844 EAL: Heap on socket 0 was expanded by 6MB 00:05:50.844 EAL: Calling mem event callback 'spdk:(nil)' 00:05:50.844 EAL: request: mp_malloc_sync 00:05:50.844 EAL: No shared files mode enabled, IPC is disabled 00:05:50.844 EAL: Heap on socket 0 was shrunk by 6MB 00:05:50.844 EAL: Trying to obtain current memory policy. 00:05:50.844 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:50.844 EAL: Restoring previous memory policy: 4 00:05:50.844 EAL: Calling mem event callback 'spdk:(nil)' 00:05:50.844 EAL: request: mp_malloc_sync 00:05:50.844 EAL: No shared files mode enabled, IPC is disabled 00:05:50.845 EAL: Heap on socket 0 was expanded by 10MB 00:05:50.845 EAL: Calling mem event callback 'spdk:(nil)' 00:05:50.845 EAL: request: mp_malloc_sync 00:05:50.845 EAL: No shared files mode enabled, IPC is disabled 00:05:50.845 EAL: Heap on socket 0 was shrunk by 10MB 00:05:50.845 EAL: Trying to obtain current memory policy. 00:05:50.845 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:50.845 EAL: Restoring previous memory policy: 4 00:05:50.845 EAL: Calling mem event callback 'spdk:(nil)' 00:05:50.845 EAL: request: mp_malloc_sync 00:05:50.845 EAL: No shared files mode enabled, IPC is disabled 00:05:50.845 EAL: Heap on socket 0 was expanded by 18MB 00:05:50.845 EAL: Calling mem event callback 'spdk:(nil)' 00:05:50.845 EAL: request: mp_malloc_sync 00:05:50.845 EAL: No shared files mode enabled, IPC is disabled 00:05:50.845 EAL: Heap on socket 0 was shrunk by 18MB 00:05:50.845 EAL: Trying to obtain current memory policy. 00:05:50.845 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:50.845 EAL: Restoring previous memory policy: 4 00:05:50.845 EAL: Calling mem event callback 'spdk:(nil)' 00:05:50.845 EAL: request: mp_malloc_sync 00:05:50.845 EAL: No shared files mode enabled, IPC is disabled 00:05:50.845 EAL: Heap on socket 0 was expanded by 34MB 00:05:50.845 EAL: Calling mem event callback 'spdk:(nil)' 00:05:50.845 EAL: request: mp_malloc_sync 00:05:50.845 EAL: No shared files mode enabled, IPC is disabled 00:05:50.845 EAL: Heap on socket 0 was shrunk by 34MB 00:05:50.845 EAL: Trying to obtain current memory policy. 00:05:50.845 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:50.845 EAL: Restoring previous memory policy: 4 00:05:50.845 EAL: Calling mem event callback 'spdk:(nil)' 00:05:50.845 EAL: request: mp_malloc_sync 00:05:50.845 EAL: No shared files mode enabled, IPC is disabled 00:05:50.845 EAL: Heap on socket 0 was expanded by 66MB 00:05:50.845 EAL: Calling mem event callback 'spdk:(nil)' 00:05:50.845 EAL: request: mp_malloc_sync 00:05:50.845 EAL: No shared files mode enabled, IPC is disabled 00:05:50.845 EAL: Heap on socket 0 was shrunk by 66MB 00:05:50.845 EAL: Trying to obtain current memory policy. 00:05:50.845 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:50.845 EAL: Restoring previous memory policy: 4 00:05:50.845 EAL: Calling mem event callback 'spdk:(nil)' 00:05:50.845 EAL: request: mp_malloc_sync 00:05:50.845 EAL: No shared files mode enabled, IPC is disabled 00:05:50.845 EAL: Heap on socket 0 was expanded by 130MB 00:05:50.845 EAL: Calling mem event callback 'spdk:(nil)' 00:05:50.845 EAL: request: mp_malloc_sync 00:05:50.845 EAL: No shared files mode enabled, IPC is disabled 00:05:50.845 EAL: Heap on socket 0 was shrunk by 130MB 00:05:50.845 EAL: Trying to obtain current memory policy. 00:05:50.845 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:50.845 EAL: Restoring previous memory policy: 4 00:05:50.845 EAL: Calling mem event callback 'spdk:(nil)' 00:05:50.845 EAL: request: mp_malloc_sync 00:05:50.845 EAL: No shared files mode enabled, IPC is disabled 00:05:50.845 EAL: Heap on socket 0 was expanded by 258MB 00:05:50.845 EAL: Calling mem event callback 'spdk:(nil)' 00:05:51.104 EAL: request: mp_malloc_sync 00:05:51.104 EAL: No shared files mode enabled, IPC is disabled 00:05:51.104 EAL: Heap on socket 0 was shrunk by 258MB 00:05:51.104 EAL: Trying to obtain current memory policy. 00:05:51.104 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:51.104 EAL: Restoring previous memory policy: 4 00:05:51.104 EAL: Calling mem event callback 'spdk:(nil)' 00:05:51.104 EAL: request: mp_malloc_sync 00:05:51.104 EAL: No shared files mode enabled, IPC is disabled 00:05:51.104 EAL: Heap on socket 0 was expanded by 514MB 00:05:51.104 EAL: Calling mem event callback 'spdk:(nil)' 00:05:51.364 EAL: request: mp_malloc_sync 00:05:51.364 EAL: No shared files mode enabled, IPC is disabled 00:05:51.364 EAL: Heap on socket 0 was shrunk by 514MB 00:05:51.364 EAL: Trying to obtain current memory policy. 00:05:51.364 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:51.364 EAL: Restoring previous memory policy: 4 00:05:51.364 EAL: Calling mem event callback 'spdk:(nil)' 00:05:51.364 EAL: request: mp_malloc_sync 00:05:51.364 EAL: No shared files mode enabled, IPC is disabled 00:05:51.364 EAL: Heap on socket 0 was expanded by 1026MB 00:05:51.623 EAL: Calling mem event callback 'spdk:(nil)' 00:05:51.623 EAL: request: mp_malloc_sync 00:05:51.623 EAL: No shared files mode enabled, IPC is disabled 00:05:51.623 EAL: Heap on socket 0 was shrunk by 1026MB 00:05:51.623 passed 00:05:51.623 00:05:51.623 Run Summary: Type Total Ran Passed Failed Inactive 00:05:51.623 suites 1 1 n/a 0 0 00:05:51.623 tests 2 2 2 0 0 00:05:51.623 asserts 497 497 497 0 n/a 00:05:51.623 00:05:51.623 Elapsed time = 0.974 seconds 00:05:51.623 EAL: Calling mem event callback 'spdk:(nil)' 00:05:51.623 EAL: request: mp_malloc_sync 00:05:51.623 EAL: No shared files mode enabled, IPC is disabled 00:05:51.623 EAL: Heap on socket 0 was shrunk by 2MB 00:05:51.623 EAL: No shared files mode enabled, IPC is disabled 00:05:51.623 EAL: No shared files mode enabled, IPC is disabled 00:05:51.883 EAL: No shared files mode enabled, IPC is disabled 00:05:51.883 00:05:51.883 real 0m1.102s 00:05:51.883 user 0m0.643s 00:05:51.883 sys 0m0.434s 00:05:51.883 17:49:44 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:51.883 17:49:44 -- common/autotest_common.sh@10 -- # set +x 00:05:51.883 ************************************ 00:05:51.883 END TEST env_vtophys 00:05:51.883 ************************************ 00:05:51.883 17:49:44 -- env/env.sh@12 -- # run_test env_pci /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/pci/pci_ut 00:05:51.883 17:49:44 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:51.883 17:49:44 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:51.883 17:49:44 -- common/autotest_common.sh@10 -- # set +x 00:05:51.883 ************************************ 00:05:51.883 START TEST env_pci 00:05:51.883 ************************************ 00:05:51.883 17:49:44 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/pci/pci_ut 00:05:51.883 00:05:51.883 00:05:51.883 CUnit - A unit testing framework for C - Version 2.1-3 00:05:51.883 http://cunit.sourceforge.net/ 00:05:51.883 00:05:51.883 00:05:51.883 Suite: pci 00:05:51.883 Test: pci_hook ...[2024-11-19 17:49:44.553977] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/pci.c:1041:spdk_pci_device_claim: *ERROR*: Cannot create lock on device /var/tmp/spdk_pci_lock_10000:00:01.0, probably process 609804 has claimed it 00:05:51.883 EAL: Cannot find device (10000:00:01.0) 00:05:51.883 EAL: Failed to attach device on primary process 00:05:51.883 passed 00:05:51.883 00:05:51.883 Run Summary: Type Total Ran Passed Failed Inactive 00:05:51.883 suites 1 1 n/a 0 0 00:05:51.883 tests 1 1 1 0 0 00:05:51.883 asserts 25 25 25 0 n/a 00:05:51.883 00:05:51.883 Elapsed time = 0.034 seconds 00:05:51.883 00:05:51.883 real 0m0.053s 00:05:51.883 user 0m0.010s 00:05:51.883 sys 0m0.043s 00:05:51.883 17:49:44 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:51.883 17:49:44 -- common/autotest_common.sh@10 -- # set +x 00:05:51.883 ************************************ 00:05:51.883 END TEST env_pci 00:05:51.883 ************************************ 00:05:51.883 17:49:44 -- env/env.sh@14 -- # argv='-c 0x1 ' 00:05:51.883 17:49:44 -- env/env.sh@15 -- # uname 00:05:51.883 17:49:44 -- env/env.sh@15 -- # '[' Linux = Linux ']' 00:05:51.883 17:49:44 -- env/env.sh@22 -- # argv+=--base-virtaddr=0x200000000000 00:05:51.883 17:49:44 -- env/env.sh@24 -- # run_test env_dpdk_post_init /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:05:51.883 17:49:44 -- common/autotest_common.sh@1087 -- # '[' 5 -le 1 ']' 00:05:51.883 17:49:44 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:51.883 17:49:44 -- common/autotest_common.sh@10 -- # set +x 00:05:51.883 ************************************ 00:05:51.883 START TEST env_dpdk_post_init 00:05:51.883 ************************************ 00:05:51.883 17:49:44 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:05:51.883 EAL: Detected CPU lcores: 112 00:05:51.883 EAL: Detected NUMA nodes: 2 00:05:51.883 EAL: Detected static linkage of DPDK 00:05:51.883 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:05:51.883 EAL: Selected IOVA mode 'VA' 00:05:51.883 EAL: No free 2048 kB hugepages reported on node 1 00:05:51.883 EAL: VFIO support initialized 00:05:51.883 TELEMETRY: No legacy callbacks, legacy socket not created 00:05:52.142 EAL: Using IOMMU type 1 (Type 1) 00:05:52.711 EAL: Probe PCI driver: spdk_nvme (8086:0a54) device: 0000:d8:00.0 (socket 1) 00:05:56.907 EAL: Releasing PCI mapped resource for 0000:d8:00.0 00:05:56.907 EAL: Calling pci_unmap_resource for 0000:d8:00.0 at 0x202001000000 00:05:56.907 Starting DPDK initialization... 00:05:56.907 Starting SPDK post initialization... 00:05:56.907 SPDK NVMe probe 00:05:56.907 Attaching to 0000:d8:00.0 00:05:56.907 Attached to 0000:d8:00.0 00:05:56.907 Cleaning up... 00:05:56.907 00:05:56.907 real 0m4.686s 00:05:56.907 user 0m3.507s 00:05:56.907 sys 0m0.421s 00:05:56.907 17:49:49 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:56.907 17:49:49 -- common/autotest_common.sh@10 -- # set +x 00:05:56.907 ************************************ 00:05:56.907 END TEST env_dpdk_post_init 00:05:56.907 ************************************ 00:05:56.907 17:49:49 -- env/env.sh@26 -- # uname 00:05:56.907 17:49:49 -- env/env.sh@26 -- # '[' Linux = Linux ']' 00:05:56.907 17:49:49 -- env/env.sh@29 -- # run_test env_mem_callbacks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:05:56.907 17:49:49 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:56.907 17:49:49 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:56.907 17:49:49 -- common/autotest_common.sh@10 -- # set +x 00:05:56.907 ************************************ 00:05:56.907 START TEST env_mem_callbacks 00:05:56.907 ************************************ 00:05:56.907 17:49:49 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:05:56.907 EAL: Detected CPU lcores: 112 00:05:56.907 EAL: Detected NUMA nodes: 2 00:05:56.907 EAL: Detected static linkage of DPDK 00:05:56.907 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:05:56.907 EAL: Selected IOVA mode 'VA' 00:05:56.907 EAL: No free 2048 kB hugepages reported on node 1 00:05:56.907 EAL: VFIO support initialized 00:05:56.907 TELEMETRY: No legacy callbacks, legacy socket not created 00:05:56.907 00:05:56.907 00:05:56.907 CUnit - A unit testing framework for C - Version 2.1-3 00:05:56.907 http://cunit.sourceforge.net/ 00:05:56.907 00:05:56.907 00:05:56.907 Suite: memory 00:05:56.907 Test: test ... 00:05:56.907 register 0x200000200000 2097152 00:05:56.907 malloc 3145728 00:05:56.907 register 0x200000400000 4194304 00:05:56.907 buf 0x200000500000 len 3145728 PASSED 00:05:56.907 malloc 64 00:05:56.907 buf 0x2000004fff40 len 64 PASSED 00:05:56.907 malloc 4194304 00:05:56.907 register 0x200000800000 6291456 00:05:56.907 buf 0x200000a00000 len 4194304 PASSED 00:05:56.907 free 0x200000500000 3145728 00:05:56.907 free 0x2000004fff40 64 00:05:56.907 unregister 0x200000400000 4194304 PASSED 00:05:56.907 free 0x200000a00000 4194304 00:05:56.907 unregister 0x200000800000 6291456 PASSED 00:05:56.907 malloc 8388608 00:05:56.907 register 0x200000400000 10485760 00:05:56.907 buf 0x200000600000 len 8388608 PASSED 00:05:56.907 free 0x200000600000 8388608 00:05:56.907 unregister 0x200000400000 10485760 PASSED 00:05:56.907 passed 00:05:56.907 00:05:56.907 Run Summary: Type Total Ran Passed Failed Inactive 00:05:56.907 suites 1 1 n/a 0 0 00:05:56.907 tests 1 1 1 0 0 00:05:56.907 asserts 15 15 15 0 n/a 00:05:56.907 00:05:56.907 Elapsed time = 0.008 seconds 00:05:56.907 00:05:56.907 real 0m0.065s 00:05:56.907 user 0m0.012s 00:05:56.907 sys 0m0.053s 00:05:56.907 17:49:49 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:56.907 17:49:49 -- common/autotest_common.sh@10 -- # set +x 00:05:56.907 ************************************ 00:05:56.907 END TEST env_mem_callbacks 00:05:56.907 ************************************ 00:05:56.907 00:05:56.907 real 0m6.457s 00:05:56.907 user 0m4.468s 00:05:56.907 sys 0m1.264s 00:05:56.907 17:49:49 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:56.907 17:49:49 -- common/autotest_common.sh@10 -- # set +x 00:05:56.907 ************************************ 00:05:56.907 END TEST env 00:05:56.907 ************************************ 00:05:56.907 17:49:49 -- spdk/autotest.sh@163 -- # run_test rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/rpc.sh 00:05:56.907 17:49:49 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:56.907 17:49:49 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:56.907 17:49:49 -- common/autotest_common.sh@10 -- # set +x 00:05:56.907 ************************************ 00:05:56.907 START TEST rpc 00:05:56.907 ************************************ 00:05:56.907 17:49:49 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/rpc.sh 00:05:56.907 * Looking for test storage... 00:05:56.907 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:05:56.907 17:49:49 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:56.907 17:49:49 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:56.907 17:49:49 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:56.907 17:49:49 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:56.907 17:49:49 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:56.907 17:49:49 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:56.907 17:49:49 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:56.907 17:49:49 -- scripts/common.sh@335 -- # IFS=.-: 00:05:56.907 17:49:49 -- scripts/common.sh@335 -- # read -ra ver1 00:05:56.907 17:49:49 -- scripts/common.sh@336 -- # IFS=.-: 00:05:56.907 17:49:49 -- scripts/common.sh@336 -- # read -ra ver2 00:05:56.907 17:49:49 -- scripts/common.sh@337 -- # local 'op=<' 00:05:56.907 17:49:49 -- scripts/common.sh@339 -- # ver1_l=2 00:05:56.907 17:49:49 -- scripts/common.sh@340 -- # ver2_l=1 00:05:56.907 17:49:49 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:56.907 17:49:49 -- scripts/common.sh@343 -- # case "$op" in 00:05:56.907 17:49:49 -- scripts/common.sh@344 -- # : 1 00:05:56.907 17:49:49 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:56.907 17:49:49 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:56.907 17:49:49 -- scripts/common.sh@364 -- # decimal 1 00:05:56.907 17:49:49 -- scripts/common.sh@352 -- # local d=1 00:05:56.907 17:49:49 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:56.907 17:49:49 -- scripts/common.sh@354 -- # echo 1 00:05:56.907 17:49:49 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:56.907 17:49:49 -- scripts/common.sh@365 -- # decimal 2 00:05:56.907 17:49:49 -- scripts/common.sh@352 -- # local d=2 00:05:56.907 17:49:49 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:56.907 17:49:49 -- scripts/common.sh@354 -- # echo 2 00:05:56.908 17:49:49 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:56.908 17:49:49 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:56.908 17:49:49 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:56.908 17:49:49 -- scripts/common.sh@367 -- # return 0 00:05:56.908 17:49:49 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:56.908 17:49:49 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:56.908 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:56.908 --rc genhtml_branch_coverage=1 00:05:56.908 --rc genhtml_function_coverage=1 00:05:56.908 --rc genhtml_legend=1 00:05:56.908 --rc geninfo_all_blocks=1 00:05:56.908 --rc geninfo_unexecuted_blocks=1 00:05:56.908 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:56.908 ' 00:05:56.908 17:49:49 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:56.908 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:56.908 --rc genhtml_branch_coverage=1 00:05:56.908 --rc genhtml_function_coverage=1 00:05:56.908 --rc genhtml_legend=1 00:05:56.908 --rc geninfo_all_blocks=1 00:05:56.908 --rc geninfo_unexecuted_blocks=1 00:05:56.908 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:56.908 ' 00:05:56.908 17:49:49 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:56.908 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:56.908 --rc genhtml_branch_coverage=1 00:05:56.908 --rc genhtml_function_coverage=1 00:05:56.908 --rc genhtml_legend=1 00:05:56.908 --rc geninfo_all_blocks=1 00:05:56.908 --rc geninfo_unexecuted_blocks=1 00:05:56.908 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:56.908 ' 00:05:56.908 17:49:49 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:56.908 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:56.908 --rc genhtml_branch_coverage=1 00:05:56.908 --rc genhtml_function_coverage=1 00:05:56.908 --rc genhtml_legend=1 00:05:56.908 --rc geninfo_all_blocks=1 00:05:56.908 --rc geninfo_unexecuted_blocks=1 00:05:56.908 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:56.908 ' 00:05:56.908 17:49:49 -- rpc/rpc.sh@65 -- # spdk_pid=610843 00:05:56.908 17:49:49 -- rpc/rpc.sh@64 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -e bdev 00:05:56.908 17:49:49 -- rpc/rpc.sh@66 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:05:56.908 17:49:49 -- rpc/rpc.sh@67 -- # waitforlisten 610843 00:05:56.908 17:49:49 -- common/autotest_common.sh@829 -- # '[' -z 610843 ']' 00:05:56.908 17:49:49 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:56.908 17:49:49 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:56.908 17:49:49 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:56.908 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:56.908 17:49:49 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:56.908 17:49:49 -- common/autotest_common.sh@10 -- # set +x 00:05:56.908 [2024-11-19 17:49:49.757100] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:05:56.908 [2024-11-19 17:49:49.757174] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid610843 ] 00:05:57.168 EAL: No free 2048 kB hugepages reported on node 1 00:05:57.168 [2024-11-19 17:49:49.839453] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:57.168 [2024-11-19 17:49:49.874841] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:57.168 [2024-11-19 17:49:49.874965] app.c: 488:app_setup_trace: *NOTICE*: Tracepoint Group Mask bdev specified. 00:05:57.168 [2024-11-19 17:49:49.874976] app.c: 492:app_setup_trace: *NOTICE*: Use 'spdk_trace -s spdk_tgt -p 610843' to capture a snapshot of events at runtime. 00:05:57.168 [2024-11-19 17:49:49.874985] app.c: 494:app_setup_trace: *NOTICE*: Or copy /dev/shm/spdk_tgt_trace.pid610843 for offline analysis/debug. 00:05:57.168 [2024-11-19 17:49:49.875008] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:57.737 17:49:50 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:57.737 17:49:50 -- common/autotest_common.sh@862 -- # return 0 00:05:57.737 17:49:50 -- rpc/rpc.sh@69 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:05:57.737 17:49:50 -- rpc/rpc.sh@69 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:05:57.737 17:49:50 -- rpc/rpc.sh@72 -- # rpc=rpc_cmd 00:05:57.737 17:49:50 -- rpc/rpc.sh@73 -- # run_test rpc_integrity rpc_integrity 00:05:57.737 17:49:50 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:57.737 17:49:50 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:57.737 17:49:50 -- common/autotest_common.sh@10 -- # set +x 00:05:57.737 ************************************ 00:05:57.737 START TEST rpc_integrity 00:05:57.737 ************************************ 00:05:57.737 17:49:50 -- common/autotest_common.sh@1114 -- # rpc_integrity 00:05:57.997 17:49:50 -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:05:57.997 17:49:50 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:57.997 17:49:50 -- common/autotest_common.sh@10 -- # set +x 00:05:57.997 17:49:50 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:57.997 17:49:50 -- rpc/rpc.sh@12 -- # bdevs='[]' 00:05:57.997 17:49:50 -- rpc/rpc.sh@13 -- # jq length 00:05:57.997 17:49:50 -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:05:57.997 17:49:50 -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:05:57.997 17:49:50 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:57.997 17:49:50 -- common/autotest_common.sh@10 -- # set +x 00:05:57.997 17:49:50 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:57.997 17:49:50 -- rpc/rpc.sh@15 -- # malloc=Malloc0 00:05:57.997 17:49:50 -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:05:57.997 17:49:50 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:57.997 17:49:50 -- common/autotest_common.sh@10 -- # set +x 00:05:57.997 17:49:50 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:57.997 17:49:50 -- rpc/rpc.sh@16 -- # bdevs='[ 00:05:57.997 { 00:05:57.997 "name": "Malloc0", 00:05:57.997 "aliases": [ 00:05:57.997 "49cd027b-1957-45aa-92d6-75bed229c0ea" 00:05:57.997 ], 00:05:57.997 "product_name": "Malloc disk", 00:05:57.997 "block_size": 512, 00:05:57.997 "num_blocks": 16384, 00:05:57.997 "uuid": "49cd027b-1957-45aa-92d6-75bed229c0ea", 00:05:57.997 "assigned_rate_limits": { 00:05:57.997 "rw_ios_per_sec": 0, 00:05:57.997 "rw_mbytes_per_sec": 0, 00:05:57.997 "r_mbytes_per_sec": 0, 00:05:57.997 "w_mbytes_per_sec": 0 00:05:57.997 }, 00:05:57.997 "claimed": false, 00:05:57.997 "zoned": false, 00:05:57.997 "supported_io_types": { 00:05:57.997 "read": true, 00:05:57.997 "write": true, 00:05:57.997 "unmap": true, 00:05:57.997 "write_zeroes": true, 00:05:57.997 "flush": true, 00:05:57.997 "reset": true, 00:05:57.997 "compare": false, 00:05:57.997 "compare_and_write": false, 00:05:57.997 "abort": true, 00:05:57.997 "nvme_admin": false, 00:05:57.997 "nvme_io": false 00:05:57.997 }, 00:05:57.997 "memory_domains": [ 00:05:57.997 { 00:05:57.997 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:57.997 "dma_device_type": 2 00:05:57.997 } 00:05:57.997 ], 00:05:57.997 "driver_specific": {} 00:05:57.997 } 00:05:57.997 ]' 00:05:57.997 17:49:50 -- rpc/rpc.sh@17 -- # jq length 00:05:57.997 17:49:50 -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:05:57.997 17:49:50 -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc0 -p Passthru0 00:05:57.997 17:49:50 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:57.997 17:49:50 -- common/autotest_common.sh@10 -- # set +x 00:05:57.997 [2024-11-19 17:49:50.738992] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc0 00:05:57.997 [2024-11-19 17:49:50.739030] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:05:57.997 [2024-11-19 17:49:50.739057] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x4c1f850 00:05:57.997 [2024-11-19 17:49:50.739067] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:05:57.997 [2024-11-19 17:49:50.739899] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:05:57.997 [2024-11-19 17:49:50.739923] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:05:57.997 Passthru0 00:05:57.997 17:49:50 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:57.997 17:49:50 -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:05:57.997 17:49:50 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:57.997 17:49:50 -- common/autotest_common.sh@10 -- # set +x 00:05:57.997 17:49:50 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:57.997 17:49:50 -- rpc/rpc.sh@20 -- # bdevs='[ 00:05:57.997 { 00:05:57.997 "name": "Malloc0", 00:05:57.997 "aliases": [ 00:05:57.997 "49cd027b-1957-45aa-92d6-75bed229c0ea" 00:05:57.997 ], 00:05:57.997 "product_name": "Malloc disk", 00:05:57.997 "block_size": 512, 00:05:57.997 "num_blocks": 16384, 00:05:57.997 "uuid": "49cd027b-1957-45aa-92d6-75bed229c0ea", 00:05:57.997 "assigned_rate_limits": { 00:05:57.997 "rw_ios_per_sec": 0, 00:05:57.997 "rw_mbytes_per_sec": 0, 00:05:57.997 "r_mbytes_per_sec": 0, 00:05:57.997 "w_mbytes_per_sec": 0 00:05:57.997 }, 00:05:57.997 "claimed": true, 00:05:57.997 "claim_type": "exclusive_write", 00:05:57.997 "zoned": false, 00:05:57.997 "supported_io_types": { 00:05:57.997 "read": true, 00:05:57.997 "write": true, 00:05:57.997 "unmap": true, 00:05:57.997 "write_zeroes": true, 00:05:57.997 "flush": true, 00:05:57.997 "reset": true, 00:05:57.997 "compare": false, 00:05:57.997 "compare_and_write": false, 00:05:57.997 "abort": true, 00:05:57.997 "nvme_admin": false, 00:05:57.997 "nvme_io": false 00:05:57.997 }, 00:05:57.997 "memory_domains": [ 00:05:57.997 { 00:05:57.997 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:57.997 "dma_device_type": 2 00:05:57.997 } 00:05:57.997 ], 00:05:57.997 "driver_specific": {} 00:05:57.997 }, 00:05:57.997 { 00:05:57.997 "name": "Passthru0", 00:05:57.997 "aliases": [ 00:05:57.997 "b6197e69-dac6-5813-90b8-850ed8ee5f24" 00:05:57.997 ], 00:05:57.997 "product_name": "passthru", 00:05:57.997 "block_size": 512, 00:05:57.997 "num_blocks": 16384, 00:05:57.998 "uuid": "b6197e69-dac6-5813-90b8-850ed8ee5f24", 00:05:57.998 "assigned_rate_limits": { 00:05:57.998 "rw_ios_per_sec": 0, 00:05:57.998 "rw_mbytes_per_sec": 0, 00:05:57.998 "r_mbytes_per_sec": 0, 00:05:57.998 "w_mbytes_per_sec": 0 00:05:57.998 }, 00:05:57.998 "claimed": false, 00:05:57.998 "zoned": false, 00:05:57.998 "supported_io_types": { 00:05:57.998 "read": true, 00:05:57.998 "write": true, 00:05:57.998 "unmap": true, 00:05:57.998 "write_zeroes": true, 00:05:57.998 "flush": true, 00:05:57.998 "reset": true, 00:05:57.998 "compare": false, 00:05:57.998 "compare_and_write": false, 00:05:57.998 "abort": true, 00:05:57.998 "nvme_admin": false, 00:05:57.998 "nvme_io": false 00:05:57.998 }, 00:05:57.998 "memory_domains": [ 00:05:57.998 { 00:05:57.998 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:57.998 "dma_device_type": 2 00:05:57.998 } 00:05:57.998 ], 00:05:57.998 "driver_specific": { 00:05:57.998 "passthru": { 00:05:57.998 "name": "Passthru0", 00:05:57.998 "base_bdev_name": "Malloc0" 00:05:57.998 } 00:05:57.998 } 00:05:57.998 } 00:05:57.998 ]' 00:05:57.998 17:49:50 -- rpc/rpc.sh@21 -- # jq length 00:05:57.998 17:49:50 -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:05:57.998 17:49:50 -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:05:57.998 17:49:50 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:57.998 17:49:50 -- common/autotest_common.sh@10 -- # set +x 00:05:57.998 17:49:50 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:57.998 17:49:50 -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc0 00:05:57.998 17:49:50 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:57.998 17:49:50 -- common/autotest_common.sh@10 -- # set +x 00:05:57.998 17:49:50 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:57.998 17:49:50 -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:05:57.998 17:49:50 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:57.998 17:49:50 -- common/autotest_common.sh@10 -- # set +x 00:05:57.998 17:49:50 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:57.998 17:49:50 -- rpc/rpc.sh@25 -- # bdevs='[]' 00:05:57.998 17:49:50 -- rpc/rpc.sh@26 -- # jq length 00:05:58.258 17:49:50 -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:05:58.258 00:05:58.258 real 0m0.288s 00:05:58.258 user 0m0.177s 00:05:58.258 sys 0m0.050s 00:05:58.258 17:49:50 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:58.258 17:49:50 -- common/autotest_common.sh@10 -- # set +x 00:05:58.258 ************************************ 00:05:58.258 END TEST rpc_integrity 00:05:58.258 ************************************ 00:05:58.258 17:49:50 -- rpc/rpc.sh@74 -- # run_test rpc_plugins rpc_plugins 00:05:58.258 17:49:50 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:58.258 17:49:50 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:58.258 17:49:50 -- common/autotest_common.sh@10 -- # set +x 00:05:58.258 ************************************ 00:05:58.258 START TEST rpc_plugins 00:05:58.258 ************************************ 00:05:58.258 17:49:50 -- common/autotest_common.sh@1114 -- # rpc_plugins 00:05:58.258 17:49:50 -- rpc/rpc.sh@30 -- # rpc_cmd --plugin rpc_plugin create_malloc 00:05:58.258 17:49:50 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:58.258 17:49:50 -- common/autotest_common.sh@10 -- # set +x 00:05:58.258 17:49:50 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:58.258 17:49:50 -- rpc/rpc.sh@30 -- # malloc=Malloc1 00:05:58.258 17:49:50 -- rpc/rpc.sh@31 -- # rpc_cmd bdev_get_bdevs 00:05:58.258 17:49:50 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:58.258 17:49:50 -- common/autotest_common.sh@10 -- # set +x 00:05:58.258 17:49:50 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:58.258 17:49:50 -- rpc/rpc.sh@31 -- # bdevs='[ 00:05:58.258 { 00:05:58.258 "name": "Malloc1", 00:05:58.258 "aliases": [ 00:05:58.258 "ceedb4da-62c5-46d7-95db-639bf9d4c997" 00:05:58.258 ], 00:05:58.258 "product_name": "Malloc disk", 00:05:58.258 "block_size": 4096, 00:05:58.258 "num_blocks": 256, 00:05:58.258 "uuid": "ceedb4da-62c5-46d7-95db-639bf9d4c997", 00:05:58.258 "assigned_rate_limits": { 00:05:58.258 "rw_ios_per_sec": 0, 00:05:58.258 "rw_mbytes_per_sec": 0, 00:05:58.258 "r_mbytes_per_sec": 0, 00:05:58.258 "w_mbytes_per_sec": 0 00:05:58.258 }, 00:05:58.258 "claimed": false, 00:05:58.258 "zoned": false, 00:05:58.258 "supported_io_types": { 00:05:58.258 "read": true, 00:05:58.258 "write": true, 00:05:58.258 "unmap": true, 00:05:58.258 "write_zeroes": true, 00:05:58.258 "flush": true, 00:05:58.258 "reset": true, 00:05:58.258 "compare": false, 00:05:58.258 "compare_and_write": false, 00:05:58.258 "abort": true, 00:05:58.258 "nvme_admin": false, 00:05:58.258 "nvme_io": false 00:05:58.258 }, 00:05:58.258 "memory_domains": [ 00:05:58.258 { 00:05:58.258 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:58.258 "dma_device_type": 2 00:05:58.258 } 00:05:58.258 ], 00:05:58.258 "driver_specific": {} 00:05:58.259 } 00:05:58.259 ]' 00:05:58.259 17:49:50 -- rpc/rpc.sh@32 -- # jq length 00:05:58.259 17:49:51 -- rpc/rpc.sh@32 -- # '[' 1 == 1 ']' 00:05:58.259 17:49:51 -- rpc/rpc.sh@34 -- # rpc_cmd --plugin rpc_plugin delete_malloc Malloc1 00:05:58.259 17:49:51 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:58.259 17:49:51 -- common/autotest_common.sh@10 -- # set +x 00:05:58.259 17:49:51 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:58.259 17:49:51 -- rpc/rpc.sh@35 -- # rpc_cmd bdev_get_bdevs 00:05:58.259 17:49:51 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:58.259 17:49:51 -- common/autotest_common.sh@10 -- # set +x 00:05:58.259 17:49:51 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:58.259 17:49:51 -- rpc/rpc.sh@35 -- # bdevs='[]' 00:05:58.259 17:49:51 -- rpc/rpc.sh@36 -- # jq length 00:05:58.259 17:49:51 -- rpc/rpc.sh@36 -- # '[' 0 == 0 ']' 00:05:58.259 00:05:58.259 real 0m0.143s 00:05:58.259 user 0m0.087s 00:05:58.259 sys 0m0.021s 00:05:58.259 17:49:51 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:58.259 17:49:51 -- common/autotest_common.sh@10 -- # set +x 00:05:58.259 ************************************ 00:05:58.259 END TEST rpc_plugins 00:05:58.259 ************************************ 00:05:58.259 17:49:51 -- rpc/rpc.sh@75 -- # run_test rpc_trace_cmd_test rpc_trace_cmd_test 00:05:58.259 17:49:51 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:58.259 17:49:51 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:58.259 17:49:51 -- common/autotest_common.sh@10 -- # set +x 00:05:58.519 ************************************ 00:05:58.519 START TEST rpc_trace_cmd_test 00:05:58.519 ************************************ 00:05:58.519 17:49:51 -- common/autotest_common.sh@1114 -- # rpc_trace_cmd_test 00:05:58.519 17:49:51 -- rpc/rpc.sh@40 -- # local info 00:05:58.519 17:49:51 -- rpc/rpc.sh@42 -- # rpc_cmd trace_get_info 00:05:58.519 17:49:51 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:58.519 17:49:51 -- common/autotest_common.sh@10 -- # set +x 00:05:58.519 17:49:51 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:58.519 17:49:51 -- rpc/rpc.sh@42 -- # info='{ 00:05:58.519 "tpoint_shm_path": "/dev/shm/spdk_tgt_trace.pid610843", 00:05:58.519 "tpoint_group_mask": "0x8", 00:05:58.519 "iscsi_conn": { 00:05:58.519 "mask": "0x2", 00:05:58.519 "tpoint_mask": "0x0" 00:05:58.519 }, 00:05:58.519 "scsi": { 00:05:58.519 "mask": "0x4", 00:05:58.519 "tpoint_mask": "0x0" 00:05:58.519 }, 00:05:58.519 "bdev": { 00:05:58.519 "mask": "0x8", 00:05:58.519 "tpoint_mask": "0xffffffffffffffff" 00:05:58.519 }, 00:05:58.519 "nvmf_rdma": { 00:05:58.519 "mask": "0x10", 00:05:58.519 "tpoint_mask": "0x0" 00:05:58.519 }, 00:05:58.519 "nvmf_tcp": { 00:05:58.519 "mask": "0x20", 00:05:58.519 "tpoint_mask": "0x0" 00:05:58.519 }, 00:05:58.519 "ftl": { 00:05:58.519 "mask": "0x40", 00:05:58.519 "tpoint_mask": "0x0" 00:05:58.519 }, 00:05:58.519 "blobfs": { 00:05:58.519 "mask": "0x80", 00:05:58.519 "tpoint_mask": "0x0" 00:05:58.519 }, 00:05:58.519 "dsa": { 00:05:58.519 "mask": "0x200", 00:05:58.519 "tpoint_mask": "0x0" 00:05:58.519 }, 00:05:58.519 "thread": { 00:05:58.519 "mask": "0x400", 00:05:58.519 "tpoint_mask": "0x0" 00:05:58.519 }, 00:05:58.519 "nvme_pcie": { 00:05:58.519 "mask": "0x800", 00:05:58.519 "tpoint_mask": "0x0" 00:05:58.519 }, 00:05:58.519 "iaa": { 00:05:58.519 "mask": "0x1000", 00:05:58.519 "tpoint_mask": "0x0" 00:05:58.519 }, 00:05:58.519 "nvme_tcp": { 00:05:58.519 "mask": "0x2000", 00:05:58.519 "tpoint_mask": "0x0" 00:05:58.519 }, 00:05:58.519 "bdev_nvme": { 00:05:58.519 "mask": "0x4000", 00:05:58.519 "tpoint_mask": "0x0" 00:05:58.519 } 00:05:58.519 }' 00:05:58.519 17:49:51 -- rpc/rpc.sh@43 -- # jq length 00:05:58.519 17:49:51 -- rpc/rpc.sh@43 -- # '[' 15 -gt 2 ']' 00:05:58.519 17:49:51 -- rpc/rpc.sh@44 -- # jq 'has("tpoint_group_mask")' 00:05:58.519 17:49:51 -- rpc/rpc.sh@44 -- # '[' true = true ']' 00:05:58.519 17:49:51 -- rpc/rpc.sh@45 -- # jq 'has("tpoint_shm_path")' 00:05:58.519 17:49:51 -- rpc/rpc.sh@45 -- # '[' true = true ']' 00:05:58.519 17:49:51 -- rpc/rpc.sh@46 -- # jq 'has("bdev")' 00:05:58.519 17:49:51 -- rpc/rpc.sh@46 -- # '[' true = true ']' 00:05:58.519 17:49:51 -- rpc/rpc.sh@47 -- # jq -r .bdev.tpoint_mask 00:05:58.519 17:49:51 -- rpc/rpc.sh@47 -- # '[' 0xffffffffffffffff '!=' 0x0 ']' 00:05:58.519 00:05:58.519 real 0m0.215s 00:05:58.519 user 0m0.177s 00:05:58.519 sys 0m0.030s 00:05:58.519 17:49:51 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:58.519 17:49:51 -- common/autotest_common.sh@10 -- # set +x 00:05:58.519 ************************************ 00:05:58.519 END TEST rpc_trace_cmd_test 00:05:58.519 ************************************ 00:05:58.779 17:49:51 -- rpc/rpc.sh@76 -- # [[ 0 -eq 1 ]] 00:05:58.779 17:49:51 -- rpc/rpc.sh@80 -- # rpc=rpc_cmd 00:05:58.779 17:49:51 -- rpc/rpc.sh@81 -- # run_test rpc_daemon_integrity rpc_integrity 00:05:58.779 17:49:51 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:58.779 17:49:51 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:58.779 17:49:51 -- common/autotest_common.sh@10 -- # set +x 00:05:58.779 ************************************ 00:05:58.779 START TEST rpc_daemon_integrity 00:05:58.779 ************************************ 00:05:58.779 17:49:51 -- common/autotest_common.sh@1114 -- # rpc_integrity 00:05:58.779 17:49:51 -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:05:58.779 17:49:51 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:58.779 17:49:51 -- common/autotest_common.sh@10 -- # set +x 00:05:58.779 17:49:51 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:58.779 17:49:51 -- rpc/rpc.sh@12 -- # bdevs='[]' 00:05:58.779 17:49:51 -- rpc/rpc.sh@13 -- # jq length 00:05:58.779 17:49:51 -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:05:58.779 17:49:51 -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:05:58.779 17:49:51 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:58.779 17:49:51 -- common/autotest_common.sh@10 -- # set +x 00:05:58.779 17:49:51 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:58.779 17:49:51 -- rpc/rpc.sh@15 -- # malloc=Malloc2 00:05:58.779 17:49:51 -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:05:58.779 17:49:51 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:58.779 17:49:51 -- common/autotest_common.sh@10 -- # set +x 00:05:58.779 17:49:51 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:58.779 17:49:51 -- rpc/rpc.sh@16 -- # bdevs='[ 00:05:58.779 { 00:05:58.779 "name": "Malloc2", 00:05:58.779 "aliases": [ 00:05:58.779 "ec44c862-e97f-45c0-b739-0992cf899bb6" 00:05:58.779 ], 00:05:58.779 "product_name": "Malloc disk", 00:05:58.779 "block_size": 512, 00:05:58.779 "num_blocks": 16384, 00:05:58.779 "uuid": "ec44c862-e97f-45c0-b739-0992cf899bb6", 00:05:58.779 "assigned_rate_limits": { 00:05:58.779 "rw_ios_per_sec": 0, 00:05:58.779 "rw_mbytes_per_sec": 0, 00:05:58.779 "r_mbytes_per_sec": 0, 00:05:58.779 "w_mbytes_per_sec": 0 00:05:58.779 }, 00:05:58.779 "claimed": false, 00:05:58.779 "zoned": false, 00:05:58.779 "supported_io_types": { 00:05:58.779 "read": true, 00:05:58.779 "write": true, 00:05:58.779 "unmap": true, 00:05:58.779 "write_zeroes": true, 00:05:58.779 "flush": true, 00:05:58.779 "reset": true, 00:05:58.779 "compare": false, 00:05:58.779 "compare_and_write": false, 00:05:58.779 "abort": true, 00:05:58.779 "nvme_admin": false, 00:05:58.779 "nvme_io": false 00:05:58.779 }, 00:05:58.779 "memory_domains": [ 00:05:58.779 { 00:05:58.779 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:58.779 "dma_device_type": 2 00:05:58.779 } 00:05:58.779 ], 00:05:58.779 "driver_specific": {} 00:05:58.779 } 00:05:58.779 ]' 00:05:58.779 17:49:51 -- rpc/rpc.sh@17 -- # jq length 00:05:58.779 17:49:51 -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:05:58.779 17:49:51 -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc2 -p Passthru0 00:05:58.779 17:49:51 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:58.779 17:49:51 -- common/autotest_common.sh@10 -- # set +x 00:05:58.779 [2024-11-19 17:49:51.521043] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc2 00:05:58.779 [2024-11-19 17:49:51.521075] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:05:58.779 [2024-11-19 17:49:51.521096] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x4c214c0 00:05:58.779 [2024-11-19 17:49:51.521105] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:05:58.779 [2024-11-19 17:49:51.521803] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:05:58.779 [2024-11-19 17:49:51.521825] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:05:58.779 Passthru0 00:05:58.779 17:49:51 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:58.779 17:49:51 -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:05:58.780 17:49:51 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:58.780 17:49:51 -- common/autotest_common.sh@10 -- # set +x 00:05:58.780 17:49:51 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:58.780 17:49:51 -- rpc/rpc.sh@20 -- # bdevs='[ 00:05:58.780 { 00:05:58.780 "name": "Malloc2", 00:05:58.780 "aliases": [ 00:05:58.780 "ec44c862-e97f-45c0-b739-0992cf899bb6" 00:05:58.780 ], 00:05:58.780 "product_name": "Malloc disk", 00:05:58.780 "block_size": 512, 00:05:58.780 "num_blocks": 16384, 00:05:58.780 "uuid": "ec44c862-e97f-45c0-b739-0992cf899bb6", 00:05:58.780 "assigned_rate_limits": { 00:05:58.780 "rw_ios_per_sec": 0, 00:05:58.780 "rw_mbytes_per_sec": 0, 00:05:58.780 "r_mbytes_per_sec": 0, 00:05:58.780 "w_mbytes_per_sec": 0 00:05:58.780 }, 00:05:58.780 "claimed": true, 00:05:58.780 "claim_type": "exclusive_write", 00:05:58.780 "zoned": false, 00:05:58.780 "supported_io_types": { 00:05:58.780 "read": true, 00:05:58.780 "write": true, 00:05:58.780 "unmap": true, 00:05:58.780 "write_zeroes": true, 00:05:58.780 "flush": true, 00:05:58.780 "reset": true, 00:05:58.780 "compare": false, 00:05:58.780 "compare_and_write": false, 00:05:58.780 "abort": true, 00:05:58.780 "nvme_admin": false, 00:05:58.780 "nvme_io": false 00:05:58.780 }, 00:05:58.780 "memory_domains": [ 00:05:58.780 { 00:05:58.780 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:58.780 "dma_device_type": 2 00:05:58.780 } 00:05:58.780 ], 00:05:58.780 "driver_specific": {} 00:05:58.780 }, 00:05:58.780 { 00:05:58.780 "name": "Passthru0", 00:05:58.780 "aliases": [ 00:05:58.780 "1fba3f7f-1d73-5c00-a863-688dae84ae99" 00:05:58.780 ], 00:05:58.780 "product_name": "passthru", 00:05:58.780 "block_size": 512, 00:05:58.780 "num_blocks": 16384, 00:05:58.780 "uuid": "1fba3f7f-1d73-5c00-a863-688dae84ae99", 00:05:58.780 "assigned_rate_limits": { 00:05:58.780 "rw_ios_per_sec": 0, 00:05:58.780 "rw_mbytes_per_sec": 0, 00:05:58.780 "r_mbytes_per_sec": 0, 00:05:58.780 "w_mbytes_per_sec": 0 00:05:58.780 }, 00:05:58.780 "claimed": false, 00:05:58.780 "zoned": false, 00:05:58.780 "supported_io_types": { 00:05:58.780 "read": true, 00:05:58.780 "write": true, 00:05:58.780 "unmap": true, 00:05:58.780 "write_zeroes": true, 00:05:58.780 "flush": true, 00:05:58.780 "reset": true, 00:05:58.780 "compare": false, 00:05:58.780 "compare_and_write": false, 00:05:58.780 "abort": true, 00:05:58.780 "nvme_admin": false, 00:05:58.780 "nvme_io": false 00:05:58.780 }, 00:05:58.780 "memory_domains": [ 00:05:58.780 { 00:05:58.780 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:58.780 "dma_device_type": 2 00:05:58.780 } 00:05:58.780 ], 00:05:58.780 "driver_specific": { 00:05:58.780 "passthru": { 00:05:58.780 "name": "Passthru0", 00:05:58.780 "base_bdev_name": "Malloc2" 00:05:58.780 } 00:05:58.780 } 00:05:58.780 } 00:05:58.780 ]' 00:05:58.780 17:49:51 -- rpc/rpc.sh@21 -- # jq length 00:05:58.780 17:49:51 -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:05:58.780 17:49:51 -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:05:58.780 17:49:51 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:58.780 17:49:51 -- common/autotest_common.sh@10 -- # set +x 00:05:58.780 17:49:51 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:58.780 17:49:51 -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc2 00:05:58.780 17:49:51 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:58.780 17:49:51 -- common/autotest_common.sh@10 -- # set +x 00:05:58.780 17:49:51 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:58.780 17:49:51 -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:05:58.780 17:49:51 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:58.780 17:49:51 -- common/autotest_common.sh@10 -- # set +x 00:05:58.780 17:49:51 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:58.780 17:49:51 -- rpc/rpc.sh@25 -- # bdevs='[]' 00:05:58.780 17:49:51 -- rpc/rpc.sh@26 -- # jq length 00:05:59.040 17:49:51 -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:05:59.040 00:05:59.040 real 0m0.274s 00:05:59.040 user 0m0.175s 00:05:59.040 sys 0m0.041s 00:05:59.040 17:49:51 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:59.040 17:49:51 -- common/autotest_common.sh@10 -- # set +x 00:05:59.040 ************************************ 00:05:59.040 END TEST rpc_daemon_integrity 00:05:59.040 ************************************ 00:05:59.040 17:49:51 -- rpc/rpc.sh@83 -- # trap - SIGINT SIGTERM EXIT 00:05:59.040 17:49:51 -- rpc/rpc.sh@84 -- # killprocess 610843 00:05:59.040 17:49:51 -- common/autotest_common.sh@936 -- # '[' -z 610843 ']' 00:05:59.040 17:49:51 -- common/autotest_common.sh@940 -- # kill -0 610843 00:05:59.040 17:49:51 -- common/autotest_common.sh@941 -- # uname 00:05:59.040 17:49:51 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:59.040 17:49:51 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 610843 00:05:59.040 17:49:51 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:59.040 17:49:51 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:59.040 17:49:51 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 610843' 00:05:59.040 killing process with pid 610843 00:05:59.040 17:49:51 -- common/autotest_common.sh@955 -- # kill 610843 00:05:59.040 17:49:51 -- common/autotest_common.sh@960 -- # wait 610843 00:05:59.299 00:05:59.299 real 0m2.513s 00:05:59.299 user 0m3.133s 00:05:59.299 sys 0m0.781s 00:05:59.299 17:49:52 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:59.299 17:49:52 -- common/autotest_common.sh@10 -- # set +x 00:05:59.299 ************************************ 00:05:59.299 END TEST rpc 00:05:59.299 ************************************ 00:05:59.299 17:49:52 -- spdk/autotest.sh@164 -- # run_test rpc_client /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:05:59.299 17:49:52 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:59.299 17:49:52 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:59.299 17:49:52 -- common/autotest_common.sh@10 -- # set +x 00:05:59.299 ************************************ 00:05:59.299 START TEST rpc_client 00:05:59.299 ************************************ 00:05:59.299 17:49:52 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:05:59.559 * Looking for test storage... 00:05:59.559 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client 00:05:59.559 17:49:52 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:59.559 17:49:52 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:59.559 17:49:52 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:59.559 17:49:52 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:59.559 17:49:52 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:59.559 17:49:52 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:59.559 17:49:52 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:59.559 17:49:52 -- scripts/common.sh@335 -- # IFS=.-: 00:05:59.559 17:49:52 -- scripts/common.sh@335 -- # read -ra ver1 00:05:59.559 17:49:52 -- scripts/common.sh@336 -- # IFS=.-: 00:05:59.559 17:49:52 -- scripts/common.sh@336 -- # read -ra ver2 00:05:59.559 17:49:52 -- scripts/common.sh@337 -- # local 'op=<' 00:05:59.559 17:49:52 -- scripts/common.sh@339 -- # ver1_l=2 00:05:59.559 17:49:52 -- scripts/common.sh@340 -- # ver2_l=1 00:05:59.559 17:49:52 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:59.559 17:49:52 -- scripts/common.sh@343 -- # case "$op" in 00:05:59.559 17:49:52 -- scripts/common.sh@344 -- # : 1 00:05:59.559 17:49:52 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:59.559 17:49:52 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:59.559 17:49:52 -- scripts/common.sh@364 -- # decimal 1 00:05:59.559 17:49:52 -- scripts/common.sh@352 -- # local d=1 00:05:59.560 17:49:52 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:59.560 17:49:52 -- scripts/common.sh@354 -- # echo 1 00:05:59.560 17:49:52 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:59.560 17:49:52 -- scripts/common.sh@365 -- # decimal 2 00:05:59.560 17:49:52 -- scripts/common.sh@352 -- # local d=2 00:05:59.560 17:49:52 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:59.560 17:49:52 -- scripts/common.sh@354 -- # echo 2 00:05:59.560 17:49:52 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:59.560 17:49:52 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:59.560 17:49:52 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:59.560 17:49:52 -- scripts/common.sh@367 -- # return 0 00:05:59.560 17:49:52 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:59.560 17:49:52 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:59.560 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:59.560 --rc genhtml_branch_coverage=1 00:05:59.560 --rc genhtml_function_coverage=1 00:05:59.560 --rc genhtml_legend=1 00:05:59.560 --rc geninfo_all_blocks=1 00:05:59.560 --rc geninfo_unexecuted_blocks=1 00:05:59.560 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:59.560 ' 00:05:59.560 17:49:52 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:59.560 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:59.560 --rc genhtml_branch_coverage=1 00:05:59.560 --rc genhtml_function_coverage=1 00:05:59.560 --rc genhtml_legend=1 00:05:59.560 --rc geninfo_all_blocks=1 00:05:59.560 --rc geninfo_unexecuted_blocks=1 00:05:59.560 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:59.560 ' 00:05:59.560 17:49:52 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:59.560 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:59.560 --rc genhtml_branch_coverage=1 00:05:59.560 --rc genhtml_function_coverage=1 00:05:59.560 --rc genhtml_legend=1 00:05:59.560 --rc geninfo_all_blocks=1 00:05:59.560 --rc geninfo_unexecuted_blocks=1 00:05:59.560 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:59.560 ' 00:05:59.560 17:49:52 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:59.560 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:59.560 --rc genhtml_branch_coverage=1 00:05:59.560 --rc genhtml_function_coverage=1 00:05:59.560 --rc genhtml_legend=1 00:05:59.560 --rc geninfo_all_blocks=1 00:05:59.560 --rc geninfo_unexecuted_blocks=1 00:05:59.560 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:59.560 ' 00:05:59.560 17:49:52 -- rpc_client/rpc_client.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client_test 00:05:59.560 OK 00:05:59.560 17:49:52 -- rpc_client/rpc_client.sh@12 -- # trap - SIGINT SIGTERM EXIT 00:05:59.560 00:05:59.560 real 0m0.214s 00:05:59.560 user 0m0.109s 00:05:59.560 sys 0m0.123s 00:05:59.560 17:49:52 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:59.560 17:49:52 -- common/autotest_common.sh@10 -- # set +x 00:05:59.560 ************************************ 00:05:59.560 END TEST rpc_client 00:05:59.560 ************************************ 00:05:59.560 17:49:52 -- spdk/autotest.sh@165 -- # run_test json_config /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config.sh 00:05:59.560 17:49:52 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:59.560 17:49:52 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:59.560 17:49:52 -- common/autotest_common.sh@10 -- # set +x 00:05:59.560 ************************************ 00:05:59.560 START TEST json_config 00:05:59.560 ************************************ 00:05:59.560 17:49:52 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config.sh 00:05:59.820 17:49:52 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:59.820 17:49:52 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:59.820 17:49:52 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:59.820 17:49:52 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:59.820 17:49:52 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:59.820 17:49:52 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:59.820 17:49:52 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:59.820 17:49:52 -- scripts/common.sh@335 -- # IFS=.-: 00:05:59.820 17:49:52 -- scripts/common.sh@335 -- # read -ra ver1 00:05:59.820 17:49:52 -- scripts/common.sh@336 -- # IFS=.-: 00:05:59.820 17:49:52 -- scripts/common.sh@336 -- # read -ra ver2 00:05:59.820 17:49:52 -- scripts/common.sh@337 -- # local 'op=<' 00:05:59.820 17:49:52 -- scripts/common.sh@339 -- # ver1_l=2 00:05:59.820 17:49:52 -- scripts/common.sh@340 -- # ver2_l=1 00:05:59.820 17:49:52 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:59.820 17:49:52 -- scripts/common.sh@343 -- # case "$op" in 00:05:59.820 17:49:52 -- scripts/common.sh@344 -- # : 1 00:05:59.820 17:49:52 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:59.820 17:49:52 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:59.820 17:49:52 -- scripts/common.sh@364 -- # decimal 1 00:05:59.820 17:49:52 -- scripts/common.sh@352 -- # local d=1 00:05:59.820 17:49:52 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:59.820 17:49:52 -- scripts/common.sh@354 -- # echo 1 00:05:59.820 17:49:52 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:59.820 17:49:52 -- scripts/common.sh@365 -- # decimal 2 00:05:59.820 17:49:52 -- scripts/common.sh@352 -- # local d=2 00:05:59.820 17:49:52 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:59.820 17:49:52 -- scripts/common.sh@354 -- # echo 2 00:05:59.820 17:49:52 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:59.820 17:49:52 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:59.820 17:49:52 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:59.820 17:49:52 -- scripts/common.sh@367 -- # return 0 00:05:59.820 17:49:52 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:59.820 17:49:52 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:59.820 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:59.820 --rc genhtml_branch_coverage=1 00:05:59.820 --rc genhtml_function_coverage=1 00:05:59.820 --rc genhtml_legend=1 00:05:59.820 --rc geninfo_all_blocks=1 00:05:59.820 --rc geninfo_unexecuted_blocks=1 00:05:59.820 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:59.820 ' 00:05:59.820 17:49:52 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:59.820 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:59.820 --rc genhtml_branch_coverage=1 00:05:59.820 --rc genhtml_function_coverage=1 00:05:59.820 --rc genhtml_legend=1 00:05:59.820 --rc geninfo_all_blocks=1 00:05:59.820 --rc geninfo_unexecuted_blocks=1 00:05:59.820 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:59.820 ' 00:05:59.820 17:49:52 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:59.820 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:59.820 --rc genhtml_branch_coverage=1 00:05:59.820 --rc genhtml_function_coverage=1 00:05:59.820 --rc genhtml_legend=1 00:05:59.820 --rc geninfo_all_blocks=1 00:05:59.820 --rc geninfo_unexecuted_blocks=1 00:05:59.820 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:59.820 ' 00:05:59.820 17:49:52 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:59.820 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:59.820 --rc genhtml_branch_coverage=1 00:05:59.820 --rc genhtml_function_coverage=1 00:05:59.820 --rc genhtml_legend=1 00:05:59.820 --rc geninfo_all_blocks=1 00:05:59.820 --rc geninfo_unexecuted_blocks=1 00:05:59.820 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:59.820 ' 00:05:59.820 17:49:52 -- json_config/json_config.sh@8 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:05:59.820 17:49:52 -- nvmf/common.sh@7 -- # uname -s 00:05:59.820 17:49:52 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:05:59.820 17:49:52 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:05:59.820 17:49:52 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:05:59.820 17:49:52 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:05:59.820 17:49:52 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:05:59.820 17:49:52 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:05:59.820 17:49:52 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:05:59.820 17:49:52 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:05:59.820 17:49:52 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:05:59.820 17:49:52 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:05:59.821 17:49:52 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:05:59.821 17:49:52 -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:05:59.821 17:49:52 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:05:59.821 17:49:52 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:05:59.821 17:49:52 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:05:59.821 17:49:52 -- nvmf/common.sh@44 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:05:59.821 17:49:52 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:05:59.821 17:49:52 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:05:59.821 17:49:52 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:05:59.821 17:49:52 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:59.821 17:49:52 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:59.821 17:49:52 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:59.821 17:49:52 -- paths/export.sh@5 -- # export PATH 00:05:59.821 17:49:52 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:59.821 17:49:52 -- nvmf/common.sh@46 -- # : 0 00:05:59.821 17:49:52 -- nvmf/common.sh@47 -- # export NVMF_APP_SHM_ID 00:05:59.821 17:49:52 -- nvmf/common.sh@48 -- # build_nvmf_app_args 00:05:59.821 17:49:52 -- nvmf/common.sh@24 -- # '[' 0 -eq 1 ']' 00:05:59.821 17:49:52 -- nvmf/common.sh@28 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:05:59.821 17:49:52 -- nvmf/common.sh@30 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:05:59.821 17:49:52 -- nvmf/common.sh@32 -- # '[' -n '' ']' 00:05:59.821 17:49:52 -- nvmf/common.sh@34 -- # '[' 0 -eq 1 ']' 00:05:59.821 17:49:52 -- nvmf/common.sh@50 -- # have_pci_nics=0 00:05:59.821 17:49:52 -- json_config/json_config.sh@10 -- # [[ 0 -eq 1 ]] 00:05:59.821 17:49:52 -- json_config/json_config.sh@14 -- # [[ 0 -ne 1 ]] 00:05:59.821 17:49:52 -- json_config/json_config.sh@14 -- # [[ 0 -eq 1 ]] 00:05:59.821 17:49:52 -- json_config/json_config.sh@25 -- # (( SPDK_TEST_BLOCKDEV + SPDK_TEST_ISCSI + SPDK_TEST_NVMF + SPDK_TEST_VHOST + SPDK_TEST_VHOST_INIT + SPDK_TEST_RBD == 0 )) 00:05:59.821 17:49:52 -- json_config/json_config.sh@26 -- # echo 'WARNING: No tests are enabled so not running JSON configuration tests' 00:05:59.821 WARNING: No tests are enabled so not running JSON configuration tests 00:05:59.821 17:49:52 -- json_config/json_config.sh@27 -- # exit 0 00:05:59.821 00:05:59.821 real 0m0.187s 00:05:59.821 user 0m0.123s 00:05:59.821 sys 0m0.073s 00:05:59.821 17:49:52 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:59.821 17:49:52 -- common/autotest_common.sh@10 -- # set +x 00:05:59.821 ************************************ 00:05:59.821 END TEST json_config 00:05:59.821 ************************************ 00:05:59.821 17:49:52 -- spdk/autotest.sh@166 -- # run_test json_config_extra_key /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:05:59.821 17:49:52 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:59.821 17:49:52 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:59.821 17:49:52 -- common/autotest_common.sh@10 -- # set +x 00:05:59.821 ************************************ 00:05:59.821 START TEST json_config_extra_key 00:05:59.821 ************************************ 00:05:59.821 17:49:52 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:06:00.081 17:49:52 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:06:00.082 17:49:52 -- common/autotest_common.sh@1690 -- # lcov --version 00:06:00.082 17:49:52 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:06:00.082 17:49:52 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:06:00.082 17:49:52 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:06:00.082 17:49:52 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:06:00.082 17:49:52 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:06:00.082 17:49:52 -- scripts/common.sh@335 -- # IFS=.-: 00:06:00.082 17:49:52 -- scripts/common.sh@335 -- # read -ra ver1 00:06:00.082 17:49:52 -- scripts/common.sh@336 -- # IFS=.-: 00:06:00.082 17:49:52 -- scripts/common.sh@336 -- # read -ra ver2 00:06:00.082 17:49:52 -- scripts/common.sh@337 -- # local 'op=<' 00:06:00.082 17:49:52 -- scripts/common.sh@339 -- # ver1_l=2 00:06:00.082 17:49:52 -- scripts/common.sh@340 -- # ver2_l=1 00:06:00.082 17:49:52 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:06:00.082 17:49:52 -- scripts/common.sh@343 -- # case "$op" in 00:06:00.082 17:49:52 -- scripts/common.sh@344 -- # : 1 00:06:00.082 17:49:52 -- scripts/common.sh@363 -- # (( v = 0 )) 00:06:00.082 17:49:52 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:00.082 17:49:52 -- scripts/common.sh@364 -- # decimal 1 00:06:00.082 17:49:52 -- scripts/common.sh@352 -- # local d=1 00:06:00.082 17:49:52 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:00.082 17:49:52 -- scripts/common.sh@354 -- # echo 1 00:06:00.082 17:49:52 -- scripts/common.sh@364 -- # ver1[v]=1 00:06:00.082 17:49:52 -- scripts/common.sh@365 -- # decimal 2 00:06:00.082 17:49:52 -- scripts/common.sh@352 -- # local d=2 00:06:00.082 17:49:52 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:00.082 17:49:52 -- scripts/common.sh@354 -- # echo 2 00:06:00.082 17:49:52 -- scripts/common.sh@365 -- # ver2[v]=2 00:06:00.082 17:49:52 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:06:00.082 17:49:52 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:06:00.082 17:49:52 -- scripts/common.sh@367 -- # return 0 00:06:00.082 17:49:52 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:00.082 17:49:52 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:06:00.082 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:00.082 --rc genhtml_branch_coverage=1 00:06:00.082 --rc genhtml_function_coverage=1 00:06:00.082 --rc genhtml_legend=1 00:06:00.082 --rc geninfo_all_blocks=1 00:06:00.082 --rc geninfo_unexecuted_blocks=1 00:06:00.082 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:00.082 ' 00:06:00.082 17:49:52 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:06:00.082 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:00.082 --rc genhtml_branch_coverage=1 00:06:00.082 --rc genhtml_function_coverage=1 00:06:00.082 --rc genhtml_legend=1 00:06:00.082 --rc geninfo_all_blocks=1 00:06:00.082 --rc geninfo_unexecuted_blocks=1 00:06:00.082 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:00.082 ' 00:06:00.082 17:49:52 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:06:00.082 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:00.082 --rc genhtml_branch_coverage=1 00:06:00.082 --rc genhtml_function_coverage=1 00:06:00.082 --rc genhtml_legend=1 00:06:00.082 --rc geninfo_all_blocks=1 00:06:00.082 --rc geninfo_unexecuted_blocks=1 00:06:00.082 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:00.082 ' 00:06:00.082 17:49:52 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:06:00.082 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:00.082 --rc genhtml_branch_coverage=1 00:06:00.082 --rc genhtml_function_coverage=1 00:06:00.082 --rc genhtml_legend=1 00:06:00.082 --rc geninfo_all_blocks=1 00:06:00.082 --rc geninfo_unexecuted_blocks=1 00:06:00.082 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:00.082 ' 00:06:00.082 17:49:52 -- json_config/json_config_extra_key.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:06:00.082 17:49:52 -- nvmf/common.sh@7 -- # uname -s 00:06:00.082 17:49:52 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:06:00.082 17:49:52 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:06:00.082 17:49:52 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:06:00.082 17:49:52 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:06:00.082 17:49:52 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:06:00.082 17:49:52 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:06:00.082 17:49:52 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:06:00.082 17:49:52 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:06:00.082 17:49:52 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:06:00.082 17:49:52 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:06:00.082 17:49:52 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:06:00.082 17:49:52 -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:06:00.082 17:49:52 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:06:00.082 17:49:52 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:06:00.082 17:49:52 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:06:00.082 17:49:52 -- nvmf/common.sh@44 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:06:00.082 17:49:52 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:06:00.082 17:49:52 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:06:00.082 17:49:52 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:06:00.082 17:49:52 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:00.082 17:49:52 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:00.082 17:49:52 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:00.082 17:49:52 -- paths/export.sh@5 -- # export PATH 00:06:00.082 17:49:52 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:00.082 17:49:52 -- nvmf/common.sh@46 -- # : 0 00:06:00.082 17:49:52 -- nvmf/common.sh@47 -- # export NVMF_APP_SHM_ID 00:06:00.082 17:49:52 -- nvmf/common.sh@48 -- # build_nvmf_app_args 00:06:00.082 17:49:52 -- nvmf/common.sh@24 -- # '[' 0 -eq 1 ']' 00:06:00.082 17:49:52 -- nvmf/common.sh@28 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:06:00.082 17:49:52 -- nvmf/common.sh@30 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:06:00.082 17:49:52 -- nvmf/common.sh@32 -- # '[' -n '' ']' 00:06:00.082 17:49:52 -- nvmf/common.sh@34 -- # '[' 0 -eq 1 ']' 00:06:00.082 17:49:52 -- nvmf/common.sh@50 -- # have_pci_nics=0 00:06:00.082 17:49:52 -- json_config/json_config_extra_key.sh@16 -- # app_pid=(['target']='') 00:06:00.082 17:49:52 -- json_config/json_config_extra_key.sh@16 -- # declare -A app_pid 00:06:00.082 17:49:52 -- json_config/json_config_extra_key.sh@17 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock') 00:06:00.082 17:49:52 -- json_config/json_config_extra_key.sh@17 -- # declare -A app_socket 00:06:00.082 17:49:52 -- json_config/json_config_extra_key.sh@18 -- # app_params=(['target']='-m 0x1 -s 1024') 00:06:00.082 17:49:52 -- json_config/json_config_extra_key.sh@18 -- # declare -A app_params 00:06:00.082 17:49:52 -- json_config/json_config_extra_key.sh@19 -- # configs_path=(['target']='/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json') 00:06:00.082 17:49:52 -- json_config/json_config_extra_key.sh@19 -- # declare -A configs_path 00:06:00.082 17:49:52 -- json_config/json_config_extra_key.sh@74 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:06:00.082 17:49:52 -- json_config/json_config_extra_key.sh@76 -- # echo 'INFO: launching applications...' 00:06:00.082 INFO: launching applications... 00:06:00.082 17:49:52 -- json_config/json_config_extra_key.sh@77 -- # json_config_test_start_app target --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json 00:06:00.082 17:49:52 -- json_config/json_config_extra_key.sh@24 -- # local app=target 00:06:00.082 17:49:52 -- json_config/json_config_extra_key.sh@25 -- # shift 00:06:00.082 17:49:52 -- json_config/json_config_extra_key.sh@27 -- # [[ -n 22 ]] 00:06:00.082 17:49:52 -- json_config/json_config_extra_key.sh@28 -- # [[ -z '' ]] 00:06:00.082 17:49:52 -- json_config/json_config_extra_key.sh@31 -- # app_pid[$app]=611641 00:06:00.082 17:49:52 -- json_config/json_config_extra_key.sh@33 -- # echo 'Waiting for target to run...' 00:06:00.082 Waiting for target to run... 00:06:00.082 17:49:52 -- json_config/json_config_extra_key.sh@34 -- # waitforlisten 611641 /var/tmp/spdk_tgt.sock 00:06:00.082 17:49:52 -- common/autotest_common.sh@829 -- # '[' -z 611641 ']' 00:06:00.082 17:49:52 -- json_config/json_config_extra_key.sh@30 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json 00:06:00.082 17:49:52 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:06:00.082 17:49:52 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:00.082 17:49:52 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:06:00.082 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:06:00.082 17:49:52 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:00.082 17:49:52 -- common/autotest_common.sh@10 -- # set +x 00:06:00.082 [2024-11-19 17:49:52.832645] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:00.082 [2024-11-19 17:49:52.832735] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid611641 ] 00:06:00.082 EAL: No free 2048 kB hugepages reported on node 1 00:06:00.650 [2024-11-19 17:49:53.284260] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:00.650 [2024-11-19 17:49:53.312685] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:00.650 [2024-11-19 17:49:53.312786] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:00.909 17:49:53 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:00.909 17:49:53 -- common/autotest_common.sh@862 -- # return 0 00:06:00.909 17:49:53 -- json_config/json_config_extra_key.sh@35 -- # echo '' 00:06:00.909 00:06:00.909 17:49:53 -- json_config/json_config_extra_key.sh@79 -- # echo 'INFO: shutting down applications...' 00:06:00.909 INFO: shutting down applications... 00:06:00.909 17:49:53 -- json_config/json_config_extra_key.sh@80 -- # json_config_test_shutdown_app target 00:06:00.909 17:49:53 -- json_config/json_config_extra_key.sh@40 -- # local app=target 00:06:00.909 17:49:53 -- json_config/json_config_extra_key.sh@43 -- # [[ -n 22 ]] 00:06:00.909 17:49:53 -- json_config/json_config_extra_key.sh@44 -- # [[ -n 611641 ]] 00:06:00.909 17:49:53 -- json_config/json_config_extra_key.sh@47 -- # kill -SIGINT 611641 00:06:00.909 17:49:53 -- json_config/json_config_extra_key.sh@49 -- # (( i = 0 )) 00:06:00.909 17:49:53 -- json_config/json_config_extra_key.sh@49 -- # (( i < 30 )) 00:06:00.909 17:49:53 -- json_config/json_config_extra_key.sh@50 -- # kill -0 611641 00:06:00.909 17:49:53 -- json_config/json_config_extra_key.sh@54 -- # sleep 0.5 00:06:01.478 17:49:54 -- json_config/json_config_extra_key.sh@49 -- # (( i++ )) 00:06:01.478 17:49:54 -- json_config/json_config_extra_key.sh@49 -- # (( i < 30 )) 00:06:01.478 17:49:54 -- json_config/json_config_extra_key.sh@50 -- # kill -0 611641 00:06:01.478 17:49:54 -- json_config/json_config_extra_key.sh@51 -- # app_pid[$app]= 00:06:01.478 17:49:54 -- json_config/json_config_extra_key.sh@52 -- # break 00:06:01.478 17:49:54 -- json_config/json_config_extra_key.sh@57 -- # [[ -n '' ]] 00:06:01.478 17:49:54 -- json_config/json_config_extra_key.sh@62 -- # echo 'SPDK target shutdown done' 00:06:01.478 SPDK target shutdown done 00:06:01.478 17:49:54 -- json_config/json_config_extra_key.sh@82 -- # echo Success 00:06:01.478 Success 00:06:01.478 00:06:01.478 real 0m1.566s 00:06:01.478 user 0m1.141s 00:06:01.478 sys 0m0.579s 00:06:01.478 17:49:54 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:01.478 17:49:54 -- common/autotest_common.sh@10 -- # set +x 00:06:01.478 ************************************ 00:06:01.478 END TEST json_config_extra_key 00:06:01.478 ************************************ 00:06:01.478 17:49:54 -- spdk/autotest.sh@167 -- # run_test alias_rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:06:01.478 17:49:54 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:01.479 17:49:54 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:01.479 17:49:54 -- common/autotest_common.sh@10 -- # set +x 00:06:01.479 ************************************ 00:06:01.479 START TEST alias_rpc 00:06:01.479 ************************************ 00:06:01.479 17:49:54 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:06:01.479 * Looking for test storage... 00:06:01.479 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc 00:06:01.479 17:49:54 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:06:01.479 17:49:54 -- common/autotest_common.sh@1690 -- # lcov --version 00:06:01.479 17:49:54 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:06:01.739 17:49:54 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:06:01.739 17:49:54 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:06:01.739 17:49:54 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:06:01.739 17:49:54 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:06:01.739 17:49:54 -- scripts/common.sh@335 -- # IFS=.-: 00:06:01.739 17:49:54 -- scripts/common.sh@335 -- # read -ra ver1 00:06:01.739 17:49:54 -- scripts/common.sh@336 -- # IFS=.-: 00:06:01.739 17:49:54 -- scripts/common.sh@336 -- # read -ra ver2 00:06:01.739 17:49:54 -- scripts/common.sh@337 -- # local 'op=<' 00:06:01.739 17:49:54 -- scripts/common.sh@339 -- # ver1_l=2 00:06:01.739 17:49:54 -- scripts/common.sh@340 -- # ver2_l=1 00:06:01.739 17:49:54 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:06:01.739 17:49:54 -- scripts/common.sh@343 -- # case "$op" in 00:06:01.739 17:49:54 -- scripts/common.sh@344 -- # : 1 00:06:01.739 17:49:54 -- scripts/common.sh@363 -- # (( v = 0 )) 00:06:01.739 17:49:54 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:01.739 17:49:54 -- scripts/common.sh@364 -- # decimal 1 00:06:01.739 17:49:54 -- scripts/common.sh@352 -- # local d=1 00:06:01.739 17:49:54 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:01.739 17:49:54 -- scripts/common.sh@354 -- # echo 1 00:06:01.739 17:49:54 -- scripts/common.sh@364 -- # ver1[v]=1 00:06:01.739 17:49:54 -- scripts/common.sh@365 -- # decimal 2 00:06:01.739 17:49:54 -- scripts/common.sh@352 -- # local d=2 00:06:01.739 17:49:54 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:01.739 17:49:54 -- scripts/common.sh@354 -- # echo 2 00:06:01.739 17:49:54 -- scripts/common.sh@365 -- # ver2[v]=2 00:06:01.739 17:49:54 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:06:01.739 17:49:54 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:06:01.739 17:49:54 -- scripts/common.sh@367 -- # return 0 00:06:01.739 17:49:54 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:01.739 17:49:54 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:06:01.739 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:01.739 --rc genhtml_branch_coverage=1 00:06:01.739 --rc genhtml_function_coverage=1 00:06:01.739 --rc genhtml_legend=1 00:06:01.739 --rc geninfo_all_blocks=1 00:06:01.739 --rc geninfo_unexecuted_blocks=1 00:06:01.739 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:01.739 ' 00:06:01.739 17:49:54 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:06:01.739 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:01.739 --rc genhtml_branch_coverage=1 00:06:01.739 --rc genhtml_function_coverage=1 00:06:01.739 --rc genhtml_legend=1 00:06:01.739 --rc geninfo_all_blocks=1 00:06:01.739 --rc geninfo_unexecuted_blocks=1 00:06:01.739 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:01.739 ' 00:06:01.739 17:49:54 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:06:01.739 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:01.739 --rc genhtml_branch_coverage=1 00:06:01.739 --rc genhtml_function_coverage=1 00:06:01.739 --rc genhtml_legend=1 00:06:01.739 --rc geninfo_all_blocks=1 00:06:01.739 --rc geninfo_unexecuted_blocks=1 00:06:01.739 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:01.739 ' 00:06:01.739 17:49:54 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:06:01.739 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:01.739 --rc genhtml_branch_coverage=1 00:06:01.739 --rc genhtml_function_coverage=1 00:06:01.739 --rc genhtml_legend=1 00:06:01.739 --rc geninfo_all_blocks=1 00:06:01.739 --rc geninfo_unexecuted_blocks=1 00:06:01.739 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:01.739 ' 00:06:01.739 17:49:54 -- alias_rpc/alias_rpc.sh@10 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:06:01.739 17:49:54 -- alias_rpc/alias_rpc.sh@13 -- # spdk_tgt_pid=611971 00:06:01.739 17:49:54 -- alias_rpc/alias_rpc.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:01.739 17:49:54 -- alias_rpc/alias_rpc.sh@14 -- # waitforlisten 611971 00:06:01.739 17:49:54 -- common/autotest_common.sh@829 -- # '[' -z 611971 ']' 00:06:01.739 17:49:54 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:01.739 17:49:54 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:01.739 17:49:54 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:01.739 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:01.739 17:49:54 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:01.739 17:49:54 -- common/autotest_common.sh@10 -- # set +x 00:06:01.739 [2024-11-19 17:49:54.448978] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:01.739 [2024-11-19 17:49:54.449053] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid611971 ] 00:06:01.739 EAL: No free 2048 kB hugepages reported on node 1 00:06:01.739 [2024-11-19 17:49:54.529008] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:01.739 [2024-11-19 17:49:54.564279] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:01.739 [2024-11-19 17:49:54.564414] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:02.678 17:49:55 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:02.678 17:49:55 -- common/autotest_common.sh@862 -- # return 0 00:06:02.678 17:49:55 -- alias_rpc/alias_rpc.sh@17 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py load_config -i 00:06:02.678 17:49:55 -- alias_rpc/alias_rpc.sh@19 -- # killprocess 611971 00:06:02.678 17:49:55 -- common/autotest_common.sh@936 -- # '[' -z 611971 ']' 00:06:02.678 17:49:55 -- common/autotest_common.sh@940 -- # kill -0 611971 00:06:02.678 17:49:55 -- common/autotest_common.sh@941 -- # uname 00:06:02.678 17:49:55 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:02.678 17:49:55 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 611971 00:06:02.937 17:49:55 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:02.937 17:49:55 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:02.937 17:49:55 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 611971' 00:06:02.937 killing process with pid 611971 00:06:02.937 17:49:55 -- common/autotest_common.sh@955 -- # kill 611971 00:06:02.937 17:49:55 -- common/autotest_common.sh@960 -- # wait 611971 00:06:03.195 00:06:03.195 real 0m1.614s 00:06:03.195 user 0m1.737s 00:06:03.195 sys 0m0.478s 00:06:03.195 17:49:55 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:03.195 17:49:55 -- common/autotest_common.sh@10 -- # set +x 00:06:03.195 ************************************ 00:06:03.195 END TEST alias_rpc 00:06:03.195 ************************************ 00:06:03.195 17:49:55 -- spdk/autotest.sh@169 -- # [[ 0 -eq 0 ]] 00:06:03.195 17:49:55 -- spdk/autotest.sh@170 -- # run_test spdkcli_tcp /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/tcp.sh 00:06:03.195 17:49:55 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:03.195 17:49:55 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:03.195 17:49:55 -- common/autotest_common.sh@10 -- # set +x 00:06:03.195 ************************************ 00:06:03.195 START TEST spdkcli_tcp 00:06:03.195 ************************************ 00:06:03.195 17:49:55 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/tcp.sh 00:06:03.195 * Looking for test storage... 00:06:03.195 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli 00:06:03.195 17:49:55 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:06:03.195 17:49:56 -- common/autotest_common.sh@1690 -- # lcov --version 00:06:03.195 17:49:56 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:06:03.454 17:49:56 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:06:03.454 17:49:56 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:06:03.454 17:49:56 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:06:03.454 17:49:56 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:06:03.454 17:49:56 -- scripts/common.sh@335 -- # IFS=.-: 00:06:03.454 17:49:56 -- scripts/common.sh@335 -- # read -ra ver1 00:06:03.454 17:49:56 -- scripts/common.sh@336 -- # IFS=.-: 00:06:03.454 17:49:56 -- scripts/common.sh@336 -- # read -ra ver2 00:06:03.454 17:49:56 -- scripts/common.sh@337 -- # local 'op=<' 00:06:03.454 17:49:56 -- scripts/common.sh@339 -- # ver1_l=2 00:06:03.454 17:49:56 -- scripts/common.sh@340 -- # ver2_l=1 00:06:03.454 17:49:56 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:06:03.454 17:49:56 -- scripts/common.sh@343 -- # case "$op" in 00:06:03.454 17:49:56 -- scripts/common.sh@344 -- # : 1 00:06:03.454 17:49:56 -- scripts/common.sh@363 -- # (( v = 0 )) 00:06:03.454 17:49:56 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:03.454 17:49:56 -- scripts/common.sh@364 -- # decimal 1 00:06:03.454 17:49:56 -- scripts/common.sh@352 -- # local d=1 00:06:03.454 17:49:56 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:03.454 17:49:56 -- scripts/common.sh@354 -- # echo 1 00:06:03.454 17:49:56 -- scripts/common.sh@364 -- # ver1[v]=1 00:06:03.454 17:49:56 -- scripts/common.sh@365 -- # decimal 2 00:06:03.454 17:49:56 -- scripts/common.sh@352 -- # local d=2 00:06:03.454 17:49:56 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:03.454 17:49:56 -- scripts/common.sh@354 -- # echo 2 00:06:03.454 17:49:56 -- scripts/common.sh@365 -- # ver2[v]=2 00:06:03.454 17:49:56 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:06:03.454 17:49:56 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:06:03.454 17:49:56 -- scripts/common.sh@367 -- # return 0 00:06:03.454 17:49:56 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:03.454 17:49:56 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:06:03.454 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:03.454 --rc genhtml_branch_coverage=1 00:06:03.454 --rc genhtml_function_coverage=1 00:06:03.454 --rc genhtml_legend=1 00:06:03.454 --rc geninfo_all_blocks=1 00:06:03.454 --rc geninfo_unexecuted_blocks=1 00:06:03.454 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:03.454 ' 00:06:03.454 17:49:56 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:06:03.454 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:03.454 --rc genhtml_branch_coverage=1 00:06:03.454 --rc genhtml_function_coverage=1 00:06:03.454 --rc genhtml_legend=1 00:06:03.454 --rc geninfo_all_blocks=1 00:06:03.454 --rc geninfo_unexecuted_blocks=1 00:06:03.454 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:03.454 ' 00:06:03.454 17:49:56 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:06:03.454 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:03.454 --rc genhtml_branch_coverage=1 00:06:03.454 --rc genhtml_function_coverage=1 00:06:03.454 --rc genhtml_legend=1 00:06:03.454 --rc geninfo_all_blocks=1 00:06:03.454 --rc geninfo_unexecuted_blocks=1 00:06:03.454 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:03.454 ' 00:06:03.454 17:49:56 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:06:03.454 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:03.454 --rc genhtml_branch_coverage=1 00:06:03.454 --rc genhtml_function_coverage=1 00:06:03.454 --rc genhtml_legend=1 00:06:03.454 --rc geninfo_all_blocks=1 00:06:03.454 --rc geninfo_unexecuted_blocks=1 00:06:03.454 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:03.454 ' 00:06:03.454 17:49:56 -- spdkcli/tcp.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/common.sh 00:06:03.454 17:49:56 -- spdkcli/common.sh@6 -- # spdkcli_job=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/spdkcli_job.py 00:06:03.454 17:49:56 -- spdkcli/common.sh@7 -- # spdk_clear_config_py=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/clear_config.py 00:06:03.454 17:49:56 -- spdkcli/tcp.sh@18 -- # IP_ADDRESS=127.0.0.1 00:06:03.454 17:49:56 -- spdkcli/tcp.sh@19 -- # PORT=9998 00:06:03.454 17:49:56 -- spdkcli/tcp.sh@21 -- # trap 'err_cleanup; exit 1' SIGINT SIGTERM EXIT 00:06:03.454 17:49:56 -- spdkcli/tcp.sh@23 -- # timing_enter run_spdk_tgt_tcp 00:06:03.454 17:49:56 -- common/autotest_common.sh@722 -- # xtrace_disable 00:06:03.454 17:49:56 -- common/autotest_common.sh@10 -- # set +x 00:06:03.454 17:49:56 -- spdkcli/tcp.sh@25 -- # spdk_tgt_pid=612304 00:06:03.454 17:49:56 -- spdkcli/tcp.sh@24 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x3 -p 0 00:06:03.454 17:49:56 -- spdkcli/tcp.sh@27 -- # waitforlisten 612304 00:06:03.454 17:49:56 -- common/autotest_common.sh@829 -- # '[' -z 612304 ']' 00:06:03.454 17:49:56 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:03.454 17:49:56 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:03.454 17:49:56 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:03.454 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:03.454 17:49:56 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:03.454 17:49:56 -- common/autotest_common.sh@10 -- # set +x 00:06:03.454 [2024-11-19 17:49:56.117053] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:03.454 [2024-11-19 17:49:56.117119] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid612304 ] 00:06:03.454 EAL: No free 2048 kB hugepages reported on node 1 00:06:03.454 [2024-11-19 17:49:56.197161] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:03.454 [2024-11-19 17:49:56.233365] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:03.454 [2024-11-19 17:49:56.233595] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:03.454 [2024-11-19 17:49:56.233595] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:04.391 17:49:56 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:04.391 17:49:56 -- common/autotest_common.sh@862 -- # return 0 00:06:04.391 17:49:56 -- spdkcli/tcp.sh@31 -- # socat_pid=612528 00:06:04.391 17:49:56 -- spdkcli/tcp.sh@30 -- # socat TCP-LISTEN:9998 UNIX-CONNECT:/var/tmp/spdk.sock 00:06:04.391 17:49:56 -- spdkcli/tcp.sh@33 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -r 100 -t 2 -s 127.0.0.1 -p 9998 rpc_get_methods 00:06:04.391 [ 00:06:04.391 "spdk_get_version", 00:06:04.391 "rpc_get_methods", 00:06:04.391 "trace_get_info", 00:06:04.391 "trace_get_tpoint_group_mask", 00:06:04.391 "trace_disable_tpoint_group", 00:06:04.391 "trace_enable_tpoint_group", 00:06:04.391 "trace_clear_tpoint_mask", 00:06:04.391 "trace_set_tpoint_mask", 00:06:04.391 "vfu_tgt_set_base_path", 00:06:04.391 "framework_get_pci_devices", 00:06:04.391 "framework_get_config", 00:06:04.391 "framework_get_subsystems", 00:06:04.391 "iobuf_get_stats", 00:06:04.391 "iobuf_set_options", 00:06:04.391 "sock_set_default_impl", 00:06:04.391 "sock_impl_set_options", 00:06:04.391 "sock_impl_get_options", 00:06:04.391 "vmd_rescan", 00:06:04.391 "vmd_remove_device", 00:06:04.391 "vmd_enable", 00:06:04.391 "accel_get_stats", 00:06:04.391 "accel_set_options", 00:06:04.391 "accel_set_driver", 00:06:04.391 "accel_crypto_key_destroy", 00:06:04.391 "accel_crypto_keys_get", 00:06:04.391 "accel_crypto_key_create", 00:06:04.391 "accel_assign_opc", 00:06:04.391 "accel_get_module_info", 00:06:04.391 "accel_get_opc_assignments", 00:06:04.391 "notify_get_notifications", 00:06:04.391 "notify_get_types", 00:06:04.391 "bdev_get_histogram", 00:06:04.391 "bdev_enable_histogram", 00:06:04.391 "bdev_set_qos_limit", 00:06:04.391 "bdev_set_qd_sampling_period", 00:06:04.391 "bdev_get_bdevs", 00:06:04.391 "bdev_reset_iostat", 00:06:04.391 "bdev_get_iostat", 00:06:04.391 "bdev_examine", 00:06:04.391 "bdev_wait_for_examine", 00:06:04.391 "bdev_set_options", 00:06:04.391 "scsi_get_devices", 00:06:04.391 "thread_set_cpumask", 00:06:04.391 "framework_get_scheduler", 00:06:04.391 "framework_set_scheduler", 00:06:04.391 "framework_get_reactors", 00:06:04.391 "thread_get_io_channels", 00:06:04.391 "thread_get_pollers", 00:06:04.391 "thread_get_stats", 00:06:04.391 "framework_monitor_context_switch", 00:06:04.391 "spdk_kill_instance", 00:06:04.391 "log_enable_timestamps", 00:06:04.391 "log_get_flags", 00:06:04.391 "log_clear_flag", 00:06:04.391 "log_set_flag", 00:06:04.391 "log_get_level", 00:06:04.391 "log_set_level", 00:06:04.391 "log_get_print_level", 00:06:04.391 "log_set_print_level", 00:06:04.391 "framework_enable_cpumask_locks", 00:06:04.391 "framework_disable_cpumask_locks", 00:06:04.391 "framework_wait_init", 00:06:04.391 "framework_start_init", 00:06:04.391 "virtio_blk_create_transport", 00:06:04.391 "virtio_blk_get_transports", 00:06:04.391 "vhost_controller_set_coalescing", 00:06:04.391 "vhost_get_controllers", 00:06:04.391 "vhost_delete_controller", 00:06:04.391 "vhost_create_blk_controller", 00:06:04.391 "vhost_scsi_controller_remove_target", 00:06:04.391 "vhost_scsi_controller_add_target", 00:06:04.391 "vhost_start_scsi_controller", 00:06:04.391 "vhost_create_scsi_controller", 00:06:04.391 "ublk_recover_disk", 00:06:04.391 "ublk_get_disks", 00:06:04.391 "ublk_stop_disk", 00:06:04.391 "ublk_start_disk", 00:06:04.391 "ublk_destroy_target", 00:06:04.391 "ublk_create_target", 00:06:04.391 "nbd_get_disks", 00:06:04.391 "nbd_stop_disk", 00:06:04.391 "nbd_start_disk", 00:06:04.391 "env_dpdk_get_mem_stats", 00:06:04.391 "nvmf_subsystem_get_listeners", 00:06:04.391 "nvmf_subsystem_get_qpairs", 00:06:04.391 "nvmf_subsystem_get_controllers", 00:06:04.391 "nvmf_get_stats", 00:06:04.391 "nvmf_get_transports", 00:06:04.391 "nvmf_create_transport", 00:06:04.391 "nvmf_get_targets", 00:06:04.391 "nvmf_delete_target", 00:06:04.391 "nvmf_create_target", 00:06:04.391 "nvmf_subsystem_allow_any_host", 00:06:04.391 "nvmf_subsystem_remove_host", 00:06:04.391 "nvmf_subsystem_add_host", 00:06:04.391 "nvmf_subsystem_remove_ns", 00:06:04.391 "nvmf_subsystem_add_ns", 00:06:04.391 "nvmf_subsystem_listener_set_ana_state", 00:06:04.391 "nvmf_discovery_get_referrals", 00:06:04.391 "nvmf_discovery_remove_referral", 00:06:04.391 "nvmf_discovery_add_referral", 00:06:04.391 "nvmf_subsystem_remove_listener", 00:06:04.391 "nvmf_subsystem_add_listener", 00:06:04.391 "nvmf_delete_subsystem", 00:06:04.391 "nvmf_create_subsystem", 00:06:04.391 "nvmf_get_subsystems", 00:06:04.391 "nvmf_set_crdt", 00:06:04.391 "nvmf_set_config", 00:06:04.391 "nvmf_set_max_subsystems", 00:06:04.391 "iscsi_set_options", 00:06:04.391 "iscsi_get_auth_groups", 00:06:04.391 "iscsi_auth_group_remove_secret", 00:06:04.392 "iscsi_auth_group_add_secret", 00:06:04.392 "iscsi_delete_auth_group", 00:06:04.392 "iscsi_create_auth_group", 00:06:04.392 "iscsi_set_discovery_auth", 00:06:04.392 "iscsi_get_options", 00:06:04.392 "iscsi_target_node_request_logout", 00:06:04.392 "iscsi_target_node_set_redirect", 00:06:04.392 "iscsi_target_node_set_auth", 00:06:04.392 "iscsi_target_node_add_lun", 00:06:04.392 "iscsi_get_connections", 00:06:04.392 "iscsi_portal_group_set_auth", 00:06:04.392 "iscsi_start_portal_group", 00:06:04.392 "iscsi_delete_portal_group", 00:06:04.392 "iscsi_create_portal_group", 00:06:04.392 "iscsi_get_portal_groups", 00:06:04.392 "iscsi_delete_target_node", 00:06:04.392 "iscsi_target_node_remove_pg_ig_maps", 00:06:04.392 "iscsi_target_node_add_pg_ig_maps", 00:06:04.392 "iscsi_create_target_node", 00:06:04.392 "iscsi_get_target_nodes", 00:06:04.392 "iscsi_delete_initiator_group", 00:06:04.392 "iscsi_initiator_group_remove_initiators", 00:06:04.392 "iscsi_initiator_group_add_initiators", 00:06:04.392 "iscsi_create_initiator_group", 00:06:04.392 "iscsi_get_initiator_groups", 00:06:04.392 "vfu_virtio_create_scsi_endpoint", 00:06:04.392 "vfu_virtio_scsi_remove_target", 00:06:04.392 "vfu_virtio_scsi_add_target", 00:06:04.392 "vfu_virtio_create_blk_endpoint", 00:06:04.392 "vfu_virtio_delete_endpoint", 00:06:04.392 "iaa_scan_accel_module", 00:06:04.392 "dsa_scan_accel_module", 00:06:04.392 "ioat_scan_accel_module", 00:06:04.392 "accel_error_inject_error", 00:06:04.392 "bdev_iscsi_delete", 00:06:04.392 "bdev_iscsi_create", 00:06:04.392 "bdev_iscsi_set_options", 00:06:04.392 "bdev_virtio_attach_controller", 00:06:04.392 "bdev_virtio_scsi_get_devices", 00:06:04.392 "bdev_virtio_detach_controller", 00:06:04.392 "bdev_virtio_blk_set_hotplug", 00:06:04.392 "bdev_ftl_set_property", 00:06:04.392 "bdev_ftl_get_properties", 00:06:04.392 "bdev_ftl_get_stats", 00:06:04.392 "bdev_ftl_unmap", 00:06:04.392 "bdev_ftl_unload", 00:06:04.392 "bdev_ftl_delete", 00:06:04.392 "bdev_ftl_load", 00:06:04.392 "bdev_ftl_create", 00:06:04.392 "bdev_aio_delete", 00:06:04.392 "bdev_aio_rescan", 00:06:04.392 "bdev_aio_create", 00:06:04.392 "blobfs_create", 00:06:04.392 "blobfs_detect", 00:06:04.392 "blobfs_set_cache_size", 00:06:04.392 "bdev_zone_block_delete", 00:06:04.392 "bdev_zone_block_create", 00:06:04.392 "bdev_delay_delete", 00:06:04.392 "bdev_delay_create", 00:06:04.392 "bdev_delay_update_latency", 00:06:04.392 "bdev_split_delete", 00:06:04.392 "bdev_split_create", 00:06:04.392 "bdev_error_inject_error", 00:06:04.392 "bdev_error_delete", 00:06:04.392 "bdev_error_create", 00:06:04.392 "bdev_raid_set_options", 00:06:04.392 "bdev_raid_remove_base_bdev", 00:06:04.392 "bdev_raid_add_base_bdev", 00:06:04.392 "bdev_raid_delete", 00:06:04.392 "bdev_raid_create", 00:06:04.392 "bdev_raid_get_bdevs", 00:06:04.392 "bdev_lvol_grow_lvstore", 00:06:04.392 "bdev_lvol_get_lvols", 00:06:04.392 "bdev_lvol_get_lvstores", 00:06:04.392 "bdev_lvol_delete", 00:06:04.392 "bdev_lvol_set_read_only", 00:06:04.392 "bdev_lvol_resize", 00:06:04.392 "bdev_lvol_decouple_parent", 00:06:04.392 "bdev_lvol_inflate", 00:06:04.392 "bdev_lvol_rename", 00:06:04.392 "bdev_lvol_clone_bdev", 00:06:04.392 "bdev_lvol_clone", 00:06:04.392 "bdev_lvol_snapshot", 00:06:04.392 "bdev_lvol_create", 00:06:04.392 "bdev_lvol_delete_lvstore", 00:06:04.392 "bdev_lvol_rename_lvstore", 00:06:04.392 "bdev_lvol_create_lvstore", 00:06:04.392 "bdev_passthru_delete", 00:06:04.392 "bdev_passthru_create", 00:06:04.392 "bdev_nvme_cuse_unregister", 00:06:04.392 "bdev_nvme_cuse_register", 00:06:04.392 "bdev_opal_new_user", 00:06:04.392 "bdev_opal_set_lock_state", 00:06:04.392 "bdev_opal_delete", 00:06:04.392 "bdev_opal_get_info", 00:06:04.392 "bdev_opal_create", 00:06:04.392 "bdev_nvme_opal_revert", 00:06:04.392 "bdev_nvme_opal_init", 00:06:04.392 "bdev_nvme_send_cmd", 00:06:04.392 "bdev_nvme_get_path_iostat", 00:06:04.392 "bdev_nvme_get_mdns_discovery_info", 00:06:04.392 "bdev_nvme_stop_mdns_discovery", 00:06:04.392 "bdev_nvme_start_mdns_discovery", 00:06:04.392 "bdev_nvme_set_multipath_policy", 00:06:04.392 "bdev_nvme_set_preferred_path", 00:06:04.392 "bdev_nvme_get_io_paths", 00:06:04.392 "bdev_nvme_remove_error_injection", 00:06:04.392 "bdev_nvme_add_error_injection", 00:06:04.392 "bdev_nvme_get_discovery_info", 00:06:04.392 "bdev_nvme_stop_discovery", 00:06:04.392 "bdev_nvme_start_discovery", 00:06:04.392 "bdev_nvme_get_controller_health_info", 00:06:04.392 "bdev_nvme_disable_controller", 00:06:04.392 "bdev_nvme_enable_controller", 00:06:04.392 "bdev_nvme_reset_controller", 00:06:04.392 "bdev_nvme_get_transport_statistics", 00:06:04.392 "bdev_nvme_apply_firmware", 00:06:04.392 "bdev_nvme_detach_controller", 00:06:04.392 "bdev_nvme_get_controllers", 00:06:04.392 "bdev_nvme_attach_controller", 00:06:04.392 "bdev_nvme_set_hotplug", 00:06:04.392 "bdev_nvme_set_options", 00:06:04.392 "bdev_null_resize", 00:06:04.392 "bdev_null_delete", 00:06:04.392 "bdev_null_create", 00:06:04.392 "bdev_malloc_delete", 00:06:04.392 "bdev_malloc_create" 00:06:04.392 ] 00:06:04.392 17:49:57 -- spdkcli/tcp.sh@35 -- # timing_exit run_spdk_tgt_tcp 00:06:04.392 17:49:57 -- common/autotest_common.sh@728 -- # xtrace_disable 00:06:04.392 17:49:57 -- common/autotest_common.sh@10 -- # set +x 00:06:04.392 17:49:57 -- spdkcli/tcp.sh@37 -- # trap - SIGINT SIGTERM EXIT 00:06:04.392 17:49:57 -- spdkcli/tcp.sh@38 -- # killprocess 612304 00:06:04.392 17:49:57 -- common/autotest_common.sh@936 -- # '[' -z 612304 ']' 00:06:04.392 17:49:57 -- common/autotest_common.sh@940 -- # kill -0 612304 00:06:04.392 17:49:57 -- common/autotest_common.sh@941 -- # uname 00:06:04.392 17:49:57 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:04.392 17:49:57 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 612304 00:06:04.392 17:49:57 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:04.392 17:49:57 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:04.392 17:49:57 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 612304' 00:06:04.392 killing process with pid 612304 00:06:04.392 17:49:57 -- common/autotest_common.sh@955 -- # kill 612304 00:06:04.392 17:49:57 -- common/autotest_common.sh@960 -- # wait 612304 00:06:04.962 00:06:04.962 real 0m1.634s 00:06:04.962 user 0m2.995s 00:06:04.962 sys 0m0.515s 00:06:04.962 17:49:57 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:04.962 17:49:57 -- common/autotest_common.sh@10 -- # set +x 00:06:04.962 ************************************ 00:06:04.962 END TEST spdkcli_tcp 00:06:04.962 ************************************ 00:06:04.962 17:49:57 -- spdk/autotest.sh@173 -- # run_test dpdk_mem_utility /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:06:04.962 17:49:57 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:04.962 17:49:57 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:04.962 17:49:57 -- common/autotest_common.sh@10 -- # set +x 00:06:04.962 ************************************ 00:06:04.962 START TEST dpdk_mem_utility 00:06:04.962 ************************************ 00:06:04.962 17:49:57 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:06:04.962 * Looking for test storage... 00:06:04.962 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility 00:06:04.962 17:49:57 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:06:04.962 17:49:57 -- common/autotest_common.sh@1690 -- # lcov --version 00:06:04.962 17:49:57 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:06:04.962 17:49:57 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:06:04.962 17:49:57 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:06:04.962 17:49:57 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:06:04.962 17:49:57 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:06:04.962 17:49:57 -- scripts/common.sh@335 -- # IFS=.-: 00:06:04.962 17:49:57 -- scripts/common.sh@335 -- # read -ra ver1 00:06:04.962 17:49:57 -- scripts/common.sh@336 -- # IFS=.-: 00:06:04.962 17:49:57 -- scripts/common.sh@336 -- # read -ra ver2 00:06:04.962 17:49:57 -- scripts/common.sh@337 -- # local 'op=<' 00:06:04.962 17:49:57 -- scripts/common.sh@339 -- # ver1_l=2 00:06:04.962 17:49:57 -- scripts/common.sh@340 -- # ver2_l=1 00:06:04.962 17:49:57 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:06:04.962 17:49:57 -- scripts/common.sh@343 -- # case "$op" in 00:06:04.962 17:49:57 -- scripts/common.sh@344 -- # : 1 00:06:04.962 17:49:57 -- scripts/common.sh@363 -- # (( v = 0 )) 00:06:04.962 17:49:57 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:04.962 17:49:57 -- scripts/common.sh@364 -- # decimal 1 00:06:04.962 17:49:57 -- scripts/common.sh@352 -- # local d=1 00:06:04.962 17:49:57 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:04.962 17:49:57 -- scripts/common.sh@354 -- # echo 1 00:06:04.962 17:49:57 -- scripts/common.sh@364 -- # ver1[v]=1 00:06:04.962 17:49:57 -- scripts/common.sh@365 -- # decimal 2 00:06:04.962 17:49:57 -- scripts/common.sh@352 -- # local d=2 00:06:04.962 17:49:57 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:04.962 17:49:57 -- scripts/common.sh@354 -- # echo 2 00:06:04.962 17:49:57 -- scripts/common.sh@365 -- # ver2[v]=2 00:06:04.962 17:49:57 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:06:04.962 17:49:57 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:06:04.962 17:49:57 -- scripts/common.sh@367 -- # return 0 00:06:04.962 17:49:57 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:04.962 17:49:57 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:06:04.962 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:04.962 --rc genhtml_branch_coverage=1 00:06:04.962 --rc genhtml_function_coverage=1 00:06:04.962 --rc genhtml_legend=1 00:06:04.962 --rc geninfo_all_blocks=1 00:06:04.962 --rc geninfo_unexecuted_blocks=1 00:06:04.962 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:04.962 ' 00:06:04.962 17:49:57 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:06:04.962 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:04.962 --rc genhtml_branch_coverage=1 00:06:04.962 --rc genhtml_function_coverage=1 00:06:04.962 --rc genhtml_legend=1 00:06:04.962 --rc geninfo_all_blocks=1 00:06:04.962 --rc geninfo_unexecuted_blocks=1 00:06:04.962 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:04.962 ' 00:06:04.962 17:49:57 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:06:04.962 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:04.962 --rc genhtml_branch_coverage=1 00:06:04.962 --rc genhtml_function_coverage=1 00:06:04.962 --rc genhtml_legend=1 00:06:04.962 --rc geninfo_all_blocks=1 00:06:04.962 --rc geninfo_unexecuted_blocks=1 00:06:04.962 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:04.962 ' 00:06:04.962 17:49:57 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:06:04.962 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:04.962 --rc genhtml_branch_coverage=1 00:06:04.962 --rc genhtml_function_coverage=1 00:06:04.962 --rc genhtml_legend=1 00:06:04.962 --rc geninfo_all_blocks=1 00:06:04.962 --rc geninfo_unexecuted_blocks=1 00:06:04.962 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:04.962 ' 00:06:04.962 17:49:57 -- dpdk_memory_utility/test_dpdk_mem_info.sh@10 -- # MEM_SCRIPT=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:06:04.962 17:49:57 -- dpdk_memory_utility/test_dpdk_mem_info.sh@13 -- # spdkpid=612648 00:06:04.962 17:49:57 -- dpdk_memory_utility/test_dpdk_mem_info.sh@15 -- # waitforlisten 612648 00:06:04.962 17:49:57 -- dpdk_memory_utility/test_dpdk_mem_info.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:04.962 17:49:57 -- common/autotest_common.sh@829 -- # '[' -z 612648 ']' 00:06:04.962 17:49:57 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:04.962 17:49:57 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:04.962 17:49:57 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:04.962 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:04.962 17:49:57 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:04.962 17:49:57 -- common/autotest_common.sh@10 -- # set +x 00:06:04.962 [2024-11-19 17:49:57.792417] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:04.962 [2024-11-19 17:49:57.792519] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid612648 ] 00:06:04.962 EAL: No free 2048 kB hugepages reported on node 1 00:06:05.222 [2024-11-19 17:49:57.873829] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:05.222 [2024-11-19 17:49:57.910949] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:05.222 [2024-11-19 17:49:57.911080] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:05.791 17:49:58 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:05.791 17:49:58 -- common/autotest_common.sh@862 -- # return 0 00:06:05.791 17:49:58 -- dpdk_memory_utility/test_dpdk_mem_info.sh@17 -- # trap 'killprocess $spdkpid' SIGINT SIGTERM EXIT 00:06:05.791 17:49:58 -- dpdk_memory_utility/test_dpdk_mem_info.sh@19 -- # rpc_cmd env_dpdk_get_mem_stats 00:06:05.791 17:49:58 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:05.791 17:49:58 -- common/autotest_common.sh@10 -- # set +x 00:06:05.791 { 00:06:05.791 "filename": "/tmp/spdk_mem_dump.txt" 00:06:05.791 } 00:06:05.791 17:49:58 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:05.791 17:49:58 -- dpdk_memory_utility/test_dpdk_mem_info.sh@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:06:06.051 DPDK memory size 814.000000 MiB in 1 heap(s) 00:06:06.051 1 heaps totaling size 814.000000 MiB 00:06:06.051 size: 814.000000 MiB heap id: 0 00:06:06.051 end heaps---------- 00:06:06.051 8 mempools totaling size 598.116089 MiB 00:06:06.051 size: 212.674988 MiB name: PDU_immediate_data_Pool 00:06:06.051 size: 158.602051 MiB name: PDU_data_out_Pool 00:06:06.051 size: 84.521057 MiB name: bdev_io_612648 00:06:06.051 size: 51.011292 MiB name: evtpool_612648 00:06:06.051 size: 50.003479 MiB name: msgpool_612648 00:06:06.051 size: 21.763794 MiB name: PDU_Pool 00:06:06.051 size: 19.513306 MiB name: SCSI_TASK_Pool 00:06:06.051 size: 0.026123 MiB name: Session_Pool 00:06:06.051 end mempools------- 00:06:06.051 6 memzones totaling size 4.142822 MiB 00:06:06.051 size: 1.000366 MiB name: RG_ring_0_612648 00:06:06.051 size: 1.000366 MiB name: RG_ring_1_612648 00:06:06.051 size: 1.000366 MiB name: RG_ring_4_612648 00:06:06.051 size: 1.000366 MiB name: RG_ring_5_612648 00:06:06.051 size: 0.125366 MiB name: RG_ring_2_612648 00:06:06.051 size: 0.015991 MiB name: RG_ring_3_612648 00:06:06.051 end memzones------- 00:06:06.051 17:49:58 -- dpdk_memory_utility/test_dpdk_mem_info.sh@23 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py -m 0 00:06:06.051 heap id: 0 total size: 814.000000 MiB number of busy elements: 41 number of free elements: 15 00:06:06.051 list of free elements. size: 12.519348 MiB 00:06:06.051 element at address: 0x200000400000 with size: 1.999512 MiB 00:06:06.051 element at address: 0x200018e00000 with size: 0.999878 MiB 00:06:06.051 element at address: 0x200019000000 with size: 0.999878 MiB 00:06:06.051 element at address: 0x200003e00000 with size: 0.996277 MiB 00:06:06.051 element at address: 0x200031c00000 with size: 0.994446 MiB 00:06:06.051 element at address: 0x200013800000 with size: 0.978699 MiB 00:06:06.051 element at address: 0x200007000000 with size: 0.959839 MiB 00:06:06.051 element at address: 0x200019200000 with size: 0.936584 MiB 00:06:06.051 element at address: 0x200000200000 with size: 0.841614 MiB 00:06:06.051 element at address: 0x20001aa00000 with size: 0.582886 MiB 00:06:06.051 element at address: 0x20000b200000 with size: 0.490723 MiB 00:06:06.051 element at address: 0x200000800000 with size: 0.487793 MiB 00:06:06.051 element at address: 0x200019400000 with size: 0.485657 MiB 00:06:06.051 element at address: 0x200027e00000 with size: 0.410034 MiB 00:06:06.051 element at address: 0x200003a00000 with size: 0.355530 MiB 00:06:06.051 list of standard malloc elements. size: 199.218079 MiB 00:06:06.051 element at address: 0x20000b3fff80 with size: 132.000122 MiB 00:06:06.051 element at address: 0x2000071fff80 with size: 64.000122 MiB 00:06:06.051 element at address: 0x200018efff80 with size: 1.000122 MiB 00:06:06.052 element at address: 0x2000190fff80 with size: 1.000122 MiB 00:06:06.052 element at address: 0x2000192fff80 with size: 1.000122 MiB 00:06:06.052 element at address: 0x2000003d9f00 with size: 0.140747 MiB 00:06:06.052 element at address: 0x2000192eff00 with size: 0.062622 MiB 00:06:06.052 element at address: 0x2000003fdf80 with size: 0.007935 MiB 00:06:06.052 element at address: 0x2000192efdc0 with size: 0.000305 MiB 00:06:06.052 element at address: 0x2000002d7740 with size: 0.000183 MiB 00:06:06.052 element at address: 0x2000002d7800 with size: 0.000183 MiB 00:06:06.052 element at address: 0x2000002d78c0 with size: 0.000183 MiB 00:06:06.052 element at address: 0x2000002d7ac0 with size: 0.000183 MiB 00:06:06.052 element at address: 0x2000002d7b80 with size: 0.000183 MiB 00:06:06.052 element at address: 0x2000002d7c40 with size: 0.000183 MiB 00:06:06.052 element at address: 0x2000003d9e40 with size: 0.000183 MiB 00:06:06.052 element at address: 0x20000087ce00 with size: 0.000183 MiB 00:06:06.052 element at address: 0x20000087cec0 with size: 0.000183 MiB 00:06:06.052 element at address: 0x2000008fd180 with size: 0.000183 MiB 00:06:06.052 element at address: 0x200003a5b040 with size: 0.000183 MiB 00:06:06.052 element at address: 0x200003adb300 with size: 0.000183 MiB 00:06:06.052 element at address: 0x200003adb500 with size: 0.000183 MiB 00:06:06.052 element at address: 0x200003adf7c0 with size: 0.000183 MiB 00:06:06.052 element at address: 0x200003affa80 with size: 0.000183 MiB 00:06:06.052 element at address: 0x200003affb40 with size: 0.000183 MiB 00:06:06.052 element at address: 0x200003eff0c0 with size: 0.000183 MiB 00:06:06.052 element at address: 0x2000070fdd80 with size: 0.000183 MiB 00:06:06.052 element at address: 0x20000b27da00 with size: 0.000183 MiB 00:06:06.052 element at address: 0x20000b27dac0 with size: 0.000183 MiB 00:06:06.052 element at address: 0x20000b2fdd80 with size: 0.000183 MiB 00:06:06.052 element at address: 0x2000138fa8c0 with size: 0.000183 MiB 00:06:06.052 element at address: 0x2000192efc40 with size: 0.000183 MiB 00:06:06.052 element at address: 0x2000192efd00 with size: 0.000183 MiB 00:06:06.052 element at address: 0x2000194bc740 with size: 0.000183 MiB 00:06:06.052 element at address: 0x20001aa95380 with size: 0.000183 MiB 00:06:06.052 element at address: 0x20001aa95440 with size: 0.000183 MiB 00:06:06.052 element at address: 0x200027e68f80 with size: 0.000183 MiB 00:06:06.052 element at address: 0x200027e69040 with size: 0.000183 MiB 00:06:06.052 element at address: 0x200027e6fc40 with size: 0.000183 MiB 00:06:06.052 element at address: 0x200027e6fe40 with size: 0.000183 MiB 00:06:06.052 element at address: 0x200027e6ff00 with size: 0.000183 MiB 00:06:06.052 list of memzone associated elements. size: 602.262573 MiB 00:06:06.052 element at address: 0x20001aa95500 with size: 211.416748 MiB 00:06:06.052 associated memzone info: size: 211.416626 MiB name: MP_PDU_immediate_data_Pool_0 00:06:06.052 element at address: 0x200027e6ffc0 with size: 157.562561 MiB 00:06:06.052 associated memzone info: size: 157.562439 MiB name: MP_PDU_data_out_Pool_0 00:06:06.052 element at address: 0x2000139fab80 with size: 84.020630 MiB 00:06:06.052 associated memzone info: size: 84.020508 MiB name: MP_bdev_io_612648_0 00:06:06.052 element at address: 0x2000009ff380 with size: 48.003052 MiB 00:06:06.052 associated memzone info: size: 48.002930 MiB name: MP_evtpool_612648_0 00:06:06.052 element at address: 0x200003fff380 with size: 48.003052 MiB 00:06:06.052 associated memzone info: size: 48.002930 MiB name: MP_msgpool_612648_0 00:06:06.052 element at address: 0x2000195be940 with size: 20.255554 MiB 00:06:06.052 associated memzone info: size: 20.255432 MiB name: MP_PDU_Pool_0 00:06:06.052 element at address: 0x200031dfeb40 with size: 18.005066 MiB 00:06:06.052 associated memzone info: size: 18.004944 MiB name: MP_SCSI_TASK_Pool_0 00:06:06.052 element at address: 0x2000005ffe00 with size: 2.000488 MiB 00:06:06.052 associated memzone info: size: 2.000366 MiB name: RG_MP_evtpool_612648 00:06:06.052 element at address: 0x200003bffe00 with size: 2.000488 MiB 00:06:06.052 associated memzone info: size: 2.000366 MiB name: RG_MP_msgpool_612648 00:06:06.052 element at address: 0x2000002d7d00 with size: 1.008118 MiB 00:06:06.052 associated memzone info: size: 1.007996 MiB name: MP_evtpool_612648 00:06:06.052 element at address: 0x20000b2fde40 with size: 1.008118 MiB 00:06:06.052 associated memzone info: size: 1.007996 MiB name: MP_PDU_Pool 00:06:06.052 element at address: 0x2000194bc800 with size: 1.008118 MiB 00:06:06.052 associated memzone info: size: 1.007996 MiB name: MP_PDU_immediate_data_Pool 00:06:06.052 element at address: 0x2000070fde40 with size: 1.008118 MiB 00:06:06.052 associated memzone info: size: 1.007996 MiB name: MP_PDU_data_out_Pool 00:06:06.052 element at address: 0x2000008fd240 with size: 1.008118 MiB 00:06:06.052 associated memzone info: size: 1.007996 MiB name: MP_SCSI_TASK_Pool 00:06:06.052 element at address: 0x200003eff180 with size: 1.000488 MiB 00:06:06.052 associated memzone info: size: 1.000366 MiB name: RG_ring_0_612648 00:06:06.052 element at address: 0x200003affc00 with size: 1.000488 MiB 00:06:06.052 associated memzone info: size: 1.000366 MiB name: RG_ring_1_612648 00:06:06.052 element at address: 0x2000138fa980 with size: 1.000488 MiB 00:06:06.052 associated memzone info: size: 1.000366 MiB name: RG_ring_4_612648 00:06:06.052 element at address: 0x200031cfe940 with size: 1.000488 MiB 00:06:06.052 associated memzone info: size: 1.000366 MiB name: RG_ring_5_612648 00:06:06.052 element at address: 0x200003a5b100 with size: 0.500488 MiB 00:06:06.052 associated memzone info: size: 0.500366 MiB name: RG_MP_bdev_io_612648 00:06:06.052 element at address: 0x20000b27db80 with size: 0.500488 MiB 00:06:06.052 associated memzone info: size: 0.500366 MiB name: RG_MP_PDU_Pool 00:06:06.052 element at address: 0x20000087cf80 with size: 0.500488 MiB 00:06:06.052 associated memzone info: size: 0.500366 MiB name: RG_MP_SCSI_TASK_Pool 00:06:06.052 element at address: 0x20001947c540 with size: 0.250488 MiB 00:06:06.052 associated memzone info: size: 0.250366 MiB name: RG_MP_PDU_immediate_data_Pool 00:06:06.052 element at address: 0x200003adf880 with size: 0.125488 MiB 00:06:06.052 associated memzone info: size: 0.125366 MiB name: RG_ring_2_612648 00:06:06.052 element at address: 0x2000070f5b80 with size: 0.031738 MiB 00:06:06.052 associated memzone info: size: 0.031616 MiB name: RG_MP_PDU_data_out_Pool 00:06:06.052 element at address: 0x200027e69100 with size: 0.023743 MiB 00:06:06.052 associated memzone info: size: 0.023621 MiB name: MP_Session_Pool_0 00:06:06.052 element at address: 0x200003adb5c0 with size: 0.016113 MiB 00:06:06.052 associated memzone info: size: 0.015991 MiB name: RG_ring_3_612648 00:06:06.052 element at address: 0x200027e6f240 with size: 0.002441 MiB 00:06:06.052 associated memzone info: size: 0.002319 MiB name: RG_MP_Session_Pool 00:06:06.052 element at address: 0x2000002d7980 with size: 0.000305 MiB 00:06:06.052 associated memzone info: size: 0.000183 MiB name: MP_msgpool_612648 00:06:06.052 element at address: 0x200003adb3c0 with size: 0.000305 MiB 00:06:06.052 associated memzone info: size: 0.000183 MiB name: MP_bdev_io_612648 00:06:06.052 element at address: 0x200027e6fd00 with size: 0.000305 MiB 00:06:06.052 associated memzone info: size: 0.000183 MiB name: MP_Session_Pool 00:06:06.052 17:49:58 -- dpdk_memory_utility/test_dpdk_mem_info.sh@25 -- # trap - SIGINT SIGTERM EXIT 00:06:06.052 17:49:58 -- dpdk_memory_utility/test_dpdk_mem_info.sh@26 -- # killprocess 612648 00:06:06.052 17:49:58 -- common/autotest_common.sh@936 -- # '[' -z 612648 ']' 00:06:06.052 17:49:58 -- common/autotest_common.sh@940 -- # kill -0 612648 00:06:06.052 17:49:58 -- common/autotest_common.sh@941 -- # uname 00:06:06.052 17:49:58 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:06.052 17:49:58 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 612648 00:06:06.052 17:49:58 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:06.052 17:49:58 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:06.052 17:49:58 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 612648' 00:06:06.052 killing process with pid 612648 00:06:06.052 17:49:58 -- common/autotest_common.sh@955 -- # kill 612648 00:06:06.052 17:49:58 -- common/autotest_common.sh@960 -- # wait 612648 00:06:06.312 00:06:06.312 real 0m1.501s 00:06:06.312 user 0m1.539s 00:06:06.312 sys 0m0.471s 00:06:06.312 17:49:59 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:06.312 17:49:59 -- common/autotest_common.sh@10 -- # set +x 00:06:06.312 ************************************ 00:06:06.312 END TEST dpdk_mem_utility 00:06:06.312 ************************************ 00:06:06.312 17:49:59 -- spdk/autotest.sh@174 -- # run_test event /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event.sh 00:06:06.312 17:49:59 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:06.312 17:49:59 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:06.312 17:49:59 -- common/autotest_common.sh@10 -- # set +x 00:06:06.312 ************************************ 00:06:06.312 START TEST event 00:06:06.312 ************************************ 00:06:06.312 17:49:59 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event.sh 00:06:06.571 * Looking for test storage... 00:06:06.571 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event 00:06:06.571 17:49:59 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:06:06.571 17:49:59 -- common/autotest_common.sh@1690 -- # lcov --version 00:06:06.571 17:49:59 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:06:06.571 17:49:59 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:06:06.571 17:49:59 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:06:06.571 17:49:59 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:06:06.571 17:49:59 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:06:06.571 17:49:59 -- scripts/common.sh@335 -- # IFS=.-: 00:06:06.571 17:49:59 -- scripts/common.sh@335 -- # read -ra ver1 00:06:06.571 17:49:59 -- scripts/common.sh@336 -- # IFS=.-: 00:06:06.571 17:49:59 -- scripts/common.sh@336 -- # read -ra ver2 00:06:06.571 17:49:59 -- scripts/common.sh@337 -- # local 'op=<' 00:06:06.571 17:49:59 -- scripts/common.sh@339 -- # ver1_l=2 00:06:06.571 17:49:59 -- scripts/common.sh@340 -- # ver2_l=1 00:06:06.571 17:49:59 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:06:06.571 17:49:59 -- scripts/common.sh@343 -- # case "$op" in 00:06:06.571 17:49:59 -- scripts/common.sh@344 -- # : 1 00:06:06.571 17:49:59 -- scripts/common.sh@363 -- # (( v = 0 )) 00:06:06.571 17:49:59 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:06.571 17:49:59 -- scripts/common.sh@364 -- # decimal 1 00:06:06.571 17:49:59 -- scripts/common.sh@352 -- # local d=1 00:06:06.571 17:49:59 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:06.571 17:49:59 -- scripts/common.sh@354 -- # echo 1 00:06:06.571 17:49:59 -- scripts/common.sh@364 -- # ver1[v]=1 00:06:06.571 17:49:59 -- scripts/common.sh@365 -- # decimal 2 00:06:06.571 17:49:59 -- scripts/common.sh@352 -- # local d=2 00:06:06.571 17:49:59 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:06.571 17:49:59 -- scripts/common.sh@354 -- # echo 2 00:06:06.571 17:49:59 -- scripts/common.sh@365 -- # ver2[v]=2 00:06:06.571 17:49:59 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:06:06.571 17:49:59 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:06:06.571 17:49:59 -- scripts/common.sh@367 -- # return 0 00:06:06.571 17:49:59 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:06.571 17:49:59 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:06:06.571 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:06.571 --rc genhtml_branch_coverage=1 00:06:06.571 --rc genhtml_function_coverage=1 00:06:06.571 --rc genhtml_legend=1 00:06:06.571 --rc geninfo_all_blocks=1 00:06:06.571 --rc geninfo_unexecuted_blocks=1 00:06:06.571 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:06.571 ' 00:06:06.571 17:49:59 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:06:06.571 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:06.571 --rc genhtml_branch_coverage=1 00:06:06.571 --rc genhtml_function_coverage=1 00:06:06.571 --rc genhtml_legend=1 00:06:06.571 --rc geninfo_all_blocks=1 00:06:06.571 --rc geninfo_unexecuted_blocks=1 00:06:06.571 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:06.571 ' 00:06:06.571 17:49:59 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:06:06.571 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:06.571 --rc genhtml_branch_coverage=1 00:06:06.571 --rc genhtml_function_coverage=1 00:06:06.571 --rc genhtml_legend=1 00:06:06.571 --rc geninfo_all_blocks=1 00:06:06.571 --rc geninfo_unexecuted_blocks=1 00:06:06.571 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:06.571 ' 00:06:06.571 17:49:59 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:06:06.571 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:06.571 --rc genhtml_branch_coverage=1 00:06:06.571 --rc genhtml_function_coverage=1 00:06:06.571 --rc genhtml_legend=1 00:06:06.571 --rc geninfo_all_blocks=1 00:06:06.571 --rc geninfo_unexecuted_blocks=1 00:06:06.571 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:06.571 ' 00:06:06.571 17:49:59 -- event/event.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/bdev/nbd_common.sh 00:06:06.571 17:49:59 -- bdev/nbd_common.sh@6 -- # set -e 00:06:06.571 17:49:59 -- event/event.sh@45 -- # run_test event_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:06:06.571 17:49:59 -- common/autotest_common.sh@1087 -- # '[' 6 -le 1 ']' 00:06:06.571 17:49:59 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:06.571 17:49:59 -- common/autotest_common.sh@10 -- # set +x 00:06:06.571 ************************************ 00:06:06.571 START TEST event_perf 00:06:06.571 ************************************ 00:06:06.571 17:49:59 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:06:06.571 Running I/O for 1 seconds...[2024-11-19 17:49:59.346633] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:06.571 [2024-11-19 17:49:59.346756] [ DPDK EAL parameters: event_perf --no-shconf -c 0xF --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid612990 ] 00:06:06.571 EAL: No free 2048 kB hugepages reported on node 1 00:06:06.571 [2024-11-19 17:49:59.431733] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:06:06.830 [2024-11-19 17:49:59.470234] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:06.830 [2024-11-19 17:49:59.470346] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:06.830 [2024-11-19 17:49:59.470452] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:06.830 [2024-11-19 17:49:59.470453] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:06:07.766 Running I/O for 1 seconds... 00:06:07.766 lcore 0: 182695 00:06:07.766 lcore 1: 182695 00:06:07.766 lcore 2: 182696 00:06:07.766 lcore 3: 182695 00:06:07.766 done. 00:06:07.766 00:06:07.766 real 0m1.201s 00:06:07.766 user 0m4.088s 00:06:07.766 sys 0m0.108s 00:06:07.766 17:50:00 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:07.766 17:50:00 -- common/autotest_common.sh@10 -- # set +x 00:06:07.766 ************************************ 00:06:07.766 END TEST event_perf 00:06:07.766 ************************************ 00:06:07.766 17:50:00 -- event/event.sh@46 -- # run_test event_reactor /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:06:07.766 17:50:00 -- common/autotest_common.sh@1087 -- # '[' 4 -le 1 ']' 00:06:07.766 17:50:00 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:07.766 17:50:00 -- common/autotest_common.sh@10 -- # set +x 00:06:07.766 ************************************ 00:06:07.766 START TEST event_reactor 00:06:07.766 ************************************ 00:06:07.766 17:50:00 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:06:07.767 [2024-11-19 17:50:00.595836] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:07.767 [2024-11-19 17:50:00.595926] [ DPDK EAL parameters: reactor --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid613275 ] 00:06:08.025 EAL: No free 2048 kB hugepages reported on node 1 00:06:08.025 [2024-11-19 17:50:00.678112] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:08.025 [2024-11-19 17:50:00.712685] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:08.963 test_start 00:06:08.963 oneshot 00:06:08.963 tick 100 00:06:08.963 tick 100 00:06:08.963 tick 250 00:06:08.963 tick 100 00:06:08.963 tick 100 00:06:08.963 tick 100 00:06:08.963 tick 250 00:06:08.963 tick 500 00:06:08.963 tick 100 00:06:08.963 tick 100 00:06:08.963 tick 250 00:06:08.963 tick 100 00:06:08.963 tick 100 00:06:08.963 test_end 00:06:08.963 00:06:08.963 real 0m1.188s 00:06:08.963 user 0m1.085s 00:06:08.963 sys 0m0.098s 00:06:08.963 17:50:01 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:08.963 17:50:01 -- common/autotest_common.sh@10 -- # set +x 00:06:08.963 ************************************ 00:06:08.963 END TEST event_reactor 00:06:08.963 ************************************ 00:06:08.963 17:50:01 -- event/event.sh@47 -- # run_test event_reactor_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:06:08.963 17:50:01 -- common/autotest_common.sh@1087 -- # '[' 4 -le 1 ']' 00:06:08.963 17:50:01 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:08.963 17:50:01 -- common/autotest_common.sh@10 -- # set +x 00:06:08.963 ************************************ 00:06:08.963 START TEST event_reactor_perf 00:06:08.963 ************************************ 00:06:08.963 17:50:01 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:06:09.222 [2024-11-19 17:50:01.835509] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:09.223 [2024-11-19 17:50:01.835643] [ DPDK EAL parameters: reactor_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid613558 ] 00:06:09.223 EAL: No free 2048 kB hugepages reported on node 1 00:06:09.223 [2024-11-19 17:50:01.919868] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:09.223 [2024-11-19 17:50:01.954195] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:10.161 test_start 00:06:10.161 test_end 00:06:10.161 Performance: 966497 events per second 00:06:10.161 00:06:10.161 real 0m1.192s 00:06:10.161 user 0m1.082s 00:06:10.161 sys 0m0.105s 00:06:10.161 17:50:03 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:10.161 17:50:03 -- common/autotest_common.sh@10 -- # set +x 00:06:10.161 ************************************ 00:06:10.161 END TEST event_reactor_perf 00:06:10.161 ************************************ 00:06:10.421 17:50:03 -- event/event.sh@49 -- # uname -s 00:06:10.421 17:50:03 -- event/event.sh@49 -- # '[' Linux = Linux ']' 00:06:10.421 17:50:03 -- event/event.sh@50 -- # run_test event_scheduler /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:06:10.421 17:50:03 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:10.421 17:50:03 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:10.421 17:50:03 -- common/autotest_common.sh@10 -- # set +x 00:06:10.421 ************************************ 00:06:10.421 START TEST event_scheduler 00:06:10.421 ************************************ 00:06:10.421 17:50:03 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:06:10.421 * Looking for test storage... 00:06:10.421 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler 00:06:10.421 17:50:03 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:06:10.421 17:50:03 -- common/autotest_common.sh@1690 -- # lcov --version 00:06:10.421 17:50:03 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:06:10.421 17:50:03 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:06:10.421 17:50:03 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:06:10.421 17:50:03 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:06:10.421 17:50:03 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:06:10.421 17:50:03 -- scripts/common.sh@335 -- # IFS=.-: 00:06:10.421 17:50:03 -- scripts/common.sh@335 -- # read -ra ver1 00:06:10.421 17:50:03 -- scripts/common.sh@336 -- # IFS=.-: 00:06:10.421 17:50:03 -- scripts/common.sh@336 -- # read -ra ver2 00:06:10.421 17:50:03 -- scripts/common.sh@337 -- # local 'op=<' 00:06:10.421 17:50:03 -- scripts/common.sh@339 -- # ver1_l=2 00:06:10.421 17:50:03 -- scripts/common.sh@340 -- # ver2_l=1 00:06:10.421 17:50:03 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:06:10.421 17:50:03 -- scripts/common.sh@343 -- # case "$op" in 00:06:10.421 17:50:03 -- scripts/common.sh@344 -- # : 1 00:06:10.421 17:50:03 -- scripts/common.sh@363 -- # (( v = 0 )) 00:06:10.421 17:50:03 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:10.421 17:50:03 -- scripts/common.sh@364 -- # decimal 1 00:06:10.421 17:50:03 -- scripts/common.sh@352 -- # local d=1 00:06:10.421 17:50:03 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:10.421 17:50:03 -- scripts/common.sh@354 -- # echo 1 00:06:10.421 17:50:03 -- scripts/common.sh@364 -- # ver1[v]=1 00:06:10.421 17:50:03 -- scripts/common.sh@365 -- # decimal 2 00:06:10.421 17:50:03 -- scripts/common.sh@352 -- # local d=2 00:06:10.421 17:50:03 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:10.421 17:50:03 -- scripts/common.sh@354 -- # echo 2 00:06:10.421 17:50:03 -- scripts/common.sh@365 -- # ver2[v]=2 00:06:10.421 17:50:03 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:06:10.421 17:50:03 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:06:10.421 17:50:03 -- scripts/common.sh@367 -- # return 0 00:06:10.421 17:50:03 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:10.421 17:50:03 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:06:10.421 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:10.421 --rc genhtml_branch_coverage=1 00:06:10.421 --rc genhtml_function_coverage=1 00:06:10.421 --rc genhtml_legend=1 00:06:10.421 --rc geninfo_all_blocks=1 00:06:10.421 --rc geninfo_unexecuted_blocks=1 00:06:10.421 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:10.421 ' 00:06:10.421 17:50:03 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:06:10.421 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:10.421 --rc genhtml_branch_coverage=1 00:06:10.421 --rc genhtml_function_coverage=1 00:06:10.421 --rc genhtml_legend=1 00:06:10.421 --rc geninfo_all_blocks=1 00:06:10.421 --rc geninfo_unexecuted_blocks=1 00:06:10.421 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:10.421 ' 00:06:10.421 17:50:03 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:06:10.421 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:10.421 --rc genhtml_branch_coverage=1 00:06:10.421 --rc genhtml_function_coverage=1 00:06:10.421 --rc genhtml_legend=1 00:06:10.421 --rc geninfo_all_blocks=1 00:06:10.421 --rc geninfo_unexecuted_blocks=1 00:06:10.421 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:10.421 ' 00:06:10.421 17:50:03 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:06:10.421 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:10.421 --rc genhtml_branch_coverage=1 00:06:10.421 --rc genhtml_function_coverage=1 00:06:10.421 --rc genhtml_legend=1 00:06:10.421 --rc geninfo_all_blocks=1 00:06:10.421 --rc geninfo_unexecuted_blocks=1 00:06:10.421 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:10.421 ' 00:06:10.422 17:50:03 -- scheduler/scheduler.sh@29 -- # rpc=rpc_cmd 00:06:10.422 17:50:03 -- scheduler/scheduler.sh@35 -- # scheduler_pid=613875 00:06:10.422 17:50:03 -- scheduler/scheduler.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler -m 0xF -p 0x2 --wait-for-rpc -f 00:06:10.422 17:50:03 -- scheduler/scheduler.sh@36 -- # trap 'killprocess $scheduler_pid; exit 1' SIGINT SIGTERM EXIT 00:06:10.422 17:50:03 -- scheduler/scheduler.sh@37 -- # waitforlisten 613875 00:06:10.422 17:50:03 -- common/autotest_common.sh@829 -- # '[' -z 613875 ']' 00:06:10.422 17:50:03 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:10.422 17:50:03 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:10.422 17:50:03 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:10.422 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:10.422 17:50:03 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:10.422 17:50:03 -- common/autotest_common.sh@10 -- # set +x 00:06:10.422 [2024-11-19 17:50:03.276096] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:10.422 [2024-11-19 17:50:03.276165] [ DPDK EAL parameters: scheduler --no-shconf -c 0xF --main-lcore=2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid613875 ] 00:06:10.681 EAL: No free 2048 kB hugepages reported on node 1 00:06:10.681 [2024-11-19 17:50:03.356578] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:06:10.681 [2024-11-19 17:50:03.395034] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:10.681 [2024-11-19 17:50:03.395145] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:10.681 [2024-11-19 17:50:03.395253] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:10.681 [2024-11-19 17:50:03.395254] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:06:10.681 17:50:03 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:10.681 17:50:03 -- common/autotest_common.sh@862 -- # return 0 00:06:10.681 17:50:03 -- scheduler/scheduler.sh@39 -- # rpc_cmd framework_set_scheduler dynamic 00:06:10.681 17:50:03 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:10.681 17:50:03 -- common/autotest_common.sh@10 -- # set +x 00:06:10.681 POWER: Env isn't set yet! 00:06:10.681 POWER: Attempting to initialise ACPI cpufreq power management... 00:06:10.681 POWER: Failed to write /sys/devices/system/cpu/cpu%u/cpufreq/scaling_governor 00:06:10.681 POWER: Cannot set governor of lcore 0 to userspace 00:06:10.681 POWER: Attempting to initialise PSTAT power management... 00:06:10.681 POWER: Power management governor of lcore 0 has been set to 'performance' successfully 00:06:10.681 POWER: Initialized successfully for lcore 0 power management 00:06:10.681 POWER: Power management governor of lcore 1 has been set to 'performance' successfully 00:06:10.681 POWER: Initialized successfully for lcore 1 power management 00:06:10.681 POWER: Power management governor of lcore 2 has been set to 'performance' successfully 00:06:10.681 POWER: Initialized successfully for lcore 2 power management 00:06:10.681 POWER: Power management governor of lcore 3 has been set to 'performance' successfully 00:06:10.681 POWER: Initialized successfully for lcore 3 power management 00:06:10.681 [2024-11-19 17:50:03.495870] scheduler_dynamic.c: 387:set_opts: *NOTICE*: Setting scheduler load limit to 20 00:06:10.681 [2024-11-19 17:50:03.495886] scheduler_dynamic.c: 389:set_opts: *NOTICE*: Setting scheduler core limit to 80 00:06:10.681 [2024-11-19 17:50:03.495897] scheduler_dynamic.c: 391:set_opts: *NOTICE*: Setting scheduler core busy to 95 00:06:10.681 17:50:03 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:10.681 17:50:03 -- scheduler/scheduler.sh@40 -- # rpc_cmd framework_start_init 00:06:10.681 17:50:03 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:10.681 17:50:03 -- common/autotest_common.sh@10 -- # set +x 00:06:10.941 [2024-11-19 17:50:03.562284] scheduler.c: 382:test_start: *NOTICE*: Scheduler test application started. 00:06:10.941 17:50:03 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:10.941 17:50:03 -- scheduler/scheduler.sh@43 -- # run_test scheduler_create_thread scheduler_create_thread 00:06:10.941 17:50:03 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:10.941 17:50:03 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:10.941 17:50:03 -- common/autotest_common.sh@10 -- # set +x 00:06:10.941 ************************************ 00:06:10.941 START TEST scheduler_create_thread 00:06:10.941 ************************************ 00:06:10.941 17:50:03 -- common/autotest_common.sh@1114 -- # scheduler_create_thread 00:06:10.941 17:50:03 -- scheduler/scheduler.sh@12 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x1 -a 100 00:06:10.941 17:50:03 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:10.941 17:50:03 -- common/autotest_common.sh@10 -- # set +x 00:06:10.941 2 00:06:10.941 17:50:03 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:10.941 17:50:03 -- scheduler/scheduler.sh@13 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x2 -a 100 00:06:10.941 17:50:03 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:10.941 17:50:03 -- common/autotest_common.sh@10 -- # set +x 00:06:10.941 3 00:06:10.941 17:50:03 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:10.941 17:50:03 -- scheduler/scheduler.sh@14 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x4 -a 100 00:06:10.941 17:50:03 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:10.941 17:50:03 -- common/autotest_common.sh@10 -- # set +x 00:06:10.941 4 00:06:10.941 17:50:03 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:10.941 17:50:03 -- scheduler/scheduler.sh@15 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x8 -a 100 00:06:10.941 17:50:03 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:10.941 17:50:03 -- common/autotest_common.sh@10 -- # set +x 00:06:10.941 5 00:06:10.941 17:50:03 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:10.941 17:50:03 -- scheduler/scheduler.sh@16 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x1 -a 0 00:06:10.941 17:50:03 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:10.941 17:50:03 -- common/autotest_common.sh@10 -- # set +x 00:06:10.941 6 00:06:10.941 17:50:03 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:10.942 17:50:03 -- scheduler/scheduler.sh@17 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x2 -a 0 00:06:10.942 17:50:03 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:10.942 17:50:03 -- common/autotest_common.sh@10 -- # set +x 00:06:10.942 7 00:06:10.942 17:50:03 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:10.942 17:50:03 -- scheduler/scheduler.sh@18 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x4 -a 0 00:06:10.942 17:50:03 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:10.942 17:50:03 -- common/autotest_common.sh@10 -- # set +x 00:06:10.942 8 00:06:10.942 17:50:03 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:10.942 17:50:03 -- scheduler/scheduler.sh@19 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x8 -a 0 00:06:10.942 17:50:03 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:10.942 17:50:03 -- common/autotest_common.sh@10 -- # set +x 00:06:10.942 9 00:06:10.942 17:50:03 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:10.942 17:50:03 -- scheduler/scheduler.sh@21 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n one_third_active -a 30 00:06:10.942 17:50:03 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:10.942 17:50:03 -- common/autotest_common.sh@10 -- # set +x 00:06:10.942 10 00:06:10.942 17:50:03 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:10.942 17:50:03 -- scheduler/scheduler.sh@22 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n half_active -a 0 00:06:10.942 17:50:03 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:10.942 17:50:03 -- common/autotest_common.sh@10 -- # set +x 00:06:10.942 17:50:03 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:10.942 17:50:03 -- scheduler/scheduler.sh@22 -- # thread_id=11 00:06:10.942 17:50:03 -- scheduler/scheduler.sh@23 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_set_active 11 50 00:06:10.942 17:50:03 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:10.942 17:50:03 -- common/autotest_common.sh@10 -- # set +x 00:06:11.880 17:50:04 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:11.880 17:50:04 -- scheduler/scheduler.sh@25 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n deleted -a 100 00:06:11.880 17:50:04 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:11.880 17:50:04 -- common/autotest_common.sh@10 -- # set +x 00:06:13.261 17:50:05 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:13.261 17:50:05 -- scheduler/scheduler.sh@25 -- # thread_id=12 00:06:13.261 17:50:05 -- scheduler/scheduler.sh@26 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_delete 12 00:06:13.261 17:50:05 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:13.261 17:50:05 -- common/autotest_common.sh@10 -- # set +x 00:06:14.199 17:50:06 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:14.199 00:06:14.199 real 0m3.382s 00:06:14.199 user 0m0.024s 00:06:14.199 sys 0m0.008s 00:06:14.199 17:50:06 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:14.199 17:50:06 -- common/autotest_common.sh@10 -- # set +x 00:06:14.199 ************************************ 00:06:14.199 END TEST scheduler_create_thread 00:06:14.199 ************************************ 00:06:14.199 17:50:06 -- scheduler/scheduler.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:06:14.199 17:50:06 -- scheduler/scheduler.sh@46 -- # killprocess 613875 00:06:14.199 17:50:06 -- common/autotest_common.sh@936 -- # '[' -z 613875 ']' 00:06:14.199 17:50:06 -- common/autotest_common.sh@940 -- # kill -0 613875 00:06:14.199 17:50:06 -- common/autotest_common.sh@941 -- # uname 00:06:14.199 17:50:07 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:14.199 17:50:07 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 613875 00:06:14.459 17:50:07 -- common/autotest_common.sh@942 -- # process_name=reactor_2 00:06:14.459 17:50:07 -- common/autotest_common.sh@946 -- # '[' reactor_2 = sudo ']' 00:06:14.459 17:50:07 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 613875' 00:06:14.459 killing process with pid 613875 00:06:14.459 17:50:07 -- common/autotest_common.sh@955 -- # kill 613875 00:06:14.459 17:50:07 -- common/autotest_common.sh@960 -- # wait 613875 00:06:14.719 [2024-11-19 17:50:07.334064] scheduler.c: 360:test_shutdown: *NOTICE*: Scheduler test application stopped. 00:06:14.719 POWER: Power management governor of lcore 0 has been set to 'powersave' successfully 00:06:14.719 POWER: Power management of lcore 0 has exited from 'performance' mode and been set back to the original 00:06:14.719 POWER: Power management governor of lcore 1 has been set to 'powersave' successfully 00:06:14.719 POWER: Power management of lcore 1 has exited from 'performance' mode and been set back to the original 00:06:14.719 POWER: Power management governor of lcore 2 has been set to 'powersave' successfully 00:06:14.719 POWER: Power management of lcore 2 has exited from 'performance' mode and been set back to the original 00:06:14.719 POWER: Power management governor of lcore 3 has been set to 'powersave' successfully 00:06:14.719 POWER: Power management of lcore 3 has exited from 'performance' mode and been set back to the original 00:06:14.719 00:06:14.719 real 0m4.482s 00:06:14.719 user 0m7.831s 00:06:14.719 sys 0m0.422s 00:06:14.719 17:50:07 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:14.719 17:50:07 -- common/autotest_common.sh@10 -- # set +x 00:06:14.719 ************************************ 00:06:14.719 END TEST event_scheduler 00:06:14.719 ************************************ 00:06:14.979 17:50:07 -- event/event.sh@51 -- # modprobe -n nbd 00:06:14.979 17:50:07 -- event/event.sh@52 -- # run_test app_repeat app_repeat_test 00:06:14.979 17:50:07 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:14.979 17:50:07 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:14.979 17:50:07 -- common/autotest_common.sh@10 -- # set +x 00:06:14.979 ************************************ 00:06:14.979 START TEST app_repeat 00:06:14.979 ************************************ 00:06:14.979 17:50:07 -- common/autotest_common.sh@1114 -- # app_repeat_test 00:06:14.979 17:50:07 -- event/event.sh@12 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:14.979 17:50:07 -- event/event.sh@13 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:14.979 17:50:07 -- event/event.sh@13 -- # local nbd_list 00:06:14.979 17:50:07 -- event/event.sh@14 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:14.979 17:50:07 -- event/event.sh@14 -- # local bdev_list 00:06:14.979 17:50:07 -- event/event.sh@15 -- # local repeat_times=4 00:06:14.979 17:50:07 -- event/event.sh@17 -- # modprobe nbd 00:06:14.979 17:50:07 -- event/event.sh@19 -- # repeat_pid=614652 00:06:14.979 17:50:07 -- event/event.sh@20 -- # trap 'killprocess $repeat_pid; exit 1' SIGINT SIGTERM EXIT 00:06:14.979 17:50:07 -- event/event.sh@21 -- # echo 'Process app_repeat pid: 614652' 00:06:14.979 Process app_repeat pid: 614652 00:06:14.979 17:50:07 -- event/event.sh@23 -- # for i in {0..2} 00:06:14.979 17:50:07 -- event/event.sh@24 -- # echo 'spdk_app_start Round 0' 00:06:14.979 spdk_app_start Round 0 00:06:14.979 17:50:07 -- event/event.sh@25 -- # waitforlisten 614652 /var/tmp/spdk-nbd.sock 00:06:14.979 17:50:07 -- common/autotest_common.sh@829 -- # '[' -z 614652 ']' 00:06:14.979 17:50:07 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:14.979 17:50:07 -- event/event.sh@18 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/app_repeat/app_repeat -r /var/tmp/spdk-nbd.sock -m 0x3 -t 4 00:06:14.979 17:50:07 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:14.979 17:50:07 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:14.979 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:14.979 17:50:07 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:14.979 17:50:07 -- common/autotest_common.sh@10 -- # set +x 00:06:14.979 [2024-11-19 17:50:07.621926] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:14.979 [2024-11-19 17:50:07.622010] [ DPDK EAL parameters: app_repeat --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid614652 ] 00:06:14.979 EAL: No free 2048 kB hugepages reported on node 1 00:06:14.979 [2024-11-19 17:50:07.691715] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:14.979 [2024-11-19 17:50:07.729527] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:14.979 [2024-11-19 17:50:07.729530] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:15.918 17:50:08 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:15.918 17:50:08 -- common/autotest_common.sh@862 -- # return 0 00:06:15.918 17:50:08 -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:15.918 Malloc0 00:06:15.918 17:50:08 -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:16.177 Malloc1 00:06:16.177 17:50:08 -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:16.177 17:50:08 -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:16.177 17:50:08 -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:16.177 17:50:08 -- bdev/nbd_common.sh@91 -- # local bdev_list 00:06:16.177 17:50:08 -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:16.177 17:50:08 -- bdev/nbd_common.sh@92 -- # local nbd_list 00:06:16.177 17:50:08 -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:16.177 17:50:08 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:16.177 17:50:08 -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:16.177 17:50:08 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:06:16.177 17:50:08 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:16.177 17:50:08 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:06:16.177 17:50:08 -- bdev/nbd_common.sh@12 -- # local i 00:06:16.177 17:50:08 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:06:16.177 17:50:08 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:16.177 17:50:08 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:06:16.177 /dev/nbd0 00:06:16.177 17:50:09 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:06:16.177 17:50:09 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:06:16.177 17:50:09 -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:06:16.177 17:50:09 -- common/autotest_common.sh@867 -- # local i 00:06:16.177 17:50:09 -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:06:16.177 17:50:09 -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:06:16.177 17:50:09 -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:06:16.177 17:50:09 -- common/autotest_common.sh@871 -- # break 00:06:16.177 17:50:09 -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:06:16.177 17:50:09 -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:06:16.177 17:50:09 -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:16.177 1+0 records in 00:06:16.177 1+0 records out 00:06:16.177 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000235841 s, 17.4 MB/s 00:06:16.177 17:50:09 -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:16.177 17:50:09 -- common/autotest_common.sh@884 -- # size=4096 00:06:16.177 17:50:09 -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:16.436 17:50:09 -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:06:16.436 17:50:09 -- common/autotest_common.sh@887 -- # return 0 00:06:16.436 17:50:09 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:16.436 17:50:09 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:16.436 17:50:09 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:06:16.436 /dev/nbd1 00:06:16.436 17:50:09 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:06:16.436 17:50:09 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:06:16.436 17:50:09 -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:06:16.436 17:50:09 -- common/autotest_common.sh@867 -- # local i 00:06:16.436 17:50:09 -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:06:16.436 17:50:09 -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:06:16.436 17:50:09 -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:06:16.436 17:50:09 -- common/autotest_common.sh@871 -- # break 00:06:16.436 17:50:09 -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:06:16.436 17:50:09 -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:06:16.436 17:50:09 -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:16.436 1+0 records in 00:06:16.436 1+0 records out 00:06:16.436 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000232545 s, 17.6 MB/s 00:06:16.436 17:50:09 -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:16.436 17:50:09 -- common/autotest_common.sh@884 -- # size=4096 00:06:16.436 17:50:09 -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:16.436 17:50:09 -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:06:16.436 17:50:09 -- common/autotest_common.sh@887 -- # return 0 00:06:16.436 17:50:09 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:16.436 17:50:09 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:16.436 17:50:09 -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:16.436 17:50:09 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:16.436 17:50:09 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:16.695 17:50:09 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:06:16.695 { 00:06:16.695 "nbd_device": "/dev/nbd0", 00:06:16.695 "bdev_name": "Malloc0" 00:06:16.695 }, 00:06:16.695 { 00:06:16.695 "nbd_device": "/dev/nbd1", 00:06:16.695 "bdev_name": "Malloc1" 00:06:16.695 } 00:06:16.695 ]' 00:06:16.695 17:50:09 -- bdev/nbd_common.sh@64 -- # echo '[ 00:06:16.695 { 00:06:16.695 "nbd_device": "/dev/nbd0", 00:06:16.695 "bdev_name": "Malloc0" 00:06:16.695 }, 00:06:16.695 { 00:06:16.695 "nbd_device": "/dev/nbd1", 00:06:16.695 "bdev_name": "Malloc1" 00:06:16.695 } 00:06:16.695 ]' 00:06:16.695 17:50:09 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:16.695 17:50:09 -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:06:16.695 /dev/nbd1' 00:06:16.695 17:50:09 -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:06:16.695 /dev/nbd1' 00:06:16.695 17:50:09 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:16.695 17:50:09 -- bdev/nbd_common.sh@65 -- # count=2 00:06:16.695 17:50:09 -- bdev/nbd_common.sh@66 -- # echo 2 00:06:16.695 17:50:09 -- bdev/nbd_common.sh@95 -- # count=2 00:06:16.695 17:50:09 -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:06:16.695 17:50:09 -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:06:16.695 17:50:09 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:16.695 17:50:09 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:16.695 17:50:09 -- bdev/nbd_common.sh@71 -- # local operation=write 00:06:16.695 17:50:09 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:16.695 17:50:09 -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:06:16.695 17:50:09 -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:06:16.695 256+0 records in 00:06:16.695 256+0 records out 00:06:16.695 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0107161 s, 97.9 MB/s 00:06:16.695 17:50:09 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:16.695 17:50:09 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:06:16.695 256+0 records in 00:06:16.695 256+0 records out 00:06:16.695 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0195811 s, 53.6 MB/s 00:06:16.695 17:50:09 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:16.695 17:50:09 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:06:16.695 256+0 records in 00:06:16.695 256+0 records out 00:06:16.695 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0210936 s, 49.7 MB/s 00:06:16.695 17:50:09 -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:06:16.695 17:50:09 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:16.695 17:50:09 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:16.695 17:50:09 -- bdev/nbd_common.sh@71 -- # local operation=verify 00:06:16.695 17:50:09 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:16.695 17:50:09 -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:06:16.695 17:50:09 -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:06:16.695 17:50:09 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:16.695 17:50:09 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:06:16.954 17:50:09 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:16.954 17:50:09 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:06:16.954 17:50:09 -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:16.954 17:50:09 -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:06:16.954 17:50:09 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:16.954 17:50:09 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:16.954 17:50:09 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:06:16.954 17:50:09 -- bdev/nbd_common.sh@51 -- # local i 00:06:16.954 17:50:09 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:16.954 17:50:09 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:06:16.954 17:50:09 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:06:16.954 17:50:09 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:06:16.954 17:50:09 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:06:16.954 17:50:09 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:16.954 17:50:09 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:16.954 17:50:09 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:06:16.954 17:50:09 -- bdev/nbd_common.sh@41 -- # break 00:06:16.954 17:50:09 -- bdev/nbd_common.sh@45 -- # return 0 00:06:16.954 17:50:09 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:16.954 17:50:09 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:06:17.213 17:50:09 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:06:17.213 17:50:09 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:06:17.213 17:50:09 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:06:17.213 17:50:09 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:17.213 17:50:09 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:17.213 17:50:09 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:06:17.213 17:50:09 -- bdev/nbd_common.sh@41 -- # break 00:06:17.213 17:50:09 -- bdev/nbd_common.sh@45 -- # return 0 00:06:17.213 17:50:09 -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:17.213 17:50:09 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:17.213 17:50:09 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:17.473 17:50:10 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:06:17.473 17:50:10 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:06:17.473 17:50:10 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:17.473 17:50:10 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:06:17.473 17:50:10 -- bdev/nbd_common.sh@65 -- # echo '' 00:06:17.473 17:50:10 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:17.473 17:50:10 -- bdev/nbd_common.sh@65 -- # true 00:06:17.473 17:50:10 -- bdev/nbd_common.sh@65 -- # count=0 00:06:17.473 17:50:10 -- bdev/nbd_common.sh@66 -- # echo 0 00:06:17.473 17:50:10 -- bdev/nbd_common.sh@104 -- # count=0 00:06:17.473 17:50:10 -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:06:17.473 17:50:10 -- bdev/nbd_common.sh@109 -- # return 0 00:06:17.473 17:50:10 -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:06:17.732 17:50:10 -- event/event.sh@35 -- # sleep 3 00:06:17.732 [2024-11-19 17:50:10.584191] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:17.991 [2024-11-19 17:50:10.617156] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:17.991 [2024-11-19 17:50:10.617158] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:17.991 [2024-11-19 17:50:10.656914] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:06:17.992 [2024-11-19 17:50:10.656972] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:06:21.283 17:50:13 -- event/event.sh@23 -- # for i in {0..2} 00:06:21.283 17:50:13 -- event/event.sh@24 -- # echo 'spdk_app_start Round 1' 00:06:21.283 spdk_app_start Round 1 00:06:21.283 17:50:13 -- event/event.sh@25 -- # waitforlisten 614652 /var/tmp/spdk-nbd.sock 00:06:21.283 17:50:13 -- common/autotest_common.sh@829 -- # '[' -z 614652 ']' 00:06:21.283 17:50:13 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:21.283 17:50:13 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:21.283 17:50:13 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:21.283 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:21.283 17:50:13 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:21.283 17:50:13 -- common/autotest_common.sh@10 -- # set +x 00:06:21.283 17:50:13 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:21.283 17:50:13 -- common/autotest_common.sh@862 -- # return 0 00:06:21.283 17:50:13 -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:21.283 Malloc0 00:06:21.283 17:50:13 -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:21.283 Malloc1 00:06:21.283 17:50:13 -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:21.283 17:50:13 -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:21.283 17:50:13 -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:21.283 17:50:13 -- bdev/nbd_common.sh@91 -- # local bdev_list 00:06:21.283 17:50:13 -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:21.283 17:50:13 -- bdev/nbd_common.sh@92 -- # local nbd_list 00:06:21.283 17:50:13 -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:21.283 17:50:13 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:21.283 17:50:13 -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:21.283 17:50:13 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:06:21.283 17:50:13 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:21.283 17:50:13 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:06:21.283 17:50:13 -- bdev/nbd_common.sh@12 -- # local i 00:06:21.283 17:50:13 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:06:21.283 17:50:13 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:21.283 17:50:13 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:06:21.543 /dev/nbd0 00:06:21.543 17:50:14 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:06:21.543 17:50:14 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:06:21.543 17:50:14 -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:06:21.543 17:50:14 -- common/autotest_common.sh@867 -- # local i 00:06:21.543 17:50:14 -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:06:21.543 17:50:14 -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:06:21.543 17:50:14 -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:06:21.543 17:50:14 -- common/autotest_common.sh@871 -- # break 00:06:21.543 17:50:14 -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:06:21.543 17:50:14 -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:06:21.543 17:50:14 -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:21.543 1+0 records in 00:06:21.543 1+0 records out 00:06:21.543 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000119497 s, 34.3 MB/s 00:06:21.543 17:50:14 -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:21.543 17:50:14 -- common/autotest_common.sh@884 -- # size=4096 00:06:21.543 17:50:14 -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:21.543 17:50:14 -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:06:21.543 17:50:14 -- common/autotest_common.sh@887 -- # return 0 00:06:21.543 17:50:14 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:21.543 17:50:14 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:21.543 17:50:14 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:06:21.543 /dev/nbd1 00:06:21.543 17:50:14 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:06:21.543 17:50:14 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:06:21.543 17:50:14 -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:06:21.543 17:50:14 -- common/autotest_common.sh@867 -- # local i 00:06:21.543 17:50:14 -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:06:21.543 17:50:14 -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:06:21.543 17:50:14 -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:06:21.543 17:50:14 -- common/autotest_common.sh@871 -- # break 00:06:21.543 17:50:14 -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:06:21.543 17:50:14 -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:06:21.543 17:50:14 -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:21.543 1+0 records in 00:06:21.543 1+0 records out 00:06:21.543 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000233231 s, 17.6 MB/s 00:06:21.543 17:50:14 -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:21.802 17:50:14 -- common/autotest_common.sh@884 -- # size=4096 00:06:21.802 17:50:14 -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:21.802 17:50:14 -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:06:21.802 17:50:14 -- common/autotest_common.sh@887 -- # return 0 00:06:21.802 17:50:14 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:21.802 17:50:14 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:21.802 17:50:14 -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:21.802 17:50:14 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:21.802 17:50:14 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:21.802 17:50:14 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:06:21.802 { 00:06:21.802 "nbd_device": "/dev/nbd0", 00:06:21.802 "bdev_name": "Malloc0" 00:06:21.802 }, 00:06:21.802 { 00:06:21.802 "nbd_device": "/dev/nbd1", 00:06:21.802 "bdev_name": "Malloc1" 00:06:21.802 } 00:06:21.802 ]' 00:06:21.802 17:50:14 -- bdev/nbd_common.sh@64 -- # echo '[ 00:06:21.802 { 00:06:21.802 "nbd_device": "/dev/nbd0", 00:06:21.802 "bdev_name": "Malloc0" 00:06:21.802 }, 00:06:21.802 { 00:06:21.802 "nbd_device": "/dev/nbd1", 00:06:21.802 "bdev_name": "Malloc1" 00:06:21.802 } 00:06:21.802 ]' 00:06:21.802 17:50:14 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:21.802 17:50:14 -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:06:21.802 /dev/nbd1' 00:06:21.802 17:50:14 -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:06:21.802 /dev/nbd1' 00:06:21.802 17:50:14 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:21.802 17:50:14 -- bdev/nbd_common.sh@65 -- # count=2 00:06:21.802 17:50:14 -- bdev/nbd_common.sh@66 -- # echo 2 00:06:21.802 17:50:14 -- bdev/nbd_common.sh@95 -- # count=2 00:06:21.802 17:50:14 -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:06:21.802 17:50:14 -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:06:21.802 17:50:14 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:21.802 17:50:14 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:21.802 17:50:14 -- bdev/nbd_common.sh@71 -- # local operation=write 00:06:21.802 17:50:14 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:21.802 17:50:14 -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:06:21.802 17:50:14 -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:06:21.802 256+0 records in 00:06:21.802 256+0 records out 00:06:21.802 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0109179 s, 96.0 MB/s 00:06:21.802 17:50:14 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:21.802 17:50:14 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:06:22.062 256+0 records in 00:06:22.062 256+0 records out 00:06:22.062 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0198393 s, 52.9 MB/s 00:06:22.062 17:50:14 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:22.062 17:50:14 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:06:22.062 256+0 records in 00:06:22.062 256+0 records out 00:06:22.062 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0211545 s, 49.6 MB/s 00:06:22.062 17:50:14 -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:06:22.062 17:50:14 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:22.062 17:50:14 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:22.062 17:50:14 -- bdev/nbd_common.sh@71 -- # local operation=verify 00:06:22.062 17:50:14 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:22.062 17:50:14 -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:06:22.062 17:50:14 -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:06:22.062 17:50:14 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:22.062 17:50:14 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:06:22.062 17:50:14 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:22.062 17:50:14 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:06:22.062 17:50:14 -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:22.062 17:50:14 -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:06:22.062 17:50:14 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:22.062 17:50:14 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:22.062 17:50:14 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:06:22.062 17:50:14 -- bdev/nbd_common.sh@51 -- # local i 00:06:22.062 17:50:14 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:22.062 17:50:14 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:06:22.062 17:50:14 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:06:22.062 17:50:14 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:06:22.062 17:50:14 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:06:22.062 17:50:14 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:22.062 17:50:14 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:22.062 17:50:14 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:06:22.062 17:50:14 -- bdev/nbd_common.sh@41 -- # break 00:06:22.062 17:50:14 -- bdev/nbd_common.sh@45 -- # return 0 00:06:22.062 17:50:14 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:22.062 17:50:14 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:06:22.322 17:50:15 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:06:22.322 17:50:15 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:06:22.322 17:50:15 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:06:22.322 17:50:15 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:22.322 17:50:15 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:22.322 17:50:15 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:06:22.322 17:50:15 -- bdev/nbd_common.sh@41 -- # break 00:06:22.322 17:50:15 -- bdev/nbd_common.sh@45 -- # return 0 00:06:22.322 17:50:15 -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:22.322 17:50:15 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:22.322 17:50:15 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:22.581 17:50:15 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:06:22.581 17:50:15 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:06:22.581 17:50:15 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:22.581 17:50:15 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:06:22.581 17:50:15 -- bdev/nbd_common.sh@65 -- # echo '' 00:06:22.581 17:50:15 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:22.581 17:50:15 -- bdev/nbd_common.sh@65 -- # true 00:06:22.581 17:50:15 -- bdev/nbd_common.sh@65 -- # count=0 00:06:22.581 17:50:15 -- bdev/nbd_common.sh@66 -- # echo 0 00:06:22.581 17:50:15 -- bdev/nbd_common.sh@104 -- # count=0 00:06:22.581 17:50:15 -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:06:22.581 17:50:15 -- bdev/nbd_common.sh@109 -- # return 0 00:06:22.581 17:50:15 -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:06:22.840 17:50:15 -- event/event.sh@35 -- # sleep 3 00:06:23.099 [2024-11-19 17:50:15.717088] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:23.099 [2024-11-19 17:50:15.749625] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:23.099 [2024-11-19 17:50:15.749635] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:23.099 [2024-11-19 17:50:15.789625] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:06:23.099 [2024-11-19 17:50:15.789666] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:06:26.392 17:50:18 -- event/event.sh@23 -- # for i in {0..2} 00:06:26.392 17:50:18 -- event/event.sh@24 -- # echo 'spdk_app_start Round 2' 00:06:26.392 spdk_app_start Round 2 00:06:26.392 17:50:18 -- event/event.sh@25 -- # waitforlisten 614652 /var/tmp/spdk-nbd.sock 00:06:26.392 17:50:18 -- common/autotest_common.sh@829 -- # '[' -z 614652 ']' 00:06:26.392 17:50:18 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:26.392 17:50:18 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:26.392 17:50:18 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:26.392 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:26.392 17:50:18 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:26.392 17:50:18 -- common/autotest_common.sh@10 -- # set +x 00:06:26.392 17:50:18 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:26.392 17:50:18 -- common/autotest_common.sh@862 -- # return 0 00:06:26.392 17:50:18 -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:26.392 Malloc0 00:06:26.392 17:50:18 -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:26.392 Malloc1 00:06:26.392 17:50:19 -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:26.392 17:50:19 -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:26.392 17:50:19 -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:26.392 17:50:19 -- bdev/nbd_common.sh@91 -- # local bdev_list 00:06:26.392 17:50:19 -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:26.392 17:50:19 -- bdev/nbd_common.sh@92 -- # local nbd_list 00:06:26.392 17:50:19 -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:26.392 17:50:19 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:26.392 17:50:19 -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:26.392 17:50:19 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:06:26.392 17:50:19 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:26.392 17:50:19 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:06:26.392 17:50:19 -- bdev/nbd_common.sh@12 -- # local i 00:06:26.392 17:50:19 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:06:26.392 17:50:19 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:26.392 17:50:19 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:06:26.651 /dev/nbd0 00:06:26.651 17:50:19 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:06:26.651 17:50:19 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:06:26.651 17:50:19 -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:06:26.651 17:50:19 -- common/autotest_common.sh@867 -- # local i 00:06:26.651 17:50:19 -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:06:26.651 17:50:19 -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:06:26.651 17:50:19 -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:06:26.651 17:50:19 -- common/autotest_common.sh@871 -- # break 00:06:26.651 17:50:19 -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:06:26.651 17:50:19 -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:06:26.651 17:50:19 -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:26.651 1+0 records in 00:06:26.651 1+0 records out 00:06:26.651 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00025433 s, 16.1 MB/s 00:06:26.651 17:50:19 -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:26.651 17:50:19 -- common/autotest_common.sh@884 -- # size=4096 00:06:26.651 17:50:19 -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:26.651 17:50:19 -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:06:26.651 17:50:19 -- common/autotest_common.sh@887 -- # return 0 00:06:26.651 17:50:19 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:26.651 17:50:19 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:26.651 17:50:19 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:06:26.651 /dev/nbd1 00:06:26.651 17:50:19 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:06:26.651 17:50:19 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:06:26.651 17:50:19 -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:06:26.651 17:50:19 -- common/autotest_common.sh@867 -- # local i 00:06:26.651 17:50:19 -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:06:26.651 17:50:19 -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:06:26.651 17:50:19 -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:06:26.651 17:50:19 -- common/autotest_common.sh@871 -- # break 00:06:26.651 17:50:19 -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:06:26.651 17:50:19 -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:06:26.651 17:50:19 -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:26.910 1+0 records in 00:06:26.910 1+0 records out 00:06:26.910 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.0002587 s, 15.8 MB/s 00:06:26.910 17:50:19 -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:26.910 17:50:19 -- common/autotest_common.sh@884 -- # size=4096 00:06:26.910 17:50:19 -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:26.910 17:50:19 -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:06:26.910 17:50:19 -- common/autotest_common.sh@887 -- # return 0 00:06:26.910 17:50:19 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:26.910 17:50:19 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:26.910 17:50:19 -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:26.910 17:50:19 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:26.910 17:50:19 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:26.910 17:50:19 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:06:26.910 { 00:06:26.910 "nbd_device": "/dev/nbd0", 00:06:26.910 "bdev_name": "Malloc0" 00:06:26.910 }, 00:06:26.910 { 00:06:26.910 "nbd_device": "/dev/nbd1", 00:06:26.910 "bdev_name": "Malloc1" 00:06:26.910 } 00:06:26.910 ]' 00:06:26.910 17:50:19 -- bdev/nbd_common.sh@64 -- # echo '[ 00:06:26.910 { 00:06:26.910 "nbd_device": "/dev/nbd0", 00:06:26.910 "bdev_name": "Malloc0" 00:06:26.910 }, 00:06:26.910 { 00:06:26.910 "nbd_device": "/dev/nbd1", 00:06:26.910 "bdev_name": "Malloc1" 00:06:26.910 } 00:06:26.910 ]' 00:06:26.910 17:50:19 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:26.910 17:50:19 -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:06:26.910 /dev/nbd1' 00:06:26.910 17:50:19 -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:06:26.910 /dev/nbd1' 00:06:26.910 17:50:19 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:26.910 17:50:19 -- bdev/nbd_common.sh@65 -- # count=2 00:06:26.910 17:50:19 -- bdev/nbd_common.sh@66 -- # echo 2 00:06:26.910 17:50:19 -- bdev/nbd_common.sh@95 -- # count=2 00:06:26.910 17:50:19 -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:06:26.910 17:50:19 -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:06:26.910 17:50:19 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:26.910 17:50:19 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:26.910 17:50:19 -- bdev/nbd_common.sh@71 -- # local operation=write 00:06:26.910 17:50:19 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:26.910 17:50:19 -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:06:26.910 17:50:19 -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:06:27.170 256+0 records in 00:06:27.170 256+0 records out 00:06:27.170 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0115851 s, 90.5 MB/s 00:06:27.170 17:50:19 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:27.170 17:50:19 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:06:27.170 256+0 records in 00:06:27.170 256+0 records out 00:06:27.170 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0199422 s, 52.6 MB/s 00:06:27.170 17:50:19 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:27.170 17:50:19 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:06:27.170 256+0 records in 00:06:27.170 256+0 records out 00:06:27.170 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0212361 s, 49.4 MB/s 00:06:27.170 17:50:19 -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:06:27.170 17:50:19 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:27.170 17:50:19 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:27.170 17:50:19 -- bdev/nbd_common.sh@71 -- # local operation=verify 00:06:27.170 17:50:19 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:27.170 17:50:19 -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:06:27.170 17:50:19 -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:06:27.170 17:50:19 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:27.170 17:50:19 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:06:27.170 17:50:19 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:27.170 17:50:19 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:06:27.170 17:50:19 -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:27.170 17:50:19 -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:06:27.170 17:50:19 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:27.170 17:50:19 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:27.170 17:50:19 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:06:27.170 17:50:19 -- bdev/nbd_common.sh@51 -- # local i 00:06:27.170 17:50:19 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:27.170 17:50:19 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:06:27.429 17:50:20 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:06:27.429 17:50:20 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:06:27.429 17:50:20 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:06:27.429 17:50:20 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:27.429 17:50:20 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:27.429 17:50:20 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:06:27.429 17:50:20 -- bdev/nbd_common.sh@41 -- # break 00:06:27.429 17:50:20 -- bdev/nbd_common.sh@45 -- # return 0 00:06:27.429 17:50:20 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:27.429 17:50:20 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:06:27.429 17:50:20 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:06:27.429 17:50:20 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:06:27.429 17:50:20 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:06:27.429 17:50:20 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:27.429 17:50:20 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:27.429 17:50:20 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:06:27.429 17:50:20 -- bdev/nbd_common.sh@41 -- # break 00:06:27.429 17:50:20 -- bdev/nbd_common.sh@45 -- # return 0 00:06:27.429 17:50:20 -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:27.429 17:50:20 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:27.429 17:50:20 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:27.689 17:50:20 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:06:27.689 17:50:20 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:06:27.689 17:50:20 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:27.689 17:50:20 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:06:27.689 17:50:20 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:27.689 17:50:20 -- bdev/nbd_common.sh@65 -- # echo '' 00:06:27.689 17:50:20 -- bdev/nbd_common.sh@65 -- # true 00:06:27.689 17:50:20 -- bdev/nbd_common.sh@65 -- # count=0 00:06:27.689 17:50:20 -- bdev/nbd_common.sh@66 -- # echo 0 00:06:27.689 17:50:20 -- bdev/nbd_common.sh@104 -- # count=0 00:06:27.689 17:50:20 -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:06:27.689 17:50:20 -- bdev/nbd_common.sh@109 -- # return 0 00:06:27.689 17:50:20 -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:06:27.949 17:50:20 -- event/event.sh@35 -- # sleep 3 00:06:28.209 [2024-11-19 17:50:20.841095] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:28.209 [2024-11-19 17:50:20.873771] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:28.209 [2024-11-19 17:50:20.873774] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:28.209 [2024-11-19 17:50:20.913463] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:06:28.209 [2024-11-19 17:50:20.913506] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:06:31.503 17:50:23 -- event/event.sh@38 -- # waitforlisten 614652 /var/tmp/spdk-nbd.sock 00:06:31.503 17:50:23 -- common/autotest_common.sh@829 -- # '[' -z 614652 ']' 00:06:31.503 17:50:23 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:31.503 17:50:23 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:31.503 17:50:23 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:31.503 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:31.503 17:50:23 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:31.503 17:50:23 -- common/autotest_common.sh@10 -- # set +x 00:06:31.503 17:50:23 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:31.503 17:50:23 -- common/autotest_common.sh@862 -- # return 0 00:06:31.503 17:50:23 -- event/event.sh@39 -- # killprocess 614652 00:06:31.503 17:50:23 -- common/autotest_common.sh@936 -- # '[' -z 614652 ']' 00:06:31.503 17:50:23 -- common/autotest_common.sh@940 -- # kill -0 614652 00:06:31.503 17:50:23 -- common/autotest_common.sh@941 -- # uname 00:06:31.503 17:50:23 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:31.503 17:50:23 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 614652 00:06:31.503 17:50:23 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:31.503 17:50:23 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:31.503 17:50:23 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 614652' 00:06:31.503 killing process with pid 614652 00:06:31.503 17:50:23 -- common/autotest_common.sh@955 -- # kill 614652 00:06:31.503 17:50:23 -- common/autotest_common.sh@960 -- # wait 614652 00:06:31.503 spdk_app_start is called in Round 0. 00:06:31.503 Shutdown signal received, stop current app iteration 00:06:31.503 Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 reinitialization... 00:06:31.503 spdk_app_start is called in Round 1. 00:06:31.503 Shutdown signal received, stop current app iteration 00:06:31.503 Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 reinitialization... 00:06:31.503 spdk_app_start is called in Round 2. 00:06:31.503 Shutdown signal received, stop current app iteration 00:06:31.503 Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 reinitialization... 00:06:31.503 spdk_app_start is called in Round 3. 00:06:31.503 Shutdown signal received, stop current app iteration 00:06:31.503 17:50:24 -- event/event.sh@40 -- # trap - SIGINT SIGTERM EXIT 00:06:31.503 17:50:24 -- event/event.sh@42 -- # return 0 00:06:31.503 00:06:31.503 real 0m16.460s 00:06:31.503 user 0m35.336s 00:06:31.503 sys 0m2.990s 00:06:31.503 17:50:24 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:31.503 17:50:24 -- common/autotest_common.sh@10 -- # set +x 00:06:31.503 ************************************ 00:06:31.503 END TEST app_repeat 00:06:31.503 ************************************ 00:06:31.503 17:50:24 -- event/event.sh@54 -- # (( SPDK_TEST_CRYPTO == 0 )) 00:06:31.503 17:50:24 -- event/event.sh@55 -- # run_test cpu_locks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/cpu_locks.sh 00:06:31.503 17:50:24 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:31.503 17:50:24 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:31.503 17:50:24 -- common/autotest_common.sh@10 -- # set +x 00:06:31.503 ************************************ 00:06:31.503 START TEST cpu_locks 00:06:31.503 ************************************ 00:06:31.503 17:50:24 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/cpu_locks.sh 00:06:31.503 * Looking for test storage... 00:06:31.503 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event 00:06:31.503 17:50:24 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:06:31.503 17:50:24 -- common/autotest_common.sh@1690 -- # lcov --version 00:06:31.503 17:50:24 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:06:31.503 17:50:24 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:06:31.503 17:50:24 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:06:31.503 17:50:24 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:06:31.503 17:50:24 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:06:31.503 17:50:24 -- scripts/common.sh@335 -- # IFS=.-: 00:06:31.503 17:50:24 -- scripts/common.sh@335 -- # read -ra ver1 00:06:31.503 17:50:24 -- scripts/common.sh@336 -- # IFS=.-: 00:06:31.503 17:50:24 -- scripts/common.sh@336 -- # read -ra ver2 00:06:31.503 17:50:24 -- scripts/common.sh@337 -- # local 'op=<' 00:06:31.503 17:50:24 -- scripts/common.sh@339 -- # ver1_l=2 00:06:31.503 17:50:24 -- scripts/common.sh@340 -- # ver2_l=1 00:06:31.503 17:50:24 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:06:31.503 17:50:24 -- scripts/common.sh@343 -- # case "$op" in 00:06:31.503 17:50:24 -- scripts/common.sh@344 -- # : 1 00:06:31.503 17:50:24 -- scripts/common.sh@363 -- # (( v = 0 )) 00:06:31.503 17:50:24 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:31.503 17:50:24 -- scripts/common.sh@364 -- # decimal 1 00:06:31.503 17:50:24 -- scripts/common.sh@352 -- # local d=1 00:06:31.503 17:50:24 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:31.503 17:50:24 -- scripts/common.sh@354 -- # echo 1 00:06:31.503 17:50:24 -- scripts/common.sh@364 -- # ver1[v]=1 00:06:31.503 17:50:24 -- scripts/common.sh@365 -- # decimal 2 00:06:31.503 17:50:24 -- scripts/common.sh@352 -- # local d=2 00:06:31.503 17:50:24 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:31.503 17:50:24 -- scripts/common.sh@354 -- # echo 2 00:06:31.503 17:50:24 -- scripts/common.sh@365 -- # ver2[v]=2 00:06:31.503 17:50:24 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:06:31.503 17:50:24 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:06:31.503 17:50:24 -- scripts/common.sh@367 -- # return 0 00:06:31.503 17:50:24 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:31.503 17:50:24 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:06:31.503 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:31.503 --rc genhtml_branch_coverage=1 00:06:31.503 --rc genhtml_function_coverage=1 00:06:31.503 --rc genhtml_legend=1 00:06:31.503 --rc geninfo_all_blocks=1 00:06:31.503 --rc geninfo_unexecuted_blocks=1 00:06:31.503 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:31.503 ' 00:06:31.503 17:50:24 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:06:31.503 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:31.503 --rc genhtml_branch_coverage=1 00:06:31.503 --rc genhtml_function_coverage=1 00:06:31.503 --rc genhtml_legend=1 00:06:31.503 --rc geninfo_all_blocks=1 00:06:31.503 --rc geninfo_unexecuted_blocks=1 00:06:31.503 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:31.503 ' 00:06:31.503 17:50:24 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:06:31.503 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:31.503 --rc genhtml_branch_coverage=1 00:06:31.503 --rc genhtml_function_coverage=1 00:06:31.503 --rc genhtml_legend=1 00:06:31.503 --rc geninfo_all_blocks=1 00:06:31.503 --rc geninfo_unexecuted_blocks=1 00:06:31.503 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:31.503 ' 00:06:31.503 17:50:24 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:06:31.503 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:31.503 --rc genhtml_branch_coverage=1 00:06:31.503 --rc genhtml_function_coverage=1 00:06:31.503 --rc genhtml_legend=1 00:06:31.503 --rc geninfo_all_blocks=1 00:06:31.503 --rc geninfo_unexecuted_blocks=1 00:06:31.503 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:31.503 ' 00:06:31.503 17:50:24 -- event/cpu_locks.sh@11 -- # rpc_sock1=/var/tmp/spdk.sock 00:06:31.503 17:50:24 -- event/cpu_locks.sh@12 -- # rpc_sock2=/var/tmp/spdk2.sock 00:06:31.503 17:50:24 -- event/cpu_locks.sh@164 -- # trap cleanup EXIT SIGTERM SIGINT 00:06:31.503 17:50:24 -- event/cpu_locks.sh@166 -- # run_test default_locks default_locks 00:06:31.503 17:50:24 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:31.503 17:50:24 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:31.503 17:50:24 -- common/autotest_common.sh@10 -- # set +x 00:06:31.503 ************************************ 00:06:31.503 START TEST default_locks 00:06:31.503 ************************************ 00:06:31.504 17:50:24 -- common/autotest_common.sh@1114 -- # default_locks 00:06:31.504 17:50:24 -- event/cpu_locks.sh@46 -- # spdk_tgt_pid=617835 00:06:31.504 17:50:24 -- event/cpu_locks.sh@47 -- # waitforlisten 617835 00:06:31.504 17:50:24 -- event/cpu_locks.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:31.504 17:50:24 -- common/autotest_common.sh@829 -- # '[' -z 617835 ']' 00:06:31.504 17:50:24 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:31.504 17:50:24 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:31.504 17:50:24 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:31.504 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:31.504 17:50:24 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:31.504 17:50:24 -- common/autotest_common.sh@10 -- # set +x 00:06:31.504 [2024-11-19 17:50:24.326384] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:31.504 [2024-11-19 17:50:24.326450] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid617835 ] 00:06:31.504 EAL: No free 2048 kB hugepages reported on node 1 00:06:31.831 [2024-11-19 17:50:24.392021] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:31.831 [2024-11-19 17:50:24.429152] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:31.831 [2024-11-19 17:50:24.429266] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:32.466 17:50:25 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:32.466 17:50:25 -- common/autotest_common.sh@862 -- # return 0 00:06:32.466 17:50:25 -- event/cpu_locks.sh@49 -- # locks_exist 617835 00:06:32.466 17:50:25 -- event/cpu_locks.sh@22 -- # lslocks -p 617835 00:06:32.466 17:50:25 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:32.755 lslocks: write error 00:06:32.755 17:50:25 -- event/cpu_locks.sh@50 -- # killprocess 617835 00:06:32.755 17:50:25 -- common/autotest_common.sh@936 -- # '[' -z 617835 ']' 00:06:32.755 17:50:25 -- common/autotest_common.sh@940 -- # kill -0 617835 00:06:32.755 17:50:25 -- common/autotest_common.sh@941 -- # uname 00:06:32.755 17:50:25 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:32.755 17:50:25 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 617835 00:06:32.755 17:50:25 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:32.755 17:50:25 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:32.755 17:50:25 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 617835' 00:06:32.755 killing process with pid 617835 00:06:32.755 17:50:25 -- common/autotest_common.sh@955 -- # kill 617835 00:06:32.755 17:50:25 -- common/autotest_common.sh@960 -- # wait 617835 00:06:33.056 17:50:25 -- event/cpu_locks.sh@52 -- # NOT waitforlisten 617835 00:06:33.056 17:50:25 -- common/autotest_common.sh@650 -- # local es=0 00:06:33.056 17:50:25 -- common/autotest_common.sh@652 -- # valid_exec_arg waitforlisten 617835 00:06:33.056 17:50:25 -- common/autotest_common.sh@638 -- # local arg=waitforlisten 00:06:33.056 17:50:25 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:33.056 17:50:25 -- common/autotest_common.sh@642 -- # type -t waitforlisten 00:06:33.056 17:50:25 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:33.056 17:50:25 -- common/autotest_common.sh@653 -- # waitforlisten 617835 00:06:33.056 17:50:25 -- common/autotest_common.sh@829 -- # '[' -z 617835 ']' 00:06:33.056 17:50:25 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:33.056 17:50:25 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:33.056 17:50:25 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:33.056 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:33.056 17:50:25 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:33.056 17:50:25 -- common/autotest_common.sh@10 -- # set +x 00:06:33.056 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 844: kill: (617835) - No such process 00:06:33.056 ERROR: process (pid: 617835) is no longer running 00:06:33.056 17:50:25 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:33.056 17:50:25 -- common/autotest_common.sh@862 -- # return 1 00:06:33.056 17:50:25 -- common/autotest_common.sh@653 -- # es=1 00:06:33.056 17:50:25 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:33.056 17:50:25 -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:06:33.056 17:50:25 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:33.056 17:50:25 -- event/cpu_locks.sh@54 -- # no_locks 00:06:33.056 17:50:25 -- event/cpu_locks.sh@26 -- # lock_files=() 00:06:33.056 17:50:25 -- event/cpu_locks.sh@26 -- # local lock_files 00:06:33.056 17:50:25 -- event/cpu_locks.sh@27 -- # (( 0 != 0 )) 00:06:33.056 00:06:33.056 real 0m1.591s 00:06:33.056 user 0m1.696s 00:06:33.056 sys 0m0.566s 00:06:33.056 17:50:25 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:33.056 17:50:25 -- common/autotest_common.sh@10 -- # set +x 00:06:33.056 ************************************ 00:06:33.056 END TEST default_locks 00:06:33.056 ************************************ 00:06:33.360 17:50:25 -- event/cpu_locks.sh@167 -- # run_test default_locks_via_rpc default_locks_via_rpc 00:06:33.360 17:50:25 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:33.360 17:50:25 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:33.360 17:50:25 -- common/autotest_common.sh@10 -- # set +x 00:06:33.360 ************************************ 00:06:33.360 START TEST default_locks_via_rpc 00:06:33.360 ************************************ 00:06:33.360 17:50:25 -- common/autotest_common.sh@1114 -- # default_locks_via_rpc 00:06:33.360 17:50:25 -- event/cpu_locks.sh@62 -- # spdk_tgt_pid=618215 00:06:33.360 17:50:25 -- event/cpu_locks.sh@63 -- # waitforlisten 618215 00:06:33.360 17:50:25 -- event/cpu_locks.sh@61 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:33.360 17:50:25 -- common/autotest_common.sh@829 -- # '[' -z 618215 ']' 00:06:33.360 17:50:25 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:33.360 17:50:25 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:33.360 17:50:25 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:33.360 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:33.360 17:50:25 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:33.360 17:50:25 -- common/autotest_common.sh@10 -- # set +x 00:06:33.360 [2024-11-19 17:50:25.970292] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:33.360 [2024-11-19 17:50:25.970368] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid618215 ] 00:06:33.360 EAL: No free 2048 kB hugepages reported on node 1 00:06:33.360 [2024-11-19 17:50:26.037732] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:33.360 [2024-11-19 17:50:26.072275] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:33.360 [2024-11-19 17:50:26.072392] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:34.011 17:50:26 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:34.011 17:50:26 -- common/autotest_common.sh@862 -- # return 0 00:06:34.011 17:50:26 -- event/cpu_locks.sh@65 -- # rpc_cmd framework_disable_cpumask_locks 00:06:34.011 17:50:26 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:34.011 17:50:26 -- common/autotest_common.sh@10 -- # set +x 00:06:34.011 17:50:26 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:34.011 17:50:26 -- event/cpu_locks.sh@67 -- # no_locks 00:06:34.011 17:50:26 -- event/cpu_locks.sh@26 -- # lock_files=() 00:06:34.011 17:50:26 -- event/cpu_locks.sh@26 -- # local lock_files 00:06:34.011 17:50:26 -- event/cpu_locks.sh@27 -- # (( 0 != 0 )) 00:06:34.011 17:50:26 -- event/cpu_locks.sh@69 -- # rpc_cmd framework_enable_cpumask_locks 00:06:34.011 17:50:26 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:34.011 17:50:26 -- common/autotest_common.sh@10 -- # set +x 00:06:34.011 17:50:26 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:34.011 17:50:26 -- event/cpu_locks.sh@71 -- # locks_exist 618215 00:06:34.011 17:50:26 -- event/cpu_locks.sh@22 -- # lslocks -p 618215 00:06:34.011 17:50:26 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:34.580 17:50:27 -- event/cpu_locks.sh@73 -- # killprocess 618215 00:06:34.580 17:50:27 -- common/autotest_common.sh@936 -- # '[' -z 618215 ']' 00:06:34.580 17:50:27 -- common/autotest_common.sh@940 -- # kill -0 618215 00:06:34.580 17:50:27 -- common/autotest_common.sh@941 -- # uname 00:06:34.580 17:50:27 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:34.580 17:50:27 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 618215 00:06:34.580 17:50:27 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:34.580 17:50:27 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:34.580 17:50:27 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 618215' 00:06:34.580 killing process with pid 618215 00:06:34.580 17:50:27 -- common/autotest_common.sh@955 -- # kill 618215 00:06:34.580 17:50:27 -- common/autotest_common.sh@960 -- # wait 618215 00:06:34.840 00:06:34.840 real 0m1.610s 00:06:34.840 user 0m1.680s 00:06:34.840 sys 0m0.582s 00:06:34.840 17:50:27 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:34.840 17:50:27 -- common/autotest_common.sh@10 -- # set +x 00:06:34.840 ************************************ 00:06:34.840 END TEST default_locks_via_rpc 00:06:34.840 ************************************ 00:06:34.840 17:50:27 -- event/cpu_locks.sh@168 -- # run_test non_locking_app_on_locked_coremask non_locking_app_on_locked_coremask 00:06:34.840 17:50:27 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:34.840 17:50:27 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:34.840 17:50:27 -- common/autotest_common.sh@10 -- # set +x 00:06:34.840 ************************************ 00:06:34.840 START TEST non_locking_app_on_locked_coremask 00:06:34.840 ************************************ 00:06:34.840 17:50:27 -- common/autotest_common.sh@1114 -- # non_locking_app_on_locked_coremask 00:06:34.840 17:50:27 -- event/cpu_locks.sh@80 -- # spdk_tgt_pid=618549 00:06:34.840 17:50:27 -- event/cpu_locks.sh@81 -- # waitforlisten 618549 /var/tmp/spdk.sock 00:06:34.840 17:50:27 -- event/cpu_locks.sh@79 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:34.840 17:50:27 -- common/autotest_common.sh@829 -- # '[' -z 618549 ']' 00:06:34.840 17:50:27 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:34.840 17:50:27 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:34.840 17:50:27 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:34.840 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:34.840 17:50:27 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:34.840 17:50:27 -- common/autotest_common.sh@10 -- # set +x 00:06:34.840 [2024-11-19 17:50:27.632190] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:34.840 [2024-11-19 17:50:27.632278] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid618549 ] 00:06:34.840 EAL: No free 2048 kB hugepages reported on node 1 00:06:34.840 [2024-11-19 17:50:27.700133] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:35.099 [2024-11-19 17:50:27.737126] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:35.099 [2024-11-19 17:50:27.737236] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:35.668 17:50:28 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:35.668 17:50:28 -- common/autotest_common.sh@862 -- # return 0 00:06:35.668 17:50:28 -- event/cpu_locks.sh@83 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 --disable-cpumask-locks -r /var/tmp/spdk2.sock 00:06:35.668 17:50:28 -- event/cpu_locks.sh@84 -- # spdk_tgt_pid2=618566 00:06:35.668 17:50:28 -- event/cpu_locks.sh@85 -- # waitforlisten 618566 /var/tmp/spdk2.sock 00:06:35.668 17:50:28 -- common/autotest_common.sh@829 -- # '[' -z 618566 ']' 00:06:35.668 17:50:28 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:35.668 17:50:28 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:35.668 17:50:28 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:35.668 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:35.668 17:50:28 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:35.668 17:50:28 -- common/autotest_common.sh@10 -- # set +x 00:06:35.668 [2024-11-19 17:50:28.468390] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:35.668 [2024-11-19 17:50:28.468436] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid618566 ] 00:06:35.668 EAL: No free 2048 kB hugepages reported on node 1 00:06:35.927 [2024-11-19 17:50:28.551414] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:06:35.927 [2024-11-19 17:50:28.551438] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:35.927 [2024-11-19 17:50:28.623310] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:35.927 [2024-11-19 17:50:28.623435] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:36.504 17:50:29 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:36.504 17:50:29 -- common/autotest_common.sh@862 -- # return 0 00:06:36.505 17:50:29 -- event/cpu_locks.sh@87 -- # locks_exist 618549 00:06:36.505 17:50:29 -- event/cpu_locks.sh@22 -- # lslocks -p 618549 00:06:36.505 17:50:29 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:37.447 lslocks: write error 00:06:37.447 17:50:30 -- event/cpu_locks.sh@89 -- # killprocess 618549 00:06:37.447 17:50:30 -- common/autotest_common.sh@936 -- # '[' -z 618549 ']' 00:06:37.447 17:50:30 -- common/autotest_common.sh@940 -- # kill -0 618549 00:06:37.447 17:50:30 -- common/autotest_common.sh@941 -- # uname 00:06:37.447 17:50:30 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:37.447 17:50:30 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 618549 00:06:37.447 17:50:30 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:37.447 17:50:30 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:37.447 17:50:30 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 618549' 00:06:37.447 killing process with pid 618549 00:06:37.447 17:50:30 -- common/autotest_common.sh@955 -- # kill 618549 00:06:37.447 17:50:30 -- common/autotest_common.sh@960 -- # wait 618549 00:06:38.016 17:50:30 -- event/cpu_locks.sh@90 -- # killprocess 618566 00:06:38.016 17:50:30 -- common/autotest_common.sh@936 -- # '[' -z 618566 ']' 00:06:38.016 17:50:30 -- common/autotest_common.sh@940 -- # kill -0 618566 00:06:38.016 17:50:30 -- common/autotest_common.sh@941 -- # uname 00:06:38.016 17:50:30 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:38.016 17:50:30 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 618566 00:06:38.016 17:50:30 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:38.016 17:50:30 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:38.016 17:50:30 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 618566' 00:06:38.016 killing process with pid 618566 00:06:38.016 17:50:30 -- common/autotest_common.sh@955 -- # kill 618566 00:06:38.016 17:50:30 -- common/autotest_common.sh@960 -- # wait 618566 00:06:38.586 00:06:38.586 real 0m3.546s 00:06:38.586 user 0m3.799s 00:06:38.586 sys 0m1.205s 00:06:38.586 17:50:31 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:38.586 17:50:31 -- common/autotest_common.sh@10 -- # set +x 00:06:38.586 ************************************ 00:06:38.586 END TEST non_locking_app_on_locked_coremask 00:06:38.586 ************************************ 00:06:38.586 17:50:31 -- event/cpu_locks.sh@169 -- # run_test locking_app_on_unlocked_coremask locking_app_on_unlocked_coremask 00:06:38.586 17:50:31 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:38.586 17:50:31 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:38.586 17:50:31 -- common/autotest_common.sh@10 -- # set +x 00:06:38.586 ************************************ 00:06:38.586 START TEST locking_app_on_unlocked_coremask 00:06:38.586 ************************************ 00:06:38.586 17:50:31 -- common/autotest_common.sh@1114 -- # locking_app_on_unlocked_coremask 00:06:38.586 17:50:31 -- event/cpu_locks.sh@98 -- # spdk_tgt_pid=619136 00:06:38.586 17:50:31 -- event/cpu_locks.sh@99 -- # waitforlisten 619136 /var/tmp/spdk.sock 00:06:38.586 17:50:31 -- event/cpu_locks.sh@97 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 --disable-cpumask-locks 00:06:38.586 17:50:31 -- common/autotest_common.sh@829 -- # '[' -z 619136 ']' 00:06:38.586 17:50:31 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:38.586 17:50:31 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:38.586 17:50:31 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:38.586 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:38.586 17:50:31 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:38.586 17:50:31 -- common/autotest_common.sh@10 -- # set +x 00:06:38.586 [2024-11-19 17:50:31.225585] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:38.586 [2024-11-19 17:50:31.225674] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid619136 ] 00:06:38.586 EAL: No free 2048 kB hugepages reported on node 1 00:06:38.586 [2024-11-19 17:50:31.291042] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:06:38.586 [2024-11-19 17:50:31.291066] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:38.586 [2024-11-19 17:50:31.328201] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:38.586 [2024-11-19 17:50:31.328311] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:39.524 17:50:32 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:39.524 17:50:32 -- common/autotest_common.sh@862 -- # return 0 00:06:39.524 17:50:32 -- event/cpu_locks.sh@102 -- # spdk_tgt_pid2=619383 00:06:39.524 17:50:32 -- event/cpu_locks.sh@103 -- # waitforlisten 619383 /var/tmp/spdk2.sock 00:06:39.524 17:50:32 -- event/cpu_locks.sh@101 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -r /var/tmp/spdk2.sock 00:06:39.524 17:50:32 -- common/autotest_common.sh@829 -- # '[' -z 619383 ']' 00:06:39.524 17:50:32 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:39.524 17:50:32 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:39.524 17:50:32 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:39.524 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:39.524 17:50:32 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:39.524 17:50:32 -- common/autotest_common.sh@10 -- # set +x 00:06:39.524 [2024-11-19 17:50:32.084451] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:39.524 [2024-11-19 17:50:32.084540] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid619383 ] 00:06:39.524 EAL: No free 2048 kB hugepages reported on node 1 00:06:39.524 [2024-11-19 17:50:32.174466] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:39.524 [2024-11-19 17:50:32.252107] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:39.524 [2024-11-19 17:50:32.252233] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:40.093 17:50:32 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:40.093 17:50:32 -- common/autotest_common.sh@862 -- # return 0 00:06:40.093 17:50:32 -- event/cpu_locks.sh@105 -- # locks_exist 619383 00:06:40.093 17:50:32 -- event/cpu_locks.sh@22 -- # lslocks -p 619383 00:06:40.093 17:50:32 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:41.029 lslocks: write error 00:06:41.029 17:50:33 -- event/cpu_locks.sh@107 -- # killprocess 619136 00:06:41.029 17:50:33 -- common/autotest_common.sh@936 -- # '[' -z 619136 ']' 00:06:41.029 17:50:33 -- common/autotest_common.sh@940 -- # kill -0 619136 00:06:41.029 17:50:33 -- common/autotest_common.sh@941 -- # uname 00:06:41.029 17:50:33 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:41.029 17:50:33 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 619136 00:06:41.029 17:50:33 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:41.029 17:50:33 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:41.029 17:50:33 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 619136' 00:06:41.029 killing process with pid 619136 00:06:41.029 17:50:33 -- common/autotest_common.sh@955 -- # kill 619136 00:06:41.029 17:50:33 -- common/autotest_common.sh@960 -- # wait 619136 00:06:41.598 17:50:34 -- event/cpu_locks.sh@108 -- # killprocess 619383 00:06:41.598 17:50:34 -- common/autotest_common.sh@936 -- # '[' -z 619383 ']' 00:06:41.598 17:50:34 -- common/autotest_common.sh@940 -- # kill -0 619383 00:06:41.598 17:50:34 -- common/autotest_common.sh@941 -- # uname 00:06:41.598 17:50:34 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:41.598 17:50:34 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 619383 00:06:41.598 17:50:34 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:41.598 17:50:34 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:41.598 17:50:34 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 619383' 00:06:41.598 killing process with pid 619383 00:06:41.598 17:50:34 -- common/autotest_common.sh@955 -- # kill 619383 00:06:41.598 17:50:34 -- common/autotest_common.sh@960 -- # wait 619383 00:06:41.857 00:06:41.857 real 0m3.433s 00:06:41.857 user 0m3.712s 00:06:41.857 sys 0m1.111s 00:06:41.857 17:50:34 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:41.857 17:50:34 -- common/autotest_common.sh@10 -- # set +x 00:06:41.857 ************************************ 00:06:41.857 END TEST locking_app_on_unlocked_coremask 00:06:41.857 ************************************ 00:06:41.857 17:50:34 -- event/cpu_locks.sh@170 -- # run_test locking_app_on_locked_coremask locking_app_on_locked_coremask 00:06:41.857 17:50:34 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:41.857 17:50:34 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:41.857 17:50:34 -- common/autotest_common.sh@10 -- # set +x 00:06:41.857 ************************************ 00:06:41.857 START TEST locking_app_on_locked_coremask 00:06:41.857 ************************************ 00:06:41.857 17:50:34 -- common/autotest_common.sh@1114 -- # locking_app_on_locked_coremask 00:06:41.857 17:50:34 -- event/cpu_locks.sh@114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:41.857 17:50:34 -- event/cpu_locks.sh@115 -- # spdk_tgt_pid=619737 00:06:41.857 17:50:34 -- event/cpu_locks.sh@116 -- # waitforlisten 619737 /var/tmp/spdk.sock 00:06:41.857 17:50:34 -- common/autotest_common.sh@829 -- # '[' -z 619737 ']' 00:06:41.857 17:50:34 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:41.857 17:50:34 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:41.857 17:50:34 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:41.857 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:41.857 17:50:34 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:41.857 17:50:34 -- common/autotest_common.sh@10 -- # set +x 00:06:41.857 [2024-11-19 17:50:34.688609] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:41.857 [2024-11-19 17:50:34.688674] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid619737 ] 00:06:41.857 EAL: No free 2048 kB hugepages reported on node 1 00:06:42.117 [2024-11-19 17:50:34.752153] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:42.117 [2024-11-19 17:50:34.789961] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:42.117 [2024-11-19 17:50:34.790067] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:42.685 17:50:35 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:42.685 17:50:35 -- common/autotest_common.sh@862 -- # return 0 00:06:42.685 17:50:35 -- event/cpu_locks.sh@118 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -r /var/tmp/spdk2.sock 00:06:42.685 17:50:35 -- event/cpu_locks.sh@119 -- # spdk_tgt_pid2=619987 00:06:42.686 17:50:35 -- event/cpu_locks.sh@120 -- # NOT waitforlisten 619987 /var/tmp/spdk2.sock 00:06:42.686 17:50:35 -- common/autotest_common.sh@650 -- # local es=0 00:06:42.686 17:50:35 -- common/autotest_common.sh@652 -- # valid_exec_arg waitforlisten 619987 /var/tmp/spdk2.sock 00:06:42.686 17:50:35 -- common/autotest_common.sh@638 -- # local arg=waitforlisten 00:06:42.686 17:50:35 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:42.686 17:50:35 -- common/autotest_common.sh@642 -- # type -t waitforlisten 00:06:42.686 17:50:35 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:42.686 17:50:35 -- common/autotest_common.sh@653 -- # waitforlisten 619987 /var/tmp/spdk2.sock 00:06:42.686 17:50:35 -- common/autotest_common.sh@829 -- # '[' -z 619987 ']' 00:06:42.686 17:50:35 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:42.686 17:50:35 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:42.686 17:50:35 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:42.686 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:42.686 17:50:35 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:42.686 17:50:35 -- common/autotest_common.sh@10 -- # set +x 00:06:42.945 [2024-11-19 17:50:35.552502] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:42.945 [2024-11-19 17:50:35.552554] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid619987 ] 00:06:42.945 EAL: No free 2048 kB hugepages reported on node 1 00:06:42.945 [2024-11-19 17:50:35.638272] app.c: 666:claim_cpu_cores: *ERROR*: Cannot create lock on core 0, probably process 619737 has claimed it. 00:06:42.945 [2024-11-19 17:50:35.638301] app.c: 791:spdk_app_start: *ERROR*: Unable to acquire lock on assigned core mask - exiting. 00:06:43.513 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 844: kill: (619987) - No such process 00:06:43.513 ERROR: process (pid: 619987) is no longer running 00:06:43.513 17:50:36 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:43.513 17:50:36 -- common/autotest_common.sh@862 -- # return 1 00:06:43.513 17:50:36 -- common/autotest_common.sh@653 -- # es=1 00:06:43.513 17:50:36 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:43.513 17:50:36 -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:06:43.513 17:50:36 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:43.513 17:50:36 -- event/cpu_locks.sh@122 -- # locks_exist 619737 00:06:43.513 17:50:36 -- event/cpu_locks.sh@22 -- # lslocks -p 619737 00:06:43.513 17:50:36 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:44.083 lslocks: write error 00:06:44.083 17:50:36 -- event/cpu_locks.sh@124 -- # killprocess 619737 00:06:44.083 17:50:36 -- common/autotest_common.sh@936 -- # '[' -z 619737 ']' 00:06:44.083 17:50:36 -- common/autotest_common.sh@940 -- # kill -0 619737 00:06:44.083 17:50:36 -- common/autotest_common.sh@941 -- # uname 00:06:44.083 17:50:36 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:44.083 17:50:36 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 619737 00:06:44.083 17:50:36 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:44.083 17:50:36 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:44.083 17:50:36 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 619737' 00:06:44.083 killing process with pid 619737 00:06:44.083 17:50:36 -- common/autotest_common.sh@955 -- # kill 619737 00:06:44.083 17:50:36 -- common/autotest_common.sh@960 -- # wait 619737 00:06:44.343 00:06:44.343 real 0m2.491s 00:06:44.343 user 0m2.767s 00:06:44.343 sys 0m0.710s 00:06:44.343 17:50:37 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:44.343 17:50:37 -- common/autotest_common.sh@10 -- # set +x 00:06:44.343 ************************************ 00:06:44.343 END TEST locking_app_on_locked_coremask 00:06:44.343 ************************************ 00:06:44.602 17:50:37 -- event/cpu_locks.sh@171 -- # run_test locking_overlapped_coremask locking_overlapped_coremask 00:06:44.602 17:50:37 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:44.602 17:50:37 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:44.602 17:50:37 -- common/autotest_common.sh@10 -- # set +x 00:06:44.602 ************************************ 00:06:44.602 START TEST locking_overlapped_coremask 00:06:44.602 ************************************ 00:06:44.602 17:50:37 -- common/autotest_common.sh@1114 -- # locking_overlapped_coremask 00:06:44.602 17:50:37 -- event/cpu_locks.sh@132 -- # spdk_tgt_pid=620284 00:06:44.602 17:50:37 -- event/cpu_locks.sh@133 -- # waitforlisten 620284 /var/tmp/spdk.sock 00:06:44.602 17:50:37 -- event/cpu_locks.sh@131 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x7 00:06:44.602 17:50:37 -- common/autotest_common.sh@829 -- # '[' -z 620284 ']' 00:06:44.602 17:50:37 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:44.602 17:50:37 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:44.602 17:50:37 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:44.602 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:44.602 17:50:37 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:44.602 17:50:37 -- common/autotest_common.sh@10 -- # set +x 00:06:44.602 [2024-11-19 17:50:37.242664] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:44.602 [2024-11-19 17:50:37.242755] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid620284 ] 00:06:44.602 EAL: No free 2048 kB hugepages reported on node 1 00:06:44.602 [2024-11-19 17:50:37.308956] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:06:44.602 [2024-11-19 17:50:37.342521] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:44.602 [2024-11-19 17:50:37.342691] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:44.602 [2024-11-19 17:50:37.342805] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:44.602 [2024-11-19 17:50:37.342808] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:45.538 17:50:38 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:45.538 17:50:38 -- common/autotest_common.sh@862 -- # return 0 00:06:45.538 17:50:38 -- event/cpu_locks.sh@136 -- # spdk_tgt_pid2=620510 00:06:45.538 17:50:38 -- event/cpu_locks.sh@137 -- # NOT waitforlisten 620510 /var/tmp/spdk2.sock 00:06:45.538 17:50:38 -- event/cpu_locks.sh@135 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1c -r /var/tmp/spdk2.sock 00:06:45.538 17:50:38 -- common/autotest_common.sh@650 -- # local es=0 00:06:45.538 17:50:38 -- common/autotest_common.sh@652 -- # valid_exec_arg waitforlisten 620510 /var/tmp/spdk2.sock 00:06:45.538 17:50:38 -- common/autotest_common.sh@638 -- # local arg=waitforlisten 00:06:45.538 17:50:38 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:45.538 17:50:38 -- common/autotest_common.sh@642 -- # type -t waitforlisten 00:06:45.538 17:50:38 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:45.538 17:50:38 -- common/autotest_common.sh@653 -- # waitforlisten 620510 /var/tmp/spdk2.sock 00:06:45.538 17:50:38 -- common/autotest_common.sh@829 -- # '[' -z 620510 ']' 00:06:45.538 17:50:38 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:45.538 17:50:38 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:45.538 17:50:38 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:45.538 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:45.538 17:50:38 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:45.538 17:50:38 -- common/autotest_common.sh@10 -- # set +x 00:06:45.538 [2024-11-19 17:50:38.104448] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:45.538 [2024-11-19 17:50:38.104535] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1c --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid620510 ] 00:06:45.538 EAL: No free 2048 kB hugepages reported on node 1 00:06:45.538 [2024-11-19 17:50:38.198323] app.c: 666:claim_cpu_cores: *ERROR*: Cannot create lock on core 2, probably process 620284 has claimed it. 00:06:45.538 [2024-11-19 17:50:38.198363] app.c: 791:spdk_app_start: *ERROR*: Unable to acquire lock on assigned core mask - exiting. 00:06:46.106 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 844: kill: (620510) - No such process 00:06:46.106 ERROR: process (pid: 620510) is no longer running 00:06:46.106 17:50:38 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:46.106 17:50:38 -- common/autotest_common.sh@862 -- # return 1 00:06:46.106 17:50:38 -- common/autotest_common.sh@653 -- # es=1 00:06:46.106 17:50:38 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:46.106 17:50:38 -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:06:46.106 17:50:38 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:46.106 17:50:38 -- event/cpu_locks.sh@139 -- # check_remaining_locks 00:06:46.106 17:50:38 -- event/cpu_locks.sh@36 -- # locks=(/var/tmp/spdk_cpu_lock_*) 00:06:46.106 17:50:38 -- event/cpu_locks.sh@37 -- # locks_expected=(/var/tmp/spdk_cpu_lock_{000..002}) 00:06:46.106 17:50:38 -- event/cpu_locks.sh@38 -- # [[ /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 == \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\0\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\1\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\2 ]] 00:06:46.106 17:50:38 -- event/cpu_locks.sh@141 -- # killprocess 620284 00:06:46.106 17:50:38 -- common/autotest_common.sh@936 -- # '[' -z 620284 ']' 00:06:46.106 17:50:38 -- common/autotest_common.sh@940 -- # kill -0 620284 00:06:46.106 17:50:38 -- common/autotest_common.sh@941 -- # uname 00:06:46.106 17:50:38 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:46.106 17:50:38 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 620284 00:06:46.106 17:50:38 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:46.106 17:50:38 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:46.106 17:50:38 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 620284' 00:06:46.106 killing process with pid 620284 00:06:46.106 17:50:38 -- common/autotest_common.sh@955 -- # kill 620284 00:06:46.106 17:50:38 -- common/autotest_common.sh@960 -- # wait 620284 00:06:46.366 00:06:46.366 real 0m1.905s 00:06:46.366 user 0m5.508s 00:06:46.366 sys 0m0.456s 00:06:46.366 17:50:39 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:46.366 17:50:39 -- common/autotest_common.sh@10 -- # set +x 00:06:46.366 ************************************ 00:06:46.366 END TEST locking_overlapped_coremask 00:06:46.366 ************************************ 00:06:46.366 17:50:39 -- event/cpu_locks.sh@172 -- # run_test locking_overlapped_coremask_via_rpc locking_overlapped_coremask_via_rpc 00:06:46.366 17:50:39 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:46.366 17:50:39 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:46.366 17:50:39 -- common/autotest_common.sh@10 -- # set +x 00:06:46.366 ************************************ 00:06:46.366 START TEST locking_overlapped_coremask_via_rpc 00:06:46.366 ************************************ 00:06:46.366 17:50:39 -- common/autotest_common.sh@1114 -- # locking_overlapped_coremask_via_rpc 00:06:46.366 17:50:39 -- event/cpu_locks.sh@148 -- # spdk_tgt_pid=620602 00:06:46.366 17:50:39 -- event/cpu_locks.sh@149 -- # waitforlisten 620602 /var/tmp/spdk.sock 00:06:46.366 17:50:39 -- event/cpu_locks.sh@147 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x7 --disable-cpumask-locks 00:06:46.366 17:50:39 -- common/autotest_common.sh@829 -- # '[' -z 620602 ']' 00:06:46.366 17:50:39 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:46.366 17:50:39 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:46.366 17:50:39 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:46.366 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:46.366 17:50:39 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:46.366 17:50:39 -- common/autotest_common.sh@10 -- # set +x 00:06:46.366 [2024-11-19 17:50:39.197664] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:46.366 [2024-11-19 17:50:39.197753] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid620602 ] 00:06:46.625 EAL: No free 2048 kB hugepages reported on node 1 00:06:46.625 [2024-11-19 17:50:39.265606] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:06:46.625 [2024-11-19 17:50:39.265631] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:06:46.625 [2024-11-19 17:50:39.304866] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:46.625 [2024-11-19 17:50:39.305006] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:46.625 [2024-11-19 17:50:39.305103] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:46.625 [2024-11-19 17:50:39.305105] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:47.195 17:50:40 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:47.195 17:50:40 -- common/autotest_common.sh@862 -- # return 0 00:06:47.195 17:50:40 -- event/cpu_locks.sh@152 -- # spdk_tgt_pid2=620865 00:06:47.195 17:50:40 -- event/cpu_locks.sh@153 -- # waitforlisten 620865 /var/tmp/spdk2.sock 00:06:47.195 17:50:40 -- event/cpu_locks.sh@151 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1c -r /var/tmp/spdk2.sock --disable-cpumask-locks 00:06:47.195 17:50:40 -- common/autotest_common.sh@829 -- # '[' -z 620865 ']' 00:06:47.195 17:50:40 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:47.195 17:50:40 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:47.195 17:50:40 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:47.195 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:47.195 17:50:40 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:47.195 17:50:40 -- common/autotest_common.sh@10 -- # set +x 00:06:47.454 [2024-11-19 17:50:40.065370] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:47.455 [2024-11-19 17:50:40.065456] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1c --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid620865 ] 00:06:47.455 EAL: No free 2048 kB hugepages reported on node 1 00:06:47.455 [2024-11-19 17:50:40.160237] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:06:47.455 [2024-11-19 17:50:40.160265] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:06:47.455 [2024-11-19 17:50:40.238668] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:47.455 [2024-11-19 17:50:40.238812] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:06:47.455 [2024-11-19 17:50:40.238945] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:47.455 [2024-11-19 17:50:40.238947] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 4 00:06:48.394 17:50:40 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:48.394 17:50:40 -- common/autotest_common.sh@862 -- # return 0 00:06:48.394 17:50:40 -- event/cpu_locks.sh@155 -- # rpc_cmd framework_enable_cpumask_locks 00:06:48.394 17:50:40 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:48.394 17:50:40 -- common/autotest_common.sh@10 -- # set +x 00:06:48.394 17:50:40 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:48.394 17:50:40 -- event/cpu_locks.sh@156 -- # NOT rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:06:48.394 17:50:40 -- common/autotest_common.sh@650 -- # local es=0 00:06:48.394 17:50:40 -- common/autotest_common.sh@652 -- # valid_exec_arg rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:06:48.394 17:50:40 -- common/autotest_common.sh@638 -- # local arg=rpc_cmd 00:06:48.394 17:50:40 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:48.394 17:50:40 -- common/autotest_common.sh@642 -- # type -t rpc_cmd 00:06:48.394 17:50:40 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:48.394 17:50:40 -- common/autotest_common.sh@653 -- # rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:06:48.394 17:50:40 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:48.394 17:50:40 -- common/autotest_common.sh@10 -- # set +x 00:06:48.394 [2024-11-19 17:50:40.923659] app.c: 666:claim_cpu_cores: *ERROR*: Cannot create lock on core 2, probably process 620602 has claimed it. 00:06:48.394 request: 00:06:48.394 { 00:06:48.394 "method": "framework_enable_cpumask_locks", 00:06:48.394 "req_id": 1 00:06:48.394 } 00:06:48.394 Got JSON-RPC error response 00:06:48.394 response: 00:06:48.394 { 00:06:48.394 "code": -32603, 00:06:48.394 "message": "Failed to claim CPU core: 2" 00:06:48.394 } 00:06:48.394 17:50:40 -- common/autotest_common.sh@589 -- # [[ 1 == 0 ]] 00:06:48.394 17:50:40 -- common/autotest_common.sh@653 -- # es=1 00:06:48.394 17:50:40 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:48.394 17:50:40 -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:06:48.394 17:50:40 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:48.394 17:50:40 -- event/cpu_locks.sh@158 -- # waitforlisten 620602 /var/tmp/spdk.sock 00:06:48.394 17:50:40 -- common/autotest_common.sh@829 -- # '[' -z 620602 ']' 00:06:48.394 17:50:40 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:48.394 17:50:40 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:48.394 17:50:40 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:48.394 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:48.394 17:50:40 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:48.394 17:50:40 -- common/autotest_common.sh@10 -- # set +x 00:06:48.394 17:50:41 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:48.394 17:50:41 -- common/autotest_common.sh@862 -- # return 0 00:06:48.394 17:50:41 -- event/cpu_locks.sh@159 -- # waitforlisten 620865 /var/tmp/spdk2.sock 00:06:48.394 17:50:41 -- common/autotest_common.sh@829 -- # '[' -z 620865 ']' 00:06:48.394 17:50:41 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:48.394 17:50:41 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:48.394 17:50:41 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:48.394 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:48.394 17:50:41 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:48.394 17:50:41 -- common/autotest_common.sh@10 -- # set +x 00:06:48.654 17:50:41 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:48.654 17:50:41 -- common/autotest_common.sh@862 -- # return 0 00:06:48.654 17:50:41 -- event/cpu_locks.sh@161 -- # check_remaining_locks 00:06:48.654 17:50:41 -- event/cpu_locks.sh@36 -- # locks=(/var/tmp/spdk_cpu_lock_*) 00:06:48.654 17:50:41 -- event/cpu_locks.sh@37 -- # locks_expected=(/var/tmp/spdk_cpu_lock_{000..002}) 00:06:48.654 17:50:41 -- event/cpu_locks.sh@38 -- # [[ /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 == \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\0\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\1\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\2 ]] 00:06:48.654 00:06:48.654 real 0m2.149s 00:06:48.654 user 0m0.901s 00:06:48.654 sys 0m0.182s 00:06:48.654 17:50:41 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:48.654 17:50:41 -- common/autotest_common.sh@10 -- # set +x 00:06:48.654 ************************************ 00:06:48.654 END TEST locking_overlapped_coremask_via_rpc 00:06:48.654 ************************************ 00:06:48.654 17:50:41 -- event/cpu_locks.sh@174 -- # cleanup 00:06:48.654 17:50:41 -- event/cpu_locks.sh@15 -- # [[ -z 620602 ]] 00:06:48.654 17:50:41 -- event/cpu_locks.sh@15 -- # killprocess 620602 00:06:48.654 17:50:41 -- common/autotest_common.sh@936 -- # '[' -z 620602 ']' 00:06:48.654 17:50:41 -- common/autotest_common.sh@940 -- # kill -0 620602 00:06:48.654 17:50:41 -- common/autotest_common.sh@941 -- # uname 00:06:48.654 17:50:41 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:48.654 17:50:41 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 620602 00:06:48.654 17:50:41 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:48.654 17:50:41 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:48.654 17:50:41 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 620602' 00:06:48.654 killing process with pid 620602 00:06:48.654 17:50:41 -- common/autotest_common.sh@955 -- # kill 620602 00:06:48.654 17:50:41 -- common/autotest_common.sh@960 -- # wait 620602 00:06:48.914 17:50:41 -- event/cpu_locks.sh@16 -- # [[ -z 620865 ]] 00:06:48.914 17:50:41 -- event/cpu_locks.sh@16 -- # killprocess 620865 00:06:48.914 17:50:41 -- common/autotest_common.sh@936 -- # '[' -z 620865 ']' 00:06:48.914 17:50:41 -- common/autotest_common.sh@940 -- # kill -0 620865 00:06:48.914 17:50:41 -- common/autotest_common.sh@941 -- # uname 00:06:48.914 17:50:41 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:48.914 17:50:41 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 620865 00:06:49.173 17:50:41 -- common/autotest_common.sh@942 -- # process_name=reactor_2 00:06:49.173 17:50:41 -- common/autotest_common.sh@946 -- # '[' reactor_2 = sudo ']' 00:06:49.173 17:50:41 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 620865' 00:06:49.173 killing process with pid 620865 00:06:49.173 17:50:41 -- common/autotest_common.sh@955 -- # kill 620865 00:06:49.173 17:50:41 -- common/autotest_common.sh@960 -- # wait 620865 00:06:49.433 17:50:42 -- event/cpu_locks.sh@18 -- # rm -f 00:06:49.433 17:50:42 -- event/cpu_locks.sh@1 -- # cleanup 00:06:49.433 17:50:42 -- event/cpu_locks.sh@15 -- # [[ -z 620602 ]] 00:06:49.433 17:50:42 -- event/cpu_locks.sh@15 -- # killprocess 620602 00:06:49.433 17:50:42 -- common/autotest_common.sh@936 -- # '[' -z 620602 ']' 00:06:49.433 17:50:42 -- common/autotest_common.sh@940 -- # kill -0 620602 00:06:49.433 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 940: kill: (620602) - No such process 00:06:49.433 17:50:42 -- common/autotest_common.sh@963 -- # echo 'Process with pid 620602 is not found' 00:06:49.433 Process with pid 620602 is not found 00:06:49.433 17:50:42 -- event/cpu_locks.sh@16 -- # [[ -z 620865 ]] 00:06:49.433 17:50:42 -- event/cpu_locks.sh@16 -- # killprocess 620865 00:06:49.433 17:50:42 -- common/autotest_common.sh@936 -- # '[' -z 620865 ']' 00:06:49.433 17:50:42 -- common/autotest_common.sh@940 -- # kill -0 620865 00:06:49.433 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 940: kill: (620865) - No such process 00:06:49.433 17:50:42 -- common/autotest_common.sh@963 -- # echo 'Process with pid 620865 is not found' 00:06:49.433 Process with pid 620865 is not found 00:06:49.433 17:50:42 -- event/cpu_locks.sh@18 -- # rm -f 00:06:49.433 00:06:49.433 real 0m17.995s 00:06:49.433 user 0m31.116s 00:06:49.433 sys 0m5.785s 00:06:49.433 17:50:42 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:49.433 17:50:42 -- common/autotest_common.sh@10 -- # set +x 00:06:49.433 ************************************ 00:06:49.433 END TEST cpu_locks 00:06:49.433 ************************************ 00:06:49.433 00:06:49.433 real 0m43.006s 00:06:49.433 user 1m20.740s 00:06:49.433 sys 0m9.865s 00:06:49.433 17:50:42 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:49.433 17:50:42 -- common/autotest_common.sh@10 -- # set +x 00:06:49.433 ************************************ 00:06:49.433 END TEST event 00:06:49.433 ************************************ 00:06:49.433 17:50:42 -- spdk/autotest.sh@175 -- # run_test thread /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/thread.sh 00:06:49.433 17:50:42 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:49.433 17:50:42 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:49.433 17:50:42 -- common/autotest_common.sh@10 -- # set +x 00:06:49.433 ************************************ 00:06:49.433 START TEST thread 00:06:49.433 ************************************ 00:06:49.433 17:50:42 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/thread.sh 00:06:49.433 * Looking for test storage... 00:06:49.433 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread 00:06:49.433 17:50:42 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:06:49.433 17:50:42 -- common/autotest_common.sh@1690 -- # lcov --version 00:06:49.433 17:50:42 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:06:49.693 17:50:42 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:06:49.693 17:50:42 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:06:49.693 17:50:42 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:06:49.693 17:50:42 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:06:49.693 17:50:42 -- scripts/common.sh@335 -- # IFS=.-: 00:06:49.693 17:50:42 -- scripts/common.sh@335 -- # read -ra ver1 00:06:49.693 17:50:42 -- scripts/common.sh@336 -- # IFS=.-: 00:06:49.693 17:50:42 -- scripts/common.sh@336 -- # read -ra ver2 00:06:49.693 17:50:42 -- scripts/common.sh@337 -- # local 'op=<' 00:06:49.693 17:50:42 -- scripts/common.sh@339 -- # ver1_l=2 00:06:49.693 17:50:42 -- scripts/common.sh@340 -- # ver2_l=1 00:06:49.693 17:50:42 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:06:49.693 17:50:42 -- scripts/common.sh@343 -- # case "$op" in 00:06:49.693 17:50:42 -- scripts/common.sh@344 -- # : 1 00:06:49.693 17:50:42 -- scripts/common.sh@363 -- # (( v = 0 )) 00:06:49.693 17:50:42 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:49.693 17:50:42 -- scripts/common.sh@364 -- # decimal 1 00:06:49.693 17:50:42 -- scripts/common.sh@352 -- # local d=1 00:06:49.693 17:50:42 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:49.693 17:50:42 -- scripts/common.sh@354 -- # echo 1 00:06:49.693 17:50:42 -- scripts/common.sh@364 -- # ver1[v]=1 00:06:49.693 17:50:42 -- scripts/common.sh@365 -- # decimal 2 00:06:49.693 17:50:42 -- scripts/common.sh@352 -- # local d=2 00:06:49.693 17:50:42 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:49.693 17:50:42 -- scripts/common.sh@354 -- # echo 2 00:06:49.693 17:50:42 -- scripts/common.sh@365 -- # ver2[v]=2 00:06:49.693 17:50:42 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:06:49.693 17:50:42 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:06:49.693 17:50:42 -- scripts/common.sh@367 -- # return 0 00:06:49.693 17:50:42 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:49.693 17:50:42 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:06:49.693 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:49.693 --rc genhtml_branch_coverage=1 00:06:49.693 --rc genhtml_function_coverage=1 00:06:49.693 --rc genhtml_legend=1 00:06:49.693 --rc geninfo_all_blocks=1 00:06:49.693 --rc geninfo_unexecuted_blocks=1 00:06:49.693 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:49.693 ' 00:06:49.693 17:50:42 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:06:49.693 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:49.693 --rc genhtml_branch_coverage=1 00:06:49.693 --rc genhtml_function_coverage=1 00:06:49.693 --rc genhtml_legend=1 00:06:49.693 --rc geninfo_all_blocks=1 00:06:49.693 --rc geninfo_unexecuted_blocks=1 00:06:49.693 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:49.693 ' 00:06:49.693 17:50:42 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:06:49.693 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:49.693 --rc genhtml_branch_coverage=1 00:06:49.693 --rc genhtml_function_coverage=1 00:06:49.693 --rc genhtml_legend=1 00:06:49.693 --rc geninfo_all_blocks=1 00:06:49.693 --rc geninfo_unexecuted_blocks=1 00:06:49.693 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:49.693 ' 00:06:49.693 17:50:42 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:06:49.693 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:49.693 --rc genhtml_branch_coverage=1 00:06:49.693 --rc genhtml_function_coverage=1 00:06:49.693 --rc genhtml_legend=1 00:06:49.693 --rc geninfo_all_blocks=1 00:06:49.693 --rc geninfo_unexecuted_blocks=1 00:06:49.693 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:49.693 ' 00:06:49.693 17:50:42 -- thread/thread.sh@11 -- # run_test thread_poller_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:06:49.693 17:50:42 -- common/autotest_common.sh@1087 -- # '[' 8 -le 1 ']' 00:06:49.693 17:50:42 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:49.693 17:50:42 -- common/autotest_common.sh@10 -- # set +x 00:06:49.693 ************************************ 00:06:49.693 START TEST thread_poller_perf 00:06:49.693 ************************************ 00:06:49.693 17:50:42 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:06:49.693 [2024-11-19 17:50:42.376336] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:49.693 [2024-11-19 17:50:42.376426] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid621289 ] 00:06:49.693 EAL: No free 2048 kB hugepages reported on node 1 00:06:49.693 [2024-11-19 17:50:42.443898] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:49.693 [2024-11-19 17:50:42.479607] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:49.693 Running 1000 pollers for 1 seconds with 1 microseconds period. 00:06:51.073 [2024-11-19T16:50:43.937Z] ====================================== 00:06:51.073 [2024-11-19T16:50:43.937Z] busy:2506386662 (cyc) 00:06:51.073 [2024-11-19T16:50:43.937Z] total_run_count: 797000 00:06:51.073 [2024-11-19T16:50:43.937Z] tsc_hz: 2500000000 (cyc) 00:06:51.073 [2024-11-19T16:50:43.937Z] ====================================== 00:06:51.073 [2024-11-19T16:50:43.937Z] poller_cost: 3144 (cyc), 1257 (nsec) 00:06:51.073 00:06:51.073 real 0m1.179s 00:06:51.073 user 0m1.089s 00:06:51.073 sys 0m0.085s 00:06:51.073 17:50:43 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:51.073 17:50:43 -- common/autotest_common.sh@10 -- # set +x 00:06:51.073 ************************************ 00:06:51.073 END TEST thread_poller_perf 00:06:51.073 ************************************ 00:06:51.074 17:50:43 -- thread/thread.sh@12 -- # run_test thread_poller_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:06:51.074 17:50:43 -- common/autotest_common.sh@1087 -- # '[' 8 -le 1 ']' 00:06:51.074 17:50:43 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:51.074 17:50:43 -- common/autotest_common.sh@10 -- # set +x 00:06:51.074 ************************************ 00:06:51.074 START TEST thread_poller_perf 00:06:51.074 ************************************ 00:06:51.074 17:50:43 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:06:51.074 [2024-11-19 17:50:43.601004] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:51.074 [2024-11-19 17:50:43.601095] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid621532 ] 00:06:51.074 EAL: No free 2048 kB hugepages reported on node 1 00:06:51.074 [2024-11-19 17:50:43.668596] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:51.074 [2024-11-19 17:50:43.703113] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:51.074 Running 1000 pollers for 1 seconds with 0 microseconds period. 00:06:52.012 [2024-11-19T16:50:44.876Z] ====================================== 00:06:52.012 [2024-11-19T16:50:44.876Z] busy:2502179232 (cyc) 00:06:52.012 [2024-11-19T16:50:44.876Z] total_run_count: 13128000 00:06:52.012 [2024-11-19T16:50:44.876Z] tsc_hz: 2500000000 (cyc) 00:06:52.012 [2024-11-19T16:50:44.876Z] ====================================== 00:06:52.012 [2024-11-19T16:50:44.876Z] poller_cost: 190 (cyc), 76 (nsec) 00:06:52.012 00:06:52.012 real 0m1.174s 00:06:52.012 user 0m1.084s 00:06:52.012 sys 0m0.086s 00:06:52.012 17:50:44 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:52.012 17:50:44 -- common/autotest_common.sh@10 -- # set +x 00:06:52.012 ************************************ 00:06:52.012 END TEST thread_poller_perf 00:06:52.012 ************************************ 00:06:52.012 17:50:44 -- thread/thread.sh@17 -- # [[ n != \y ]] 00:06:52.012 17:50:44 -- thread/thread.sh@18 -- # run_test thread_spdk_lock /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock 00:06:52.012 17:50:44 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:52.012 17:50:44 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:52.012 17:50:44 -- common/autotest_common.sh@10 -- # set +x 00:06:52.012 ************************************ 00:06:52.012 START TEST thread_spdk_lock 00:06:52.012 ************************************ 00:06:52.012 17:50:44 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock 00:06:52.012 [2024-11-19 17:50:44.825985] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:52.012 [2024-11-19 17:50:44.826107] [ DPDK EAL parameters: spdk_lock_test --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid621816 ] 00:06:52.012 EAL: No free 2048 kB hugepages reported on node 1 00:06:52.271 [2024-11-19 17:50:44.897520] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:52.271 [2024-11-19 17:50:44.932016] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:52.271 [2024-11-19 17:50:44.932018] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:52.840 [2024-11-19 17:50:45.424653] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 957:thread_execute_poller: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:06:52.840 [2024-11-19 17:50:45.424691] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:3064:spdk_spin_lock: *ERROR*: unrecoverable spinlock error 2: Deadlock detected (thread != sspin->thread) 00:06:52.840 [2024-11-19 17:50:45.424702] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:3019:sspin_stacks_print: *ERROR*: spinlock 0x12e2e40 00:06:52.840 [2024-11-19 17:50:45.425523] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 852:msg_queue_run_batch: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:06:52.840 [2024-11-19 17:50:45.425628] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:1018:thread_execute_timed_poller: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:06:52.840 [2024-11-19 17:50:45.425647] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 852:msg_queue_run_batch: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:06:52.840 Starting test contend 00:06:52.840 Worker Delay Wait us Hold us Total us 00:06:52.840 0 3 169446 185215 354662 00:06:52.840 1 5 89656 286675 376332 00:06:52.840 PASS test contend 00:06:52.840 Starting test hold_by_poller 00:06:52.840 PASS test hold_by_poller 00:06:52.840 Starting test hold_by_message 00:06:52.840 PASS test hold_by_message 00:06:52.840 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock summary: 00:06:52.840 100014 assertions passed 00:06:52.840 0 assertions failed 00:06:52.840 00:06:52.840 real 0m0.668s 00:06:52.840 user 0m1.066s 00:06:52.840 sys 0m0.091s 00:06:52.840 17:50:45 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:52.840 17:50:45 -- common/autotest_common.sh@10 -- # set +x 00:06:52.840 ************************************ 00:06:52.840 END TEST thread_spdk_lock 00:06:52.840 ************************************ 00:06:52.840 00:06:52.840 real 0m3.334s 00:06:52.840 user 0m3.386s 00:06:52.840 sys 0m0.472s 00:06:52.840 17:50:45 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:52.840 17:50:45 -- common/autotest_common.sh@10 -- # set +x 00:06:52.840 ************************************ 00:06:52.840 END TEST thread 00:06:52.840 ************************************ 00:06:52.840 17:50:45 -- spdk/autotest.sh@176 -- # run_test accel /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel.sh 00:06:52.840 17:50:45 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:52.840 17:50:45 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:52.840 17:50:45 -- common/autotest_common.sh@10 -- # set +x 00:06:52.840 ************************************ 00:06:52.840 START TEST accel 00:06:52.840 ************************************ 00:06:52.840 17:50:45 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel.sh 00:06:52.840 * Looking for test storage... 00:06:52.840 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel 00:06:52.840 17:50:45 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:06:52.840 17:50:45 -- common/autotest_common.sh@1690 -- # lcov --version 00:06:52.840 17:50:45 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:06:53.100 17:50:45 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:06:53.100 17:50:45 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:06:53.100 17:50:45 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:06:53.100 17:50:45 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:06:53.100 17:50:45 -- scripts/common.sh@335 -- # IFS=.-: 00:06:53.100 17:50:45 -- scripts/common.sh@335 -- # read -ra ver1 00:06:53.100 17:50:45 -- scripts/common.sh@336 -- # IFS=.-: 00:06:53.100 17:50:45 -- scripts/common.sh@336 -- # read -ra ver2 00:06:53.100 17:50:45 -- scripts/common.sh@337 -- # local 'op=<' 00:06:53.100 17:50:45 -- scripts/common.sh@339 -- # ver1_l=2 00:06:53.100 17:50:45 -- scripts/common.sh@340 -- # ver2_l=1 00:06:53.100 17:50:45 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:06:53.100 17:50:45 -- scripts/common.sh@343 -- # case "$op" in 00:06:53.100 17:50:45 -- scripts/common.sh@344 -- # : 1 00:06:53.100 17:50:45 -- scripts/common.sh@363 -- # (( v = 0 )) 00:06:53.100 17:50:45 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:53.100 17:50:45 -- scripts/common.sh@364 -- # decimal 1 00:06:53.100 17:50:45 -- scripts/common.sh@352 -- # local d=1 00:06:53.100 17:50:45 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:53.100 17:50:45 -- scripts/common.sh@354 -- # echo 1 00:06:53.100 17:50:45 -- scripts/common.sh@364 -- # ver1[v]=1 00:06:53.100 17:50:45 -- scripts/common.sh@365 -- # decimal 2 00:06:53.100 17:50:45 -- scripts/common.sh@352 -- # local d=2 00:06:53.100 17:50:45 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:53.100 17:50:45 -- scripts/common.sh@354 -- # echo 2 00:06:53.100 17:50:45 -- scripts/common.sh@365 -- # ver2[v]=2 00:06:53.100 17:50:45 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:06:53.100 17:50:45 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:06:53.100 17:50:45 -- scripts/common.sh@367 -- # return 0 00:06:53.100 17:50:45 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:53.100 17:50:45 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:06:53.100 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:53.100 --rc genhtml_branch_coverage=1 00:06:53.100 --rc genhtml_function_coverage=1 00:06:53.100 --rc genhtml_legend=1 00:06:53.101 --rc geninfo_all_blocks=1 00:06:53.101 --rc geninfo_unexecuted_blocks=1 00:06:53.101 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:53.101 ' 00:06:53.101 17:50:45 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:06:53.101 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:53.101 --rc genhtml_branch_coverage=1 00:06:53.101 --rc genhtml_function_coverage=1 00:06:53.101 --rc genhtml_legend=1 00:06:53.101 --rc geninfo_all_blocks=1 00:06:53.101 --rc geninfo_unexecuted_blocks=1 00:06:53.101 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:53.101 ' 00:06:53.101 17:50:45 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:06:53.101 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:53.101 --rc genhtml_branch_coverage=1 00:06:53.101 --rc genhtml_function_coverage=1 00:06:53.101 --rc genhtml_legend=1 00:06:53.101 --rc geninfo_all_blocks=1 00:06:53.101 --rc geninfo_unexecuted_blocks=1 00:06:53.101 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:53.101 ' 00:06:53.101 17:50:45 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:06:53.101 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:53.101 --rc genhtml_branch_coverage=1 00:06:53.101 --rc genhtml_function_coverage=1 00:06:53.101 --rc genhtml_legend=1 00:06:53.101 --rc geninfo_all_blocks=1 00:06:53.101 --rc geninfo_unexecuted_blocks=1 00:06:53.101 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:53.101 ' 00:06:53.101 17:50:45 -- accel/accel.sh@73 -- # declare -A expected_opcs 00:06:53.101 17:50:45 -- accel/accel.sh@74 -- # get_expected_opcs 00:06:53.101 17:50:45 -- accel/accel.sh@57 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:06:53.101 17:50:45 -- accel/accel.sh@59 -- # spdk_tgt_pid=622139 00:06:53.101 17:50:45 -- accel/accel.sh@60 -- # waitforlisten 622139 00:06:53.101 17:50:45 -- common/autotest_common.sh@829 -- # '[' -z 622139 ']' 00:06:53.101 17:50:45 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:53.101 17:50:45 -- accel/accel.sh@58 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:06:53.101 17:50:45 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:53.101 17:50:45 -- accel/accel.sh@58 -- # build_accel_config 00:06:53.101 17:50:45 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:53.101 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:53.101 17:50:45 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:53.101 17:50:45 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:53.101 17:50:45 -- common/autotest_common.sh@10 -- # set +x 00:06:53.101 17:50:45 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:53.101 17:50:45 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:53.101 17:50:45 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:53.101 17:50:45 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:53.101 17:50:45 -- accel/accel.sh@41 -- # local IFS=, 00:06:53.101 17:50:45 -- accel/accel.sh@42 -- # jq -r . 00:06:53.101 [2024-11-19 17:50:45.783769] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:53.101 [2024-11-19 17:50:45.783846] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid622139 ] 00:06:53.101 EAL: No free 2048 kB hugepages reported on node 1 00:06:53.101 [2024-11-19 17:50:45.850316] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:53.101 [2024-11-19 17:50:45.885818] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:53.101 [2024-11-19 17:50:45.885935] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:54.040 17:50:46 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:54.040 17:50:46 -- common/autotest_common.sh@862 -- # return 0 00:06:54.040 17:50:46 -- accel/accel.sh@62 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:06:54.040 17:50:46 -- accel/accel.sh@62 -- # rpc_cmd accel_get_opc_assignments 00:06:54.040 17:50:46 -- accel/accel.sh@62 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:06:54.040 17:50:46 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:54.040 17:50:46 -- common/autotest_common.sh@10 -- # set +x 00:06:54.040 17:50:46 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:54.040 17:50:46 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:54.040 17:50:46 -- accel/accel.sh@64 -- # IFS== 00:06:54.040 17:50:46 -- accel/accel.sh@64 -- # read -r opc module 00:06:54.040 17:50:46 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:54.040 17:50:46 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:54.040 17:50:46 -- accel/accel.sh@64 -- # IFS== 00:06:54.040 17:50:46 -- accel/accel.sh@64 -- # read -r opc module 00:06:54.040 17:50:46 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:54.040 17:50:46 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:54.040 17:50:46 -- accel/accel.sh@64 -- # IFS== 00:06:54.040 17:50:46 -- accel/accel.sh@64 -- # read -r opc module 00:06:54.040 17:50:46 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:54.040 17:50:46 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:54.040 17:50:46 -- accel/accel.sh@64 -- # IFS== 00:06:54.040 17:50:46 -- accel/accel.sh@64 -- # read -r opc module 00:06:54.040 17:50:46 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:54.040 17:50:46 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:54.040 17:50:46 -- accel/accel.sh@64 -- # IFS== 00:06:54.041 17:50:46 -- accel/accel.sh@64 -- # read -r opc module 00:06:54.041 17:50:46 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:54.041 17:50:46 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:54.041 17:50:46 -- accel/accel.sh@64 -- # IFS== 00:06:54.041 17:50:46 -- accel/accel.sh@64 -- # read -r opc module 00:06:54.041 17:50:46 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:54.041 17:50:46 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:54.041 17:50:46 -- accel/accel.sh@64 -- # IFS== 00:06:54.041 17:50:46 -- accel/accel.sh@64 -- # read -r opc module 00:06:54.041 17:50:46 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:54.041 17:50:46 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:54.041 17:50:46 -- accel/accel.sh@64 -- # IFS== 00:06:54.041 17:50:46 -- accel/accel.sh@64 -- # read -r opc module 00:06:54.041 17:50:46 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:54.041 17:50:46 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:54.041 17:50:46 -- accel/accel.sh@64 -- # IFS== 00:06:54.041 17:50:46 -- accel/accel.sh@64 -- # read -r opc module 00:06:54.041 17:50:46 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:54.041 17:50:46 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:54.041 17:50:46 -- accel/accel.sh@64 -- # IFS== 00:06:54.041 17:50:46 -- accel/accel.sh@64 -- # read -r opc module 00:06:54.041 17:50:46 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:54.041 17:50:46 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:54.041 17:50:46 -- accel/accel.sh@64 -- # IFS== 00:06:54.041 17:50:46 -- accel/accel.sh@64 -- # read -r opc module 00:06:54.041 17:50:46 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:54.041 17:50:46 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:54.041 17:50:46 -- accel/accel.sh@64 -- # IFS== 00:06:54.041 17:50:46 -- accel/accel.sh@64 -- # read -r opc module 00:06:54.041 17:50:46 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:54.041 17:50:46 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:54.041 17:50:46 -- accel/accel.sh@64 -- # IFS== 00:06:54.041 17:50:46 -- accel/accel.sh@64 -- # read -r opc module 00:06:54.041 17:50:46 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:54.041 17:50:46 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:54.041 17:50:46 -- accel/accel.sh@64 -- # IFS== 00:06:54.041 17:50:46 -- accel/accel.sh@64 -- # read -r opc module 00:06:54.041 17:50:46 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:54.041 17:50:46 -- accel/accel.sh@67 -- # killprocess 622139 00:06:54.041 17:50:46 -- common/autotest_common.sh@936 -- # '[' -z 622139 ']' 00:06:54.041 17:50:46 -- common/autotest_common.sh@940 -- # kill -0 622139 00:06:54.041 17:50:46 -- common/autotest_common.sh@941 -- # uname 00:06:54.041 17:50:46 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:54.041 17:50:46 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 622139 00:06:54.041 17:50:46 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:54.041 17:50:46 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:54.041 17:50:46 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 622139' 00:06:54.041 killing process with pid 622139 00:06:54.041 17:50:46 -- common/autotest_common.sh@955 -- # kill 622139 00:06:54.041 17:50:46 -- common/autotest_common.sh@960 -- # wait 622139 00:06:54.300 17:50:47 -- accel/accel.sh@68 -- # trap - ERR 00:06:54.300 17:50:47 -- accel/accel.sh@81 -- # run_test accel_help accel_perf -h 00:06:54.300 17:50:47 -- common/autotest_common.sh@1087 -- # '[' 3 -le 1 ']' 00:06:54.300 17:50:47 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:54.300 17:50:47 -- common/autotest_common.sh@10 -- # set +x 00:06:54.300 17:50:47 -- common/autotest_common.sh@1114 -- # accel_perf -h 00:06:54.300 17:50:47 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -h 00:06:54.300 17:50:47 -- accel/accel.sh@12 -- # build_accel_config 00:06:54.300 17:50:47 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:54.300 17:50:47 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:54.300 17:50:47 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:54.300 17:50:47 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:54.300 17:50:47 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:54.300 17:50:47 -- accel/accel.sh@41 -- # local IFS=, 00:06:54.300 17:50:47 -- accel/accel.sh@42 -- # jq -r . 00:06:54.300 17:50:47 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:54.300 17:50:47 -- common/autotest_common.sh@10 -- # set +x 00:06:54.300 17:50:47 -- accel/accel.sh@83 -- # run_test accel_missing_filename NOT accel_perf -t 1 -w compress 00:06:54.300 17:50:47 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:06:54.300 17:50:47 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:54.300 17:50:47 -- common/autotest_common.sh@10 -- # set +x 00:06:54.300 ************************************ 00:06:54.300 START TEST accel_missing_filename 00:06:54.300 ************************************ 00:06:54.300 17:50:47 -- common/autotest_common.sh@1114 -- # NOT accel_perf -t 1 -w compress 00:06:54.300 17:50:47 -- common/autotest_common.sh@650 -- # local es=0 00:06:54.300 17:50:47 -- common/autotest_common.sh@652 -- # valid_exec_arg accel_perf -t 1 -w compress 00:06:54.301 17:50:47 -- common/autotest_common.sh@638 -- # local arg=accel_perf 00:06:54.301 17:50:47 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:54.301 17:50:47 -- common/autotest_common.sh@642 -- # type -t accel_perf 00:06:54.301 17:50:47 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:54.301 17:50:47 -- common/autotest_common.sh@653 -- # accel_perf -t 1 -w compress 00:06:54.301 17:50:47 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress 00:06:54.301 17:50:47 -- accel/accel.sh@12 -- # build_accel_config 00:06:54.301 17:50:47 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:54.301 17:50:47 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:54.301 17:50:47 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:54.301 17:50:47 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:54.301 17:50:47 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:54.301 17:50:47 -- accel/accel.sh@41 -- # local IFS=, 00:06:54.301 17:50:47 -- accel/accel.sh@42 -- # jq -r . 00:06:54.301 [2024-11-19 17:50:47.112648] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:54.301 [2024-11-19 17:50:47.112754] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid622349 ] 00:06:54.301 EAL: No free 2048 kB hugepages reported on node 1 00:06:54.560 [2024-11-19 17:50:47.183745] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:54.560 [2024-11-19 17:50:47.219052] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:54.560 [2024-11-19 17:50:47.258312] app.c: 910:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:06:54.560 [2024-11-19 17:50:47.318556] accel_perf.c:1385:main: *ERROR*: ERROR starting application 00:06:54.560 A filename is required. 00:06:54.560 17:50:47 -- common/autotest_common.sh@653 -- # es=234 00:06:54.560 17:50:47 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:54.560 17:50:47 -- common/autotest_common.sh@662 -- # es=106 00:06:54.560 17:50:47 -- common/autotest_common.sh@663 -- # case "$es" in 00:06:54.560 17:50:47 -- common/autotest_common.sh@670 -- # es=1 00:06:54.560 17:50:47 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:54.560 00:06:54.560 real 0m0.288s 00:06:54.560 user 0m0.184s 00:06:54.560 sys 0m0.140s 00:06:54.560 17:50:47 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:54.560 17:50:47 -- common/autotest_common.sh@10 -- # set +x 00:06:54.560 ************************************ 00:06:54.560 END TEST accel_missing_filename 00:06:54.560 ************************************ 00:06:54.560 17:50:47 -- accel/accel.sh@85 -- # run_test accel_compress_verify NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:54.560 17:50:47 -- common/autotest_common.sh@1087 -- # '[' 10 -le 1 ']' 00:06:54.560 17:50:47 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:54.560 17:50:47 -- common/autotest_common.sh@10 -- # set +x 00:06:54.820 ************************************ 00:06:54.820 START TEST accel_compress_verify 00:06:54.820 ************************************ 00:06:54.820 17:50:47 -- common/autotest_common.sh@1114 -- # NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:54.820 17:50:47 -- common/autotest_common.sh@650 -- # local es=0 00:06:54.820 17:50:47 -- common/autotest_common.sh@652 -- # valid_exec_arg accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:54.820 17:50:47 -- common/autotest_common.sh@638 -- # local arg=accel_perf 00:06:54.820 17:50:47 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:54.820 17:50:47 -- common/autotest_common.sh@642 -- # type -t accel_perf 00:06:54.820 17:50:47 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:54.820 17:50:47 -- common/autotest_common.sh@653 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:54.820 17:50:47 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:54.820 17:50:47 -- accel/accel.sh@12 -- # build_accel_config 00:06:54.820 17:50:47 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:54.820 17:50:47 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:54.820 17:50:47 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:54.820 17:50:47 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:54.820 17:50:47 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:54.820 17:50:47 -- accel/accel.sh@41 -- # local IFS=, 00:06:54.820 17:50:47 -- accel/accel.sh@42 -- # jq -r . 00:06:54.820 [2024-11-19 17:50:47.448547] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:54.820 [2024-11-19 17:50:47.448737] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid622469 ] 00:06:54.820 EAL: No free 2048 kB hugepages reported on node 1 00:06:54.820 [2024-11-19 17:50:47.519110] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:54.820 [2024-11-19 17:50:47.554380] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:54.820 [2024-11-19 17:50:47.594029] app.c: 910:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:06:54.820 [2024-11-19 17:50:47.653008] accel_perf.c:1385:main: *ERROR*: ERROR starting application 00:06:55.080 00:06:55.080 Compression does not support the verify option, aborting. 00:06:55.080 17:50:47 -- common/autotest_common.sh@653 -- # es=161 00:06:55.080 17:50:47 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:55.080 17:50:47 -- common/autotest_common.sh@662 -- # es=33 00:06:55.080 17:50:47 -- common/autotest_common.sh@663 -- # case "$es" in 00:06:55.080 17:50:47 -- common/autotest_common.sh@670 -- # es=1 00:06:55.080 17:50:47 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:55.080 00:06:55.080 real 0m0.286s 00:06:55.080 user 0m0.187s 00:06:55.080 sys 0m0.136s 00:06:55.080 17:50:47 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:55.080 17:50:47 -- common/autotest_common.sh@10 -- # set +x 00:06:55.080 ************************************ 00:06:55.080 END TEST accel_compress_verify 00:06:55.080 ************************************ 00:06:55.080 17:50:47 -- accel/accel.sh@87 -- # run_test accel_wrong_workload NOT accel_perf -t 1 -w foobar 00:06:55.080 17:50:47 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:06:55.080 17:50:47 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:55.080 17:50:47 -- common/autotest_common.sh@10 -- # set +x 00:06:55.080 ************************************ 00:06:55.080 START TEST accel_wrong_workload 00:06:55.080 ************************************ 00:06:55.080 17:50:47 -- common/autotest_common.sh@1114 -- # NOT accel_perf -t 1 -w foobar 00:06:55.080 17:50:47 -- common/autotest_common.sh@650 -- # local es=0 00:06:55.080 17:50:47 -- common/autotest_common.sh@652 -- # valid_exec_arg accel_perf -t 1 -w foobar 00:06:55.080 17:50:47 -- common/autotest_common.sh@638 -- # local arg=accel_perf 00:06:55.080 17:50:47 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:55.080 17:50:47 -- common/autotest_common.sh@642 -- # type -t accel_perf 00:06:55.080 17:50:47 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:55.080 17:50:47 -- common/autotest_common.sh@653 -- # accel_perf -t 1 -w foobar 00:06:55.080 17:50:47 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w foobar 00:06:55.080 17:50:47 -- accel/accel.sh@12 -- # build_accel_config 00:06:55.080 17:50:47 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:55.080 17:50:47 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:55.080 17:50:47 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:55.080 17:50:47 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:55.080 17:50:47 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:55.080 17:50:47 -- accel/accel.sh@41 -- # local IFS=, 00:06:55.080 17:50:47 -- accel/accel.sh@42 -- # jq -r . 00:06:55.080 Unsupported workload type: foobar 00:06:55.080 [2024-11-19 17:50:47.773632] app.c:1292:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'w' failed: 1 00:06:55.080 accel_perf options: 00:06:55.080 [-h help message] 00:06:55.080 [-q queue depth per core] 00:06:55.080 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:06:55.080 [-T number of threads per core 00:06:55.080 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:06:55.080 [-t time in seconds] 00:06:55.080 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:06:55.080 [ dif_verify, , dif_generate, dif_generate_copy 00:06:55.080 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:06:55.080 [-l for compress/decompress workloads, name of uncompressed input file 00:06:55.080 [-S for crc32c workload, use this seed value (default 0) 00:06:55.080 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:06:55.080 [-f for fill workload, use this BYTE value (default 255) 00:06:55.080 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:06:55.080 [-y verify result if this switch is on] 00:06:55.080 [-a tasks to allocate per core (default: same value as -q)] 00:06:55.080 Can be used to spread operations across a wider range of memory. 00:06:55.080 17:50:47 -- common/autotest_common.sh@653 -- # es=1 00:06:55.080 17:50:47 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:55.080 17:50:47 -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:06:55.080 17:50:47 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:55.080 00:06:55.080 real 0m0.025s 00:06:55.080 user 0m0.012s 00:06:55.080 sys 0m0.014s 00:06:55.080 17:50:47 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:55.080 17:50:47 -- common/autotest_common.sh@10 -- # set +x 00:06:55.080 ************************************ 00:06:55.080 END TEST accel_wrong_workload 00:06:55.080 ************************************ 00:06:55.080 Error: writing output failed: Broken pipe 00:06:55.080 17:50:47 -- accel/accel.sh@89 -- # run_test accel_negative_buffers NOT accel_perf -t 1 -w xor -y -x -1 00:06:55.080 17:50:47 -- common/autotest_common.sh@1087 -- # '[' 10 -le 1 ']' 00:06:55.080 17:50:47 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:55.080 17:50:47 -- common/autotest_common.sh@10 -- # set +x 00:06:55.080 ************************************ 00:06:55.080 START TEST accel_negative_buffers 00:06:55.080 ************************************ 00:06:55.080 17:50:47 -- common/autotest_common.sh@1114 -- # NOT accel_perf -t 1 -w xor -y -x -1 00:06:55.080 17:50:47 -- common/autotest_common.sh@650 -- # local es=0 00:06:55.080 17:50:47 -- common/autotest_common.sh@652 -- # valid_exec_arg accel_perf -t 1 -w xor -y -x -1 00:06:55.080 17:50:47 -- common/autotest_common.sh@638 -- # local arg=accel_perf 00:06:55.080 17:50:47 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:55.080 17:50:47 -- common/autotest_common.sh@642 -- # type -t accel_perf 00:06:55.080 17:50:47 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:55.080 17:50:47 -- common/autotest_common.sh@653 -- # accel_perf -t 1 -w xor -y -x -1 00:06:55.080 17:50:47 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x -1 00:06:55.081 17:50:47 -- accel/accel.sh@12 -- # build_accel_config 00:06:55.081 17:50:47 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:55.081 17:50:47 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:55.081 17:50:47 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:55.081 17:50:47 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:55.081 17:50:47 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:55.081 17:50:47 -- accel/accel.sh@41 -- # local IFS=, 00:06:55.081 17:50:47 -- accel/accel.sh@42 -- # jq -r . 00:06:55.081 -x option must be non-negative. 00:06:55.081 [2024-11-19 17:50:47.837551] app.c:1292:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'x' failed: 1 00:06:55.081 accel_perf options: 00:06:55.081 [-h help message] 00:06:55.081 [-q queue depth per core] 00:06:55.081 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:06:55.081 [-T number of threads per core 00:06:55.081 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:06:55.081 [-t time in seconds] 00:06:55.081 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:06:55.081 [ dif_verify, , dif_generate, dif_generate_copy 00:06:55.081 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:06:55.081 [-l for compress/decompress workloads, name of uncompressed input file 00:06:55.081 [-S for crc32c workload, use this seed value (default 0) 00:06:55.081 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:06:55.081 [-f for fill workload, use this BYTE value (default 255) 00:06:55.081 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:06:55.081 [-y verify result if this switch is on] 00:06:55.081 [-a tasks to allocate per core (default: same value as -q)] 00:06:55.081 Can be used to spread operations across a wider range of memory. 00:06:55.081 17:50:47 -- common/autotest_common.sh@653 -- # es=1 00:06:55.081 17:50:47 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:55.081 17:50:47 -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:06:55.081 17:50:47 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:55.081 00:06:55.081 real 0m0.023s 00:06:55.081 user 0m0.009s 00:06:55.081 sys 0m0.014s 00:06:55.081 17:50:47 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:55.081 17:50:47 -- common/autotest_common.sh@10 -- # set +x 00:06:55.081 ************************************ 00:06:55.081 END TEST accel_negative_buffers 00:06:55.081 ************************************ 00:06:55.081 Error: writing output failed: Broken pipe 00:06:55.081 17:50:47 -- accel/accel.sh@93 -- # run_test accel_crc32c accel_test -t 1 -w crc32c -S 32 -y 00:06:55.081 17:50:47 -- common/autotest_common.sh@1087 -- # '[' 9 -le 1 ']' 00:06:55.081 17:50:47 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:55.081 17:50:47 -- common/autotest_common.sh@10 -- # set +x 00:06:55.081 ************************************ 00:06:55.081 START TEST accel_crc32c 00:06:55.081 ************************************ 00:06:55.081 17:50:47 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w crc32c -S 32 -y 00:06:55.081 17:50:47 -- accel/accel.sh@16 -- # local accel_opc 00:06:55.081 17:50:47 -- accel/accel.sh@17 -- # local accel_module 00:06:55.081 17:50:47 -- accel/accel.sh@18 -- # accel_perf -t 1 -w crc32c -S 32 -y 00:06:55.081 17:50:47 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -S 32 -y 00:06:55.081 17:50:47 -- accel/accel.sh@12 -- # build_accel_config 00:06:55.081 17:50:47 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:55.081 17:50:47 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:55.081 17:50:47 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:55.081 17:50:47 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:55.081 17:50:47 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:55.081 17:50:47 -- accel/accel.sh@41 -- # local IFS=, 00:06:55.081 17:50:47 -- accel/accel.sh@42 -- # jq -r . 00:06:55.081 [2024-11-19 17:50:47.910842] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:55.081 [2024-11-19 17:50:47.910929] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid622533 ] 00:06:55.340 EAL: No free 2048 kB hugepages reported on node 1 00:06:55.340 [2024-11-19 17:50:47.981619] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:55.340 [2024-11-19 17:50:48.018555] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:56.720 17:50:49 -- accel/accel.sh@18 -- # out=' 00:06:56.720 SPDK Configuration: 00:06:56.720 Core mask: 0x1 00:06:56.720 00:06:56.720 Accel Perf Configuration: 00:06:56.720 Workload Type: crc32c 00:06:56.720 CRC-32C seed: 32 00:06:56.720 Transfer size: 4096 bytes 00:06:56.720 Vector count 1 00:06:56.720 Module: software 00:06:56.720 Queue depth: 32 00:06:56.720 Allocate depth: 32 00:06:56.720 # threads/core: 1 00:06:56.720 Run time: 1 seconds 00:06:56.720 Verify: Yes 00:06:56.720 00:06:56.720 Running for 1 seconds... 00:06:56.720 00:06:56.720 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:56.720 ------------------------------------------------------------------------------------ 00:06:56.720 0,0 848352/s 3313 MiB/s 0 0 00:06:56.720 ==================================================================================== 00:06:56.720 Total 848352/s 3313 MiB/s 0 0' 00:06:56.720 17:50:49 -- accel/accel.sh@20 -- # IFS=: 00:06:56.720 17:50:49 -- accel/accel.sh@20 -- # read -r var val 00:06:56.720 17:50:49 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -S 32 -y 00:06:56.720 17:50:49 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -S 32 -y 00:06:56.720 17:50:49 -- accel/accel.sh@12 -- # build_accel_config 00:06:56.720 17:50:49 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:56.720 17:50:49 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:56.720 17:50:49 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:56.720 17:50:49 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:56.720 17:50:49 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:56.720 17:50:49 -- accel/accel.sh@41 -- # local IFS=, 00:06:56.720 17:50:49 -- accel/accel.sh@42 -- # jq -r . 00:06:56.720 [2024-11-19 17:50:49.199018] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:56.720 [2024-11-19 17:50:49.199114] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid622799 ] 00:06:56.720 EAL: No free 2048 kB hugepages reported on node 1 00:06:56.720 [2024-11-19 17:50:49.266588] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:56.720 [2024-11-19 17:50:49.299302] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:56.720 17:50:49 -- accel/accel.sh@21 -- # val= 00:06:56.720 17:50:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.720 17:50:49 -- accel/accel.sh@20 -- # IFS=: 00:06:56.720 17:50:49 -- accel/accel.sh@20 -- # read -r var val 00:06:56.720 17:50:49 -- accel/accel.sh@21 -- # val= 00:06:56.720 17:50:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.720 17:50:49 -- accel/accel.sh@20 -- # IFS=: 00:06:56.720 17:50:49 -- accel/accel.sh@20 -- # read -r var val 00:06:56.720 17:50:49 -- accel/accel.sh@21 -- # val=0x1 00:06:56.720 17:50:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.720 17:50:49 -- accel/accel.sh@20 -- # IFS=: 00:06:56.720 17:50:49 -- accel/accel.sh@20 -- # read -r var val 00:06:56.720 17:50:49 -- accel/accel.sh@21 -- # val= 00:06:56.720 17:50:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.720 17:50:49 -- accel/accel.sh@20 -- # IFS=: 00:06:56.720 17:50:49 -- accel/accel.sh@20 -- # read -r var val 00:06:56.720 17:50:49 -- accel/accel.sh@21 -- # val= 00:06:56.720 17:50:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.720 17:50:49 -- accel/accel.sh@20 -- # IFS=: 00:06:56.720 17:50:49 -- accel/accel.sh@20 -- # read -r var val 00:06:56.720 17:50:49 -- accel/accel.sh@21 -- # val=crc32c 00:06:56.720 17:50:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.720 17:50:49 -- accel/accel.sh@24 -- # accel_opc=crc32c 00:06:56.720 17:50:49 -- accel/accel.sh@20 -- # IFS=: 00:06:56.720 17:50:49 -- accel/accel.sh@20 -- # read -r var val 00:06:56.720 17:50:49 -- accel/accel.sh@21 -- # val=32 00:06:56.720 17:50:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.720 17:50:49 -- accel/accel.sh@20 -- # IFS=: 00:06:56.720 17:50:49 -- accel/accel.sh@20 -- # read -r var val 00:06:56.720 17:50:49 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:56.720 17:50:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.720 17:50:49 -- accel/accel.sh@20 -- # IFS=: 00:06:56.720 17:50:49 -- accel/accel.sh@20 -- # read -r var val 00:06:56.720 17:50:49 -- accel/accel.sh@21 -- # val= 00:06:56.720 17:50:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.720 17:50:49 -- accel/accel.sh@20 -- # IFS=: 00:06:56.720 17:50:49 -- accel/accel.sh@20 -- # read -r var val 00:06:56.720 17:50:49 -- accel/accel.sh@21 -- # val=software 00:06:56.720 17:50:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.720 17:50:49 -- accel/accel.sh@23 -- # accel_module=software 00:06:56.720 17:50:49 -- accel/accel.sh@20 -- # IFS=: 00:06:56.720 17:50:49 -- accel/accel.sh@20 -- # read -r var val 00:06:56.720 17:50:49 -- accel/accel.sh@21 -- # val=32 00:06:56.720 17:50:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.720 17:50:49 -- accel/accel.sh@20 -- # IFS=: 00:06:56.720 17:50:49 -- accel/accel.sh@20 -- # read -r var val 00:06:56.720 17:50:49 -- accel/accel.sh@21 -- # val=32 00:06:56.721 17:50:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.721 17:50:49 -- accel/accel.sh@20 -- # IFS=: 00:06:56.721 17:50:49 -- accel/accel.sh@20 -- # read -r var val 00:06:56.721 17:50:49 -- accel/accel.sh@21 -- # val=1 00:06:56.721 17:50:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.721 17:50:49 -- accel/accel.sh@20 -- # IFS=: 00:06:56.721 17:50:49 -- accel/accel.sh@20 -- # read -r var val 00:06:56.721 17:50:49 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:56.721 17:50:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.721 17:50:49 -- accel/accel.sh@20 -- # IFS=: 00:06:56.721 17:50:49 -- accel/accel.sh@20 -- # read -r var val 00:06:56.721 17:50:49 -- accel/accel.sh@21 -- # val=Yes 00:06:56.721 17:50:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.721 17:50:49 -- accel/accel.sh@20 -- # IFS=: 00:06:56.721 17:50:49 -- accel/accel.sh@20 -- # read -r var val 00:06:56.721 17:50:49 -- accel/accel.sh@21 -- # val= 00:06:56.721 17:50:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.721 17:50:49 -- accel/accel.sh@20 -- # IFS=: 00:06:56.721 17:50:49 -- accel/accel.sh@20 -- # read -r var val 00:06:56.721 17:50:49 -- accel/accel.sh@21 -- # val= 00:06:56.721 17:50:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.721 17:50:49 -- accel/accel.sh@20 -- # IFS=: 00:06:56.721 17:50:49 -- accel/accel.sh@20 -- # read -r var val 00:06:57.659 17:50:50 -- accel/accel.sh@21 -- # val= 00:06:57.659 17:50:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:57.659 17:50:50 -- accel/accel.sh@20 -- # IFS=: 00:06:57.659 17:50:50 -- accel/accel.sh@20 -- # read -r var val 00:06:57.659 17:50:50 -- accel/accel.sh@21 -- # val= 00:06:57.659 17:50:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:57.659 17:50:50 -- accel/accel.sh@20 -- # IFS=: 00:06:57.659 17:50:50 -- accel/accel.sh@20 -- # read -r var val 00:06:57.659 17:50:50 -- accel/accel.sh@21 -- # val= 00:06:57.659 17:50:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:57.659 17:50:50 -- accel/accel.sh@20 -- # IFS=: 00:06:57.659 17:50:50 -- accel/accel.sh@20 -- # read -r var val 00:06:57.659 17:50:50 -- accel/accel.sh@21 -- # val= 00:06:57.659 17:50:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:57.659 17:50:50 -- accel/accel.sh@20 -- # IFS=: 00:06:57.659 17:50:50 -- accel/accel.sh@20 -- # read -r var val 00:06:57.659 17:50:50 -- accel/accel.sh@21 -- # val= 00:06:57.659 17:50:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:57.659 17:50:50 -- accel/accel.sh@20 -- # IFS=: 00:06:57.659 17:50:50 -- accel/accel.sh@20 -- # read -r var val 00:06:57.659 17:50:50 -- accel/accel.sh@21 -- # val= 00:06:57.659 17:50:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:57.659 17:50:50 -- accel/accel.sh@20 -- # IFS=: 00:06:57.659 17:50:50 -- accel/accel.sh@20 -- # read -r var val 00:06:57.659 17:50:50 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:57.659 17:50:50 -- accel/accel.sh@28 -- # [[ -n crc32c ]] 00:06:57.659 17:50:50 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:57.659 00:06:57.659 real 0m2.573s 00:06:57.659 user 0m2.312s 00:06:57.659 sys 0m0.259s 00:06:57.659 17:50:50 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:57.659 17:50:50 -- common/autotest_common.sh@10 -- # set +x 00:06:57.659 ************************************ 00:06:57.659 END TEST accel_crc32c 00:06:57.659 ************************************ 00:06:57.659 17:50:50 -- accel/accel.sh@94 -- # run_test accel_crc32c_C2 accel_test -t 1 -w crc32c -y -C 2 00:06:57.659 17:50:50 -- common/autotest_common.sh@1087 -- # '[' 9 -le 1 ']' 00:06:57.659 17:50:50 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:57.659 17:50:50 -- common/autotest_common.sh@10 -- # set +x 00:06:57.659 ************************************ 00:06:57.659 START TEST accel_crc32c_C2 00:06:57.659 ************************************ 00:06:57.659 17:50:50 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w crc32c -y -C 2 00:06:57.659 17:50:50 -- accel/accel.sh@16 -- # local accel_opc 00:06:57.659 17:50:50 -- accel/accel.sh@17 -- # local accel_module 00:06:57.659 17:50:50 -- accel/accel.sh@18 -- # accel_perf -t 1 -w crc32c -y -C 2 00:06:57.659 17:50:50 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -y -C 2 00:06:57.659 17:50:50 -- accel/accel.sh@12 -- # build_accel_config 00:06:57.659 17:50:50 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:57.659 17:50:50 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:57.659 17:50:50 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:57.659 17:50:50 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:57.659 17:50:50 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:57.659 17:50:50 -- accel/accel.sh@41 -- # local IFS=, 00:06:57.659 17:50:50 -- accel/accel.sh@42 -- # jq -r . 00:06:57.919 [2024-11-19 17:50:50.525426] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:57.919 [2024-11-19 17:50:50.525524] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid623082 ] 00:06:57.919 EAL: No free 2048 kB hugepages reported on node 1 00:06:57.919 [2024-11-19 17:50:50.593929] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:57.919 [2024-11-19 17:50:50.629243] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:59.298 17:50:51 -- accel/accel.sh@18 -- # out=' 00:06:59.298 SPDK Configuration: 00:06:59.298 Core mask: 0x1 00:06:59.298 00:06:59.298 Accel Perf Configuration: 00:06:59.298 Workload Type: crc32c 00:06:59.298 CRC-32C seed: 0 00:06:59.298 Transfer size: 4096 bytes 00:06:59.298 Vector count 2 00:06:59.298 Module: software 00:06:59.298 Queue depth: 32 00:06:59.298 Allocate depth: 32 00:06:59.298 # threads/core: 1 00:06:59.298 Run time: 1 seconds 00:06:59.298 Verify: Yes 00:06:59.298 00:06:59.298 Running for 1 seconds... 00:06:59.298 00:06:59.298 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:59.298 ------------------------------------------------------------------------------------ 00:06:59.298 0,0 598816/s 4678 MiB/s 0 0 00:06:59.298 ==================================================================================== 00:06:59.298 Total 598816/s 2339 MiB/s 0 0' 00:06:59.298 17:50:51 -- accel/accel.sh@20 -- # IFS=: 00:06:59.298 17:50:51 -- accel/accel.sh@20 -- # read -r var val 00:06:59.298 17:50:51 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -y -C 2 00:06:59.298 17:50:51 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -y -C 2 00:06:59.298 17:50:51 -- accel/accel.sh@12 -- # build_accel_config 00:06:59.298 17:50:51 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:59.298 17:50:51 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:59.298 17:50:51 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:59.298 17:50:51 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:59.298 17:50:51 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:59.298 17:50:51 -- accel/accel.sh@41 -- # local IFS=, 00:06:59.298 17:50:51 -- accel/accel.sh@42 -- # jq -r . 00:06:59.298 [2024-11-19 17:50:51.809951] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:59.298 [2024-11-19 17:50:51.810047] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid623358 ] 00:06:59.298 EAL: No free 2048 kB hugepages reported on node 1 00:06:59.298 [2024-11-19 17:50:51.877673] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:59.298 [2024-11-19 17:50:51.912110] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:59.298 17:50:51 -- accel/accel.sh@21 -- # val= 00:06:59.298 17:50:51 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.298 17:50:51 -- accel/accel.sh@20 -- # IFS=: 00:06:59.298 17:50:51 -- accel/accel.sh@20 -- # read -r var val 00:06:59.298 17:50:51 -- accel/accel.sh@21 -- # val= 00:06:59.298 17:50:51 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.298 17:50:51 -- accel/accel.sh@20 -- # IFS=: 00:06:59.298 17:50:51 -- accel/accel.sh@20 -- # read -r var val 00:06:59.298 17:50:51 -- accel/accel.sh@21 -- # val=0x1 00:06:59.298 17:50:51 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.298 17:50:51 -- accel/accel.sh@20 -- # IFS=: 00:06:59.298 17:50:51 -- accel/accel.sh@20 -- # read -r var val 00:06:59.298 17:50:51 -- accel/accel.sh@21 -- # val= 00:06:59.298 17:50:51 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.298 17:50:51 -- accel/accel.sh@20 -- # IFS=: 00:06:59.298 17:50:51 -- accel/accel.sh@20 -- # read -r var val 00:06:59.298 17:50:51 -- accel/accel.sh@21 -- # val= 00:06:59.298 17:50:51 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.298 17:50:51 -- accel/accel.sh@20 -- # IFS=: 00:06:59.298 17:50:51 -- accel/accel.sh@20 -- # read -r var val 00:06:59.298 17:50:51 -- accel/accel.sh@21 -- # val=crc32c 00:06:59.298 17:50:51 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.298 17:50:51 -- accel/accel.sh@24 -- # accel_opc=crc32c 00:06:59.298 17:50:51 -- accel/accel.sh@20 -- # IFS=: 00:06:59.298 17:50:51 -- accel/accel.sh@20 -- # read -r var val 00:06:59.298 17:50:51 -- accel/accel.sh@21 -- # val=0 00:06:59.298 17:50:51 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.298 17:50:51 -- accel/accel.sh@20 -- # IFS=: 00:06:59.298 17:50:51 -- accel/accel.sh@20 -- # read -r var val 00:06:59.298 17:50:51 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:59.298 17:50:51 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.298 17:50:51 -- accel/accel.sh@20 -- # IFS=: 00:06:59.298 17:50:51 -- accel/accel.sh@20 -- # read -r var val 00:06:59.299 17:50:51 -- accel/accel.sh@21 -- # val= 00:06:59.299 17:50:51 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.299 17:50:51 -- accel/accel.sh@20 -- # IFS=: 00:06:59.299 17:50:51 -- accel/accel.sh@20 -- # read -r var val 00:06:59.299 17:50:51 -- accel/accel.sh@21 -- # val=software 00:06:59.299 17:50:51 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.299 17:50:51 -- accel/accel.sh@23 -- # accel_module=software 00:06:59.299 17:50:51 -- accel/accel.sh@20 -- # IFS=: 00:06:59.299 17:50:51 -- accel/accel.sh@20 -- # read -r var val 00:06:59.299 17:50:51 -- accel/accel.sh@21 -- # val=32 00:06:59.299 17:50:51 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.299 17:50:51 -- accel/accel.sh@20 -- # IFS=: 00:06:59.299 17:50:51 -- accel/accel.sh@20 -- # read -r var val 00:06:59.299 17:50:51 -- accel/accel.sh@21 -- # val=32 00:06:59.299 17:50:51 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.299 17:50:51 -- accel/accel.sh@20 -- # IFS=: 00:06:59.299 17:50:51 -- accel/accel.sh@20 -- # read -r var val 00:06:59.299 17:50:51 -- accel/accel.sh@21 -- # val=1 00:06:59.299 17:50:51 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.299 17:50:51 -- accel/accel.sh@20 -- # IFS=: 00:06:59.299 17:50:51 -- accel/accel.sh@20 -- # read -r var val 00:06:59.299 17:50:51 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:59.299 17:50:51 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.299 17:50:51 -- accel/accel.sh@20 -- # IFS=: 00:06:59.299 17:50:51 -- accel/accel.sh@20 -- # read -r var val 00:06:59.299 17:50:51 -- accel/accel.sh@21 -- # val=Yes 00:06:59.299 17:50:51 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.299 17:50:51 -- accel/accel.sh@20 -- # IFS=: 00:06:59.299 17:50:51 -- accel/accel.sh@20 -- # read -r var val 00:06:59.299 17:50:51 -- accel/accel.sh@21 -- # val= 00:06:59.299 17:50:51 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.299 17:50:51 -- accel/accel.sh@20 -- # IFS=: 00:06:59.299 17:50:51 -- accel/accel.sh@20 -- # read -r var val 00:06:59.299 17:50:51 -- accel/accel.sh@21 -- # val= 00:06:59.299 17:50:51 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.299 17:50:51 -- accel/accel.sh@20 -- # IFS=: 00:06:59.299 17:50:51 -- accel/accel.sh@20 -- # read -r var val 00:07:00.237 17:50:53 -- accel/accel.sh@21 -- # val= 00:07:00.237 17:50:53 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.237 17:50:53 -- accel/accel.sh@20 -- # IFS=: 00:07:00.237 17:50:53 -- accel/accel.sh@20 -- # read -r var val 00:07:00.237 17:50:53 -- accel/accel.sh@21 -- # val= 00:07:00.237 17:50:53 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.237 17:50:53 -- accel/accel.sh@20 -- # IFS=: 00:07:00.237 17:50:53 -- accel/accel.sh@20 -- # read -r var val 00:07:00.237 17:50:53 -- accel/accel.sh@21 -- # val= 00:07:00.237 17:50:53 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.237 17:50:53 -- accel/accel.sh@20 -- # IFS=: 00:07:00.237 17:50:53 -- accel/accel.sh@20 -- # read -r var val 00:07:00.237 17:50:53 -- accel/accel.sh@21 -- # val= 00:07:00.237 17:50:53 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.237 17:50:53 -- accel/accel.sh@20 -- # IFS=: 00:07:00.237 17:50:53 -- accel/accel.sh@20 -- # read -r var val 00:07:00.237 17:50:53 -- accel/accel.sh@21 -- # val= 00:07:00.237 17:50:53 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.237 17:50:53 -- accel/accel.sh@20 -- # IFS=: 00:07:00.237 17:50:53 -- accel/accel.sh@20 -- # read -r var val 00:07:00.237 17:50:53 -- accel/accel.sh@21 -- # val= 00:07:00.237 17:50:53 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.237 17:50:53 -- accel/accel.sh@20 -- # IFS=: 00:07:00.237 17:50:53 -- accel/accel.sh@20 -- # read -r var val 00:07:00.237 17:50:53 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:00.237 17:50:53 -- accel/accel.sh@28 -- # [[ -n crc32c ]] 00:07:00.237 17:50:53 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:00.237 00:07:00.237 real 0m2.568s 00:07:00.237 user 0m2.308s 00:07:00.237 sys 0m0.259s 00:07:00.237 17:50:53 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:00.237 17:50:53 -- common/autotest_common.sh@10 -- # set +x 00:07:00.237 ************************************ 00:07:00.237 END TEST accel_crc32c_C2 00:07:00.237 ************************************ 00:07:00.496 17:50:53 -- accel/accel.sh@95 -- # run_test accel_copy accel_test -t 1 -w copy -y 00:07:00.497 17:50:53 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:07:00.497 17:50:53 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:00.497 17:50:53 -- common/autotest_common.sh@10 -- # set +x 00:07:00.497 ************************************ 00:07:00.497 START TEST accel_copy 00:07:00.497 ************************************ 00:07:00.497 17:50:53 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w copy -y 00:07:00.497 17:50:53 -- accel/accel.sh@16 -- # local accel_opc 00:07:00.497 17:50:53 -- accel/accel.sh@17 -- # local accel_module 00:07:00.497 17:50:53 -- accel/accel.sh@18 -- # accel_perf -t 1 -w copy -y 00:07:00.497 17:50:53 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy -y 00:07:00.497 17:50:53 -- accel/accel.sh@12 -- # build_accel_config 00:07:00.497 17:50:53 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:00.497 17:50:53 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:00.497 17:50:53 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:00.497 17:50:53 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:00.497 17:50:53 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:00.497 17:50:53 -- accel/accel.sh@41 -- # local IFS=, 00:07:00.497 17:50:53 -- accel/accel.sh@42 -- # jq -r . 00:07:00.497 [2024-11-19 17:50:53.139556] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:00.497 [2024-11-19 17:50:53.139652] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid623542 ] 00:07:00.497 EAL: No free 2048 kB hugepages reported on node 1 00:07:00.497 [2024-11-19 17:50:53.207296] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:00.497 [2024-11-19 17:50:53.242314] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:01.876 17:50:54 -- accel/accel.sh@18 -- # out=' 00:07:01.876 SPDK Configuration: 00:07:01.876 Core mask: 0x1 00:07:01.876 00:07:01.876 Accel Perf Configuration: 00:07:01.876 Workload Type: copy 00:07:01.876 Transfer size: 4096 bytes 00:07:01.876 Vector count 1 00:07:01.876 Module: software 00:07:01.876 Queue depth: 32 00:07:01.876 Allocate depth: 32 00:07:01.876 # threads/core: 1 00:07:01.876 Run time: 1 seconds 00:07:01.876 Verify: Yes 00:07:01.876 00:07:01.876 Running for 1 seconds... 00:07:01.876 00:07:01.876 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:01.876 ------------------------------------------------------------------------------------ 00:07:01.876 0,0 563680/s 2201 MiB/s 0 0 00:07:01.876 ==================================================================================== 00:07:01.876 Total 563680/s 2201 MiB/s 0 0' 00:07:01.876 17:50:54 -- accel/accel.sh@20 -- # IFS=: 00:07:01.876 17:50:54 -- accel/accel.sh@20 -- # read -r var val 00:07:01.876 17:50:54 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy -y 00:07:01.876 17:50:54 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy -y 00:07:01.876 17:50:54 -- accel/accel.sh@12 -- # build_accel_config 00:07:01.876 17:50:54 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:01.877 17:50:54 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:01.877 17:50:54 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:01.877 17:50:54 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:01.877 17:50:54 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:01.877 17:50:54 -- accel/accel.sh@41 -- # local IFS=, 00:07:01.877 17:50:54 -- accel/accel.sh@42 -- # jq -r . 00:07:01.877 [2024-11-19 17:50:54.424296] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:01.877 [2024-11-19 17:50:54.424387] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid623683 ] 00:07:01.877 EAL: No free 2048 kB hugepages reported on node 1 00:07:01.877 [2024-11-19 17:50:54.492610] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:01.877 [2024-11-19 17:50:54.526686] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:01.877 17:50:54 -- accel/accel.sh@21 -- # val= 00:07:01.877 17:50:54 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.877 17:50:54 -- accel/accel.sh@20 -- # IFS=: 00:07:01.877 17:50:54 -- accel/accel.sh@20 -- # read -r var val 00:07:01.877 17:50:54 -- accel/accel.sh@21 -- # val= 00:07:01.877 17:50:54 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.877 17:50:54 -- accel/accel.sh@20 -- # IFS=: 00:07:01.877 17:50:54 -- accel/accel.sh@20 -- # read -r var val 00:07:01.877 17:50:54 -- accel/accel.sh@21 -- # val=0x1 00:07:01.877 17:50:54 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.877 17:50:54 -- accel/accel.sh@20 -- # IFS=: 00:07:01.877 17:50:54 -- accel/accel.sh@20 -- # read -r var val 00:07:01.877 17:50:54 -- accel/accel.sh@21 -- # val= 00:07:01.877 17:50:54 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.877 17:50:54 -- accel/accel.sh@20 -- # IFS=: 00:07:01.877 17:50:54 -- accel/accel.sh@20 -- # read -r var val 00:07:01.877 17:50:54 -- accel/accel.sh@21 -- # val= 00:07:01.877 17:50:54 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.877 17:50:54 -- accel/accel.sh@20 -- # IFS=: 00:07:01.877 17:50:54 -- accel/accel.sh@20 -- # read -r var val 00:07:01.877 17:50:54 -- accel/accel.sh@21 -- # val=copy 00:07:01.877 17:50:54 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.877 17:50:54 -- accel/accel.sh@24 -- # accel_opc=copy 00:07:01.877 17:50:54 -- accel/accel.sh@20 -- # IFS=: 00:07:01.877 17:50:54 -- accel/accel.sh@20 -- # read -r var val 00:07:01.877 17:50:54 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:01.877 17:50:54 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.877 17:50:54 -- accel/accel.sh@20 -- # IFS=: 00:07:01.877 17:50:54 -- accel/accel.sh@20 -- # read -r var val 00:07:01.877 17:50:54 -- accel/accel.sh@21 -- # val= 00:07:01.877 17:50:54 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.877 17:50:54 -- accel/accel.sh@20 -- # IFS=: 00:07:01.877 17:50:54 -- accel/accel.sh@20 -- # read -r var val 00:07:01.877 17:50:54 -- accel/accel.sh@21 -- # val=software 00:07:01.877 17:50:54 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.877 17:50:54 -- accel/accel.sh@23 -- # accel_module=software 00:07:01.877 17:50:54 -- accel/accel.sh@20 -- # IFS=: 00:07:01.877 17:50:54 -- accel/accel.sh@20 -- # read -r var val 00:07:01.877 17:50:54 -- accel/accel.sh@21 -- # val=32 00:07:01.877 17:50:54 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.877 17:50:54 -- accel/accel.sh@20 -- # IFS=: 00:07:01.877 17:50:54 -- accel/accel.sh@20 -- # read -r var val 00:07:01.877 17:50:54 -- accel/accel.sh@21 -- # val=32 00:07:01.877 17:50:54 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.877 17:50:54 -- accel/accel.sh@20 -- # IFS=: 00:07:01.877 17:50:54 -- accel/accel.sh@20 -- # read -r var val 00:07:01.877 17:50:54 -- accel/accel.sh@21 -- # val=1 00:07:01.877 17:50:54 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.877 17:50:54 -- accel/accel.sh@20 -- # IFS=: 00:07:01.877 17:50:54 -- accel/accel.sh@20 -- # read -r var val 00:07:01.877 17:50:54 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:01.877 17:50:54 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.877 17:50:54 -- accel/accel.sh@20 -- # IFS=: 00:07:01.877 17:50:54 -- accel/accel.sh@20 -- # read -r var val 00:07:01.877 17:50:54 -- accel/accel.sh@21 -- # val=Yes 00:07:01.877 17:50:54 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.877 17:50:54 -- accel/accel.sh@20 -- # IFS=: 00:07:01.877 17:50:54 -- accel/accel.sh@20 -- # read -r var val 00:07:01.877 17:50:54 -- accel/accel.sh@21 -- # val= 00:07:01.877 17:50:54 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.877 17:50:54 -- accel/accel.sh@20 -- # IFS=: 00:07:01.877 17:50:54 -- accel/accel.sh@20 -- # read -r var val 00:07:01.877 17:50:54 -- accel/accel.sh@21 -- # val= 00:07:01.877 17:50:54 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.877 17:50:54 -- accel/accel.sh@20 -- # IFS=: 00:07:01.877 17:50:54 -- accel/accel.sh@20 -- # read -r var val 00:07:03.258 17:50:55 -- accel/accel.sh@21 -- # val= 00:07:03.258 17:50:55 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.258 17:50:55 -- accel/accel.sh@20 -- # IFS=: 00:07:03.258 17:50:55 -- accel/accel.sh@20 -- # read -r var val 00:07:03.258 17:50:55 -- accel/accel.sh@21 -- # val= 00:07:03.258 17:50:55 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.258 17:50:55 -- accel/accel.sh@20 -- # IFS=: 00:07:03.258 17:50:55 -- accel/accel.sh@20 -- # read -r var val 00:07:03.258 17:50:55 -- accel/accel.sh@21 -- # val= 00:07:03.258 17:50:55 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.258 17:50:55 -- accel/accel.sh@20 -- # IFS=: 00:07:03.258 17:50:55 -- accel/accel.sh@20 -- # read -r var val 00:07:03.258 17:50:55 -- accel/accel.sh@21 -- # val= 00:07:03.258 17:50:55 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.258 17:50:55 -- accel/accel.sh@20 -- # IFS=: 00:07:03.258 17:50:55 -- accel/accel.sh@20 -- # read -r var val 00:07:03.258 17:50:55 -- accel/accel.sh@21 -- # val= 00:07:03.258 17:50:55 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.258 17:50:55 -- accel/accel.sh@20 -- # IFS=: 00:07:03.258 17:50:55 -- accel/accel.sh@20 -- # read -r var val 00:07:03.258 17:50:55 -- accel/accel.sh@21 -- # val= 00:07:03.258 17:50:55 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.258 17:50:55 -- accel/accel.sh@20 -- # IFS=: 00:07:03.258 17:50:55 -- accel/accel.sh@20 -- # read -r var val 00:07:03.258 17:50:55 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:03.258 17:50:55 -- accel/accel.sh@28 -- # [[ -n copy ]] 00:07:03.258 17:50:55 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:03.258 00:07:03.258 real 0m2.570s 00:07:03.258 user 0m2.315s 00:07:03.258 sys 0m0.253s 00:07:03.258 17:50:55 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:03.258 17:50:55 -- common/autotest_common.sh@10 -- # set +x 00:07:03.258 ************************************ 00:07:03.258 END TEST accel_copy 00:07:03.258 ************************************ 00:07:03.258 17:50:55 -- accel/accel.sh@96 -- # run_test accel_fill accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:07:03.258 17:50:55 -- common/autotest_common.sh@1087 -- # '[' 13 -le 1 ']' 00:07:03.258 17:50:55 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:03.258 17:50:55 -- common/autotest_common.sh@10 -- # set +x 00:07:03.258 ************************************ 00:07:03.258 START TEST accel_fill 00:07:03.258 ************************************ 00:07:03.258 17:50:55 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:07:03.258 17:50:55 -- accel/accel.sh@16 -- # local accel_opc 00:07:03.258 17:50:55 -- accel/accel.sh@17 -- # local accel_module 00:07:03.258 17:50:55 -- accel/accel.sh@18 -- # accel_perf -t 1 -w fill -f 128 -q 64 -a 64 -y 00:07:03.258 17:50:55 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w fill -f 128 -q 64 -a 64 -y 00:07:03.258 17:50:55 -- accel/accel.sh@12 -- # build_accel_config 00:07:03.258 17:50:55 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:03.258 17:50:55 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:03.258 17:50:55 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:03.258 17:50:55 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:03.258 17:50:55 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:03.258 17:50:55 -- accel/accel.sh@41 -- # local IFS=, 00:07:03.258 17:50:55 -- accel/accel.sh@42 -- # jq -r . 00:07:03.258 [2024-11-19 17:50:55.751879] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:03.258 [2024-11-19 17:50:55.751972] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid623943 ] 00:07:03.258 EAL: No free 2048 kB hugepages reported on node 1 00:07:03.258 [2024-11-19 17:50:55.819956] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:03.258 [2024-11-19 17:50:55.854993] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:04.195 17:50:57 -- accel/accel.sh@18 -- # out=' 00:07:04.195 SPDK Configuration: 00:07:04.195 Core mask: 0x1 00:07:04.195 00:07:04.195 Accel Perf Configuration: 00:07:04.195 Workload Type: fill 00:07:04.195 Fill pattern: 0x80 00:07:04.195 Transfer size: 4096 bytes 00:07:04.195 Vector count 1 00:07:04.195 Module: software 00:07:04.195 Queue depth: 64 00:07:04.195 Allocate depth: 64 00:07:04.195 # threads/core: 1 00:07:04.195 Run time: 1 seconds 00:07:04.195 Verify: Yes 00:07:04.195 00:07:04.195 Running for 1 seconds... 00:07:04.195 00:07:04.195 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:04.195 ------------------------------------------------------------------------------------ 00:07:04.195 0,0 967488/s 3779 MiB/s 0 0 00:07:04.195 ==================================================================================== 00:07:04.195 Total 967488/s 3779 MiB/s 0 0' 00:07:04.195 17:50:57 -- accel/accel.sh@20 -- # IFS=: 00:07:04.195 17:50:57 -- accel/accel.sh@20 -- # read -r var val 00:07:04.195 17:50:57 -- accel/accel.sh@15 -- # accel_perf -t 1 -w fill -f 128 -q 64 -a 64 -y 00:07:04.195 17:50:57 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w fill -f 128 -q 64 -a 64 -y 00:07:04.195 17:50:57 -- accel/accel.sh@12 -- # build_accel_config 00:07:04.195 17:50:57 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:04.195 17:50:57 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:04.195 17:50:57 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:04.195 17:50:57 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:04.195 17:50:57 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:04.195 17:50:57 -- accel/accel.sh@41 -- # local IFS=, 00:07:04.195 17:50:57 -- accel/accel.sh@42 -- # jq -r . 00:07:04.195 [2024-11-19 17:50:57.037280] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:04.195 [2024-11-19 17:50:57.037405] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid624217 ] 00:07:04.454 EAL: No free 2048 kB hugepages reported on node 1 00:07:04.454 [2024-11-19 17:50:57.106158] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:04.454 [2024-11-19 17:50:57.139927] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:04.454 17:50:57 -- accel/accel.sh@21 -- # val= 00:07:04.454 17:50:57 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.454 17:50:57 -- accel/accel.sh@20 -- # IFS=: 00:07:04.454 17:50:57 -- accel/accel.sh@20 -- # read -r var val 00:07:04.454 17:50:57 -- accel/accel.sh@21 -- # val= 00:07:04.454 17:50:57 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.454 17:50:57 -- accel/accel.sh@20 -- # IFS=: 00:07:04.454 17:50:57 -- accel/accel.sh@20 -- # read -r var val 00:07:04.454 17:50:57 -- accel/accel.sh@21 -- # val=0x1 00:07:04.454 17:50:57 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.454 17:50:57 -- accel/accel.sh@20 -- # IFS=: 00:07:04.454 17:50:57 -- accel/accel.sh@20 -- # read -r var val 00:07:04.454 17:50:57 -- accel/accel.sh@21 -- # val= 00:07:04.454 17:50:57 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.454 17:50:57 -- accel/accel.sh@20 -- # IFS=: 00:07:04.454 17:50:57 -- accel/accel.sh@20 -- # read -r var val 00:07:04.454 17:50:57 -- accel/accel.sh@21 -- # val= 00:07:04.454 17:50:57 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.454 17:50:57 -- accel/accel.sh@20 -- # IFS=: 00:07:04.454 17:50:57 -- accel/accel.sh@20 -- # read -r var val 00:07:04.454 17:50:57 -- accel/accel.sh@21 -- # val=fill 00:07:04.454 17:50:57 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.454 17:50:57 -- accel/accel.sh@24 -- # accel_opc=fill 00:07:04.454 17:50:57 -- accel/accel.sh@20 -- # IFS=: 00:07:04.454 17:50:57 -- accel/accel.sh@20 -- # read -r var val 00:07:04.454 17:50:57 -- accel/accel.sh@21 -- # val=0x80 00:07:04.454 17:50:57 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.454 17:50:57 -- accel/accel.sh@20 -- # IFS=: 00:07:04.454 17:50:57 -- accel/accel.sh@20 -- # read -r var val 00:07:04.454 17:50:57 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:04.454 17:50:57 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.455 17:50:57 -- accel/accel.sh@20 -- # IFS=: 00:07:04.455 17:50:57 -- accel/accel.sh@20 -- # read -r var val 00:07:04.455 17:50:57 -- accel/accel.sh@21 -- # val= 00:07:04.455 17:50:57 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.455 17:50:57 -- accel/accel.sh@20 -- # IFS=: 00:07:04.455 17:50:57 -- accel/accel.sh@20 -- # read -r var val 00:07:04.455 17:50:57 -- accel/accel.sh@21 -- # val=software 00:07:04.455 17:50:57 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.455 17:50:57 -- accel/accel.sh@23 -- # accel_module=software 00:07:04.455 17:50:57 -- accel/accel.sh@20 -- # IFS=: 00:07:04.455 17:50:57 -- accel/accel.sh@20 -- # read -r var val 00:07:04.455 17:50:57 -- accel/accel.sh@21 -- # val=64 00:07:04.455 17:50:57 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.455 17:50:57 -- accel/accel.sh@20 -- # IFS=: 00:07:04.455 17:50:57 -- accel/accel.sh@20 -- # read -r var val 00:07:04.455 17:50:57 -- accel/accel.sh@21 -- # val=64 00:07:04.455 17:50:57 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.455 17:50:57 -- accel/accel.sh@20 -- # IFS=: 00:07:04.455 17:50:57 -- accel/accel.sh@20 -- # read -r var val 00:07:04.455 17:50:57 -- accel/accel.sh@21 -- # val=1 00:07:04.455 17:50:57 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.455 17:50:57 -- accel/accel.sh@20 -- # IFS=: 00:07:04.455 17:50:57 -- accel/accel.sh@20 -- # read -r var val 00:07:04.455 17:50:57 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:04.455 17:50:57 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.455 17:50:57 -- accel/accel.sh@20 -- # IFS=: 00:07:04.455 17:50:57 -- accel/accel.sh@20 -- # read -r var val 00:07:04.455 17:50:57 -- accel/accel.sh@21 -- # val=Yes 00:07:04.455 17:50:57 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.455 17:50:57 -- accel/accel.sh@20 -- # IFS=: 00:07:04.455 17:50:57 -- accel/accel.sh@20 -- # read -r var val 00:07:04.455 17:50:57 -- accel/accel.sh@21 -- # val= 00:07:04.455 17:50:57 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.455 17:50:57 -- accel/accel.sh@20 -- # IFS=: 00:07:04.455 17:50:57 -- accel/accel.sh@20 -- # read -r var val 00:07:04.455 17:50:57 -- accel/accel.sh@21 -- # val= 00:07:04.455 17:50:57 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.455 17:50:57 -- accel/accel.sh@20 -- # IFS=: 00:07:04.455 17:50:57 -- accel/accel.sh@20 -- # read -r var val 00:07:05.834 17:50:58 -- accel/accel.sh@21 -- # val= 00:07:05.834 17:50:58 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.834 17:50:58 -- accel/accel.sh@20 -- # IFS=: 00:07:05.834 17:50:58 -- accel/accel.sh@20 -- # read -r var val 00:07:05.834 17:50:58 -- accel/accel.sh@21 -- # val= 00:07:05.834 17:50:58 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.834 17:50:58 -- accel/accel.sh@20 -- # IFS=: 00:07:05.834 17:50:58 -- accel/accel.sh@20 -- # read -r var val 00:07:05.834 17:50:58 -- accel/accel.sh@21 -- # val= 00:07:05.834 17:50:58 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.834 17:50:58 -- accel/accel.sh@20 -- # IFS=: 00:07:05.834 17:50:58 -- accel/accel.sh@20 -- # read -r var val 00:07:05.834 17:50:58 -- accel/accel.sh@21 -- # val= 00:07:05.834 17:50:58 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.834 17:50:58 -- accel/accel.sh@20 -- # IFS=: 00:07:05.834 17:50:58 -- accel/accel.sh@20 -- # read -r var val 00:07:05.834 17:50:58 -- accel/accel.sh@21 -- # val= 00:07:05.834 17:50:58 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.834 17:50:58 -- accel/accel.sh@20 -- # IFS=: 00:07:05.834 17:50:58 -- accel/accel.sh@20 -- # read -r var val 00:07:05.834 17:50:58 -- accel/accel.sh@21 -- # val= 00:07:05.834 17:50:58 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.834 17:50:58 -- accel/accel.sh@20 -- # IFS=: 00:07:05.834 17:50:58 -- accel/accel.sh@20 -- # read -r var val 00:07:05.834 17:50:58 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:05.834 17:50:58 -- accel/accel.sh@28 -- # [[ -n fill ]] 00:07:05.834 17:50:58 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:05.834 00:07:05.834 real 0m2.569s 00:07:05.834 user 0m2.319s 00:07:05.834 sys 0m0.248s 00:07:05.834 17:50:58 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:05.834 17:50:58 -- common/autotest_common.sh@10 -- # set +x 00:07:05.834 ************************************ 00:07:05.834 END TEST accel_fill 00:07:05.834 ************************************ 00:07:05.834 17:50:58 -- accel/accel.sh@97 -- # run_test accel_copy_crc32c accel_test -t 1 -w copy_crc32c -y 00:07:05.834 17:50:58 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:07:05.834 17:50:58 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:05.834 17:50:58 -- common/autotest_common.sh@10 -- # set +x 00:07:05.834 ************************************ 00:07:05.834 START TEST accel_copy_crc32c 00:07:05.834 ************************************ 00:07:05.834 17:50:58 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w copy_crc32c -y 00:07:05.834 17:50:58 -- accel/accel.sh@16 -- # local accel_opc 00:07:05.834 17:50:58 -- accel/accel.sh@17 -- # local accel_module 00:07:05.834 17:50:58 -- accel/accel.sh@18 -- # accel_perf -t 1 -w copy_crc32c -y 00:07:05.834 17:50:58 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y 00:07:05.834 17:50:58 -- accel/accel.sh@12 -- # build_accel_config 00:07:05.834 17:50:58 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:05.834 17:50:58 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:05.834 17:50:58 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:05.834 17:50:58 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:05.834 17:50:58 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:05.834 17:50:58 -- accel/accel.sh@41 -- # local IFS=, 00:07:05.834 17:50:58 -- accel/accel.sh@42 -- # jq -r . 00:07:05.834 [2024-11-19 17:50:58.365858] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:05.834 [2024-11-19 17:50:58.365949] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid624498 ] 00:07:05.834 EAL: No free 2048 kB hugepages reported on node 1 00:07:05.834 [2024-11-19 17:50:58.433943] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:05.834 [2024-11-19 17:50:58.468902] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:06.772 17:50:59 -- accel/accel.sh@18 -- # out=' 00:07:06.772 SPDK Configuration: 00:07:06.772 Core mask: 0x1 00:07:06.772 00:07:06.772 Accel Perf Configuration: 00:07:06.772 Workload Type: copy_crc32c 00:07:06.772 CRC-32C seed: 0 00:07:06.772 Vector size: 4096 bytes 00:07:06.772 Transfer size: 4096 bytes 00:07:06.772 Vector count 1 00:07:06.772 Module: software 00:07:06.772 Queue depth: 32 00:07:06.772 Allocate depth: 32 00:07:06.772 # threads/core: 1 00:07:06.772 Run time: 1 seconds 00:07:06.772 Verify: Yes 00:07:06.772 00:07:06.772 Running for 1 seconds... 00:07:06.772 00:07:06.772 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:06.772 ------------------------------------------------------------------------------------ 00:07:06.773 0,0 422496/s 1650 MiB/s 0 0 00:07:06.773 ==================================================================================== 00:07:06.773 Total 422496/s 1650 MiB/s 0 0' 00:07:06.773 17:50:59 -- accel/accel.sh@20 -- # IFS=: 00:07:06.773 17:50:59 -- accel/accel.sh@20 -- # read -r var val 00:07:06.773 17:50:59 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y 00:07:06.773 17:50:59 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y 00:07:06.773 17:50:59 -- accel/accel.sh@12 -- # build_accel_config 00:07:06.773 17:50:59 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:06.773 17:50:59 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:06.773 17:50:59 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:06.773 17:50:59 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:06.773 17:50:59 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:06.773 17:50:59 -- accel/accel.sh@41 -- # local IFS=, 00:07:06.773 17:50:59 -- accel/accel.sh@42 -- # jq -r . 00:07:07.032 [2024-11-19 17:50:59.649618] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:07.032 [2024-11-19 17:50:59.649715] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid624766 ] 00:07:07.032 EAL: No free 2048 kB hugepages reported on node 1 00:07:07.032 [2024-11-19 17:50:59.716982] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:07.032 [2024-11-19 17:50:59.750955] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:07.032 17:50:59 -- accel/accel.sh@21 -- # val= 00:07:07.032 17:50:59 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.032 17:50:59 -- accel/accel.sh@20 -- # IFS=: 00:07:07.032 17:50:59 -- accel/accel.sh@20 -- # read -r var val 00:07:07.032 17:50:59 -- accel/accel.sh@21 -- # val= 00:07:07.032 17:50:59 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.032 17:50:59 -- accel/accel.sh@20 -- # IFS=: 00:07:07.032 17:50:59 -- accel/accel.sh@20 -- # read -r var val 00:07:07.032 17:50:59 -- accel/accel.sh@21 -- # val=0x1 00:07:07.032 17:50:59 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.032 17:50:59 -- accel/accel.sh@20 -- # IFS=: 00:07:07.032 17:50:59 -- accel/accel.sh@20 -- # read -r var val 00:07:07.033 17:50:59 -- accel/accel.sh@21 -- # val= 00:07:07.033 17:50:59 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.033 17:50:59 -- accel/accel.sh@20 -- # IFS=: 00:07:07.033 17:50:59 -- accel/accel.sh@20 -- # read -r var val 00:07:07.033 17:50:59 -- accel/accel.sh@21 -- # val= 00:07:07.033 17:50:59 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.033 17:50:59 -- accel/accel.sh@20 -- # IFS=: 00:07:07.033 17:50:59 -- accel/accel.sh@20 -- # read -r var val 00:07:07.033 17:50:59 -- accel/accel.sh@21 -- # val=copy_crc32c 00:07:07.033 17:50:59 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.033 17:50:59 -- accel/accel.sh@24 -- # accel_opc=copy_crc32c 00:07:07.033 17:50:59 -- accel/accel.sh@20 -- # IFS=: 00:07:07.033 17:50:59 -- accel/accel.sh@20 -- # read -r var val 00:07:07.033 17:50:59 -- accel/accel.sh@21 -- # val=0 00:07:07.033 17:50:59 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.033 17:50:59 -- accel/accel.sh@20 -- # IFS=: 00:07:07.033 17:50:59 -- accel/accel.sh@20 -- # read -r var val 00:07:07.033 17:50:59 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:07.033 17:50:59 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.033 17:50:59 -- accel/accel.sh@20 -- # IFS=: 00:07:07.033 17:50:59 -- accel/accel.sh@20 -- # read -r var val 00:07:07.033 17:50:59 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:07.033 17:50:59 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.033 17:50:59 -- accel/accel.sh@20 -- # IFS=: 00:07:07.033 17:50:59 -- accel/accel.sh@20 -- # read -r var val 00:07:07.033 17:50:59 -- accel/accel.sh@21 -- # val= 00:07:07.033 17:50:59 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.033 17:50:59 -- accel/accel.sh@20 -- # IFS=: 00:07:07.033 17:50:59 -- accel/accel.sh@20 -- # read -r var val 00:07:07.033 17:50:59 -- accel/accel.sh@21 -- # val=software 00:07:07.033 17:50:59 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.033 17:50:59 -- accel/accel.sh@23 -- # accel_module=software 00:07:07.033 17:50:59 -- accel/accel.sh@20 -- # IFS=: 00:07:07.033 17:50:59 -- accel/accel.sh@20 -- # read -r var val 00:07:07.033 17:50:59 -- accel/accel.sh@21 -- # val=32 00:07:07.033 17:50:59 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.033 17:50:59 -- accel/accel.sh@20 -- # IFS=: 00:07:07.033 17:50:59 -- accel/accel.sh@20 -- # read -r var val 00:07:07.033 17:50:59 -- accel/accel.sh@21 -- # val=32 00:07:07.033 17:50:59 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.033 17:50:59 -- accel/accel.sh@20 -- # IFS=: 00:07:07.033 17:50:59 -- accel/accel.sh@20 -- # read -r var val 00:07:07.033 17:50:59 -- accel/accel.sh@21 -- # val=1 00:07:07.033 17:50:59 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.033 17:50:59 -- accel/accel.sh@20 -- # IFS=: 00:07:07.033 17:50:59 -- accel/accel.sh@20 -- # read -r var val 00:07:07.033 17:50:59 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:07.033 17:50:59 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.033 17:50:59 -- accel/accel.sh@20 -- # IFS=: 00:07:07.033 17:50:59 -- accel/accel.sh@20 -- # read -r var val 00:07:07.033 17:50:59 -- accel/accel.sh@21 -- # val=Yes 00:07:07.033 17:50:59 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.033 17:50:59 -- accel/accel.sh@20 -- # IFS=: 00:07:07.033 17:50:59 -- accel/accel.sh@20 -- # read -r var val 00:07:07.033 17:50:59 -- accel/accel.sh@21 -- # val= 00:07:07.033 17:50:59 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.033 17:50:59 -- accel/accel.sh@20 -- # IFS=: 00:07:07.033 17:50:59 -- accel/accel.sh@20 -- # read -r var val 00:07:07.033 17:50:59 -- accel/accel.sh@21 -- # val= 00:07:07.033 17:50:59 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.033 17:50:59 -- accel/accel.sh@20 -- # IFS=: 00:07:07.033 17:50:59 -- accel/accel.sh@20 -- # read -r var val 00:07:08.413 17:51:00 -- accel/accel.sh@21 -- # val= 00:07:08.413 17:51:00 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.413 17:51:00 -- accel/accel.sh@20 -- # IFS=: 00:07:08.413 17:51:00 -- accel/accel.sh@20 -- # read -r var val 00:07:08.413 17:51:00 -- accel/accel.sh@21 -- # val= 00:07:08.413 17:51:00 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.413 17:51:00 -- accel/accel.sh@20 -- # IFS=: 00:07:08.413 17:51:00 -- accel/accel.sh@20 -- # read -r var val 00:07:08.413 17:51:00 -- accel/accel.sh@21 -- # val= 00:07:08.413 17:51:00 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.413 17:51:00 -- accel/accel.sh@20 -- # IFS=: 00:07:08.413 17:51:00 -- accel/accel.sh@20 -- # read -r var val 00:07:08.413 17:51:00 -- accel/accel.sh@21 -- # val= 00:07:08.413 17:51:00 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.413 17:51:00 -- accel/accel.sh@20 -- # IFS=: 00:07:08.413 17:51:00 -- accel/accel.sh@20 -- # read -r var val 00:07:08.413 17:51:00 -- accel/accel.sh@21 -- # val= 00:07:08.413 17:51:00 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.413 17:51:00 -- accel/accel.sh@20 -- # IFS=: 00:07:08.413 17:51:00 -- accel/accel.sh@20 -- # read -r var val 00:07:08.413 17:51:00 -- accel/accel.sh@21 -- # val= 00:07:08.413 17:51:00 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.413 17:51:00 -- accel/accel.sh@20 -- # IFS=: 00:07:08.413 17:51:00 -- accel/accel.sh@20 -- # read -r var val 00:07:08.413 17:51:00 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:08.413 17:51:00 -- accel/accel.sh@28 -- # [[ -n copy_crc32c ]] 00:07:08.413 17:51:00 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:08.413 00:07:08.413 real 0m2.569s 00:07:08.413 user 0m2.305s 00:07:08.413 sys 0m0.262s 00:07:08.413 17:51:00 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:08.413 17:51:00 -- common/autotest_common.sh@10 -- # set +x 00:07:08.413 ************************************ 00:07:08.413 END TEST accel_copy_crc32c 00:07:08.413 ************************************ 00:07:08.413 17:51:00 -- accel/accel.sh@98 -- # run_test accel_copy_crc32c_C2 accel_test -t 1 -w copy_crc32c -y -C 2 00:07:08.413 17:51:00 -- common/autotest_common.sh@1087 -- # '[' 9 -le 1 ']' 00:07:08.413 17:51:00 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:08.413 17:51:00 -- common/autotest_common.sh@10 -- # set +x 00:07:08.413 ************************************ 00:07:08.413 START TEST accel_copy_crc32c_C2 00:07:08.413 ************************************ 00:07:08.413 17:51:00 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w copy_crc32c -y -C 2 00:07:08.413 17:51:00 -- accel/accel.sh@16 -- # local accel_opc 00:07:08.413 17:51:00 -- accel/accel.sh@17 -- # local accel_module 00:07:08.413 17:51:00 -- accel/accel.sh@18 -- # accel_perf -t 1 -w copy_crc32c -y -C 2 00:07:08.413 17:51:00 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y -C 2 00:07:08.413 17:51:00 -- accel/accel.sh@12 -- # build_accel_config 00:07:08.413 17:51:00 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:08.413 17:51:00 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:08.413 17:51:00 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:08.413 17:51:00 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:08.413 17:51:00 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:08.413 17:51:00 -- accel/accel.sh@41 -- # local IFS=, 00:07:08.413 17:51:00 -- accel/accel.sh@42 -- # jq -r . 00:07:08.413 [2024-11-19 17:51:00.977510] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:08.413 [2024-11-19 17:51:00.977608] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid625105 ] 00:07:08.413 EAL: No free 2048 kB hugepages reported on node 1 00:07:08.413 [2024-11-19 17:51:01.047426] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:08.413 [2024-11-19 17:51:01.084580] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:09.794 17:51:02 -- accel/accel.sh@18 -- # out=' 00:07:09.794 SPDK Configuration: 00:07:09.794 Core mask: 0x1 00:07:09.794 00:07:09.794 Accel Perf Configuration: 00:07:09.794 Workload Type: copy_crc32c 00:07:09.794 CRC-32C seed: 0 00:07:09.794 Vector size: 4096 bytes 00:07:09.794 Transfer size: 8192 bytes 00:07:09.794 Vector count 2 00:07:09.794 Module: software 00:07:09.794 Queue depth: 32 00:07:09.794 Allocate depth: 32 00:07:09.794 # threads/core: 1 00:07:09.794 Run time: 1 seconds 00:07:09.794 Verify: Yes 00:07:09.794 00:07:09.794 Running for 1 seconds... 00:07:09.794 00:07:09.794 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:09.794 ------------------------------------------------------------------------------------ 00:07:09.794 0,0 292768/s 2287 MiB/s 0 0 00:07:09.794 ==================================================================================== 00:07:09.794 Total 292768/s 1143 MiB/s 0 0' 00:07:09.794 17:51:02 -- accel/accel.sh@20 -- # IFS=: 00:07:09.794 17:51:02 -- accel/accel.sh@20 -- # read -r var val 00:07:09.794 17:51:02 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y -C 2 00:07:09.794 17:51:02 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y -C 2 00:07:09.794 17:51:02 -- accel/accel.sh@12 -- # build_accel_config 00:07:09.794 17:51:02 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:09.794 17:51:02 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:09.794 17:51:02 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:09.794 17:51:02 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:09.794 17:51:02 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:09.794 17:51:02 -- accel/accel.sh@41 -- # local IFS=, 00:07:09.794 17:51:02 -- accel/accel.sh@42 -- # jq -r . 00:07:09.794 [2024-11-19 17:51:02.266816] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:09.794 [2024-11-19 17:51:02.266923] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid625307 ] 00:07:09.794 EAL: No free 2048 kB hugepages reported on node 1 00:07:09.794 [2024-11-19 17:51:02.334971] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:09.794 [2024-11-19 17:51:02.369492] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:09.794 17:51:02 -- accel/accel.sh@21 -- # val= 00:07:09.794 17:51:02 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.794 17:51:02 -- accel/accel.sh@20 -- # IFS=: 00:07:09.794 17:51:02 -- accel/accel.sh@20 -- # read -r var val 00:07:09.794 17:51:02 -- accel/accel.sh@21 -- # val= 00:07:09.794 17:51:02 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.794 17:51:02 -- accel/accel.sh@20 -- # IFS=: 00:07:09.794 17:51:02 -- accel/accel.sh@20 -- # read -r var val 00:07:09.794 17:51:02 -- accel/accel.sh@21 -- # val=0x1 00:07:09.794 17:51:02 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.794 17:51:02 -- accel/accel.sh@20 -- # IFS=: 00:07:09.794 17:51:02 -- accel/accel.sh@20 -- # read -r var val 00:07:09.794 17:51:02 -- accel/accel.sh@21 -- # val= 00:07:09.794 17:51:02 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.794 17:51:02 -- accel/accel.sh@20 -- # IFS=: 00:07:09.794 17:51:02 -- accel/accel.sh@20 -- # read -r var val 00:07:09.794 17:51:02 -- accel/accel.sh@21 -- # val= 00:07:09.794 17:51:02 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.794 17:51:02 -- accel/accel.sh@20 -- # IFS=: 00:07:09.794 17:51:02 -- accel/accel.sh@20 -- # read -r var val 00:07:09.794 17:51:02 -- accel/accel.sh@21 -- # val=copy_crc32c 00:07:09.794 17:51:02 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.794 17:51:02 -- accel/accel.sh@24 -- # accel_opc=copy_crc32c 00:07:09.794 17:51:02 -- accel/accel.sh@20 -- # IFS=: 00:07:09.794 17:51:02 -- accel/accel.sh@20 -- # read -r var val 00:07:09.794 17:51:02 -- accel/accel.sh@21 -- # val=0 00:07:09.794 17:51:02 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.794 17:51:02 -- accel/accel.sh@20 -- # IFS=: 00:07:09.794 17:51:02 -- accel/accel.sh@20 -- # read -r var val 00:07:09.795 17:51:02 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:09.795 17:51:02 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.795 17:51:02 -- accel/accel.sh@20 -- # IFS=: 00:07:09.795 17:51:02 -- accel/accel.sh@20 -- # read -r var val 00:07:09.795 17:51:02 -- accel/accel.sh@21 -- # val='8192 bytes' 00:07:09.795 17:51:02 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.795 17:51:02 -- accel/accel.sh@20 -- # IFS=: 00:07:09.795 17:51:02 -- accel/accel.sh@20 -- # read -r var val 00:07:09.795 17:51:02 -- accel/accel.sh@21 -- # val= 00:07:09.795 17:51:02 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.795 17:51:02 -- accel/accel.sh@20 -- # IFS=: 00:07:09.795 17:51:02 -- accel/accel.sh@20 -- # read -r var val 00:07:09.795 17:51:02 -- accel/accel.sh@21 -- # val=software 00:07:09.795 17:51:02 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.795 17:51:02 -- accel/accel.sh@23 -- # accel_module=software 00:07:09.795 17:51:02 -- accel/accel.sh@20 -- # IFS=: 00:07:09.795 17:51:02 -- accel/accel.sh@20 -- # read -r var val 00:07:09.795 17:51:02 -- accel/accel.sh@21 -- # val=32 00:07:09.795 17:51:02 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.795 17:51:02 -- accel/accel.sh@20 -- # IFS=: 00:07:09.795 17:51:02 -- accel/accel.sh@20 -- # read -r var val 00:07:09.795 17:51:02 -- accel/accel.sh@21 -- # val=32 00:07:09.795 17:51:02 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.795 17:51:02 -- accel/accel.sh@20 -- # IFS=: 00:07:09.795 17:51:02 -- accel/accel.sh@20 -- # read -r var val 00:07:09.795 17:51:02 -- accel/accel.sh@21 -- # val=1 00:07:09.795 17:51:02 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.795 17:51:02 -- accel/accel.sh@20 -- # IFS=: 00:07:09.795 17:51:02 -- accel/accel.sh@20 -- # read -r var val 00:07:09.795 17:51:02 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:09.795 17:51:02 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.795 17:51:02 -- accel/accel.sh@20 -- # IFS=: 00:07:09.795 17:51:02 -- accel/accel.sh@20 -- # read -r var val 00:07:09.795 17:51:02 -- accel/accel.sh@21 -- # val=Yes 00:07:09.795 17:51:02 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.795 17:51:02 -- accel/accel.sh@20 -- # IFS=: 00:07:09.795 17:51:02 -- accel/accel.sh@20 -- # read -r var val 00:07:09.795 17:51:02 -- accel/accel.sh@21 -- # val= 00:07:09.795 17:51:02 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.795 17:51:02 -- accel/accel.sh@20 -- # IFS=: 00:07:09.795 17:51:02 -- accel/accel.sh@20 -- # read -r var val 00:07:09.795 17:51:02 -- accel/accel.sh@21 -- # val= 00:07:09.795 17:51:02 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.795 17:51:02 -- accel/accel.sh@20 -- # IFS=: 00:07:09.795 17:51:02 -- accel/accel.sh@20 -- # read -r var val 00:07:10.733 17:51:03 -- accel/accel.sh@21 -- # val= 00:07:10.733 17:51:03 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.733 17:51:03 -- accel/accel.sh@20 -- # IFS=: 00:07:10.733 17:51:03 -- accel/accel.sh@20 -- # read -r var val 00:07:10.733 17:51:03 -- accel/accel.sh@21 -- # val= 00:07:10.733 17:51:03 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.733 17:51:03 -- accel/accel.sh@20 -- # IFS=: 00:07:10.733 17:51:03 -- accel/accel.sh@20 -- # read -r var val 00:07:10.733 17:51:03 -- accel/accel.sh@21 -- # val= 00:07:10.733 17:51:03 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.733 17:51:03 -- accel/accel.sh@20 -- # IFS=: 00:07:10.733 17:51:03 -- accel/accel.sh@20 -- # read -r var val 00:07:10.733 17:51:03 -- accel/accel.sh@21 -- # val= 00:07:10.733 17:51:03 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.733 17:51:03 -- accel/accel.sh@20 -- # IFS=: 00:07:10.733 17:51:03 -- accel/accel.sh@20 -- # read -r var val 00:07:10.733 17:51:03 -- accel/accel.sh@21 -- # val= 00:07:10.733 17:51:03 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.733 17:51:03 -- accel/accel.sh@20 -- # IFS=: 00:07:10.733 17:51:03 -- accel/accel.sh@20 -- # read -r var val 00:07:10.733 17:51:03 -- accel/accel.sh@21 -- # val= 00:07:10.733 17:51:03 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.733 17:51:03 -- accel/accel.sh@20 -- # IFS=: 00:07:10.733 17:51:03 -- accel/accel.sh@20 -- # read -r var val 00:07:10.733 17:51:03 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:10.733 17:51:03 -- accel/accel.sh@28 -- # [[ -n copy_crc32c ]] 00:07:10.733 17:51:03 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:10.733 00:07:10.733 real 0m2.575s 00:07:10.733 user 0m2.318s 00:07:10.733 sys 0m0.255s 00:07:10.733 17:51:03 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:10.734 17:51:03 -- common/autotest_common.sh@10 -- # set +x 00:07:10.734 ************************************ 00:07:10.734 END TEST accel_copy_crc32c_C2 00:07:10.734 ************************************ 00:07:10.734 17:51:03 -- accel/accel.sh@99 -- # run_test accel_dualcast accel_test -t 1 -w dualcast -y 00:07:10.734 17:51:03 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:07:10.734 17:51:03 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:10.734 17:51:03 -- common/autotest_common.sh@10 -- # set +x 00:07:10.734 ************************************ 00:07:10.734 START TEST accel_dualcast 00:07:10.734 ************************************ 00:07:10.734 17:51:03 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w dualcast -y 00:07:10.734 17:51:03 -- accel/accel.sh@16 -- # local accel_opc 00:07:10.734 17:51:03 -- accel/accel.sh@17 -- # local accel_module 00:07:10.734 17:51:03 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dualcast -y 00:07:10.734 17:51:03 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dualcast -y 00:07:10.734 17:51:03 -- accel/accel.sh@12 -- # build_accel_config 00:07:10.734 17:51:03 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:10.734 17:51:03 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:10.734 17:51:03 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:10.734 17:51:03 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:10.734 17:51:03 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:10.734 17:51:03 -- accel/accel.sh@41 -- # local IFS=, 00:07:10.734 17:51:03 -- accel/accel.sh@42 -- # jq -r . 00:07:10.734 [2024-11-19 17:51:03.596635] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:10.734 [2024-11-19 17:51:03.596728] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid625492 ] 00:07:10.993 EAL: No free 2048 kB hugepages reported on node 1 00:07:10.993 [2024-11-19 17:51:03.666057] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:10.993 [2024-11-19 17:51:03.701927] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:12.372 17:51:04 -- accel/accel.sh@18 -- # out=' 00:07:12.372 SPDK Configuration: 00:07:12.372 Core mask: 0x1 00:07:12.372 00:07:12.372 Accel Perf Configuration: 00:07:12.372 Workload Type: dualcast 00:07:12.372 Transfer size: 4096 bytes 00:07:12.372 Vector count 1 00:07:12.372 Module: software 00:07:12.372 Queue depth: 32 00:07:12.372 Allocate depth: 32 00:07:12.372 # threads/core: 1 00:07:12.372 Run time: 1 seconds 00:07:12.372 Verify: Yes 00:07:12.372 00:07:12.372 Running for 1 seconds... 00:07:12.372 00:07:12.372 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:12.372 ------------------------------------------------------------------------------------ 00:07:12.372 0,0 619424/s 2419 MiB/s 0 0 00:07:12.372 ==================================================================================== 00:07:12.372 Total 619424/s 2419 MiB/s 0 0' 00:07:12.372 17:51:04 -- accel/accel.sh@20 -- # IFS=: 00:07:12.372 17:51:04 -- accel/accel.sh@20 -- # read -r var val 00:07:12.372 17:51:04 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dualcast -y 00:07:12.372 17:51:04 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dualcast -y 00:07:12.372 17:51:04 -- accel/accel.sh@12 -- # build_accel_config 00:07:12.372 17:51:04 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:12.372 17:51:04 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:12.372 17:51:04 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:12.372 17:51:04 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:12.372 17:51:04 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:12.372 17:51:04 -- accel/accel.sh@41 -- # local IFS=, 00:07:12.372 17:51:04 -- accel/accel.sh@42 -- # jq -r . 00:07:12.372 [2024-11-19 17:51:04.884671] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:12.372 [2024-11-19 17:51:04.884794] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid626056 ] 00:07:12.372 EAL: No free 2048 kB hugepages reported on node 1 00:07:12.372 [2024-11-19 17:51:04.954460] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:12.372 [2024-11-19 17:51:04.988476] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:12.372 17:51:05 -- accel/accel.sh@21 -- # val= 00:07:12.372 17:51:05 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.372 17:51:05 -- accel/accel.sh@20 -- # IFS=: 00:07:12.372 17:51:05 -- accel/accel.sh@20 -- # read -r var val 00:07:12.372 17:51:05 -- accel/accel.sh@21 -- # val= 00:07:12.372 17:51:05 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.372 17:51:05 -- accel/accel.sh@20 -- # IFS=: 00:07:12.372 17:51:05 -- accel/accel.sh@20 -- # read -r var val 00:07:12.372 17:51:05 -- accel/accel.sh@21 -- # val=0x1 00:07:12.372 17:51:05 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.372 17:51:05 -- accel/accel.sh@20 -- # IFS=: 00:07:12.372 17:51:05 -- accel/accel.sh@20 -- # read -r var val 00:07:12.372 17:51:05 -- accel/accel.sh@21 -- # val= 00:07:12.372 17:51:05 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.372 17:51:05 -- accel/accel.sh@20 -- # IFS=: 00:07:12.372 17:51:05 -- accel/accel.sh@20 -- # read -r var val 00:07:12.372 17:51:05 -- accel/accel.sh@21 -- # val= 00:07:12.372 17:51:05 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.372 17:51:05 -- accel/accel.sh@20 -- # IFS=: 00:07:12.372 17:51:05 -- accel/accel.sh@20 -- # read -r var val 00:07:12.372 17:51:05 -- accel/accel.sh@21 -- # val=dualcast 00:07:12.372 17:51:05 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.372 17:51:05 -- accel/accel.sh@24 -- # accel_opc=dualcast 00:07:12.372 17:51:05 -- accel/accel.sh@20 -- # IFS=: 00:07:12.372 17:51:05 -- accel/accel.sh@20 -- # read -r var val 00:07:12.372 17:51:05 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:12.372 17:51:05 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.372 17:51:05 -- accel/accel.sh@20 -- # IFS=: 00:07:12.372 17:51:05 -- accel/accel.sh@20 -- # read -r var val 00:07:12.372 17:51:05 -- accel/accel.sh@21 -- # val= 00:07:12.372 17:51:05 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.372 17:51:05 -- accel/accel.sh@20 -- # IFS=: 00:07:12.372 17:51:05 -- accel/accel.sh@20 -- # read -r var val 00:07:12.372 17:51:05 -- accel/accel.sh@21 -- # val=software 00:07:12.372 17:51:05 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.372 17:51:05 -- accel/accel.sh@23 -- # accel_module=software 00:07:12.372 17:51:05 -- accel/accel.sh@20 -- # IFS=: 00:07:12.372 17:51:05 -- accel/accel.sh@20 -- # read -r var val 00:07:12.372 17:51:05 -- accel/accel.sh@21 -- # val=32 00:07:12.372 17:51:05 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.372 17:51:05 -- accel/accel.sh@20 -- # IFS=: 00:07:12.372 17:51:05 -- accel/accel.sh@20 -- # read -r var val 00:07:12.372 17:51:05 -- accel/accel.sh@21 -- # val=32 00:07:12.372 17:51:05 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.372 17:51:05 -- accel/accel.sh@20 -- # IFS=: 00:07:12.372 17:51:05 -- accel/accel.sh@20 -- # read -r var val 00:07:12.372 17:51:05 -- accel/accel.sh@21 -- # val=1 00:07:12.372 17:51:05 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.372 17:51:05 -- accel/accel.sh@20 -- # IFS=: 00:07:12.372 17:51:05 -- accel/accel.sh@20 -- # read -r var val 00:07:12.372 17:51:05 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:12.372 17:51:05 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.372 17:51:05 -- accel/accel.sh@20 -- # IFS=: 00:07:12.372 17:51:05 -- accel/accel.sh@20 -- # read -r var val 00:07:12.372 17:51:05 -- accel/accel.sh@21 -- # val=Yes 00:07:12.372 17:51:05 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.372 17:51:05 -- accel/accel.sh@20 -- # IFS=: 00:07:12.372 17:51:05 -- accel/accel.sh@20 -- # read -r var val 00:07:12.372 17:51:05 -- accel/accel.sh@21 -- # val= 00:07:12.372 17:51:05 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.372 17:51:05 -- accel/accel.sh@20 -- # IFS=: 00:07:12.372 17:51:05 -- accel/accel.sh@20 -- # read -r var val 00:07:12.372 17:51:05 -- accel/accel.sh@21 -- # val= 00:07:12.372 17:51:05 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.372 17:51:05 -- accel/accel.sh@20 -- # IFS=: 00:07:12.372 17:51:05 -- accel/accel.sh@20 -- # read -r var val 00:07:13.310 17:51:06 -- accel/accel.sh@21 -- # val= 00:07:13.310 17:51:06 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.310 17:51:06 -- accel/accel.sh@20 -- # IFS=: 00:07:13.310 17:51:06 -- accel/accel.sh@20 -- # read -r var val 00:07:13.310 17:51:06 -- accel/accel.sh@21 -- # val= 00:07:13.310 17:51:06 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.310 17:51:06 -- accel/accel.sh@20 -- # IFS=: 00:07:13.310 17:51:06 -- accel/accel.sh@20 -- # read -r var val 00:07:13.310 17:51:06 -- accel/accel.sh@21 -- # val= 00:07:13.310 17:51:06 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.310 17:51:06 -- accel/accel.sh@20 -- # IFS=: 00:07:13.310 17:51:06 -- accel/accel.sh@20 -- # read -r var val 00:07:13.310 17:51:06 -- accel/accel.sh@21 -- # val= 00:07:13.310 17:51:06 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.310 17:51:06 -- accel/accel.sh@20 -- # IFS=: 00:07:13.310 17:51:06 -- accel/accel.sh@20 -- # read -r var val 00:07:13.310 17:51:06 -- accel/accel.sh@21 -- # val= 00:07:13.310 17:51:06 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.310 17:51:06 -- accel/accel.sh@20 -- # IFS=: 00:07:13.310 17:51:06 -- accel/accel.sh@20 -- # read -r var val 00:07:13.310 17:51:06 -- accel/accel.sh@21 -- # val= 00:07:13.310 17:51:06 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.310 17:51:06 -- accel/accel.sh@20 -- # IFS=: 00:07:13.310 17:51:06 -- accel/accel.sh@20 -- # read -r var val 00:07:13.310 17:51:06 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:13.310 17:51:06 -- accel/accel.sh@28 -- # [[ -n dualcast ]] 00:07:13.310 17:51:06 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:13.310 00:07:13.310 real 0m2.573s 00:07:13.310 user 0m2.308s 00:07:13.310 sys 0m0.261s 00:07:13.310 17:51:06 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:13.310 17:51:06 -- common/autotest_common.sh@10 -- # set +x 00:07:13.310 ************************************ 00:07:13.310 END TEST accel_dualcast 00:07:13.310 ************************************ 00:07:13.570 17:51:06 -- accel/accel.sh@100 -- # run_test accel_compare accel_test -t 1 -w compare -y 00:07:13.570 17:51:06 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:07:13.570 17:51:06 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:13.570 17:51:06 -- common/autotest_common.sh@10 -- # set +x 00:07:13.570 ************************************ 00:07:13.570 START TEST accel_compare 00:07:13.570 ************************************ 00:07:13.570 17:51:06 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w compare -y 00:07:13.570 17:51:06 -- accel/accel.sh@16 -- # local accel_opc 00:07:13.570 17:51:06 -- accel/accel.sh@17 -- # local accel_module 00:07:13.570 17:51:06 -- accel/accel.sh@18 -- # accel_perf -t 1 -w compare -y 00:07:13.570 17:51:06 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compare -y 00:07:13.570 17:51:06 -- accel/accel.sh@12 -- # build_accel_config 00:07:13.570 17:51:06 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:13.570 17:51:06 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:13.570 17:51:06 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:13.570 17:51:06 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:13.570 17:51:06 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:13.570 17:51:06 -- accel/accel.sh@41 -- # local IFS=, 00:07:13.570 17:51:06 -- accel/accel.sh@42 -- # jq -r . 00:07:13.570 [2024-11-19 17:51:06.215384] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:13.570 [2024-11-19 17:51:06.215475] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid626472 ] 00:07:13.570 EAL: No free 2048 kB hugepages reported on node 1 00:07:13.570 [2024-11-19 17:51:06.285652] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:13.570 [2024-11-19 17:51:06.320372] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:14.950 17:51:07 -- accel/accel.sh@18 -- # out=' 00:07:14.950 SPDK Configuration: 00:07:14.950 Core mask: 0x1 00:07:14.950 00:07:14.950 Accel Perf Configuration: 00:07:14.950 Workload Type: compare 00:07:14.950 Transfer size: 4096 bytes 00:07:14.950 Vector count 1 00:07:14.950 Module: software 00:07:14.950 Queue depth: 32 00:07:14.950 Allocate depth: 32 00:07:14.950 # threads/core: 1 00:07:14.950 Run time: 1 seconds 00:07:14.950 Verify: Yes 00:07:14.950 00:07:14.950 Running for 1 seconds... 00:07:14.950 00:07:14.950 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:14.950 ------------------------------------------------------------------------------------ 00:07:14.950 0,0 810432/s 3165 MiB/s 0 0 00:07:14.950 ==================================================================================== 00:07:14.950 Total 810432/s 3165 MiB/s 0 0' 00:07:14.950 17:51:07 -- accel/accel.sh@20 -- # IFS=: 00:07:14.950 17:51:07 -- accel/accel.sh@20 -- # read -r var val 00:07:14.950 17:51:07 -- accel/accel.sh@15 -- # accel_perf -t 1 -w compare -y 00:07:14.950 17:51:07 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compare -y 00:07:14.950 17:51:07 -- accel/accel.sh@12 -- # build_accel_config 00:07:14.950 17:51:07 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:14.950 17:51:07 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:14.950 17:51:07 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:14.950 17:51:07 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:14.950 17:51:07 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:14.950 17:51:07 -- accel/accel.sh@41 -- # local IFS=, 00:07:14.950 17:51:07 -- accel/accel.sh@42 -- # jq -r . 00:07:14.950 [2024-11-19 17:51:07.499546] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:14.950 [2024-11-19 17:51:07.499642] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid626738 ] 00:07:14.950 EAL: No free 2048 kB hugepages reported on node 1 00:07:14.950 [2024-11-19 17:51:07.566922] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:14.950 [2024-11-19 17:51:07.600588] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:14.950 17:51:07 -- accel/accel.sh@21 -- # val= 00:07:14.950 17:51:07 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.950 17:51:07 -- accel/accel.sh@20 -- # IFS=: 00:07:14.950 17:51:07 -- accel/accel.sh@20 -- # read -r var val 00:07:14.950 17:51:07 -- accel/accel.sh@21 -- # val= 00:07:14.950 17:51:07 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.950 17:51:07 -- accel/accel.sh@20 -- # IFS=: 00:07:14.950 17:51:07 -- accel/accel.sh@20 -- # read -r var val 00:07:14.950 17:51:07 -- accel/accel.sh@21 -- # val=0x1 00:07:14.950 17:51:07 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.950 17:51:07 -- accel/accel.sh@20 -- # IFS=: 00:07:14.950 17:51:07 -- accel/accel.sh@20 -- # read -r var val 00:07:14.950 17:51:07 -- accel/accel.sh@21 -- # val= 00:07:14.951 17:51:07 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.951 17:51:07 -- accel/accel.sh@20 -- # IFS=: 00:07:14.951 17:51:07 -- accel/accel.sh@20 -- # read -r var val 00:07:14.951 17:51:07 -- accel/accel.sh@21 -- # val= 00:07:14.951 17:51:07 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.951 17:51:07 -- accel/accel.sh@20 -- # IFS=: 00:07:14.951 17:51:07 -- accel/accel.sh@20 -- # read -r var val 00:07:14.951 17:51:07 -- accel/accel.sh@21 -- # val=compare 00:07:14.951 17:51:07 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.951 17:51:07 -- accel/accel.sh@24 -- # accel_opc=compare 00:07:14.951 17:51:07 -- accel/accel.sh@20 -- # IFS=: 00:07:14.951 17:51:07 -- accel/accel.sh@20 -- # read -r var val 00:07:14.951 17:51:07 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:14.951 17:51:07 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.951 17:51:07 -- accel/accel.sh@20 -- # IFS=: 00:07:14.951 17:51:07 -- accel/accel.sh@20 -- # read -r var val 00:07:14.951 17:51:07 -- accel/accel.sh@21 -- # val= 00:07:14.951 17:51:07 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.951 17:51:07 -- accel/accel.sh@20 -- # IFS=: 00:07:14.951 17:51:07 -- accel/accel.sh@20 -- # read -r var val 00:07:14.951 17:51:07 -- accel/accel.sh@21 -- # val=software 00:07:14.951 17:51:07 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.951 17:51:07 -- accel/accel.sh@23 -- # accel_module=software 00:07:14.951 17:51:07 -- accel/accel.sh@20 -- # IFS=: 00:07:14.951 17:51:07 -- accel/accel.sh@20 -- # read -r var val 00:07:14.951 17:51:07 -- accel/accel.sh@21 -- # val=32 00:07:14.951 17:51:07 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.951 17:51:07 -- accel/accel.sh@20 -- # IFS=: 00:07:14.951 17:51:07 -- accel/accel.sh@20 -- # read -r var val 00:07:14.951 17:51:07 -- accel/accel.sh@21 -- # val=32 00:07:14.951 17:51:07 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.951 17:51:07 -- accel/accel.sh@20 -- # IFS=: 00:07:14.951 17:51:07 -- accel/accel.sh@20 -- # read -r var val 00:07:14.951 17:51:07 -- accel/accel.sh@21 -- # val=1 00:07:14.951 17:51:07 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.951 17:51:07 -- accel/accel.sh@20 -- # IFS=: 00:07:14.951 17:51:07 -- accel/accel.sh@20 -- # read -r var val 00:07:14.951 17:51:07 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:14.951 17:51:07 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.951 17:51:07 -- accel/accel.sh@20 -- # IFS=: 00:07:14.951 17:51:07 -- accel/accel.sh@20 -- # read -r var val 00:07:14.951 17:51:07 -- accel/accel.sh@21 -- # val=Yes 00:07:14.951 17:51:07 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.951 17:51:07 -- accel/accel.sh@20 -- # IFS=: 00:07:14.951 17:51:07 -- accel/accel.sh@20 -- # read -r var val 00:07:14.951 17:51:07 -- accel/accel.sh@21 -- # val= 00:07:14.951 17:51:07 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.951 17:51:07 -- accel/accel.sh@20 -- # IFS=: 00:07:14.951 17:51:07 -- accel/accel.sh@20 -- # read -r var val 00:07:14.951 17:51:07 -- accel/accel.sh@21 -- # val= 00:07:14.951 17:51:07 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.951 17:51:07 -- accel/accel.sh@20 -- # IFS=: 00:07:14.951 17:51:07 -- accel/accel.sh@20 -- # read -r var val 00:07:16.330 17:51:08 -- accel/accel.sh@21 -- # val= 00:07:16.330 17:51:08 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.330 17:51:08 -- accel/accel.sh@20 -- # IFS=: 00:07:16.330 17:51:08 -- accel/accel.sh@20 -- # read -r var val 00:07:16.330 17:51:08 -- accel/accel.sh@21 -- # val= 00:07:16.330 17:51:08 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.330 17:51:08 -- accel/accel.sh@20 -- # IFS=: 00:07:16.330 17:51:08 -- accel/accel.sh@20 -- # read -r var val 00:07:16.330 17:51:08 -- accel/accel.sh@21 -- # val= 00:07:16.330 17:51:08 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.330 17:51:08 -- accel/accel.sh@20 -- # IFS=: 00:07:16.330 17:51:08 -- accel/accel.sh@20 -- # read -r var val 00:07:16.330 17:51:08 -- accel/accel.sh@21 -- # val= 00:07:16.330 17:51:08 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.330 17:51:08 -- accel/accel.sh@20 -- # IFS=: 00:07:16.330 17:51:08 -- accel/accel.sh@20 -- # read -r var val 00:07:16.330 17:51:08 -- accel/accel.sh@21 -- # val= 00:07:16.330 17:51:08 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.330 17:51:08 -- accel/accel.sh@20 -- # IFS=: 00:07:16.330 17:51:08 -- accel/accel.sh@20 -- # read -r var val 00:07:16.330 17:51:08 -- accel/accel.sh@21 -- # val= 00:07:16.330 17:51:08 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.330 17:51:08 -- accel/accel.sh@20 -- # IFS=: 00:07:16.330 17:51:08 -- accel/accel.sh@20 -- # read -r var val 00:07:16.330 17:51:08 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:16.330 17:51:08 -- accel/accel.sh@28 -- # [[ -n compare ]] 00:07:16.330 17:51:08 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:16.330 00:07:16.330 real 0m2.569s 00:07:16.330 user 0m2.310s 00:07:16.330 sys 0m0.256s 00:07:16.330 17:51:08 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:16.330 17:51:08 -- common/autotest_common.sh@10 -- # set +x 00:07:16.330 ************************************ 00:07:16.330 END TEST accel_compare 00:07:16.330 ************************************ 00:07:16.330 17:51:08 -- accel/accel.sh@101 -- # run_test accel_xor accel_test -t 1 -w xor -y 00:07:16.330 17:51:08 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:07:16.330 17:51:08 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:16.330 17:51:08 -- common/autotest_common.sh@10 -- # set +x 00:07:16.330 ************************************ 00:07:16.330 START TEST accel_xor 00:07:16.330 ************************************ 00:07:16.330 17:51:08 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w xor -y 00:07:16.330 17:51:08 -- accel/accel.sh@16 -- # local accel_opc 00:07:16.330 17:51:08 -- accel/accel.sh@17 -- # local accel_module 00:07:16.330 17:51:08 -- accel/accel.sh@18 -- # accel_perf -t 1 -w xor -y 00:07:16.330 17:51:08 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y 00:07:16.330 17:51:08 -- accel/accel.sh@12 -- # build_accel_config 00:07:16.330 17:51:08 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:16.330 17:51:08 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:16.330 17:51:08 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:16.330 17:51:08 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:16.330 17:51:08 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:16.330 17:51:08 -- accel/accel.sh@41 -- # local IFS=, 00:07:16.330 17:51:08 -- accel/accel.sh@42 -- # jq -r . 00:07:16.330 [2024-11-19 17:51:08.828583] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:16.330 [2024-11-19 17:51:08.828718] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid627021 ] 00:07:16.331 EAL: No free 2048 kB hugepages reported on node 1 00:07:16.331 [2024-11-19 17:51:08.898194] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:16.331 [2024-11-19 17:51:08.932761] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:17.267 17:51:10 -- accel/accel.sh@18 -- # out=' 00:07:17.267 SPDK Configuration: 00:07:17.267 Core mask: 0x1 00:07:17.267 00:07:17.267 Accel Perf Configuration: 00:07:17.267 Workload Type: xor 00:07:17.267 Source buffers: 2 00:07:17.267 Transfer size: 4096 bytes 00:07:17.267 Vector count 1 00:07:17.267 Module: software 00:07:17.267 Queue depth: 32 00:07:17.267 Allocate depth: 32 00:07:17.267 # threads/core: 1 00:07:17.267 Run time: 1 seconds 00:07:17.267 Verify: Yes 00:07:17.267 00:07:17.267 Running for 1 seconds... 00:07:17.267 00:07:17.267 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:17.267 ------------------------------------------------------------------------------------ 00:07:17.267 0,0 702432/s 2743 MiB/s 0 0 00:07:17.267 ==================================================================================== 00:07:17.267 Total 702432/s 2743 MiB/s 0 0' 00:07:17.267 17:51:10 -- accel/accel.sh@20 -- # IFS=: 00:07:17.267 17:51:10 -- accel/accel.sh@20 -- # read -r var val 00:07:17.267 17:51:10 -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y 00:07:17.267 17:51:10 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y 00:07:17.267 17:51:10 -- accel/accel.sh@12 -- # build_accel_config 00:07:17.267 17:51:10 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:17.267 17:51:10 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:17.267 17:51:10 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:17.267 17:51:10 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:17.267 17:51:10 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:17.267 17:51:10 -- accel/accel.sh@41 -- # local IFS=, 00:07:17.267 17:51:10 -- accel/accel.sh@42 -- # jq -r . 00:07:17.267 [2024-11-19 17:51:10.115201] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:17.267 [2024-11-19 17:51:10.115289] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid627221 ] 00:07:17.527 EAL: No free 2048 kB hugepages reported on node 1 00:07:17.527 [2024-11-19 17:51:10.183750] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:17.527 [2024-11-19 17:51:10.218433] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:17.527 17:51:10 -- accel/accel.sh@21 -- # val= 00:07:17.527 17:51:10 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.527 17:51:10 -- accel/accel.sh@20 -- # IFS=: 00:07:17.527 17:51:10 -- accel/accel.sh@20 -- # read -r var val 00:07:17.527 17:51:10 -- accel/accel.sh@21 -- # val= 00:07:17.527 17:51:10 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.527 17:51:10 -- accel/accel.sh@20 -- # IFS=: 00:07:17.527 17:51:10 -- accel/accel.sh@20 -- # read -r var val 00:07:17.527 17:51:10 -- accel/accel.sh@21 -- # val=0x1 00:07:17.527 17:51:10 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.527 17:51:10 -- accel/accel.sh@20 -- # IFS=: 00:07:17.527 17:51:10 -- accel/accel.sh@20 -- # read -r var val 00:07:17.527 17:51:10 -- accel/accel.sh@21 -- # val= 00:07:17.527 17:51:10 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.527 17:51:10 -- accel/accel.sh@20 -- # IFS=: 00:07:17.527 17:51:10 -- accel/accel.sh@20 -- # read -r var val 00:07:17.527 17:51:10 -- accel/accel.sh@21 -- # val= 00:07:17.527 17:51:10 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.527 17:51:10 -- accel/accel.sh@20 -- # IFS=: 00:07:17.527 17:51:10 -- accel/accel.sh@20 -- # read -r var val 00:07:17.527 17:51:10 -- accel/accel.sh@21 -- # val=xor 00:07:17.527 17:51:10 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.527 17:51:10 -- accel/accel.sh@24 -- # accel_opc=xor 00:07:17.527 17:51:10 -- accel/accel.sh@20 -- # IFS=: 00:07:17.527 17:51:10 -- accel/accel.sh@20 -- # read -r var val 00:07:17.527 17:51:10 -- accel/accel.sh@21 -- # val=2 00:07:17.527 17:51:10 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.527 17:51:10 -- accel/accel.sh@20 -- # IFS=: 00:07:17.527 17:51:10 -- accel/accel.sh@20 -- # read -r var val 00:07:17.527 17:51:10 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:17.527 17:51:10 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.527 17:51:10 -- accel/accel.sh@20 -- # IFS=: 00:07:17.527 17:51:10 -- accel/accel.sh@20 -- # read -r var val 00:07:17.527 17:51:10 -- accel/accel.sh@21 -- # val= 00:07:17.527 17:51:10 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.527 17:51:10 -- accel/accel.sh@20 -- # IFS=: 00:07:17.527 17:51:10 -- accel/accel.sh@20 -- # read -r var val 00:07:17.527 17:51:10 -- accel/accel.sh@21 -- # val=software 00:07:17.527 17:51:10 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.527 17:51:10 -- accel/accel.sh@23 -- # accel_module=software 00:07:17.527 17:51:10 -- accel/accel.sh@20 -- # IFS=: 00:07:17.527 17:51:10 -- accel/accel.sh@20 -- # read -r var val 00:07:17.527 17:51:10 -- accel/accel.sh@21 -- # val=32 00:07:17.527 17:51:10 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.527 17:51:10 -- accel/accel.sh@20 -- # IFS=: 00:07:17.527 17:51:10 -- accel/accel.sh@20 -- # read -r var val 00:07:17.527 17:51:10 -- accel/accel.sh@21 -- # val=32 00:07:17.527 17:51:10 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.527 17:51:10 -- accel/accel.sh@20 -- # IFS=: 00:07:17.527 17:51:10 -- accel/accel.sh@20 -- # read -r var val 00:07:17.527 17:51:10 -- accel/accel.sh@21 -- # val=1 00:07:17.527 17:51:10 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.527 17:51:10 -- accel/accel.sh@20 -- # IFS=: 00:07:17.527 17:51:10 -- accel/accel.sh@20 -- # read -r var val 00:07:17.527 17:51:10 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:17.527 17:51:10 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.527 17:51:10 -- accel/accel.sh@20 -- # IFS=: 00:07:17.527 17:51:10 -- accel/accel.sh@20 -- # read -r var val 00:07:17.527 17:51:10 -- accel/accel.sh@21 -- # val=Yes 00:07:17.527 17:51:10 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.527 17:51:10 -- accel/accel.sh@20 -- # IFS=: 00:07:17.527 17:51:10 -- accel/accel.sh@20 -- # read -r var val 00:07:17.527 17:51:10 -- accel/accel.sh@21 -- # val= 00:07:17.527 17:51:10 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.527 17:51:10 -- accel/accel.sh@20 -- # IFS=: 00:07:17.527 17:51:10 -- accel/accel.sh@20 -- # read -r var val 00:07:17.527 17:51:10 -- accel/accel.sh@21 -- # val= 00:07:17.527 17:51:10 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.527 17:51:10 -- accel/accel.sh@20 -- # IFS=: 00:07:17.527 17:51:10 -- accel/accel.sh@20 -- # read -r var val 00:07:18.907 17:51:11 -- accel/accel.sh@21 -- # val= 00:07:18.907 17:51:11 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.907 17:51:11 -- accel/accel.sh@20 -- # IFS=: 00:07:18.907 17:51:11 -- accel/accel.sh@20 -- # read -r var val 00:07:18.907 17:51:11 -- accel/accel.sh@21 -- # val= 00:07:18.907 17:51:11 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.907 17:51:11 -- accel/accel.sh@20 -- # IFS=: 00:07:18.907 17:51:11 -- accel/accel.sh@20 -- # read -r var val 00:07:18.907 17:51:11 -- accel/accel.sh@21 -- # val= 00:07:18.907 17:51:11 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.907 17:51:11 -- accel/accel.sh@20 -- # IFS=: 00:07:18.907 17:51:11 -- accel/accel.sh@20 -- # read -r var val 00:07:18.907 17:51:11 -- accel/accel.sh@21 -- # val= 00:07:18.907 17:51:11 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.907 17:51:11 -- accel/accel.sh@20 -- # IFS=: 00:07:18.907 17:51:11 -- accel/accel.sh@20 -- # read -r var val 00:07:18.907 17:51:11 -- accel/accel.sh@21 -- # val= 00:07:18.907 17:51:11 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.907 17:51:11 -- accel/accel.sh@20 -- # IFS=: 00:07:18.907 17:51:11 -- accel/accel.sh@20 -- # read -r var val 00:07:18.907 17:51:11 -- accel/accel.sh@21 -- # val= 00:07:18.907 17:51:11 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.907 17:51:11 -- accel/accel.sh@20 -- # IFS=: 00:07:18.907 17:51:11 -- accel/accel.sh@20 -- # read -r var val 00:07:18.907 17:51:11 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:18.907 17:51:11 -- accel/accel.sh@28 -- # [[ -n xor ]] 00:07:18.907 17:51:11 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:18.907 00:07:18.907 real 0m2.575s 00:07:18.907 user 0m2.322s 00:07:18.907 sys 0m0.250s 00:07:18.907 17:51:11 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:18.907 17:51:11 -- common/autotest_common.sh@10 -- # set +x 00:07:18.907 ************************************ 00:07:18.907 END TEST accel_xor 00:07:18.907 ************************************ 00:07:18.907 17:51:11 -- accel/accel.sh@102 -- # run_test accel_xor accel_test -t 1 -w xor -y -x 3 00:07:18.907 17:51:11 -- common/autotest_common.sh@1087 -- # '[' 9 -le 1 ']' 00:07:18.907 17:51:11 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:18.907 17:51:11 -- common/autotest_common.sh@10 -- # set +x 00:07:18.907 ************************************ 00:07:18.907 START TEST accel_xor 00:07:18.907 ************************************ 00:07:18.907 17:51:11 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w xor -y -x 3 00:07:18.907 17:51:11 -- accel/accel.sh@16 -- # local accel_opc 00:07:18.907 17:51:11 -- accel/accel.sh@17 -- # local accel_module 00:07:18.907 17:51:11 -- accel/accel.sh@18 -- # accel_perf -t 1 -w xor -y -x 3 00:07:18.907 17:51:11 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x 3 00:07:18.907 17:51:11 -- accel/accel.sh@12 -- # build_accel_config 00:07:18.907 17:51:11 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:18.907 17:51:11 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:18.907 17:51:11 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:18.907 17:51:11 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:18.907 17:51:11 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:18.907 17:51:11 -- accel/accel.sh@41 -- # local IFS=, 00:07:18.907 17:51:11 -- accel/accel.sh@42 -- # jq -r . 00:07:18.908 [2024-11-19 17:51:11.444850] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:18.908 [2024-11-19 17:51:11.444940] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid627416 ] 00:07:18.908 EAL: No free 2048 kB hugepages reported on node 1 00:07:18.908 [2024-11-19 17:51:11.513291] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:18.908 [2024-11-19 17:51:11.549038] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:20.285 17:51:12 -- accel/accel.sh@18 -- # out=' 00:07:20.285 SPDK Configuration: 00:07:20.285 Core mask: 0x1 00:07:20.285 00:07:20.285 Accel Perf Configuration: 00:07:20.285 Workload Type: xor 00:07:20.285 Source buffers: 3 00:07:20.285 Transfer size: 4096 bytes 00:07:20.285 Vector count 1 00:07:20.285 Module: software 00:07:20.285 Queue depth: 32 00:07:20.285 Allocate depth: 32 00:07:20.285 # threads/core: 1 00:07:20.285 Run time: 1 seconds 00:07:20.285 Verify: Yes 00:07:20.285 00:07:20.285 Running for 1 seconds... 00:07:20.285 00:07:20.285 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:20.285 ------------------------------------------------------------------------------------ 00:07:20.285 0,0 650880/s 2542 MiB/s 0 0 00:07:20.285 ==================================================================================== 00:07:20.285 Total 650880/s 2542 MiB/s 0 0' 00:07:20.285 17:51:12 -- accel/accel.sh@20 -- # IFS=: 00:07:20.285 17:51:12 -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y -x 3 00:07:20.285 17:51:12 -- accel/accel.sh@20 -- # read -r var val 00:07:20.285 17:51:12 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x 3 00:07:20.285 17:51:12 -- accel/accel.sh@12 -- # build_accel_config 00:07:20.285 17:51:12 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:20.285 17:51:12 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:20.285 17:51:12 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:20.285 17:51:12 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:20.285 17:51:12 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:20.285 17:51:12 -- accel/accel.sh@41 -- # local IFS=, 00:07:20.285 17:51:12 -- accel/accel.sh@42 -- # jq -r . 00:07:20.285 [2024-11-19 17:51:12.717944] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:20.285 [2024-11-19 17:51:12.717997] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid627600 ] 00:07:20.285 EAL: No free 2048 kB hugepages reported on node 1 00:07:20.285 [2024-11-19 17:51:12.779536] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:20.286 [2024-11-19 17:51:12.813715] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:20.286 17:51:12 -- accel/accel.sh@21 -- # val= 00:07:20.286 17:51:12 -- accel/accel.sh@22 -- # case "$var" in 00:07:20.286 17:51:12 -- accel/accel.sh@20 -- # IFS=: 00:07:20.286 17:51:12 -- accel/accel.sh@20 -- # read -r var val 00:07:20.286 17:51:12 -- accel/accel.sh@21 -- # val= 00:07:20.286 17:51:12 -- accel/accel.sh@22 -- # case "$var" in 00:07:20.286 17:51:12 -- accel/accel.sh@20 -- # IFS=: 00:07:20.286 17:51:12 -- accel/accel.sh@20 -- # read -r var val 00:07:20.286 17:51:12 -- accel/accel.sh@21 -- # val=0x1 00:07:20.286 17:51:12 -- accel/accel.sh@22 -- # case "$var" in 00:07:20.286 17:51:12 -- accel/accel.sh@20 -- # IFS=: 00:07:20.286 17:51:12 -- accel/accel.sh@20 -- # read -r var val 00:07:20.286 17:51:12 -- accel/accel.sh@21 -- # val= 00:07:20.286 17:51:12 -- accel/accel.sh@22 -- # case "$var" in 00:07:20.286 17:51:12 -- accel/accel.sh@20 -- # IFS=: 00:07:20.286 17:51:12 -- accel/accel.sh@20 -- # read -r var val 00:07:20.286 17:51:12 -- accel/accel.sh@21 -- # val= 00:07:20.286 17:51:12 -- accel/accel.sh@22 -- # case "$var" in 00:07:20.286 17:51:12 -- accel/accel.sh@20 -- # IFS=: 00:07:20.286 17:51:12 -- accel/accel.sh@20 -- # read -r var val 00:07:20.286 17:51:12 -- accel/accel.sh@21 -- # val=xor 00:07:20.286 17:51:12 -- accel/accel.sh@22 -- # case "$var" in 00:07:20.286 17:51:12 -- accel/accel.sh@24 -- # accel_opc=xor 00:07:20.286 17:51:12 -- accel/accel.sh@20 -- # IFS=: 00:07:20.286 17:51:12 -- accel/accel.sh@20 -- # read -r var val 00:07:20.286 17:51:12 -- accel/accel.sh@21 -- # val=3 00:07:20.286 17:51:12 -- accel/accel.sh@22 -- # case "$var" in 00:07:20.286 17:51:12 -- accel/accel.sh@20 -- # IFS=: 00:07:20.286 17:51:12 -- accel/accel.sh@20 -- # read -r var val 00:07:20.286 17:51:12 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:20.286 17:51:12 -- accel/accel.sh@22 -- # case "$var" in 00:07:20.286 17:51:12 -- accel/accel.sh@20 -- # IFS=: 00:07:20.286 17:51:12 -- accel/accel.sh@20 -- # read -r var val 00:07:20.286 17:51:12 -- accel/accel.sh@21 -- # val= 00:07:20.286 17:51:12 -- accel/accel.sh@22 -- # case "$var" in 00:07:20.286 17:51:12 -- accel/accel.sh@20 -- # IFS=: 00:07:20.286 17:51:12 -- accel/accel.sh@20 -- # read -r var val 00:07:20.286 17:51:12 -- accel/accel.sh@21 -- # val=software 00:07:20.286 17:51:12 -- accel/accel.sh@22 -- # case "$var" in 00:07:20.286 17:51:12 -- accel/accel.sh@23 -- # accel_module=software 00:07:20.286 17:51:12 -- accel/accel.sh@20 -- # IFS=: 00:07:20.286 17:51:12 -- accel/accel.sh@20 -- # read -r var val 00:07:20.286 17:51:12 -- accel/accel.sh@21 -- # val=32 00:07:20.286 17:51:12 -- accel/accel.sh@22 -- # case "$var" in 00:07:20.286 17:51:12 -- accel/accel.sh@20 -- # IFS=: 00:07:20.286 17:51:12 -- accel/accel.sh@20 -- # read -r var val 00:07:20.286 17:51:12 -- accel/accel.sh@21 -- # val=32 00:07:20.286 17:51:12 -- accel/accel.sh@22 -- # case "$var" in 00:07:20.286 17:51:12 -- accel/accel.sh@20 -- # IFS=: 00:07:20.286 17:51:12 -- accel/accel.sh@20 -- # read -r var val 00:07:20.286 17:51:12 -- accel/accel.sh@21 -- # val=1 00:07:20.286 17:51:12 -- accel/accel.sh@22 -- # case "$var" in 00:07:20.286 17:51:12 -- accel/accel.sh@20 -- # IFS=: 00:07:20.286 17:51:12 -- accel/accel.sh@20 -- # read -r var val 00:07:20.286 17:51:12 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:20.286 17:51:12 -- accel/accel.sh@22 -- # case "$var" in 00:07:20.286 17:51:12 -- accel/accel.sh@20 -- # IFS=: 00:07:20.286 17:51:12 -- accel/accel.sh@20 -- # read -r var val 00:07:20.286 17:51:12 -- accel/accel.sh@21 -- # val=Yes 00:07:20.286 17:51:12 -- accel/accel.sh@22 -- # case "$var" in 00:07:20.286 17:51:12 -- accel/accel.sh@20 -- # IFS=: 00:07:20.286 17:51:12 -- accel/accel.sh@20 -- # read -r var val 00:07:20.286 17:51:12 -- accel/accel.sh@21 -- # val= 00:07:20.286 17:51:12 -- accel/accel.sh@22 -- # case "$var" in 00:07:20.286 17:51:12 -- accel/accel.sh@20 -- # IFS=: 00:07:20.286 17:51:12 -- accel/accel.sh@20 -- # read -r var val 00:07:20.286 17:51:12 -- accel/accel.sh@21 -- # val= 00:07:20.286 17:51:12 -- accel/accel.sh@22 -- # case "$var" in 00:07:20.286 17:51:12 -- accel/accel.sh@20 -- # IFS=: 00:07:20.286 17:51:12 -- accel/accel.sh@20 -- # read -r var val 00:07:21.223 17:51:13 -- accel/accel.sh@21 -- # val= 00:07:21.223 17:51:13 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.223 17:51:13 -- accel/accel.sh@20 -- # IFS=: 00:07:21.223 17:51:13 -- accel/accel.sh@20 -- # read -r var val 00:07:21.223 17:51:13 -- accel/accel.sh@21 -- # val= 00:07:21.223 17:51:13 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.223 17:51:13 -- accel/accel.sh@20 -- # IFS=: 00:07:21.223 17:51:13 -- accel/accel.sh@20 -- # read -r var val 00:07:21.223 17:51:13 -- accel/accel.sh@21 -- # val= 00:07:21.223 17:51:13 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.223 17:51:13 -- accel/accel.sh@20 -- # IFS=: 00:07:21.223 17:51:13 -- accel/accel.sh@20 -- # read -r var val 00:07:21.223 17:51:13 -- accel/accel.sh@21 -- # val= 00:07:21.223 17:51:13 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.223 17:51:13 -- accel/accel.sh@20 -- # IFS=: 00:07:21.223 17:51:13 -- accel/accel.sh@20 -- # read -r var val 00:07:21.223 17:51:13 -- accel/accel.sh@21 -- # val= 00:07:21.223 17:51:13 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.223 17:51:13 -- accel/accel.sh@20 -- # IFS=: 00:07:21.223 17:51:13 -- accel/accel.sh@20 -- # read -r var val 00:07:21.223 17:51:13 -- accel/accel.sh@21 -- # val= 00:07:21.223 17:51:13 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.223 17:51:13 -- accel/accel.sh@20 -- # IFS=: 00:07:21.223 17:51:13 -- accel/accel.sh@20 -- # read -r var val 00:07:21.223 17:51:13 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:21.223 17:51:13 -- accel/accel.sh@28 -- # [[ -n xor ]] 00:07:21.223 17:51:13 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:21.223 00:07:21.223 real 0m2.550s 00:07:21.223 user 0m2.303s 00:07:21.223 sys 0m0.246s 00:07:21.223 17:51:13 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:21.223 17:51:13 -- common/autotest_common.sh@10 -- # set +x 00:07:21.223 ************************************ 00:07:21.223 END TEST accel_xor 00:07:21.223 ************************************ 00:07:21.223 17:51:14 -- accel/accel.sh@103 -- # run_test accel_dif_verify accel_test -t 1 -w dif_verify 00:07:21.223 17:51:14 -- common/autotest_common.sh@1087 -- # '[' 6 -le 1 ']' 00:07:21.223 17:51:14 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:21.223 17:51:14 -- common/autotest_common.sh@10 -- # set +x 00:07:21.223 ************************************ 00:07:21.223 START TEST accel_dif_verify 00:07:21.223 ************************************ 00:07:21.223 17:51:14 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w dif_verify 00:07:21.223 17:51:14 -- accel/accel.sh@16 -- # local accel_opc 00:07:21.223 17:51:14 -- accel/accel.sh@17 -- # local accel_module 00:07:21.223 17:51:14 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dif_verify 00:07:21.223 17:51:14 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_verify 00:07:21.223 17:51:14 -- accel/accel.sh@12 -- # build_accel_config 00:07:21.223 17:51:14 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:21.223 17:51:14 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:21.223 17:51:14 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:21.223 17:51:14 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:21.223 17:51:14 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:21.223 17:51:14 -- accel/accel.sh@41 -- # local IFS=, 00:07:21.223 17:51:14 -- accel/accel.sh@42 -- # jq -r . 00:07:21.223 [2024-11-19 17:51:14.031699] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:21.223 [2024-11-19 17:51:14.031794] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid627881 ] 00:07:21.223 EAL: No free 2048 kB hugepages reported on node 1 00:07:21.481 [2024-11-19 17:51:14.100366] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:21.481 [2024-11-19 17:51:14.134843] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:22.858 17:51:15 -- accel/accel.sh@18 -- # out=' 00:07:22.858 SPDK Configuration: 00:07:22.858 Core mask: 0x1 00:07:22.858 00:07:22.858 Accel Perf Configuration: 00:07:22.858 Workload Type: dif_verify 00:07:22.858 Vector size: 4096 bytes 00:07:22.858 Transfer size: 4096 bytes 00:07:22.858 Block size: 512 bytes 00:07:22.858 Metadata size: 8 bytes 00:07:22.858 Vector count 1 00:07:22.858 Module: software 00:07:22.858 Queue depth: 32 00:07:22.858 Allocate depth: 32 00:07:22.858 # threads/core: 1 00:07:22.858 Run time: 1 seconds 00:07:22.858 Verify: No 00:07:22.858 00:07:22.858 Running for 1 seconds... 00:07:22.858 00:07:22.858 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:22.858 ------------------------------------------------------------------------------------ 00:07:22.858 0,0 246080/s 976 MiB/s 0 0 00:07:22.858 ==================================================================================== 00:07:22.859 Total 246080/s 961 MiB/s 0 0' 00:07:22.859 17:51:15 -- accel/accel.sh@20 -- # IFS=: 00:07:22.859 17:51:15 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_verify 00:07:22.859 17:51:15 -- accel/accel.sh@20 -- # read -r var val 00:07:22.859 17:51:15 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_verify 00:07:22.859 17:51:15 -- accel/accel.sh@12 -- # build_accel_config 00:07:22.859 17:51:15 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:22.859 17:51:15 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:22.859 17:51:15 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:22.859 17:51:15 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:22.859 17:51:15 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:22.859 17:51:15 -- accel/accel.sh@41 -- # local IFS=, 00:07:22.859 17:51:15 -- accel/accel.sh@42 -- # jq -r . 00:07:22.859 [2024-11-19 17:51:15.314216] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:22.859 [2024-11-19 17:51:15.314304] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid628149 ] 00:07:22.859 EAL: No free 2048 kB hugepages reported on node 1 00:07:22.859 [2024-11-19 17:51:15.381884] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:22.859 [2024-11-19 17:51:15.415925] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:22.859 17:51:15 -- accel/accel.sh@21 -- # val= 00:07:22.859 17:51:15 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.859 17:51:15 -- accel/accel.sh@20 -- # IFS=: 00:07:22.859 17:51:15 -- accel/accel.sh@20 -- # read -r var val 00:07:22.859 17:51:15 -- accel/accel.sh@21 -- # val= 00:07:22.859 17:51:15 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.859 17:51:15 -- accel/accel.sh@20 -- # IFS=: 00:07:22.859 17:51:15 -- accel/accel.sh@20 -- # read -r var val 00:07:22.859 17:51:15 -- accel/accel.sh@21 -- # val=0x1 00:07:22.859 17:51:15 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.859 17:51:15 -- accel/accel.sh@20 -- # IFS=: 00:07:22.859 17:51:15 -- accel/accel.sh@20 -- # read -r var val 00:07:22.859 17:51:15 -- accel/accel.sh@21 -- # val= 00:07:22.859 17:51:15 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.859 17:51:15 -- accel/accel.sh@20 -- # IFS=: 00:07:22.859 17:51:15 -- accel/accel.sh@20 -- # read -r var val 00:07:22.859 17:51:15 -- accel/accel.sh@21 -- # val= 00:07:22.859 17:51:15 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.859 17:51:15 -- accel/accel.sh@20 -- # IFS=: 00:07:22.859 17:51:15 -- accel/accel.sh@20 -- # read -r var val 00:07:22.859 17:51:15 -- accel/accel.sh@21 -- # val=dif_verify 00:07:22.859 17:51:15 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.859 17:51:15 -- accel/accel.sh@24 -- # accel_opc=dif_verify 00:07:22.859 17:51:15 -- accel/accel.sh@20 -- # IFS=: 00:07:22.859 17:51:15 -- accel/accel.sh@20 -- # read -r var val 00:07:22.859 17:51:15 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:22.859 17:51:15 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.859 17:51:15 -- accel/accel.sh@20 -- # IFS=: 00:07:22.859 17:51:15 -- accel/accel.sh@20 -- # read -r var val 00:07:22.859 17:51:15 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:22.859 17:51:15 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.859 17:51:15 -- accel/accel.sh@20 -- # IFS=: 00:07:22.859 17:51:15 -- accel/accel.sh@20 -- # read -r var val 00:07:22.859 17:51:15 -- accel/accel.sh@21 -- # val='512 bytes' 00:07:22.859 17:51:15 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.859 17:51:15 -- accel/accel.sh@20 -- # IFS=: 00:07:22.859 17:51:15 -- accel/accel.sh@20 -- # read -r var val 00:07:22.859 17:51:15 -- accel/accel.sh@21 -- # val='8 bytes' 00:07:22.859 17:51:15 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.859 17:51:15 -- accel/accel.sh@20 -- # IFS=: 00:07:22.859 17:51:15 -- accel/accel.sh@20 -- # read -r var val 00:07:22.859 17:51:15 -- accel/accel.sh@21 -- # val= 00:07:22.859 17:51:15 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.859 17:51:15 -- accel/accel.sh@20 -- # IFS=: 00:07:22.859 17:51:15 -- accel/accel.sh@20 -- # read -r var val 00:07:22.859 17:51:15 -- accel/accel.sh@21 -- # val=software 00:07:22.859 17:51:15 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.859 17:51:15 -- accel/accel.sh@23 -- # accel_module=software 00:07:22.859 17:51:15 -- accel/accel.sh@20 -- # IFS=: 00:07:22.859 17:51:15 -- accel/accel.sh@20 -- # read -r var val 00:07:22.859 17:51:15 -- accel/accel.sh@21 -- # val=32 00:07:22.859 17:51:15 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.859 17:51:15 -- accel/accel.sh@20 -- # IFS=: 00:07:22.859 17:51:15 -- accel/accel.sh@20 -- # read -r var val 00:07:22.859 17:51:15 -- accel/accel.sh@21 -- # val=32 00:07:22.859 17:51:15 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.859 17:51:15 -- accel/accel.sh@20 -- # IFS=: 00:07:22.859 17:51:15 -- accel/accel.sh@20 -- # read -r var val 00:07:22.859 17:51:15 -- accel/accel.sh@21 -- # val=1 00:07:22.859 17:51:15 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.859 17:51:15 -- accel/accel.sh@20 -- # IFS=: 00:07:22.859 17:51:15 -- accel/accel.sh@20 -- # read -r var val 00:07:22.859 17:51:15 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:22.859 17:51:15 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.859 17:51:15 -- accel/accel.sh@20 -- # IFS=: 00:07:22.859 17:51:15 -- accel/accel.sh@20 -- # read -r var val 00:07:22.859 17:51:15 -- accel/accel.sh@21 -- # val=No 00:07:22.859 17:51:15 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.859 17:51:15 -- accel/accel.sh@20 -- # IFS=: 00:07:22.859 17:51:15 -- accel/accel.sh@20 -- # read -r var val 00:07:22.859 17:51:15 -- accel/accel.sh@21 -- # val= 00:07:22.859 17:51:15 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.859 17:51:15 -- accel/accel.sh@20 -- # IFS=: 00:07:22.859 17:51:15 -- accel/accel.sh@20 -- # read -r var val 00:07:22.859 17:51:15 -- accel/accel.sh@21 -- # val= 00:07:22.859 17:51:15 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.859 17:51:15 -- accel/accel.sh@20 -- # IFS=: 00:07:22.859 17:51:15 -- accel/accel.sh@20 -- # read -r var val 00:07:23.798 17:51:16 -- accel/accel.sh@21 -- # val= 00:07:23.798 17:51:16 -- accel/accel.sh@22 -- # case "$var" in 00:07:23.798 17:51:16 -- accel/accel.sh@20 -- # IFS=: 00:07:23.798 17:51:16 -- accel/accel.sh@20 -- # read -r var val 00:07:23.798 17:51:16 -- accel/accel.sh@21 -- # val= 00:07:23.798 17:51:16 -- accel/accel.sh@22 -- # case "$var" in 00:07:23.798 17:51:16 -- accel/accel.sh@20 -- # IFS=: 00:07:23.798 17:51:16 -- accel/accel.sh@20 -- # read -r var val 00:07:23.798 17:51:16 -- accel/accel.sh@21 -- # val= 00:07:23.798 17:51:16 -- accel/accel.sh@22 -- # case "$var" in 00:07:23.798 17:51:16 -- accel/accel.sh@20 -- # IFS=: 00:07:23.798 17:51:16 -- accel/accel.sh@20 -- # read -r var val 00:07:23.798 17:51:16 -- accel/accel.sh@21 -- # val= 00:07:23.798 17:51:16 -- accel/accel.sh@22 -- # case "$var" in 00:07:23.798 17:51:16 -- accel/accel.sh@20 -- # IFS=: 00:07:23.798 17:51:16 -- accel/accel.sh@20 -- # read -r var val 00:07:23.798 17:51:16 -- accel/accel.sh@21 -- # val= 00:07:23.798 17:51:16 -- accel/accel.sh@22 -- # case "$var" in 00:07:23.798 17:51:16 -- accel/accel.sh@20 -- # IFS=: 00:07:23.798 17:51:16 -- accel/accel.sh@20 -- # read -r var val 00:07:23.798 17:51:16 -- accel/accel.sh@21 -- # val= 00:07:23.798 17:51:16 -- accel/accel.sh@22 -- # case "$var" in 00:07:23.798 17:51:16 -- accel/accel.sh@20 -- # IFS=: 00:07:23.798 17:51:16 -- accel/accel.sh@20 -- # read -r var val 00:07:23.798 17:51:16 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:23.798 17:51:16 -- accel/accel.sh@28 -- # [[ -n dif_verify ]] 00:07:23.798 17:51:16 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:23.798 00:07:23.798 real 0m2.566s 00:07:23.798 user 0m2.312s 00:07:23.798 sys 0m0.253s 00:07:23.798 17:51:16 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:23.798 17:51:16 -- common/autotest_common.sh@10 -- # set +x 00:07:23.798 ************************************ 00:07:23.798 END TEST accel_dif_verify 00:07:23.798 ************************************ 00:07:23.798 17:51:16 -- accel/accel.sh@104 -- # run_test accel_dif_generate accel_test -t 1 -w dif_generate 00:07:23.798 17:51:16 -- common/autotest_common.sh@1087 -- # '[' 6 -le 1 ']' 00:07:23.798 17:51:16 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:23.798 17:51:16 -- common/autotest_common.sh@10 -- # set +x 00:07:23.798 ************************************ 00:07:23.798 START TEST accel_dif_generate 00:07:23.798 ************************************ 00:07:23.798 17:51:16 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w dif_generate 00:07:23.798 17:51:16 -- accel/accel.sh@16 -- # local accel_opc 00:07:23.798 17:51:16 -- accel/accel.sh@17 -- # local accel_module 00:07:23.798 17:51:16 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dif_generate 00:07:23.798 17:51:16 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate 00:07:23.798 17:51:16 -- accel/accel.sh@12 -- # build_accel_config 00:07:23.798 17:51:16 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:23.798 17:51:16 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:23.798 17:51:16 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:23.798 17:51:16 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:23.798 17:51:16 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:23.798 17:51:16 -- accel/accel.sh@41 -- # local IFS=, 00:07:23.798 17:51:16 -- accel/accel.sh@42 -- # jq -r . 00:07:23.798 [2024-11-19 17:51:16.638359] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:23.798 [2024-11-19 17:51:16.638444] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid628436 ] 00:07:24.057 EAL: No free 2048 kB hugepages reported on node 1 00:07:24.057 [2024-11-19 17:51:16.706351] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:24.057 [2024-11-19 17:51:16.741078] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:25.435 17:51:17 -- accel/accel.sh@18 -- # out=' 00:07:25.435 SPDK Configuration: 00:07:25.435 Core mask: 0x1 00:07:25.435 00:07:25.435 Accel Perf Configuration: 00:07:25.435 Workload Type: dif_generate 00:07:25.435 Vector size: 4096 bytes 00:07:25.435 Transfer size: 4096 bytes 00:07:25.435 Block size: 512 bytes 00:07:25.435 Metadata size: 8 bytes 00:07:25.435 Vector count 1 00:07:25.435 Module: software 00:07:25.435 Queue depth: 32 00:07:25.435 Allocate depth: 32 00:07:25.435 # threads/core: 1 00:07:25.435 Run time: 1 seconds 00:07:25.435 Verify: No 00:07:25.435 00:07:25.435 Running for 1 seconds... 00:07:25.435 00:07:25.435 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:25.436 ------------------------------------------------------------------------------------ 00:07:25.436 0,0 279360/s 1108 MiB/s 0 0 00:07:25.436 ==================================================================================== 00:07:25.436 Total 279360/s 1091 MiB/s 0 0' 00:07:25.436 17:51:17 -- accel/accel.sh@20 -- # IFS=: 00:07:25.436 17:51:17 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate 00:07:25.436 17:51:17 -- accel/accel.sh@20 -- # read -r var val 00:07:25.436 17:51:17 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate 00:07:25.436 17:51:17 -- accel/accel.sh@12 -- # build_accel_config 00:07:25.436 17:51:17 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:25.436 17:51:17 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:25.436 17:51:17 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:25.436 17:51:17 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:25.436 17:51:17 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:25.436 17:51:17 -- accel/accel.sh@41 -- # local IFS=, 00:07:25.436 17:51:17 -- accel/accel.sh@42 -- # jq -r . 00:07:25.436 [2024-11-19 17:51:17.910146] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:25.436 [2024-11-19 17:51:17.910201] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid628701 ] 00:07:25.436 EAL: No free 2048 kB hugepages reported on node 1 00:07:25.436 [2024-11-19 17:51:17.972129] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:25.436 [2024-11-19 17:51:18.006456] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:25.436 17:51:18 -- accel/accel.sh@21 -- # val= 00:07:25.436 17:51:18 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.436 17:51:18 -- accel/accel.sh@20 -- # IFS=: 00:07:25.436 17:51:18 -- accel/accel.sh@20 -- # read -r var val 00:07:25.436 17:51:18 -- accel/accel.sh@21 -- # val= 00:07:25.436 17:51:18 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.436 17:51:18 -- accel/accel.sh@20 -- # IFS=: 00:07:25.436 17:51:18 -- accel/accel.sh@20 -- # read -r var val 00:07:25.436 17:51:18 -- accel/accel.sh@21 -- # val=0x1 00:07:25.436 17:51:18 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.436 17:51:18 -- accel/accel.sh@20 -- # IFS=: 00:07:25.436 17:51:18 -- accel/accel.sh@20 -- # read -r var val 00:07:25.436 17:51:18 -- accel/accel.sh@21 -- # val= 00:07:25.436 17:51:18 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.436 17:51:18 -- accel/accel.sh@20 -- # IFS=: 00:07:25.436 17:51:18 -- accel/accel.sh@20 -- # read -r var val 00:07:25.436 17:51:18 -- accel/accel.sh@21 -- # val= 00:07:25.436 17:51:18 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.436 17:51:18 -- accel/accel.sh@20 -- # IFS=: 00:07:25.436 17:51:18 -- accel/accel.sh@20 -- # read -r var val 00:07:25.436 17:51:18 -- accel/accel.sh@21 -- # val=dif_generate 00:07:25.436 17:51:18 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.436 17:51:18 -- accel/accel.sh@24 -- # accel_opc=dif_generate 00:07:25.436 17:51:18 -- accel/accel.sh@20 -- # IFS=: 00:07:25.436 17:51:18 -- accel/accel.sh@20 -- # read -r var val 00:07:25.436 17:51:18 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:25.436 17:51:18 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.436 17:51:18 -- accel/accel.sh@20 -- # IFS=: 00:07:25.436 17:51:18 -- accel/accel.sh@20 -- # read -r var val 00:07:25.436 17:51:18 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:25.436 17:51:18 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.436 17:51:18 -- accel/accel.sh@20 -- # IFS=: 00:07:25.436 17:51:18 -- accel/accel.sh@20 -- # read -r var val 00:07:25.436 17:51:18 -- accel/accel.sh@21 -- # val='512 bytes' 00:07:25.436 17:51:18 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.436 17:51:18 -- accel/accel.sh@20 -- # IFS=: 00:07:25.436 17:51:18 -- accel/accel.sh@20 -- # read -r var val 00:07:25.436 17:51:18 -- accel/accel.sh@21 -- # val='8 bytes' 00:07:25.436 17:51:18 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.436 17:51:18 -- accel/accel.sh@20 -- # IFS=: 00:07:25.436 17:51:18 -- accel/accel.sh@20 -- # read -r var val 00:07:25.436 17:51:18 -- accel/accel.sh@21 -- # val= 00:07:25.436 17:51:18 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.436 17:51:18 -- accel/accel.sh@20 -- # IFS=: 00:07:25.436 17:51:18 -- accel/accel.sh@20 -- # read -r var val 00:07:25.436 17:51:18 -- accel/accel.sh@21 -- # val=software 00:07:25.436 17:51:18 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.436 17:51:18 -- accel/accel.sh@23 -- # accel_module=software 00:07:25.436 17:51:18 -- accel/accel.sh@20 -- # IFS=: 00:07:25.436 17:51:18 -- accel/accel.sh@20 -- # read -r var val 00:07:25.436 17:51:18 -- accel/accel.sh@21 -- # val=32 00:07:25.436 17:51:18 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.436 17:51:18 -- accel/accel.sh@20 -- # IFS=: 00:07:25.436 17:51:18 -- accel/accel.sh@20 -- # read -r var val 00:07:25.436 17:51:18 -- accel/accel.sh@21 -- # val=32 00:07:25.436 17:51:18 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.436 17:51:18 -- accel/accel.sh@20 -- # IFS=: 00:07:25.436 17:51:18 -- accel/accel.sh@20 -- # read -r var val 00:07:25.436 17:51:18 -- accel/accel.sh@21 -- # val=1 00:07:25.436 17:51:18 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.436 17:51:18 -- accel/accel.sh@20 -- # IFS=: 00:07:25.436 17:51:18 -- accel/accel.sh@20 -- # read -r var val 00:07:25.436 17:51:18 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:25.436 17:51:18 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.436 17:51:18 -- accel/accel.sh@20 -- # IFS=: 00:07:25.436 17:51:18 -- accel/accel.sh@20 -- # read -r var val 00:07:25.436 17:51:18 -- accel/accel.sh@21 -- # val=No 00:07:25.436 17:51:18 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.436 17:51:18 -- accel/accel.sh@20 -- # IFS=: 00:07:25.436 17:51:18 -- accel/accel.sh@20 -- # read -r var val 00:07:25.436 17:51:18 -- accel/accel.sh@21 -- # val= 00:07:25.436 17:51:18 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.436 17:51:18 -- accel/accel.sh@20 -- # IFS=: 00:07:25.436 17:51:18 -- accel/accel.sh@20 -- # read -r var val 00:07:25.436 17:51:18 -- accel/accel.sh@21 -- # val= 00:07:25.436 17:51:18 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.436 17:51:18 -- accel/accel.sh@20 -- # IFS=: 00:07:25.436 17:51:18 -- accel/accel.sh@20 -- # read -r var val 00:07:26.375 17:51:19 -- accel/accel.sh@21 -- # val= 00:07:26.375 17:51:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.375 17:51:19 -- accel/accel.sh@20 -- # IFS=: 00:07:26.375 17:51:19 -- accel/accel.sh@20 -- # read -r var val 00:07:26.375 17:51:19 -- accel/accel.sh@21 -- # val= 00:07:26.375 17:51:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.375 17:51:19 -- accel/accel.sh@20 -- # IFS=: 00:07:26.375 17:51:19 -- accel/accel.sh@20 -- # read -r var val 00:07:26.375 17:51:19 -- accel/accel.sh@21 -- # val= 00:07:26.375 17:51:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.375 17:51:19 -- accel/accel.sh@20 -- # IFS=: 00:07:26.375 17:51:19 -- accel/accel.sh@20 -- # read -r var val 00:07:26.375 17:51:19 -- accel/accel.sh@21 -- # val= 00:07:26.375 17:51:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.375 17:51:19 -- accel/accel.sh@20 -- # IFS=: 00:07:26.375 17:51:19 -- accel/accel.sh@20 -- # read -r var val 00:07:26.375 17:51:19 -- accel/accel.sh@21 -- # val= 00:07:26.375 17:51:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.375 17:51:19 -- accel/accel.sh@20 -- # IFS=: 00:07:26.375 17:51:19 -- accel/accel.sh@20 -- # read -r var val 00:07:26.375 17:51:19 -- accel/accel.sh@21 -- # val= 00:07:26.375 17:51:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.375 17:51:19 -- accel/accel.sh@20 -- # IFS=: 00:07:26.375 17:51:19 -- accel/accel.sh@20 -- # read -r var val 00:07:26.375 17:51:19 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:26.375 17:51:19 -- accel/accel.sh@28 -- # [[ -n dif_generate ]] 00:07:26.375 17:51:19 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:26.375 00:07:26.375 real 0m2.551s 00:07:26.375 user 0m2.318s 00:07:26.375 sys 0m0.233s 00:07:26.375 17:51:19 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:26.375 17:51:19 -- common/autotest_common.sh@10 -- # set +x 00:07:26.375 ************************************ 00:07:26.375 END TEST accel_dif_generate 00:07:26.375 ************************************ 00:07:26.375 17:51:19 -- accel/accel.sh@105 -- # run_test accel_dif_generate_copy accel_test -t 1 -w dif_generate_copy 00:07:26.375 17:51:19 -- common/autotest_common.sh@1087 -- # '[' 6 -le 1 ']' 00:07:26.375 17:51:19 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:26.375 17:51:19 -- common/autotest_common.sh@10 -- # set +x 00:07:26.375 ************************************ 00:07:26.375 START TEST accel_dif_generate_copy 00:07:26.375 ************************************ 00:07:26.375 17:51:19 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w dif_generate_copy 00:07:26.375 17:51:19 -- accel/accel.sh@16 -- # local accel_opc 00:07:26.375 17:51:19 -- accel/accel.sh@17 -- # local accel_module 00:07:26.375 17:51:19 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dif_generate_copy 00:07:26.375 17:51:19 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate_copy 00:07:26.375 17:51:19 -- accel/accel.sh@12 -- # build_accel_config 00:07:26.375 17:51:19 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:26.375 17:51:19 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:26.375 17:51:19 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:26.375 17:51:19 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:26.375 17:51:19 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:26.375 17:51:19 -- accel/accel.sh@41 -- # local IFS=, 00:07:26.375 17:51:19 -- accel/accel.sh@42 -- # jq -r . 00:07:26.375 [2024-11-19 17:51:19.216632] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:26.375 [2024-11-19 17:51:19.216698] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid628869 ] 00:07:26.635 EAL: No free 2048 kB hugepages reported on node 1 00:07:26.635 [2024-11-19 17:51:19.280755] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:26.635 [2024-11-19 17:51:19.315971] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:28.014 17:51:20 -- accel/accel.sh@18 -- # out=' 00:07:28.014 SPDK Configuration: 00:07:28.014 Core mask: 0x1 00:07:28.014 00:07:28.014 Accel Perf Configuration: 00:07:28.014 Workload Type: dif_generate_copy 00:07:28.014 Vector size: 4096 bytes 00:07:28.014 Transfer size: 4096 bytes 00:07:28.014 Vector count 1 00:07:28.014 Module: software 00:07:28.014 Queue depth: 32 00:07:28.014 Allocate depth: 32 00:07:28.014 # threads/core: 1 00:07:28.014 Run time: 1 seconds 00:07:28.014 Verify: No 00:07:28.014 00:07:28.014 Running for 1 seconds... 00:07:28.014 00:07:28.014 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:28.014 ------------------------------------------------------------------------------------ 00:07:28.014 0,0 222624/s 883 MiB/s 0 0 00:07:28.014 ==================================================================================== 00:07:28.014 Total 222624/s 869 MiB/s 0 0' 00:07:28.014 17:51:20 -- accel/accel.sh@20 -- # IFS=: 00:07:28.014 17:51:20 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate_copy 00:07:28.014 17:51:20 -- accel/accel.sh@20 -- # read -r var val 00:07:28.014 17:51:20 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate_copy 00:07:28.014 17:51:20 -- accel/accel.sh@12 -- # build_accel_config 00:07:28.014 17:51:20 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:28.014 17:51:20 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:28.014 17:51:20 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:28.014 17:51:20 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:28.014 17:51:20 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:28.014 17:51:20 -- accel/accel.sh@41 -- # local IFS=, 00:07:28.014 17:51:20 -- accel/accel.sh@42 -- # jq -r . 00:07:28.014 [2024-11-19 17:51:20.485983] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:28.014 [2024-11-19 17:51:20.486037] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid629018 ] 00:07:28.014 EAL: No free 2048 kB hugepages reported on node 1 00:07:28.014 [2024-11-19 17:51:20.549879] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:28.014 [2024-11-19 17:51:20.585276] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:28.014 17:51:20 -- accel/accel.sh@21 -- # val= 00:07:28.014 17:51:20 -- accel/accel.sh@22 -- # case "$var" in 00:07:28.014 17:51:20 -- accel/accel.sh@20 -- # IFS=: 00:07:28.014 17:51:20 -- accel/accel.sh@20 -- # read -r var val 00:07:28.014 17:51:20 -- accel/accel.sh@21 -- # val= 00:07:28.014 17:51:20 -- accel/accel.sh@22 -- # case "$var" in 00:07:28.014 17:51:20 -- accel/accel.sh@20 -- # IFS=: 00:07:28.014 17:51:20 -- accel/accel.sh@20 -- # read -r var val 00:07:28.014 17:51:20 -- accel/accel.sh@21 -- # val=0x1 00:07:28.014 17:51:20 -- accel/accel.sh@22 -- # case "$var" in 00:07:28.014 17:51:20 -- accel/accel.sh@20 -- # IFS=: 00:07:28.014 17:51:20 -- accel/accel.sh@20 -- # read -r var val 00:07:28.014 17:51:20 -- accel/accel.sh@21 -- # val= 00:07:28.014 17:51:20 -- accel/accel.sh@22 -- # case "$var" in 00:07:28.014 17:51:20 -- accel/accel.sh@20 -- # IFS=: 00:07:28.015 17:51:20 -- accel/accel.sh@20 -- # read -r var val 00:07:28.015 17:51:20 -- accel/accel.sh@21 -- # val= 00:07:28.015 17:51:20 -- accel/accel.sh@22 -- # case "$var" in 00:07:28.015 17:51:20 -- accel/accel.sh@20 -- # IFS=: 00:07:28.015 17:51:20 -- accel/accel.sh@20 -- # read -r var val 00:07:28.015 17:51:20 -- accel/accel.sh@21 -- # val=dif_generate_copy 00:07:28.015 17:51:20 -- accel/accel.sh@22 -- # case "$var" in 00:07:28.015 17:51:20 -- accel/accel.sh@24 -- # accel_opc=dif_generate_copy 00:07:28.015 17:51:20 -- accel/accel.sh@20 -- # IFS=: 00:07:28.015 17:51:20 -- accel/accel.sh@20 -- # read -r var val 00:07:28.015 17:51:20 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:28.015 17:51:20 -- accel/accel.sh@22 -- # case "$var" in 00:07:28.015 17:51:20 -- accel/accel.sh@20 -- # IFS=: 00:07:28.015 17:51:20 -- accel/accel.sh@20 -- # read -r var val 00:07:28.015 17:51:20 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:28.015 17:51:20 -- accel/accel.sh@22 -- # case "$var" in 00:07:28.015 17:51:20 -- accel/accel.sh@20 -- # IFS=: 00:07:28.015 17:51:20 -- accel/accel.sh@20 -- # read -r var val 00:07:28.015 17:51:20 -- accel/accel.sh@21 -- # val= 00:07:28.015 17:51:20 -- accel/accel.sh@22 -- # case "$var" in 00:07:28.015 17:51:20 -- accel/accel.sh@20 -- # IFS=: 00:07:28.015 17:51:20 -- accel/accel.sh@20 -- # read -r var val 00:07:28.015 17:51:20 -- accel/accel.sh@21 -- # val=software 00:07:28.015 17:51:20 -- accel/accel.sh@22 -- # case "$var" in 00:07:28.015 17:51:20 -- accel/accel.sh@23 -- # accel_module=software 00:07:28.015 17:51:20 -- accel/accel.sh@20 -- # IFS=: 00:07:28.015 17:51:20 -- accel/accel.sh@20 -- # read -r var val 00:07:28.015 17:51:20 -- accel/accel.sh@21 -- # val=32 00:07:28.015 17:51:20 -- accel/accel.sh@22 -- # case "$var" in 00:07:28.015 17:51:20 -- accel/accel.sh@20 -- # IFS=: 00:07:28.015 17:51:20 -- accel/accel.sh@20 -- # read -r var val 00:07:28.015 17:51:20 -- accel/accel.sh@21 -- # val=32 00:07:28.015 17:51:20 -- accel/accel.sh@22 -- # case "$var" in 00:07:28.015 17:51:20 -- accel/accel.sh@20 -- # IFS=: 00:07:28.015 17:51:20 -- accel/accel.sh@20 -- # read -r var val 00:07:28.015 17:51:20 -- accel/accel.sh@21 -- # val=1 00:07:28.015 17:51:20 -- accel/accel.sh@22 -- # case "$var" in 00:07:28.015 17:51:20 -- accel/accel.sh@20 -- # IFS=: 00:07:28.015 17:51:20 -- accel/accel.sh@20 -- # read -r var val 00:07:28.015 17:51:20 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:28.015 17:51:20 -- accel/accel.sh@22 -- # case "$var" in 00:07:28.015 17:51:20 -- accel/accel.sh@20 -- # IFS=: 00:07:28.015 17:51:20 -- accel/accel.sh@20 -- # read -r var val 00:07:28.015 17:51:20 -- accel/accel.sh@21 -- # val=No 00:07:28.015 17:51:20 -- accel/accel.sh@22 -- # case "$var" in 00:07:28.015 17:51:20 -- accel/accel.sh@20 -- # IFS=: 00:07:28.015 17:51:20 -- accel/accel.sh@20 -- # read -r var val 00:07:28.015 17:51:20 -- accel/accel.sh@21 -- # val= 00:07:28.015 17:51:20 -- accel/accel.sh@22 -- # case "$var" in 00:07:28.015 17:51:20 -- accel/accel.sh@20 -- # IFS=: 00:07:28.015 17:51:20 -- accel/accel.sh@20 -- # read -r var val 00:07:28.015 17:51:20 -- accel/accel.sh@21 -- # val= 00:07:28.015 17:51:20 -- accel/accel.sh@22 -- # case "$var" in 00:07:28.015 17:51:20 -- accel/accel.sh@20 -- # IFS=: 00:07:28.015 17:51:20 -- accel/accel.sh@20 -- # read -r var val 00:07:28.954 17:51:21 -- accel/accel.sh@21 -- # val= 00:07:28.954 17:51:21 -- accel/accel.sh@22 -- # case "$var" in 00:07:28.954 17:51:21 -- accel/accel.sh@20 -- # IFS=: 00:07:28.954 17:51:21 -- accel/accel.sh@20 -- # read -r var val 00:07:28.954 17:51:21 -- accel/accel.sh@21 -- # val= 00:07:28.954 17:51:21 -- accel/accel.sh@22 -- # case "$var" in 00:07:28.954 17:51:21 -- accel/accel.sh@20 -- # IFS=: 00:07:28.954 17:51:21 -- accel/accel.sh@20 -- # read -r var val 00:07:28.954 17:51:21 -- accel/accel.sh@21 -- # val= 00:07:28.954 17:51:21 -- accel/accel.sh@22 -- # case "$var" in 00:07:28.954 17:51:21 -- accel/accel.sh@20 -- # IFS=: 00:07:28.954 17:51:21 -- accel/accel.sh@20 -- # read -r var val 00:07:28.954 17:51:21 -- accel/accel.sh@21 -- # val= 00:07:28.954 17:51:21 -- accel/accel.sh@22 -- # case "$var" in 00:07:28.954 17:51:21 -- accel/accel.sh@20 -- # IFS=: 00:07:28.954 17:51:21 -- accel/accel.sh@20 -- # read -r var val 00:07:28.954 17:51:21 -- accel/accel.sh@21 -- # val= 00:07:28.954 17:51:21 -- accel/accel.sh@22 -- # case "$var" in 00:07:28.954 17:51:21 -- accel/accel.sh@20 -- # IFS=: 00:07:28.954 17:51:21 -- accel/accel.sh@20 -- # read -r var val 00:07:28.954 17:51:21 -- accel/accel.sh@21 -- # val= 00:07:28.954 17:51:21 -- accel/accel.sh@22 -- # case "$var" in 00:07:28.954 17:51:21 -- accel/accel.sh@20 -- # IFS=: 00:07:28.954 17:51:21 -- accel/accel.sh@20 -- # read -r var val 00:07:28.954 17:51:21 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:28.954 17:51:21 -- accel/accel.sh@28 -- # [[ -n dif_generate_copy ]] 00:07:28.954 17:51:21 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:28.954 00:07:28.954 real 0m2.541s 00:07:28.954 user 0m2.302s 00:07:28.954 sys 0m0.237s 00:07:28.954 17:51:21 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:28.954 17:51:21 -- common/autotest_common.sh@10 -- # set +x 00:07:28.954 ************************************ 00:07:28.954 END TEST accel_dif_generate_copy 00:07:28.954 ************************************ 00:07:28.954 17:51:21 -- accel/accel.sh@107 -- # [[ y == y ]] 00:07:28.954 17:51:21 -- accel/accel.sh@108 -- # run_test accel_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:28.954 17:51:21 -- common/autotest_common.sh@1087 -- # '[' 8 -le 1 ']' 00:07:28.954 17:51:21 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:28.954 17:51:21 -- common/autotest_common.sh@10 -- # set +x 00:07:28.954 ************************************ 00:07:28.954 START TEST accel_comp 00:07:28.954 ************************************ 00:07:28.954 17:51:21 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:28.954 17:51:21 -- accel/accel.sh@16 -- # local accel_opc 00:07:28.954 17:51:21 -- accel/accel.sh@17 -- # local accel_module 00:07:28.954 17:51:21 -- accel/accel.sh@18 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:28.954 17:51:21 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:28.954 17:51:21 -- accel/accel.sh@12 -- # build_accel_config 00:07:28.954 17:51:21 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:28.954 17:51:21 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:28.954 17:51:21 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:28.954 17:51:21 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:28.954 17:51:21 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:28.954 17:51:21 -- accel/accel.sh@41 -- # local IFS=, 00:07:28.954 17:51:21 -- accel/accel.sh@42 -- # jq -r . 00:07:28.954 [2024-11-19 17:51:21.805222] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:28.954 [2024-11-19 17:51:21.805317] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid629299 ] 00:07:29.218 EAL: No free 2048 kB hugepages reported on node 1 00:07:29.218 [2024-11-19 17:51:21.872737] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:29.218 [2024-11-19 17:51:21.907780] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:30.605 17:51:23 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:30.605 00:07:30.605 SPDK Configuration: 00:07:30.605 Core mask: 0x1 00:07:30.605 00:07:30.605 Accel Perf Configuration: 00:07:30.605 Workload Type: compress 00:07:30.605 Transfer size: 4096 bytes 00:07:30.605 Vector count 1 00:07:30.605 Module: software 00:07:30.605 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:30.605 Queue depth: 32 00:07:30.605 Allocate depth: 32 00:07:30.605 # threads/core: 1 00:07:30.605 Run time: 1 seconds 00:07:30.605 Verify: No 00:07:30.605 00:07:30.605 Running for 1 seconds... 00:07:30.605 00:07:30.605 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:30.605 ------------------------------------------------------------------------------------ 00:07:30.605 0,0 68064/s 283 MiB/s 0 0 00:07:30.605 ==================================================================================== 00:07:30.605 Total 68064/s 265 MiB/s 0 0' 00:07:30.605 17:51:23 -- accel/accel.sh@20 -- # IFS=: 00:07:30.605 17:51:23 -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:30.605 17:51:23 -- accel/accel.sh@20 -- # read -r var val 00:07:30.605 17:51:23 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:30.605 17:51:23 -- accel/accel.sh@12 -- # build_accel_config 00:07:30.605 17:51:23 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:30.605 17:51:23 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:30.605 17:51:23 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:30.605 17:51:23 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:30.605 17:51:23 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:30.605 17:51:23 -- accel/accel.sh@41 -- # local IFS=, 00:07:30.605 17:51:23 -- accel/accel.sh@42 -- # jq -r . 00:07:30.605 [2024-11-19 17:51:23.088317] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:30.605 [2024-11-19 17:51:23.088418] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid629567 ] 00:07:30.605 EAL: No free 2048 kB hugepages reported on node 1 00:07:30.605 [2024-11-19 17:51:23.156857] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:30.605 [2024-11-19 17:51:23.190654] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:30.605 17:51:23 -- accel/accel.sh@21 -- # val= 00:07:30.605 17:51:23 -- accel/accel.sh@22 -- # case "$var" in 00:07:30.605 17:51:23 -- accel/accel.sh@20 -- # IFS=: 00:07:30.605 17:51:23 -- accel/accel.sh@20 -- # read -r var val 00:07:30.605 17:51:23 -- accel/accel.sh@21 -- # val= 00:07:30.605 17:51:23 -- accel/accel.sh@22 -- # case "$var" in 00:07:30.605 17:51:23 -- accel/accel.sh@20 -- # IFS=: 00:07:30.605 17:51:23 -- accel/accel.sh@20 -- # read -r var val 00:07:30.605 17:51:23 -- accel/accel.sh@21 -- # val= 00:07:30.605 17:51:23 -- accel/accel.sh@22 -- # case "$var" in 00:07:30.605 17:51:23 -- accel/accel.sh@20 -- # IFS=: 00:07:30.605 17:51:23 -- accel/accel.sh@20 -- # read -r var val 00:07:30.605 17:51:23 -- accel/accel.sh@21 -- # val=0x1 00:07:30.605 17:51:23 -- accel/accel.sh@22 -- # case "$var" in 00:07:30.605 17:51:23 -- accel/accel.sh@20 -- # IFS=: 00:07:30.605 17:51:23 -- accel/accel.sh@20 -- # read -r var val 00:07:30.605 17:51:23 -- accel/accel.sh@21 -- # val= 00:07:30.605 17:51:23 -- accel/accel.sh@22 -- # case "$var" in 00:07:30.605 17:51:23 -- accel/accel.sh@20 -- # IFS=: 00:07:30.605 17:51:23 -- accel/accel.sh@20 -- # read -r var val 00:07:30.605 17:51:23 -- accel/accel.sh@21 -- # val= 00:07:30.605 17:51:23 -- accel/accel.sh@22 -- # case "$var" in 00:07:30.605 17:51:23 -- accel/accel.sh@20 -- # IFS=: 00:07:30.605 17:51:23 -- accel/accel.sh@20 -- # read -r var val 00:07:30.605 17:51:23 -- accel/accel.sh@21 -- # val=compress 00:07:30.605 17:51:23 -- accel/accel.sh@22 -- # case "$var" in 00:07:30.605 17:51:23 -- accel/accel.sh@24 -- # accel_opc=compress 00:07:30.605 17:51:23 -- accel/accel.sh@20 -- # IFS=: 00:07:30.605 17:51:23 -- accel/accel.sh@20 -- # read -r var val 00:07:30.605 17:51:23 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:30.605 17:51:23 -- accel/accel.sh@22 -- # case "$var" in 00:07:30.605 17:51:23 -- accel/accel.sh@20 -- # IFS=: 00:07:30.605 17:51:23 -- accel/accel.sh@20 -- # read -r var val 00:07:30.605 17:51:23 -- accel/accel.sh@21 -- # val= 00:07:30.605 17:51:23 -- accel/accel.sh@22 -- # case "$var" in 00:07:30.605 17:51:23 -- accel/accel.sh@20 -- # IFS=: 00:07:30.605 17:51:23 -- accel/accel.sh@20 -- # read -r var val 00:07:30.605 17:51:23 -- accel/accel.sh@21 -- # val=software 00:07:30.605 17:51:23 -- accel/accel.sh@22 -- # case "$var" in 00:07:30.605 17:51:23 -- accel/accel.sh@23 -- # accel_module=software 00:07:30.605 17:51:23 -- accel/accel.sh@20 -- # IFS=: 00:07:30.605 17:51:23 -- accel/accel.sh@20 -- # read -r var val 00:07:30.605 17:51:23 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:30.605 17:51:23 -- accel/accel.sh@22 -- # case "$var" in 00:07:30.605 17:51:23 -- accel/accel.sh@20 -- # IFS=: 00:07:30.605 17:51:23 -- accel/accel.sh@20 -- # read -r var val 00:07:30.605 17:51:23 -- accel/accel.sh@21 -- # val=32 00:07:30.605 17:51:23 -- accel/accel.sh@22 -- # case "$var" in 00:07:30.605 17:51:23 -- accel/accel.sh@20 -- # IFS=: 00:07:30.605 17:51:23 -- accel/accel.sh@20 -- # read -r var val 00:07:30.605 17:51:23 -- accel/accel.sh@21 -- # val=32 00:07:30.605 17:51:23 -- accel/accel.sh@22 -- # case "$var" in 00:07:30.605 17:51:23 -- accel/accel.sh@20 -- # IFS=: 00:07:30.605 17:51:23 -- accel/accel.sh@20 -- # read -r var val 00:07:30.605 17:51:23 -- accel/accel.sh@21 -- # val=1 00:07:30.605 17:51:23 -- accel/accel.sh@22 -- # case "$var" in 00:07:30.605 17:51:23 -- accel/accel.sh@20 -- # IFS=: 00:07:30.605 17:51:23 -- accel/accel.sh@20 -- # read -r var val 00:07:30.605 17:51:23 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:30.605 17:51:23 -- accel/accel.sh@22 -- # case "$var" in 00:07:30.605 17:51:23 -- accel/accel.sh@20 -- # IFS=: 00:07:30.605 17:51:23 -- accel/accel.sh@20 -- # read -r var val 00:07:30.605 17:51:23 -- accel/accel.sh@21 -- # val=No 00:07:30.605 17:51:23 -- accel/accel.sh@22 -- # case "$var" in 00:07:30.605 17:51:23 -- accel/accel.sh@20 -- # IFS=: 00:07:30.605 17:51:23 -- accel/accel.sh@20 -- # read -r var val 00:07:30.605 17:51:23 -- accel/accel.sh@21 -- # val= 00:07:30.605 17:51:23 -- accel/accel.sh@22 -- # case "$var" in 00:07:30.605 17:51:23 -- accel/accel.sh@20 -- # IFS=: 00:07:30.605 17:51:23 -- accel/accel.sh@20 -- # read -r var val 00:07:30.605 17:51:23 -- accel/accel.sh@21 -- # val= 00:07:30.605 17:51:23 -- accel/accel.sh@22 -- # case "$var" in 00:07:30.605 17:51:23 -- accel/accel.sh@20 -- # IFS=: 00:07:30.605 17:51:23 -- accel/accel.sh@20 -- # read -r var val 00:07:31.544 17:51:24 -- accel/accel.sh@21 -- # val= 00:07:31.544 17:51:24 -- accel/accel.sh@22 -- # case "$var" in 00:07:31.544 17:51:24 -- accel/accel.sh@20 -- # IFS=: 00:07:31.544 17:51:24 -- accel/accel.sh@20 -- # read -r var val 00:07:31.544 17:51:24 -- accel/accel.sh@21 -- # val= 00:07:31.544 17:51:24 -- accel/accel.sh@22 -- # case "$var" in 00:07:31.544 17:51:24 -- accel/accel.sh@20 -- # IFS=: 00:07:31.544 17:51:24 -- accel/accel.sh@20 -- # read -r var val 00:07:31.544 17:51:24 -- accel/accel.sh@21 -- # val= 00:07:31.544 17:51:24 -- accel/accel.sh@22 -- # case "$var" in 00:07:31.544 17:51:24 -- accel/accel.sh@20 -- # IFS=: 00:07:31.544 17:51:24 -- accel/accel.sh@20 -- # read -r var val 00:07:31.544 17:51:24 -- accel/accel.sh@21 -- # val= 00:07:31.544 17:51:24 -- accel/accel.sh@22 -- # case "$var" in 00:07:31.544 17:51:24 -- accel/accel.sh@20 -- # IFS=: 00:07:31.544 17:51:24 -- accel/accel.sh@20 -- # read -r var val 00:07:31.544 17:51:24 -- accel/accel.sh@21 -- # val= 00:07:31.544 17:51:24 -- accel/accel.sh@22 -- # case "$var" in 00:07:31.544 17:51:24 -- accel/accel.sh@20 -- # IFS=: 00:07:31.544 17:51:24 -- accel/accel.sh@20 -- # read -r var val 00:07:31.544 17:51:24 -- accel/accel.sh@21 -- # val= 00:07:31.544 17:51:24 -- accel/accel.sh@22 -- # case "$var" in 00:07:31.544 17:51:24 -- accel/accel.sh@20 -- # IFS=: 00:07:31.544 17:51:24 -- accel/accel.sh@20 -- # read -r var val 00:07:31.544 17:51:24 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:31.544 17:51:24 -- accel/accel.sh@28 -- # [[ -n compress ]] 00:07:31.544 17:51:24 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:31.544 00:07:31.544 real 0m2.568s 00:07:31.544 user 0m2.320s 00:07:31.544 sys 0m0.246s 00:07:31.544 17:51:24 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:31.544 17:51:24 -- common/autotest_common.sh@10 -- # set +x 00:07:31.544 ************************************ 00:07:31.544 END TEST accel_comp 00:07:31.544 ************************************ 00:07:31.544 17:51:24 -- accel/accel.sh@109 -- # run_test accel_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:31.544 17:51:24 -- common/autotest_common.sh@1087 -- # '[' 9 -le 1 ']' 00:07:31.544 17:51:24 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:31.544 17:51:24 -- common/autotest_common.sh@10 -- # set +x 00:07:31.544 ************************************ 00:07:31.544 START TEST accel_decomp 00:07:31.544 ************************************ 00:07:31.544 17:51:24 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:31.544 17:51:24 -- accel/accel.sh@16 -- # local accel_opc 00:07:31.544 17:51:24 -- accel/accel.sh@17 -- # local accel_module 00:07:31.544 17:51:24 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:31.544 17:51:24 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:31.544 17:51:24 -- accel/accel.sh@12 -- # build_accel_config 00:07:31.544 17:51:24 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:31.544 17:51:24 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:31.544 17:51:24 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:31.544 17:51:24 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:31.544 17:51:24 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:31.544 17:51:24 -- accel/accel.sh@41 -- # local IFS=, 00:07:31.544 17:51:24 -- accel/accel.sh@42 -- # jq -r . 00:07:31.803 [2024-11-19 17:51:24.412758] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:31.803 [2024-11-19 17:51:24.412848] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid629848 ] 00:07:31.803 EAL: No free 2048 kB hugepages reported on node 1 00:07:31.803 [2024-11-19 17:51:24.481745] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:31.803 [2024-11-19 17:51:24.516299] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:33.185 17:51:25 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:33.185 00:07:33.185 SPDK Configuration: 00:07:33.185 Core mask: 0x1 00:07:33.185 00:07:33.185 Accel Perf Configuration: 00:07:33.185 Workload Type: decompress 00:07:33.185 Transfer size: 4096 bytes 00:07:33.185 Vector count 1 00:07:33.185 Module: software 00:07:33.185 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:33.185 Queue depth: 32 00:07:33.185 Allocate depth: 32 00:07:33.185 # threads/core: 1 00:07:33.185 Run time: 1 seconds 00:07:33.185 Verify: Yes 00:07:33.185 00:07:33.185 Running for 1 seconds... 00:07:33.185 00:07:33.185 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:33.185 ------------------------------------------------------------------------------------ 00:07:33.185 0,0 94240/s 173 MiB/s 0 0 00:07:33.185 ==================================================================================== 00:07:33.186 Total 94240/s 368 MiB/s 0 0' 00:07:33.186 17:51:25 -- accel/accel.sh@20 -- # IFS=: 00:07:33.186 17:51:25 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:33.186 17:51:25 -- accel/accel.sh@20 -- # read -r var val 00:07:33.186 17:51:25 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:33.186 17:51:25 -- accel/accel.sh@12 -- # build_accel_config 00:07:33.186 17:51:25 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:33.186 17:51:25 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:33.186 17:51:25 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:33.186 17:51:25 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:33.186 17:51:25 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:33.186 17:51:25 -- accel/accel.sh@41 -- # local IFS=, 00:07:33.186 17:51:25 -- accel/accel.sh@42 -- # jq -r . 00:07:33.186 [2024-11-19 17:51:25.687734] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:33.186 [2024-11-19 17:51:25.687788] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid630114 ] 00:07:33.186 EAL: No free 2048 kB hugepages reported on node 1 00:07:33.186 [2024-11-19 17:51:25.751259] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:33.186 [2024-11-19 17:51:25.784930] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:33.186 17:51:25 -- accel/accel.sh@21 -- # val= 00:07:33.186 17:51:25 -- accel/accel.sh@22 -- # case "$var" in 00:07:33.186 17:51:25 -- accel/accel.sh@20 -- # IFS=: 00:07:33.186 17:51:25 -- accel/accel.sh@20 -- # read -r var val 00:07:33.186 17:51:25 -- accel/accel.sh@21 -- # val= 00:07:33.186 17:51:25 -- accel/accel.sh@22 -- # case "$var" in 00:07:33.186 17:51:25 -- accel/accel.sh@20 -- # IFS=: 00:07:33.186 17:51:25 -- accel/accel.sh@20 -- # read -r var val 00:07:33.186 17:51:25 -- accel/accel.sh@21 -- # val= 00:07:33.186 17:51:25 -- accel/accel.sh@22 -- # case "$var" in 00:07:33.186 17:51:25 -- accel/accel.sh@20 -- # IFS=: 00:07:33.186 17:51:25 -- accel/accel.sh@20 -- # read -r var val 00:07:33.186 17:51:25 -- accel/accel.sh@21 -- # val=0x1 00:07:33.186 17:51:25 -- accel/accel.sh@22 -- # case "$var" in 00:07:33.186 17:51:25 -- accel/accel.sh@20 -- # IFS=: 00:07:33.186 17:51:25 -- accel/accel.sh@20 -- # read -r var val 00:07:33.186 17:51:25 -- accel/accel.sh@21 -- # val= 00:07:33.186 17:51:25 -- accel/accel.sh@22 -- # case "$var" in 00:07:33.186 17:51:25 -- accel/accel.sh@20 -- # IFS=: 00:07:33.186 17:51:25 -- accel/accel.sh@20 -- # read -r var val 00:07:33.186 17:51:25 -- accel/accel.sh@21 -- # val= 00:07:33.186 17:51:25 -- accel/accel.sh@22 -- # case "$var" in 00:07:33.186 17:51:25 -- accel/accel.sh@20 -- # IFS=: 00:07:33.186 17:51:25 -- accel/accel.sh@20 -- # read -r var val 00:07:33.186 17:51:25 -- accel/accel.sh@21 -- # val=decompress 00:07:33.186 17:51:25 -- accel/accel.sh@22 -- # case "$var" in 00:07:33.186 17:51:25 -- accel/accel.sh@24 -- # accel_opc=decompress 00:07:33.186 17:51:25 -- accel/accel.sh@20 -- # IFS=: 00:07:33.186 17:51:25 -- accel/accel.sh@20 -- # read -r var val 00:07:33.186 17:51:25 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:33.186 17:51:25 -- accel/accel.sh@22 -- # case "$var" in 00:07:33.186 17:51:25 -- accel/accel.sh@20 -- # IFS=: 00:07:33.186 17:51:25 -- accel/accel.sh@20 -- # read -r var val 00:07:33.186 17:51:25 -- accel/accel.sh@21 -- # val= 00:07:33.186 17:51:25 -- accel/accel.sh@22 -- # case "$var" in 00:07:33.186 17:51:25 -- accel/accel.sh@20 -- # IFS=: 00:07:33.186 17:51:25 -- accel/accel.sh@20 -- # read -r var val 00:07:33.186 17:51:25 -- accel/accel.sh@21 -- # val=software 00:07:33.186 17:51:25 -- accel/accel.sh@22 -- # case "$var" in 00:07:33.186 17:51:25 -- accel/accel.sh@23 -- # accel_module=software 00:07:33.186 17:51:25 -- accel/accel.sh@20 -- # IFS=: 00:07:33.186 17:51:25 -- accel/accel.sh@20 -- # read -r var val 00:07:33.186 17:51:25 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:33.186 17:51:25 -- accel/accel.sh@22 -- # case "$var" in 00:07:33.186 17:51:25 -- accel/accel.sh@20 -- # IFS=: 00:07:33.186 17:51:25 -- accel/accel.sh@20 -- # read -r var val 00:07:33.186 17:51:25 -- accel/accel.sh@21 -- # val=32 00:07:33.186 17:51:25 -- accel/accel.sh@22 -- # case "$var" in 00:07:33.186 17:51:25 -- accel/accel.sh@20 -- # IFS=: 00:07:33.186 17:51:25 -- accel/accel.sh@20 -- # read -r var val 00:07:33.186 17:51:25 -- accel/accel.sh@21 -- # val=32 00:07:33.186 17:51:25 -- accel/accel.sh@22 -- # case "$var" in 00:07:33.186 17:51:25 -- accel/accel.sh@20 -- # IFS=: 00:07:33.186 17:51:25 -- accel/accel.sh@20 -- # read -r var val 00:07:33.186 17:51:25 -- accel/accel.sh@21 -- # val=1 00:07:33.186 17:51:25 -- accel/accel.sh@22 -- # case "$var" in 00:07:33.186 17:51:25 -- accel/accel.sh@20 -- # IFS=: 00:07:33.186 17:51:25 -- accel/accel.sh@20 -- # read -r var val 00:07:33.186 17:51:25 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:33.186 17:51:25 -- accel/accel.sh@22 -- # case "$var" in 00:07:33.186 17:51:25 -- accel/accel.sh@20 -- # IFS=: 00:07:33.186 17:51:25 -- accel/accel.sh@20 -- # read -r var val 00:07:33.186 17:51:25 -- accel/accel.sh@21 -- # val=Yes 00:07:33.186 17:51:25 -- accel/accel.sh@22 -- # case "$var" in 00:07:33.186 17:51:25 -- accel/accel.sh@20 -- # IFS=: 00:07:33.186 17:51:25 -- accel/accel.sh@20 -- # read -r var val 00:07:33.186 17:51:25 -- accel/accel.sh@21 -- # val= 00:07:33.186 17:51:25 -- accel/accel.sh@22 -- # case "$var" in 00:07:33.186 17:51:25 -- accel/accel.sh@20 -- # IFS=: 00:07:33.186 17:51:25 -- accel/accel.sh@20 -- # read -r var val 00:07:33.186 17:51:25 -- accel/accel.sh@21 -- # val= 00:07:33.186 17:51:25 -- accel/accel.sh@22 -- # case "$var" in 00:07:33.186 17:51:25 -- accel/accel.sh@20 -- # IFS=: 00:07:33.186 17:51:25 -- accel/accel.sh@20 -- # read -r var val 00:07:34.125 17:51:26 -- accel/accel.sh@21 -- # val= 00:07:34.125 17:51:26 -- accel/accel.sh@22 -- # case "$var" in 00:07:34.125 17:51:26 -- accel/accel.sh@20 -- # IFS=: 00:07:34.125 17:51:26 -- accel/accel.sh@20 -- # read -r var val 00:07:34.125 17:51:26 -- accel/accel.sh@21 -- # val= 00:07:34.125 17:51:26 -- accel/accel.sh@22 -- # case "$var" in 00:07:34.125 17:51:26 -- accel/accel.sh@20 -- # IFS=: 00:07:34.125 17:51:26 -- accel/accel.sh@20 -- # read -r var val 00:07:34.125 17:51:26 -- accel/accel.sh@21 -- # val= 00:07:34.125 17:51:26 -- accel/accel.sh@22 -- # case "$var" in 00:07:34.125 17:51:26 -- accel/accel.sh@20 -- # IFS=: 00:07:34.125 17:51:26 -- accel/accel.sh@20 -- # read -r var val 00:07:34.125 17:51:26 -- accel/accel.sh@21 -- # val= 00:07:34.125 17:51:26 -- accel/accel.sh@22 -- # case "$var" in 00:07:34.125 17:51:26 -- accel/accel.sh@20 -- # IFS=: 00:07:34.125 17:51:26 -- accel/accel.sh@20 -- # read -r var val 00:07:34.125 17:51:26 -- accel/accel.sh@21 -- # val= 00:07:34.125 17:51:26 -- accel/accel.sh@22 -- # case "$var" in 00:07:34.125 17:51:26 -- accel/accel.sh@20 -- # IFS=: 00:07:34.125 17:51:26 -- accel/accel.sh@20 -- # read -r var val 00:07:34.125 17:51:26 -- accel/accel.sh@21 -- # val= 00:07:34.125 17:51:26 -- accel/accel.sh@22 -- # case "$var" in 00:07:34.125 17:51:26 -- accel/accel.sh@20 -- # IFS=: 00:07:34.125 17:51:26 -- accel/accel.sh@20 -- # read -r var val 00:07:34.125 17:51:26 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:34.125 17:51:26 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:07:34.125 17:51:26 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:34.125 00:07:34.125 real 0m2.557s 00:07:34.125 user 0m2.313s 00:07:34.125 sys 0m0.243s 00:07:34.125 17:51:26 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:34.125 17:51:26 -- common/autotest_common.sh@10 -- # set +x 00:07:34.125 ************************************ 00:07:34.125 END TEST accel_decomp 00:07:34.125 ************************************ 00:07:34.125 17:51:26 -- accel/accel.sh@110 -- # run_test accel_decmop_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:34.125 17:51:26 -- common/autotest_common.sh@1087 -- # '[' 11 -le 1 ']' 00:07:34.125 17:51:26 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:34.125 17:51:26 -- common/autotest_common.sh@10 -- # set +x 00:07:34.384 ************************************ 00:07:34.384 START TEST accel_decmop_full 00:07:34.384 ************************************ 00:07:34.384 17:51:26 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:34.384 17:51:26 -- accel/accel.sh@16 -- # local accel_opc 00:07:34.384 17:51:26 -- accel/accel.sh@17 -- # local accel_module 00:07:34.384 17:51:26 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:34.384 17:51:26 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:34.384 17:51:26 -- accel/accel.sh@12 -- # build_accel_config 00:07:34.384 17:51:26 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:34.384 17:51:26 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:34.384 17:51:26 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:34.384 17:51:26 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:34.384 17:51:26 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:34.384 17:51:26 -- accel/accel.sh@41 -- # local IFS=, 00:07:34.384 17:51:26 -- accel/accel.sh@42 -- # jq -r . 00:07:34.384 [2024-11-19 17:51:27.010833] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:34.384 [2024-11-19 17:51:27.010928] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid630297 ] 00:07:34.384 EAL: No free 2048 kB hugepages reported on node 1 00:07:34.384 [2024-11-19 17:51:27.079816] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:34.384 [2024-11-19 17:51:27.115978] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:35.765 17:51:28 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:35.765 00:07:35.765 SPDK Configuration: 00:07:35.765 Core mask: 0x1 00:07:35.765 00:07:35.765 Accel Perf Configuration: 00:07:35.765 Workload Type: decompress 00:07:35.765 Transfer size: 111250 bytes 00:07:35.765 Vector count 1 00:07:35.765 Module: software 00:07:35.765 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:35.765 Queue depth: 32 00:07:35.765 Allocate depth: 32 00:07:35.765 # threads/core: 1 00:07:35.765 Run time: 1 seconds 00:07:35.765 Verify: Yes 00:07:35.765 00:07:35.765 Running for 1 seconds... 00:07:35.765 00:07:35.765 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:35.765 ------------------------------------------------------------------------------------ 00:07:35.765 0,0 5920/s 244 MiB/s 0 0 00:07:35.765 ==================================================================================== 00:07:35.765 Total 5920/s 628 MiB/s 0 0' 00:07:35.765 17:51:28 -- accel/accel.sh@20 -- # IFS=: 00:07:35.765 17:51:28 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:35.765 17:51:28 -- accel/accel.sh@20 -- # read -r var val 00:07:35.765 17:51:28 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:35.765 17:51:28 -- accel/accel.sh@12 -- # build_accel_config 00:07:35.765 17:51:28 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:35.765 17:51:28 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:35.765 17:51:28 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:35.765 17:51:28 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:35.765 17:51:28 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:35.765 17:51:28 -- accel/accel.sh@41 -- # local IFS=, 00:07:35.765 17:51:28 -- accel/accel.sh@42 -- # jq -r . 00:07:35.765 [2024-11-19 17:51:28.298267] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:35.765 [2024-11-19 17:51:28.298322] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid630450 ] 00:07:35.765 EAL: No free 2048 kB hugepages reported on node 1 00:07:35.765 [2024-11-19 17:51:28.360890] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:35.765 [2024-11-19 17:51:28.395116] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:35.765 17:51:28 -- accel/accel.sh@21 -- # val= 00:07:35.765 17:51:28 -- accel/accel.sh@22 -- # case "$var" in 00:07:35.765 17:51:28 -- accel/accel.sh@20 -- # IFS=: 00:07:35.765 17:51:28 -- accel/accel.sh@20 -- # read -r var val 00:07:35.765 17:51:28 -- accel/accel.sh@21 -- # val= 00:07:35.765 17:51:28 -- accel/accel.sh@22 -- # case "$var" in 00:07:35.765 17:51:28 -- accel/accel.sh@20 -- # IFS=: 00:07:35.765 17:51:28 -- accel/accel.sh@20 -- # read -r var val 00:07:35.765 17:51:28 -- accel/accel.sh@21 -- # val= 00:07:35.765 17:51:28 -- accel/accel.sh@22 -- # case "$var" in 00:07:35.765 17:51:28 -- accel/accel.sh@20 -- # IFS=: 00:07:35.765 17:51:28 -- accel/accel.sh@20 -- # read -r var val 00:07:35.765 17:51:28 -- accel/accel.sh@21 -- # val=0x1 00:07:35.765 17:51:28 -- accel/accel.sh@22 -- # case "$var" in 00:07:35.765 17:51:28 -- accel/accel.sh@20 -- # IFS=: 00:07:35.765 17:51:28 -- accel/accel.sh@20 -- # read -r var val 00:07:35.765 17:51:28 -- accel/accel.sh@21 -- # val= 00:07:35.765 17:51:28 -- accel/accel.sh@22 -- # case "$var" in 00:07:35.765 17:51:28 -- accel/accel.sh@20 -- # IFS=: 00:07:35.765 17:51:28 -- accel/accel.sh@20 -- # read -r var val 00:07:35.765 17:51:28 -- accel/accel.sh@21 -- # val= 00:07:35.765 17:51:28 -- accel/accel.sh@22 -- # case "$var" in 00:07:35.765 17:51:28 -- accel/accel.sh@20 -- # IFS=: 00:07:35.765 17:51:28 -- accel/accel.sh@20 -- # read -r var val 00:07:35.765 17:51:28 -- accel/accel.sh@21 -- # val=decompress 00:07:35.765 17:51:28 -- accel/accel.sh@22 -- # case "$var" in 00:07:35.765 17:51:28 -- accel/accel.sh@24 -- # accel_opc=decompress 00:07:35.765 17:51:28 -- accel/accel.sh@20 -- # IFS=: 00:07:35.765 17:51:28 -- accel/accel.sh@20 -- # read -r var val 00:07:35.765 17:51:28 -- accel/accel.sh@21 -- # val='111250 bytes' 00:07:35.765 17:51:28 -- accel/accel.sh@22 -- # case "$var" in 00:07:35.765 17:51:28 -- accel/accel.sh@20 -- # IFS=: 00:07:35.765 17:51:28 -- accel/accel.sh@20 -- # read -r var val 00:07:35.765 17:51:28 -- accel/accel.sh@21 -- # val= 00:07:35.765 17:51:28 -- accel/accel.sh@22 -- # case "$var" in 00:07:35.765 17:51:28 -- accel/accel.sh@20 -- # IFS=: 00:07:35.765 17:51:28 -- accel/accel.sh@20 -- # read -r var val 00:07:35.765 17:51:28 -- accel/accel.sh@21 -- # val=software 00:07:35.765 17:51:28 -- accel/accel.sh@22 -- # case "$var" in 00:07:35.765 17:51:28 -- accel/accel.sh@23 -- # accel_module=software 00:07:35.765 17:51:28 -- accel/accel.sh@20 -- # IFS=: 00:07:35.765 17:51:28 -- accel/accel.sh@20 -- # read -r var val 00:07:35.765 17:51:28 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:35.765 17:51:28 -- accel/accel.sh@22 -- # case "$var" in 00:07:35.765 17:51:28 -- accel/accel.sh@20 -- # IFS=: 00:07:35.765 17:51:28 -- accel/accel.sh@20 -- # read -r var val 00:07:35.765 17:51:28 -- accel/accel.sh@21 -- # val=32 00:07:35.765 17:51:28 -- accel/accel.sh@22 -- # case "$var" in 00:07:35.765 17:51:28 -- accel/accel.sh@20 -- # IFS=: 00:07:35.765 17:51:28 -- accel/accel.sh@20 -- # read -r var val 00:07:35.765 17:51:28 -- accel/accel.sh@21 -- # val=32 00:07:35.765 17:51:28 -- accel/accel.sh@22 -- # case "$var" in 00:07:35.765 17:51:28 -- accel/accel.sh@20 -- # IFS=: 00:07:35.765 17:51:28 -- accel/accel.sh@20 -- # read -r var val 00:07:35.765 17:51:28 -- accel/accel.sh@21 -- # val=1 00:07:35.765 17:51:28 -- accel/accel.sh@22 -- # case "$var" in 00:07:35.765 17:51:28 -- accel/accel.sh@20 -- # IFS=: 00:07:35.765 17:51:28 -- accel/accel.sh@20 -- # read -r var val 00:07:35.765 17:51:28 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:35.765 17:51:28 -- accel/accel.sh@22 -- # case "$var" in 00:07:35.765 17:51:28 -- accel/accel.sh@20 -- # IFS=: 00:07:35.765 17:51:28 -- accel/accel.sh@20 -- # read -r var val 00:07:35.765 17:51:28 -- accel/accel.sh@21 -- # val=Yes 00:07:35.765 17:51:28 -- accel/accel.sh@22 -- # case "$var" in 00:07:35.765 17:51:28 -- accel/accel.sh@20 -- # IFS=: 00:07:35.765 17:51:28 -- accel/accel.sh@20 -- # read -r var val 00:07:35.765 17:51:28 -- accel/accel.sh@21 -- # val= 00:07:35.765 17:51:28 -- accel/accel.sh@22 -- # case "$var" in 00:07:35.765 17:51:28 -- accel/accel.sh@20 -- # IFS=: 00:07:35.765 17:51:28 -- accel/accel.sh@20 -- # read -r var val 00:07:35.765 17:51:28 -- accel/accel.sh@21 -- # val= 00:07:35.765 17:51:28 -- accel/accel.sh@22 -- # case "$var" in 00:07:35.765 17:51:28 -- accel/accel.sh@20 -- # IFS=: 00:07:35.765 17:51:28 -- accel/accel.sh@20 -- # read -r var val 00:07:36.704 17:51:29 -- accel/accel.sh@21 -- # val= 00:07:36.704 17:51:29 -- accel/accel.sh@22 -- # case "$var" in 00:07:36.704 17:51:29 -- accel/accel.sh@20 -- # IFS=: 00:07:36.704 17:51:29 -- accel/accel.sh@20 -- # read -r var val 00:07:36.704 17:51:29 -- accel/accel.sh@21 -- # val= 00:07:36.704 17:51:29 -- accel/accel.sh@22 -- # case "$var" in 00:07:36.704 17:51:29 -- accel/accel.sh@20 -- # IFS=: 00:07:36.704 17:51:29 -- accel/accel.sh@20 -- # read -r var val 00:07:36.704 17:51:29 -- accel/accel.sh@21 -- # val= 00:07:36.704 17:51:29 -- accel/accel.sh@22 -- # case "$var" in 00:07:36.704 17:51:29 -- accel/accel.sh@20 -- # IFS=: 00:07:36.704 17:51:29 -- accel/accel.sh@20 -- # read -r var val 00:07:36.963 17:51:29 -- accel/accel.sh@21 -- # val= 00:07:36.963 17:51:29 -- accel/accel.sh@22 -- # case "$var" in 00:07:36.963 17:51:29 -- accel/accel.sh@20 -- # IFS=: 00:07:36.963 17:51:29 -- accel/accel.sh@20 -- # read -r var val 00:07:36.963 17:51:29 -- accel/accel.sh@21 -- # val= 00:07:36.963 17:51:29 -- accel/accel.sh@22 -- # case "$var" in 00:07:36.963 17:51:29 -- accel/accel.sh@20 -- # IFS=: 00:07:36.963 17:51:29 -- accel/accel.sh@20 -- # read -r var val 00:07:36.963 17:51:29 -- accel/accel.sh@21 -- # val= 00:07:36.963 17:51:29 -- accel/accel.sh@22 -- # case "$var" in 00:07:36.963 17:51:29 -- accel/accel.sh@20 -- # IFS=: 00:07:36.963 17:51:29 -- accel/accel.sh@20 -- # read -r var val 00:07:36.963 17:51:29 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:36.963 17:51:29 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:07:36.963 17:51:29 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:36.963 00:07:36.963 real 0m2.579s 00:07:36.963 user 0m2.337s 00:07:36.963 sys 0m0.239s 00:07:36.963 17:51:29 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:36.963 17:51:29 -- common/autotest_common.sh@10 -- # set +x 00:07:36.963 ************************************ 00:07:36.963 END TEST accel_decmop_full 00:07:36.963 ************************************ 00:07:36.963 17:51:29 -- accel/accel.sh@111 -- # run_test accel_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:36.963 17:51:29 -- common/autotest_common.sh@1087 -- # '[' 11 -le 1 ']' 00:07:36.963 17:51:29 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:36.963 17:51:29 -- common/autotest_common.sh@10 -- # set +x 00:07:36.963 ************************************ 00:07:36.963 START TEST accel_decomp_mcore 00:07:36.963 ************************************ 00:07:36.963 17:51:29 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:36.963 17:51:29 -- accel/accel.sh@16 -- # local accel_opc 00:07:36.963 17:51:29 -- accel/accel.sh@17 -- # local accel_module 00:07:36.963 17:51:29 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:36.963 17:51:29 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:36.963 17:51:29 -- accel/accel.sh@12 -- # build_accel_config 00:07:36.963 17:51:29 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:36.963 17:51:29 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:36.963 17:51:29 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:36.963 17:51:29 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:36.963 17:51:29 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:36.963 17:51:29 -- accel/accel.sh@41 -- # local IFS=, 00:07:36.963 17:51:29 -- accel/accel.sh@42 -- # jq -r . 00:07:36.963 [2024-11-19 17:51:29.631551] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:36.963 [2024-11-19 17:51:29.631645] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid630709 ] 00:07:36.963 EAL: No free 2048 kB hugepages reported on node 1 00:07:36.963 [2024-11-19 17:51:29.700667] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:36.963 [2024-11-19 17:51:29.738321] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:36.963 [2024-11-19 17:51:29.738413] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:36.963 [2024-11-19 17:51:29.738509] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:07:36.963 [2024-11-19 17:51:29.738511] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:38.344 17:51:30 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:38.344 00:07:38.344 SPDK Configuration: 00:07:38.344 Core mask: 0xf 00:07:38.344 00:07:38.344 Accel Perf Configuration: 00:07:38.344 Workload Type: decompress 00:07:38.344 Transfer size: 4096 bytes 00:07:38.344 Vector count 1 00:07:38.344 Module: software 00:07:38.344 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:38.344 Queue depth: 32 00:07:38.344 Allocate depth: 32 00:07:38.344 # threads/core: 1 00:07:38.344 Run time: 1 seconds 00:07:38.344 Verify: Yes 00:07:38.344 00:07:38.344 Running for 1 seconds... 00:07:38.344 00:07:38.344 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:38.344 ------------------------------------------------------------------------------------ 00:07:38.344 0,0 78592/s 144 MiB/s 0 0 00:07:38.344 3,0 78976/s 145 MiB/s 0 0 00:07:38.344 2,0 78560/s 144 MiB/s 0 0 00:07:38.344 1,0 78784/s 145 MiB/s 0 0 00:07:38.344 ==================================================================================== 00:07:38.344 Total 314912/s 1230 MiB/s 0 0' 00:07:38.344 17:51:30 -- accel/accel.sh@20 -- # IFS=: 00:07:38.344 17:51:30 -- accel/accel.sh@20 -- # read -r var val 00:07:38.344 17:51:30 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:38.344 17:51:30 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:38.344 17:51:30 -- accel/accel.sh@12 -- # build_accel_config 00:07:38.344 17:51:30 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:38.344 17:51:30 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:38.344 17:51:30 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:38.344 17:51:30 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:38.344 17:51:30 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:38.344 17:51:30 -- accel/accel.sh@41 -- # local IFS=, 00:07:38.344 17:51:30 -- accel/accel.sh@42 -- # jq -r . 00:07:38.344 [2024-11-19 17:51:30.929638] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:38.344 [2024-11-19 17:51:30.929729] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid630985 ] 00:07:38.344 EAL: No free 2048 kB hugepages reported on node 1 00:07:38.344 [2024-11-19 17:51:30.997544] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:38.344 [2024-11-19 17:51:31.034163] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:38.344 [2024-11-19 17:51:31.034260] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:38.344 [2024-11-19 17:51:31.034347] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:07:38.344 [2024-11-19 17:51:31.034349] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:38.344 17:51:31 -- accel/accel.sh@21 -- # val= 00:07:38.344 17:51:31 -- accel/accel.sh@22 -- # case "$var" in 00:07:38.344 17:51:31 -- accel/accel.sh@20 -- # IFS=: 00:07:38.344 17:51:31 -- accel/accel.sh@20 -- # read -r var val 00:07:38.344 17:51:31 -- accel/accel.sh@21 -- # val= 00:07:38.344 17:51:31 -- accel/accel.sh@22 -- # case "$var" in 00:07:38.344 17:51:31 -- accel/accel.sh@20 -- # IFS=: 00:07:38.344 17:51:31 -- accel/accel.sh@20 -- # read -r var val 00:07:38.344 17:51:31 -- accel/accel.sh@21 -- # val= 00:07:38.344 17:51:31 -- accel/accel.sh@22 -- # case "$var" in 00:07:38.344 17:51:31 -- accel/accel.sh@20 -- # IFS=: 00:07:38.344 17:51:31 -- accel/accel.sh@20 -- # read -r var val 00:07:38.344 17:51:31 -- accel/accel.sh@21 -- # val=0xf 00:07:38.344 17:51:31 -- accel/accel.sh@22 -- # case "$var" in 00:07:38.344 17:51:31 -- accel/accel.sh@20 -- # IFS=: 00:07:38.344 17:51:31 -- accel/accel.sh@20 -- # read -r var val 00:07:38.344 17:51:31 -- accel/accel.sh@21 -- # val= 00:07:38.344 17:51:31 -- accel/accel.sh@22 -- # case "$var" in 00:07:38.344 17:51:31 -- accel/accel.sh@20 -- # IFS=: 00:07:38.344 17:51:31 -- accel/accel.sh@20 -- # read -r var val 00:07:38.344 17:51:31 -- accel/accel.sh@21 -- # val= 00:07:38.344 17:51:31 -- accel/accel.sh@22 -- # case "$var" in 00:07:38.344 17:51:31 -- accel/accel.sh@20 -- # IFS=: 00:07:38.344 17:51:31 -- accel/accel.sh@20 -- # read -r var val 00:07:38.344 17:51:31 -- accel/accel.sh@21 -- # val=decompress 00:07:38.344 17:51:31 -- accel/accel.sh@22 -- # case "$var" in 00:07:38.344 17:51:31 -- accel/accel.sh@24 -- # accel_opc=decompress 00:07:38.344 17:51:31 -- accel/accel.sh@20 -- # IFS=: 00:07:38.344 17:51:31 -- accel/accel.sh@20 -- # read -r var val 00:07:38.344 17:51:31 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:38.344 17:51:31 -- accel/accel.sh@22 -- # case "$var" in 00:07:38.344 17:51:31 -- accel/accel.sh@20 -- # IFS=: 00:07:38.344 17:51:31 -- accel/accel.sh@20 -- # read -r var val 00:07:38.344 17:51:31 -- accel/accel.sh@21 -- # val= 00:07:38.344 17:51:31 -- accel/accel.sh@22 -- # case "$var" in 00:07:38.344 17:51:31 -- accel/accel.sh@20 -- # IFS=: 00:07:38.344 17:51:31 -- accel/accel.sh@20 -- # read -r var val 00:07:38.344 17:51:31 -- accel/accel.sh@21 -- # val=software 00:07:38.344 17:51:31 -- accel/accel.sh@22 -- # case "$var" in 00:07:38.344 17:51:31 -- accel/accel.sh@23 -- # accel_module=software 00:07:38.344 17:51:31 -- accel/accel.sh@20 -- # IFS=: 00:07:38.344 17:51:31 -- accel/accel.sh@20 -- # read -r var val 00:07:38.344 17:51:31 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:38.344 17:51:31 -- accel/accel.sh@22 -- # case "$var" in 00:07:38.344 17:51:31 -- accel/accel.sh@20 -- # IFS=: 00:07:38.344 17:51:31 -- accel/accel.sh@20 -- # read -r var val 00:07:38.344 17:51:31 -- accel/accel.sh@21 -- # val=32 00:07:38.345 17:51:31 -- accel/accel.sh@22 -- # case "$var" in 00:07:38.345 17:51:31 -- accel/accel.sh@20 -- # IFS=: 00:07:38.345 17:51:31 -- accel/accel.sh@20 -- # read -r var val 00:07:38.345 17:51:31 -- accel/accel.sh@21 -- # val=32 00:07:38.345 17:51:31 -- accel/accel.sh@22 -- # case "$var" in 00:07:38.345 17:51:31 -- accel/accel.sh@20 -- # IFS=: 00:07:38.345 17:51:31 -- accel/accel.sh@20 -- # read -r var val 00:07:38.345 17:51:31 -- accel/accel.sh@21 -- # val=1 00:07:38.345 17:51:31 -- accel/accel.sh@22 -- # case "$var" in 00:07:38.345 17:51:31 -- accel/accel.sh@20 -- # IFS=: 00:07:38.345 17:51:31 -- accel/accel.sh@20 -- # read -r var val 00:07:38.345 17:51:31 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:38.345 17:51:31 -- accel/accel.sh@22 -- # case "$var" in 00:07:38.345 17:51:31 -- accel/accel.sh@20 -- # IFS=: 00:07:38.345 17:51:31 -- accel/accel.sh@20 -- # read -r var val 00:07:38.345 17:51:31 -- accel/accel.sh@21 -- # val=Yes 00:07:38.345 17:51:31 -- accel/accel.sh@22 -- # case "$var" in 00:07:38.345 17:51:31 -- accel/accel.sh@20 -- # IFS=: 00:07:38.345 17:51:31 -- accel/accel.sh@20 -- # read -r var val 00:07:38.345 17:51:31 -- accel/accel.sh@21 -- # val= 00:07:38.345 17:51:31 -- accel/accel.sh@22 -- # case "$var" in 00:07:38.345 17:51:31 -- accel/accel.sh@20 -- # IFS=: 00:07:38.345 17:51:31 -- accel/accel.sh@20 -- # read -r var val 00:07:38.345 17:51:31 -- accel/accel.sh@21 -- # val= 00:07:38.345 17:51:31 -- accel/accel.sh@22 -- # case "$var" in 00:07:38.345 17:51:31 -- accel/accel.sh@20 -- # IFS=: 00:07:38.345 17:51:31 -- accel/accel.sh@20 -- # read -r var val 00:07:39.726 17:51:32 -- accel/accel.sh@21 -- # val= 00:07:39.726 17:51:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:39.726 17:51:32 -- accel/accel.sh@20 -- # IFS=: 00:07:39.726 17:51:32 -- accel/accel.sh@20 -- # read -r var val 00:07:39.726 17:51:32 -- accel/accel.sh@21 -- # val= 00:07:39.726 17:51:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:39.726 17:51:32 -- accel/accel.sh@20 -- # IFS=: 00:07:39.726 17:51:32 -- accel/accel.sh@20 -- # read -r var val 00:07:39.726 17:51:32 -- accel/accel.sh@21 -- # val= 00:07:39.726 17:51:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:39.726 17:51:32 -- accel/accel.sh@20 -- # IFS=: 00:07:39.726 17:51:32 -- accel/accel.sh@20 -- # read -r var val 00:07:39.726 17:51:32 -- accel/accel.sh@21 -- # val= 00:07:39.726 17:51:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:39.726 17:51:32 -- accel/accel.sh@20 -- # IFS=: 00:07:39.726 17:51:32 -- accel/accel.sh@20 -- # read -r var val 00:07:39.726 17:51:32 -- accel/accel.sh@21 -- # val= 00:07:39.726 17:51:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:39.726 17:51:32 -- accel/accel.sh@20 -- # IFS=: 00:07:39.726 17:51:32 -- accel/accel.sh@20 -- # read -r var val 00:07:39.726 17:51:32 -- accel/accel.sh@21 -- # val= 00:07:39.726 17:51:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:39.726 17:51:32 -- accel/accel.sh@20 -- # IFS=: 00:07:39.726 17:51:32 -- accel/accel.sh@20 -- # read -r var val 00:07:39.726 17:51:32 -- accel/accel.sh@21 -- # val= 00:07:39.726 17:51:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:39.726 17:51:32 -- accel/accel.sh@20 -- # IFS=: 00:07:39.726 17:51:32 -- accel/accel.sh@20 -- # read -r var val 00:07:39.726 17:51:32 -- accel/accel.sh@21 -- # val= 00:07:39.726 17:51:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:39.726 17:51:32 -- accel/accel.sh@20 -- # IFS=: 00:07:39.726 17:51:32 -- accel/accel.sh@20 -- # read -r var val 00:07:39.726 17:51:32 -- accel/accel.sh@21 -- # val= 00:07:39.726 17:51:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:39.726 17:51:32 -- accel/accel.sh@20 -- # IFS=: 00:07:39.726 17:51:32 -- accel/accel.sh@20 -- # read -r var val 00:07:39.726 17:51:32 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:39.726 17:51:32 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:07:39.726 17:51:32 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:39.726 00:07:39.726 real 0m2.603s 00:07:39.726 user 0m8.987s 00:07:39.726 sys 0m0.274s 00:07:39.726 17:51:32 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:39.726 17:51:32 -- common/autotest_common.sh@10 -- # set +x 00:07:39.726 ************************************ 00:07:39.726 END TEST accel_decomp_mcore 00:07:39.726 ************************************ 00:07:39.726 17:51:32 -- accel/accel.sh@112 -- # run_test accel_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:39.726 17:51:32 -- common/autotest_common.sh@1087 -- # '[' 13 -le 1 ']' 00:07:39.726 17:51:32 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:39.726 17:51:32 -- common/autotest_common.sh@10 -- # set +x 00:07:39.726 ************************************ 00:07:39.726 START TEST accel_decomp_full_mcore 00:07:39.726 ************************************ 00:07:39.726 17:51:32 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:39.726 17:51:32 -- accel/accel.sh@16 -- # local accel_opc 00:07:39.726 17:51:32 -- accel/accel.sh@17 -- # local accel_module 00:07:39.726 17:51:32 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:39.726 17:51:32 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:39.726 17:51:32 -- accel/accel.sh@12 -- # build_accel_config 00:07:39.726 17:51:32 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:39.726 17:51:32 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:39.726 17:51:32 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:39.726 17:51:32 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:39.726 17:51:32 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:39.726 17:51:32 -- accel/accel.sh@41 -- # local IFS=, 00:07:39.726 17:51:32 -- accel/accel.sh@42 -- # jq -r . 00:07:39.726 [2024-11-19 17:51:32.283771] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:39.726 [2024-11-19 17:51:32.283867] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid631272 ] 00:07:39.726 EAL: No free 2048 kB hugepages reported on node 1 00:07:39.726 [2024-11-19 17:51:32.352413] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:39.726 [2024-11-19 17:51:32.389723] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:39.726 [2024-11-19 17:51:32.389819] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:39.726 [2024-11-19 17:51:32.389905] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:07:39.726 [2024-11-19 17:51:32.389922] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:41.106 17:51:33 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:41.106 00:07:41.106 SPDK Configuration: 00:07:41.106 Core mask: 0xf 00:07:41.106 00:07:41.106 Accel Perf Configuration: 00:07:41.106 Workload Type: decompress 00:07:41.106 Transfer size: 111250 bytes 00:07:41.106 Vector count 1 00:07:41.106 Module: software 00:07:41.106 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:41.106 Queue depth: 32 00:07:41.106 Allocate depth: 32 00:07:41.106 # threads/core: 1 00:07:41.106 Run time: 1 seconds 00:07:41.106 Verify: Yes 00:07:41.106 00:07:41.106 Running for 1 seconds... 00:07:41.106 00:07:41.106 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:41.106 ------------------------------------------------------------------------------------ 00:07:41.106 0,0 5792/s 239 MiB/s 0 0 00:07:41.106 3,0 5824/s 240 MiB/s 0 0 00:07:41.107 2,0 5824/s 240 MiB/s 0 0 00:07:41.107 1,0 5824/s 240 MiB/s 0 0 00:07:41.107 ==================================================================================== 00:07:41.107 Total 23264/s 2468 MiB/s 0 0' 00:07:41.107 17:51:33 -- accel/accel.sh@20 -- # IFS=: 00:07:41.107 17:51:33 -- accel/accel.sh@20 -- # read -r var val 00:07:41.107 17:51:33 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:41.107 17:51:33 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:41.107 17:51:33 -- accel/accel.sh@12 -- # build_accel_config 00:07:41.107 17:51:33 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:41.107 17:51:33 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:41.107 17:51:33 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:41.107 17:51:33 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:41.107 17:51:33 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:41.107 17:51:33 -- accel/accel.sh@41 -- # local IFS=, 00:07:41.107 17:51:33 -- accel/accel.sh@42 -- # jq -r . 00:07:41.107 [2024-11-19 17:51:33.587952] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:41.107 [2024-11-19 17:51:33.588044] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid631544 ] 00:07:41.107 EAL: No free 2048 kB hugepages reported on node 1 00:07:41.107 [2024-11-19 17:51:33.655058] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:41.107 [2024-11-19 17:51:33.691113] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:41.107 [2024-11-19 17:51:33.691211] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:41.107 [2024-11-19 17:51:33.691301] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:07:41.107 [2024-11-19 17:51:33.691303] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:41.107 17:51:33 -- accel/accel.sh@21 -- # val= 00:07:41.107 17:51:33 -- accel/accel.sh@22 -- # case "$var" in 00:07:41.107 17:51:33 -- accel/accel.sh@20 -- # IFS=: 00:07:41.107 17:51:33 -- accel/accel.sh@20 -- # read -r var val 00:07:41.107 17:51:33 -- accel/accel.sh@21 -- # val= 00:07:41.107 17:51:33 -- accel/accel.sh@22 -- # case "$var" in 00:07:41.107 17:51:33 -- accel/accel.sh@20 -- # IFS=: 00:07:41.107 17:51:33 -- accel/accel.sh@20 -- # read -r var val 00:07:41.107 17:51:33 -- accel/accel.sh@21 -- # val= 00:07:41.107 17:51:33 -- accel/accel.sh@22 -- # case "$var" in 00:07:41.107 17:51:33 -- accel/accel.sh@20 -- # IFS=: 00:07:41.107 17:51:33 -- accel/accel.sh@20 -- # read -r var val 00:07:41.107 17:51:33 -- accel/accel.sh@21 -- # val=0xf 00:07:41.107 17:51:33 -- accel/accel.sh@22 -- # case "$var" in 00:07:41.107 17:51:33 -- accel/accel.sh@20 -- # IFS=: 00:07:41.107 17:51:33 -- accel/accel.sh@20 -- # read -r var val 00:07:41.107 17:51:33 -- accel/accel.sh@21 -- # val= 00:07:41.107 17:51:33 -- accel/accel.sh@22 -- # case "$var" in 00:07:41.107 17:51:33 -- accel/accel.sh@20 -- # IFS=: 00:07:41.107 17:51:33 -- accel/accel.sh@20 -- # read -r var val 00:07:41.107 17:51:33 -- accel/accel.sh@21 -- # val= 00:07:41.107 17:51:33 -- accel/accel.sh@22 -- # case "$var" in 00:07:41.107 17:51:33 -- accel/accel.sh@20 -- # IFS=: 00:07:41.107 17:51:33 -- accel/accel.sh@20 -- # read -r var val 00:07:41.107 17:51:33 -- accel/accel.sh@21 -- # val=decompress 00:07:41.107 17:51:33 -- accel/accel.sh@22 -- # case "$var" in 00:07:41.107 17:51:33 -- accel/accel.sh@24 -- # accel_opc=decompress 00:07:41.107 17:51:33 -- accel/accel.sh@20 -- # IFS=: 00:07:41.107 17:51:33 -- accel/accel.sh@20 -- # read -r var val 00:07:41.107 17:51:33 -- accel/accel.sh@21 -- # val='111250 bytes' 00:07:41.107 17:51:33 -- accel/accel.sh@22 -- # case "$var" in 00:07:41.107 17:51:33 -- accel/accel.sh@20 -- # IFS=: 00:07:41.107 17:51:33 -- accel/accel.sh@20 -- # read -r var val 00:07:41.107 17:51:33 -- accel/accel.sh@21 -- # val= 00:07:41.107 17:51:33 -- accel/accel.sh@22 -- # case "$var" in 00:07:41.107 17:51:33 -- accel/accel.sh@20 -- # IFS=: 00:07:41.107 17:51:33 -- accel/accel.sh@20 -- # read -r var val 00:07:41.107 17:51:33 -- accel/accel.sh@21 -- # val=software 00:07:41.107 17:51:33 -- accel/accel.sh@22 -- # case "$var" in 00:07:41.107 17:51:33 -- accel/accel.sh@23 -- # accel_module=software 00:07:41.107 17:51:33 -- accel/accel.sh@20 -- # IFS=: 00:07:41.107 17:51:33 -- accel/accel.sh@20 -- # read -r var val 00:07:41.107 17:51:33 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:41.107 17:51:33 -- accel/accel.sh@22 -- # case "$var" in 00:07:41.107 17:51:33 -- accel/accel.sh@20 -- # IFS=: 00:07:41.107 17:51:33 -- accel/accel.sh@20 -- # read -r var val 00:07:41.107 17:51:33 -- accel/accel.sh@21 -- # val=32 00:07:41.107 17:51:33 -- accel/accel.sh@22 -- # case "$var" in 00:07:41.107 17:51:33 -- accel/accel.sh@20 -- # IFS=: 00:07:41.107 17:51:33 -- accel/accel.sh@20 -- # read -r var val 00:07:41.107 17:51:33 -- accel/accel.sh@21 -- # val=32 00:07:41.107 17:51:33 -- accel/accel.sh@22 -- # case "$var" in 00:07:41.107 17:51:33 -- accel/accel.sh@20 -- # IFS=: 00:07:41.107 17:51:33 -- accel/accel.sh@20 -- # read -r var val 00:07:41.107 17:51:33 -- accel/accel.sh@21 -- # val=1 00:07:41.107 17:51:33 -- accel/accel.sh@22 -- # case "$var" in 00:07:41.107 17:51:33 -- accel/accel.sh@20 -- # IFS=: 00:07:41.107 17:51:33 -- accel/accel.sh@20 -- # read -r var val 00:07:41.107 17:51:33 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:41.107 17:51:33 -- accel/accel.sh@22 -- # case "$var" in 00:07:41.107 17:51:33 -- accel/accel.sh@20 -- # IFS=: 00:07:41.107 17:51:33 -- accel/accel.sh@20 -- # read -r var val 00:07:41.107 17:51:33 -- accel/accel.sh@21 -- # val=Yes 00:07:41.107 17:51:33 -- accel/accel.sh@22 -- # case "$var" in 00:07:41.107 17:51:33 -- accel/accel.sh@20 -- # IFS=: 00:07:41.107 17:51:33 -- accel/accel.sh@20 -- # read -r var val 00:07:41.107 17:51:33 -- accel/accel.sh@21 -- # val= 00:07:41.107 17:51:33 -- accel/accel.sh@22 -- # case "$var" in 00:07:41.107 17:51:33 -- accel/accel.sh@20 -- # IFS=: 00:07:41.107 17:51:33 -- accel/accel.sh@20 -- # read -r var val 00:07:41.107 17:51:33 -- accel/accel.sh@21 -- # val= 00:07:41.107 17:51:33 -- accel/accel.sh@22 -- # case "$var" in 00:07:41.107 17:51:33 -- accel/accel.sh@20 -- # IFS=: 00:07:41.107 17:51:33 -- accel/accel.sh@20 -- # read -r var val 00:07:42.045 17:51:34 -- accel/accel.sh@21 -- # val= 00:07:42.045 17:51:34 -- accel/accel.sh@22 -- # case "$var" in 00:07:42.045 17:51:34 -- accel/accel.sh@20 -- # IFS=: 00:07:42.045 17:51:34 -- accel/accel.sh@20 -- # read -r var val 00:07:42.045 17:51:34 -- accel/accel.sh@21 -- # val= 00:07:42.045 17:51:34 -- accel/accel.sh@22 -- # case "$var" in 00:07:42.045 17:51:34 -- accel/accel.sh@20 -- # IFS=: 00:07:42.045 17:51:34 -- accel/accel.sh@20 -- # read -r var val 00:07:42.045 17:51:34 -- accel/accel.sh@21 -- # val= 00:07:42.045 17:51:34 -- accel/accel.sh@22 -- # case "$var" in 00:07:42.045 17:51:34 -- accel/accel.sh@20 -- # IFS=: 00:07:42.045 17:51:34 -- accel/accel.sh@20 -- # read -r var val 00:07:42.045 17:51:34 -- accel/accel.sh@21 -- # val= 00:07:42.045 17:51:34 -- accel/accel.sh@22 -- # case "$var" in 00:07:42.045 17:51:34 -- accel/accel.sh@20 -- # IFS=: 00:07:42.045 17:51:34 -- accel/accel.sh@20 -- # read -r var val 00:07:42.045 17:51:34 -- accel/accel.sh@21 -- # val= 00:07:42.045 17:51:34 -- accel/accel.sh@22 -- # case "$var" in 00:07:42.045 17:51:34 -- accel/accel.sh@20 -- # IFS=: 00:07:42.045 17:51:34 -- accel/accel.sh@20 -- # read -r var val 00:07:42.045 17:51:34 -- accel/accel.sh@21 -- # val= 00:07:42.045 17:51:34 -- accel/accel.sh@22 -- # case "$var" in 00:07:42.045 17:51:34 -- accel/accel.sh@20 -- # IFS=: 00:07:42.045 17:51:34 -- accel/accel.sh@20 -- # read -r var val 00:07:42.045 17:51:34 -- accel/accel.sh@21 -- # val= 00:07:42.045 17:51:34 -- accel/accel.sh@22 -- # case "$var" in 00:07:42.045 17:51:34 -- accel/accel.sh@20 -- # IFS=: 00:07:42.045 17:51:34 -- accel/accel.sh@20 -- # read -r var val 00:07:42.045 17:51:34 -- accel/accel.sh@21 -- # val= 00:07:42.045 17:51:34 -- accel/accel.sh@22 -- # case "$var" in 00:07:42.045 17:51:34 -- accel/accel.sh@20 -- # IFS=: 00:07:42.045 17:51:34 -- accel/accel.sh@20 -- # read -r var val 00:07:42.045 17:51:34 -- accel/accel.sh@21 -- # val= 00:07:42.045 17:51:34 -- accel/accel.sh@22 -- # case "$var" in 00:07:42.045 17:51:34 -- accel/accel.sh@20 -- # IFS=: 00:07:42.045 17:51:34 -- accel/accel.sh@20 -- # read -r var val 00:07:42.045 17:51:34 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:42.045 17:51:34 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:07:42.045 17:51:34 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:42.045 00:07:42.045 real 0m2.615s 00:07:42.045 user 0m9.041s 00:07:42.045 sys 0m0.270s 00:07:42.045 17:51:34 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:42.045 17:51:34 -- common/autotest_common.sh@10 -- # set +x 00:07:42.045 ************************************ 00:07:42.045 END TEST accel_decomp_full_mcore 00:07:42.045 ************************************ 00:07:42.305 17:51:34 -- accel/accel.sh@113 -- # run_test accel_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:42.305 17:51:34 -- common/autotest_common.sh@1087 -- # '[' 11 -le 1 ']' 00:07:42.305 17:51:34 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:42.305 17:51:34 -- common/autotest_common.sh@10 -- # set +x 00:07:42.305 ************************************ 00:07:42.305 START TEST accel_decomp_mthread 00:07:42.305 ************************************ 00:07:42.305 17:51:34 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:42.305 17:51:34 -- accel/accel.sh@16 -- # local accel_opc 00:07:42.305 17:51:34 -- accel/accel.sh@17 -- # local accel_module 00:07:42.305 17:51:34 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:42.305 17:51:34 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:42.305 17:51:34 -- accel/accel.sh@12 -- # build_accel_config 00:07:42.305 17:51:34 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:42.305 17:51:34 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:42.305 17:51:34 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:42.305 17:51:34 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:42.305 17:51:34 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:42.305 17:51:34 -- accel/accel.sh@41 -- # local IFS=, 00:07:42.305 17:51:34 -- accel/accel.sh@42 -- # jq -r . 00:07:42.305 [2024-11-19 17:51:34.946930] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:42.305 [2024-11-19 17:51:34.947018] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid631828 ] 00:07:42.305 EAL: No free 2048 kB hugepages reported on node 1 00:07:42.305 [2024-11-19 17:51:35.015482] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:42.305 [2024-11-19 17:51:35.049186] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:43.685 17:51:36 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:43.685 00:07:43.685 SPDK Configuration: 00:07:43.685 Core mask: 0x1 00:07:43.685 00:07:43.685 Accel Perf Configuration: 00:07:43.685 Workload Type: decompress 00:07:43.685 Transfer size: 4096 bytes 00:07:43.685 Vector count 1 00:07:43.685 Module: software 00:07:43.685 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:43.685 Queue depth: 32 00:07:43.685 Allocate depth: 32 00:07:43.685 # threads/core: 2 00:07:43.685 Run time: 1 seconds 00:07:43.685 Verify: Yes 00:07:43.685 00:07:43.685 Running for 1 seconds... 00:07:43.685 00:07:43.685 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:43.685 ------------------------------------------------------------------------------------ 00:07:43.685 0,1 47520/s 87 MiB/s 0 0 00:07:43.685 0,0 47424/s 87 MiB/s 0 0 00:07:43.685 ==================================================================================== 00:07:43.685 Total 94944/s 370 MiB/s 0 0' 00:07:43.685 17:51:36 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:43.685 17:51:36 -- accel/accel.sh@20 -- # IFS=: 00:07:43.685 17:51:36 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:43.685 17:51:36 -- accel/accel.sh@20 -- # read -r var val 00:07:43.685 17:51:36 -- accel/accel.sh@12 -- # build_accel_config 00:07:43.685 17:51:36 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:43.685 17:51:36 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:43.685 17:51:36 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:43.685 17:51:36 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:43.685 17:51:36 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:43.685 17:51:36 -- accel/accel.sh@41 -- # local IFS=, 00:07:43.685 17:51:36 -- accel/accel.sh@42 -- # jq -r . 00:07:43.685 [2024-11-19 17:51:36.222792] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:43.685 [2024-11-19 17:51:36.222843] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid631988 ] 00:07:43.685 EAL: No free 2048 kB hugepages reported on node 1 00:07:43.685 [2024-11-19 17:51:36.281522] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:43.685 [2024-11-19 17:51:36.316200] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:43.685 17:51:36 -- accel/accel.sh@21 -- # val= 00:07:43.685 17:51:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:43.685 17:51:36 -- accel/accel.sh@20 -- # IFS=: 00:07:43.685 17:51:36 -- accel/accel.sh@20 -- # read -r var val 00:07:43.685 17:51:36 -- accel/accel.sh@21 -- # val= 00:07:43.685 17:51:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:43.685 17:51:36 -- accel/accel.sh@20 -- # IFS=: 00:07:43.685 17:51:36 -- accel/accel.sh@20 -- # read -r var val 00:07:43.685 17:51:36 -- accel/accel.sh@21 -- # val= 00:07:43.685 17:51:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:43.685 17:51:36 -- accel/accel.sh@20 -- # IFS=: 00:07:43.685 17:51:36 -- accel/accel.sh@20 -- # read -r var val 00:07:43.685 17:51:36 -- accel/accel.sh@21 -- # val=0x1 00:07:43.685 17:51:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:43.685 17:51:36 -- accel/accel.sh@20 -- # IFS=: 00:07:43.685 17:51:36 -- accel/accel.sh@20 -- # read -r var val 00:07:43.685 17:51:36 -- accel/accel.sh@21 -- # val= 00:07:43.685 17:51:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:43.685 17:51:36 -- accel/accel.sh@20 -- # IFS=: 00:07:43.685 17:51:36 -- accel/accel.sh@20 -- # read -r var val 00:07:43.685 17:51:36 -- accel/accel.sh@21 -- # val= 00:07:43.685 17:51:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:43.685 17:51:36 -- accel/accel.sh@20 -- # IFS=: 00:07:43.685 17:51:36 -- accel/accel.sh@20 -- # read -r var val 00:07:43.685 17:51:36 -- accel/accel.sh@21 -- # val=decompress 00:07:43.685 17:51:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:43.685 17:51:36 -- accel/accel.sh@24 -- # accel_opc=decompress 00:07:43.685 17:51:36 -- accel/accel.sh@20 -- # IFS=: 00:07:43.685 17:51:36 -- accel/accel.sh@20 -- # read -r var val 00:07:43.685 17:51:36 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:43.685 17:51:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:43.685 17:51:36 -- accel/accel.sh@20 -- # IFS=: 00:07:43.685 17:51:36 -- accel/accel.sh@20 -- # read -r var val 00:07:43.685 17:51:36 -- accel/accel.sh@21 -- # val= 00:07:43.685 17:51:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:43.685 17:51:36 -- accel/accel.sh@20 -- # IFS=: 00:07:43.685 17:51:36 -- accel/accel.sh@20 -- # read -r var val 00:07:43.685 17:51:36 -- accel/accel.sh@21 -- # val=software 00:07:43.685 17:51:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:43.685 17:51:36 -- accel/accel.sh@23 -- # accel_module=software 00:07:43.685 17:51:36 -- accel/accel.sh@20 -- # IFS=: 00:07:43.685 17:51:36 -- accel/accel.sh@20 -- # read -r var val 00:07:43.685 17:51:36 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:43.685 17:51:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:43.685 17:51:36 -- accel/accel.sh@20 -- # IFS=: 00:07:43.685 17:51:36 -- accel/accel.sh@20 -- # read -r var val 00:07:43.685 17:51:36 -- accel/accel.sh@21 -- # val=32 00:07:43.685 17:51:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:43.685 17:51:36 -- accel/accel.sh@20 -- # IFS=: 00:07:43.685 17:51:36 -- accel/accel.sh@20 -- # read -r var val 00:07:43.685 17:51:36 -- accel/accel.sh@21 -- # val=32 00:07:43.685 17:51:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:43.685 17:51:36 -- accel/accel.sh@20 -- # IFS=: 00:07:43.685 17:51:36 -- accel/accel.sh@20 -- # read -r var val 00:07:43.685 17:51:36 -- accel/accel.sh@21 -- # val=2 00:07:43.686 17:51:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:43.686 17:51:36 -- accel/accel.sh@20 -- # IFS=: 00:07:43.686 17:51:36 -- accel/accel.sh@20 -- # read -r var val 00:07:43.686 17:51:36 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:43.686 17:51:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:43.686 17:51:36 -- accel/accel.sh@20 -- # IFS=: 00:07:43.686 17:51:36 -- accel/accel.sh@20 -- # read -r var val 00:07:43.686 17:51:36 -- accel/accel.sh@21 -- # val=Yes 00:07:43.686 17:51:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:43.686 17:51:36 -- accel/accel.sh@20 -- # IFS=: 00:07:43.686 17:51:36 -- accel/accel.sh@20 -- # read -r var val 00:07:43.686 17:51:36 -- accel/accel.sh@21 -- # val= 00:07:43.686 17:51:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:43.686 17:51:36 -- accel/accel.sh@20 -- # IFS=: 00:07:43.686 17:51:36 -- accel/accel.sh@20 -- # read -r var val 00:07:43.686 17:51:36 -- accel/accel.sh@21 -- # val= 00:07:43.686 17:51:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:43.686 17:51:36 -- accel/accel.sh@20 -- # IFS=: 00:07:43.686 17:51:36 -- accel/accel.sh@20 -- # read -r var val 00:07:44.625 17:51:37 -- accel/accel.sh@21 -- # val= 00:07:44.625 17:51:37 -- accel/accel.sh@22 -- # case "$var" in 00:07:44.625 17:51:37 -- accel/accel.sh@20 -- # IFS=: 00:07:44.625 17:51:37 -- accel/accel.sh@20 -- # read -r var val 00:07:44.625 17:51:37 -- accel/accel.sh@21 -- # val= 00:07:44.625 17:51:37 -- accel/accel.sh@22 -- # case "$var" in 00:07:44.625 17:51:37 -- accel/accel.sh@20 -- # IFS=: 00:07:44.625 17:51:37 -- accel/accel.sh@20 -- # read -r var val 00:07:44.625 17:51:37 -- accel/accel.sh@21 -- # val= 00:07:44.625 17:51:37 -- accel/accel.sh@22 -- # case "$var" in 00:07:44.625 17:51:37 -- accel/accel.sh@20 -- # IFS=: 00:07:44.625 17:51:37 -- accel/accel.sh@20 -- # read -r var val 00:07:44.625 17:51:37 -- accel/accel.sh@21 -- # val= 00:07:44.625 17:51:37 -- accel/accel.sh@22 -- # case "$var" in 00:07:44.625 17:51:37 -- accel/accel.sh@20 -- # IFS=: 00:07:44.625 17:51:37 -- accel/accel.sh@20 -- # read -r var val 00:07:44.625 17:51:37 -- accel/accel.sh@21 -- # val= 00:07:44.625 17:51:37 -- accel/accel.sh@22 -- # case "$var" in 00:07:44.625 17:51:37 -- accel/accel.sh@20 -- # IFS=: 00:07:44.625 17:51:37 -- accel/accel.sh@20 -- # read -r var val 00:07:44.625 17:51:37 -- accel/accel.sh@21 -- # val= 00:07:44.625 17:51:37 -- accel/accel.sh@22 -- # case "$var" in 00:07:44.625 17:51:37 -- accel/accel.sh@20 -- # IFS=: 00:07:44.625 17:51:37 -- accel/accel.sh@20 -- # read -r var val 00:07:44.625 17:51:37 -- accel/accel.sh@21 -- # val= 00:07:44.625 17:51:37 -- accel/accel.sh@22 -- # case "$var" in 00:07:44.625 17:51:37 -- accel/accel.sh@20 -- # IFS=: 00:07:44.625 17:51:37 -- accel/accel.sh@20 -- # read -r var val 00:07:44.625 17:51:37 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:44.625 17:51:37 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:07:44.625 17:51:37 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:44.625 00:07:44.625 real 0m2.562s 00:07:44.625 user 0m2.326s 00:07:44.625 sys 0m0.245s 00:07:44.625 17:51:37 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:44.625 17:51:37 -- common/autotest_common.sh@10 -- # set +x 00:07:44.625 ************************************ 00:07:44.625 END TEST accel_decomp_mthread 00:07:44.625 ************************************ 00:07:44.885 17:51:37 -- accel/accel.sh@114 -- # run_test accel_deomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:44.885 17:51:37 -- common/autotest_common.sh@1087 -- # '[' 13 -le 1 ']' 00:07:44.885 17:51:37 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:44.885 17:51:37 -- common/autotest_common.sh@10 -- # set +x 00:07:44.885 ************************************ 00:07:44.885 START TEST accel_deomp_full_mthread 00:07:44.885 ************************************ 00:07:44.885 17:51:37 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:44.885 17:51:37 -- accel/accel.sh@16 -- # local accel_opc 00:07:44.885 17:51:37 -- accel/accel.sh@17 -- # local accel_module 00:07:44.885 17:51:37 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:44.885 17:51:37 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:44.885 17:51:37 -- accel/accel.sh@12 -- # build_accel_config 00:07:44.885 17:51:37 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:44.885 17:51:37 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:44.885 17:51:37 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:44.885 17:51:37 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:44.885 17:51:37 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:44.885 17:51:37 -- accel/accel.sh@41 -- # local IFS=, 00:07:44.885 17:51:37 -- accel/accel.sh@42 -- # jq -r . 00:07:44.885 [2024-11-19 17:51:37.558843] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:44.885 [2024-11-19 17:51:37.558931] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid632162 ] 00:07:44.885 EAL: No free 2048 kB hugepages reported on node 1 00:07:44.885 [2024-11-19 17:51:37.626905] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:44.885 [2024-11-19 17:51:37.662386] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:46.265 17:51:38 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:46.265 00:07:46.265 SPDK Configuration: 00:07:46.265 Core mask: 0x1 00:07:46.265 00:07:46.265 Accel Perf Configuration: 00:07:46.265 Workload Type: decompress 00:07:46.265 Transfer size: 111250 bytes 00:07:46.265 Vector count 1 00:07:46.265 Module: software 00:07:46.265 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:46.265 Queue depth: 32 00:07:46.265 Allocate depth: 32 00:07:46.265 # threads/core: 2 00:07:46.265 Run time: 1 seconds 00:07:46.265 Verify: Yes 00:07:46.265 00:07:46.265 Running for 1 seconds... 00:07:46.265 00:07:46.265 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:46.265 ------------------------------------------------------------------------------------ 00:07:46.265 0,1 3040/s 125 MiB/s 0 0 00:07:46.265 0,0 2976/s 122 MiB/s 0 0 00:07:46.265 ==================================================================================== 00:07:46.265 Total 6016/s 638 MiB/s 0 0' 00:07:46.265 17:51:38 -- accel/accel.sh@20 -- # IFS=: 00:07:46.265 17:51:38 -- accel/accel.sh@20 -- # read -r var val 00:07:46.265 17:51:38 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:46.265 17:51:38 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:46.265 17:51:38 -- accel/accel.sh@12 -- # build_accel_config 00:07:46.265 17:51:38 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:46.265 17:51:38 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:46.265 17:51:38 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:46.265 17:51:38 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:46.265 17:51:38 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:46.265 17:51:38 -- accel/accel.sh@41 -- # local IFS=, 00:07:46.265 17:51:38 -- accel/accel.sh@42 -- # jq -r . 00:07:46.265 [2024-11-19 17:51:38.866304] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:46.265 [2024-11-19 17:51:38.866394] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid632409 ] 00:07:46.265 EAL: No free 2048 kB hugepages reported on node 1 00:07:46.265 [2024-11-19 17:51:38.933474] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:46.265 [2024-11-19 17:51:38.966698] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:46.265 17:51:39 -- accel/accel.sh@21 -- # val= 00:07:46.265 17:51:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:46.265 17:51:39 -- accel/accel.sh@20 -- # IFS=: 00:07:46.265 17:51:39 -- accel/accel.sh@20 -- # read -r var val 00:07:46.265 17:51:39 -- accel/accel.sh@21 -- # val= 00:07:46.265 17:51:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:46.265 17:51:39 -- accel/accel.sh@20 -- # IFS=: 00:07:46.265 17:51:39 -- accel/accel.sh@20 -- # read -r var val 00:07:46.265 17:51:39 -- accel/accel.sh@21 -- # val= 00:07:46.265 17:51:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:46.265 17:51:39 -- accel/accel.sh@20 -- # IFS=: 00:07:46.265 17:51:39 -- accel/accel.sh@20 -- # read -r var val 00:07:46.265 17:51:39 -- accel/accel.sh@21 -- # val=0x1 00:07:46.265 17:51:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:46.265 17:51:39 -- accel/accel.sh@20 -- # IFS=: 00:07:46.265 17:51:39 -- accel/accel.sh@20 -- # read -r var val 00:07:46.265 17:51:39 -- accel/accel.sh@21 -- # val= 00:07:46.265 17:51:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:46.265 17:51:39 -- accel/accel.sh@20 -- # IFS=: 00:07:46.265 17:51:39 -- accel/accel.sh@20 -- # read -r var val 00:07:46.265 17:51:39 -- accel/accel.sh@21 -- # val= 00:07:46.265 17:51:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:46.265 17:51:39 -- accel/accel.sh@20 -- # IFS=: 00:07:46.265 17:51:39 -- accel/accel.sh@20 -- # read -r var val 00:07:46.266 17:51:39 -- accel/accel.sh@21 -- # val=decompress 00:07:46.266 17:51:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:46.266 17:51:39 -- accel/accel.sh@24 -- # accel_opc=decompress 00:07:46.266 17:51:39 -- accel/accel.sh@20 -- # IFS=: 00:07:46.266 17:51:39 -- accel/accel.sh@20 -- # read -r var val 00:07:46.266 17:51:39 -- accel/accel.sh@21 -- # val='111250 bytes' 00:07:46.266 17:51:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:46.266 17:51:39 -- accel/accel.sh@20 -- # IFS=: 00:07:46.266 17:51:39 -- accel/accel.sh@20 -- # read -r var val 00:07:46.266 17:51:39 -- accel/accel.sh@21 -- # val= 00:07:46.266 17:51:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:46.266 17:51:39 -- accel/accel.sh@20 -- # IFS=: 00:07:46.266 17:51:39 -- accel/accel.sh@20 -- # read -r var val 00:07:46.266 17:51:39 -- accel/accel.sh@21 -- # val=software 00:07:46.266 17:51:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:46.266 17:51:39 -- accel/accel.sh@23 -- # accel_module=software 00:07:46.266 17:51:39 -- accel/accel.sh@20 -- # IFS=: 00:07:46.266 17:51:39 -- accel/accel.sh@20 -- # read -r var val 00:07:46.266 17:51:39 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:46.266 17:51:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:46.266 17:51:39 -- accel/accel.sh@20 -- # IFS=: 00:07:46.266 17:51:39 -- accel/accel.sh@20 -- # read -r var val 00:07:46.266 17:51:39 -- accel/accel.sh@21 -- # val=32 00:07:46.266 17:51:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:46.266 17:51:39 -- accel/accel.sh@20 -- # IFS=: 00:07:46.266 17:51:39 -- accel/accel.sh@20 -- # read -r var val 00:07:46.266 17:51:39 -- accel/accel.sh@21 -- # val=32 00:07:46.266 17:51:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:46.266 17:51:39 -- accel/accel.sh@20 -- # IFS=: 00:07:46.266 17:51:39 -- accel/accel.sh@20 -- # read -r var val 00:07:46.266 17:51:39 -- accel/accel.sh@21 -- # val=2 00:07:46.266 17:51:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:46.266 17:51:39 -- accel/accel.sh@20 -- # IFS=: 00:07:46.266 17:51:39 -- accel/accel.sh@20 -- # read -r var val 00:07:46.266 17:51:39 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:46.266 17:51:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:46.266 17:51:39 -- accel/accel.sh@20 -- # IFS=: 00:07:46.266 17:51:39 -- accel/accel.sh@20 -- # read -r var val 00:07:46.266 17:51:39 -- accel/accel.sh@21 -- # val=Yes 00:07:46.266 17:51:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:46.266 17:51:39 -- accel/accel.sh@20 -- # IFS=: 00:07:46.266 17:51:39 -- accel/accel.sh@20 -- # read -r var val 00:07:46.266 17:51:39 -- accel/accel.sh@21 -- # val= 00:07:46.266 17:51:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:46.266 17:51:39 -- accel/accel.sh@20 -- # IFS=: 00:07:46.266 17:51:39 -- accel/accel.sh@20 -- # read -r var val 00:07:46.266 17:51:39 -- accel/accel.sh@21 -- # val= 00:07:46.266 17:51:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:46.266 17:51:39 -- accel/accel.sh@20 -- # IFS=: 00:07:46.266 17:51:39 -- accel/accel.sh@20 -- # read -r var val 00:07:47.643 17:51:40 -- accel/accel.sh@21 -- # val= 00:07:47.643 17:51:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:47.643 17:51:40 -- accel/accel.sh@20 -- # IFS=: 00:07:47.643 17:51:40 -- accel/accel.sh@20 -- # read -r var val 00:07:47.643 17:51:40 -- accel/accel.sh@21 -- # val= 00:07:47.643 17:51:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:47.643 17:51:40 -- accel/accel.sh@20 -- # IFS=: 00:07:47.643 17:51:40 -- accel/accel.sh@20 -- # read -r var val 00:07:47.643 17:51:40 -- accel/accel.sh@21 -- # val= 00:07:47.643 17:51:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:47.643 17:51:40 -- accel/accel.sh@20 -- # IFS=: 00:07:47.643 17:51:40 -- accel/accel.sh@20 -- # read -r var val 00:07:47.643 17:51:40 -- accel/accel.sh@21 -- # val= 00:07:47.643 17:51:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:47.643 17:51:40 -- accel/accel.sh@20 -- # IFS=: 00:07:47.643 17:51:40 -- accel/accel.sh@20 -- # read -r var val 00:07:47.643 17:51:40 -- accel/accel.sh@21 -- # val= 00:07:47.643 17:51:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:47.643 17:51:40 -- accel/accel.sh@20 -- # IFS=: 00:07:47.643 17:51:40 -- accel/accel.sh@20 -- # read -r var val 00:07:47.643 17:51:40 -- accel/accel.sh@21 -- # val= 00:07:47.643 17:51:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:47.643 17:51:40 -- accel/accel.sh@20 -- # IFS=: 00:07:47.643 17:51:40 -- accel/accel.sh@20 -- # read -r var val 00:07:47.643 17:51:40 -- accel/accel.sh@21 -- # val= 00:07:47.643 17:51:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:47.643 17:51:40 -- accel/accel.sh@20 -- # IFS=: 00:07:47.643 17:51:40 -- accel/accel.sh@20 -- # read -r var val 00:07:47.643 17:51:40 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:47.643 17:51:40 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:07:47.643 17:51:40 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:47.643 00:07:47.643 real 0m2.623s 00:07:47.643 user 0m2.385s 00:07:47.643 sys 0m0.245s 00:07:47.643 17:51:40 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:47.643 17:51:40 -- common/autotest_common.sh@10 -- # set +x 00:07:47.643 ************************************ 00:07:47.643 END TEST accel_deomp_full_mthread 00:07:47.643 ************************************ 00:07:47.643 17:51:40 -- accel/accel.sh@116 -- # [[ n == y ]] 00:07:47.643 17:51:40 -- accel/accel.sh@129 -- # run_test accel_dif_functional_tests /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:07:47.643 17:51:40 -- accel/accel.sh@129 -- # build_accel_config 00:07:47.643 17:51:40 -- common/autotest_common.sh@1087 -- # '[' 4 -le 1 ']' 00:07:47.643 17:51:40 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:47.643 17:51:40 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:47.643 17:51:40 -- common/autotest_common.sh@10 -- # set +x 00:07:47.643 17:51:40 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:47.643 17:51:40 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:47.643 17:51:40 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:47.643 17:51:40 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:47.643 17:51:40 -- accel/accel.sh@41 -- # local IFS=, 00:07:47.643 17:51:40 -- accel/accel.sh@42 -- # jq -r . 00:07:47.643 ************************************ 00:07:47.643 START TEST accel_dif_functional_tests 00:07:47.643 ************************************ 00:07:47.643 17:51:40 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:07:47.643 [2024-11-19 17:51:40.233145] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:47.643 [2024-11-19 17:51:40.233234] [ DPDK EAL parameters: DIF --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid632697 ] 00:07:47.643 EAL: No free 2048 kB hugepages reported on node 1 00:07:47.643 [2024-11-19 17:51:40.299667] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:07:47.643 [2024-11-19 17:51:40.335822] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:47.643 [2024-11-19 17:51:40.335917] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:47.644 [2024-11-19 17:51:40.335917] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:47.644 00:07:47.644 00:07:47.644 CUnit - A unit testing framework for C - Version 2.1-3 00:07:47.644 http://cunit.sourceforge.net/ 00:07:47.644 00:07:47.644 00:07:47.644 Suite: accel_dif 00:07:47.644 Test: verify: DIF generated, GUARD check ...passed 00:07:47.644 Test: verify: DIF generated, APPTAG check ...passed 00:07:47.644 Test: verify: DIF generated, REFTAG check ...passed 00:07:47.644 Test: verify: DIF not generated, GUARD check ...[2024-11-19 17:51:40.398916] dif.c: 779:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:07:47.644 [2024-11-19 17:51:40.398971] dif.c: 779:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:07:47.644 passed 00:07:47.644 Test: verify: DIF not generated, APPTAG check ...[2024-11-19 17:51:40.399022] dif.c: 794:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:07:47.644 [2024-11-19 17:51:40.399042] dif.c: 794:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:07:47.644 passed 00:07:47.644 Test: verify: DIF not generated, REFTAG check ...[2024-11-19 17:51:40.399062] dif.c: 815:_dif_verify: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:07:47.644 [2024-11-19 17:51:40.399081] dif.c: 815:_dif_verify: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:07:47.644 passed 00:07:47.644 Test: verify: APPTAG correct, APPTAG check ...passed 00:07:47.644 Test: verify: APPTAG incorrect, APPTAG check ...[2024-11-19 17:51:40.399127] dif.c: 794:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=30, Expected=28, Actual=14 00:07:47.644 passed 00:07:47.644 Test: verify: APPTAG incorrect, no APPTAG check ...passed 00:07:47.644 Test: verify: REFTAG incorrect, REFTAG ignore ...passed 00:07:47.644 Test: verify: REFTAG_INIT correct, REFTAG check ...passed 00:07:47.644 Test: verify: REFTAG_INIT incorrect, REFTAG check ...[2024-11-19 17:51:40.399233] dif.c: 815:_dif_verify: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=10 00:07:47.644 passed 00:07:47.644 Test: generate copy: DIF generated, GUARD check ...passed 00:07:47.644 Test: generate copy: DIF generated, APTTAG check ...passed 00:07:47.644 Test: generate copy: DIF generated, REFTAG check ...passed 00:07:47.644 Test: generate copy: DIF generated, no GUARD check flag set ...passed 00:07:47.644 Test: generate copy: DIF generated, no APPTAG check flag set ...passed 00:07:47.644 Test: generate copy: DIF generated, no REFTAG check flag set ...passed 00:07:47.644 Test: generate copy: iovecs-len validate ...[2024-11-19 17:51:40.399414] dif.c:1167:spdk_dif_generate_copy: *ERROR*: Size of bounce_iovs arrays are not valid or misaligned with block_size. 00:07:47.644 passed 00:07:47.644 Test: generate copy: buffer alignment validate ...passed 00:07:47.644 00:07:47.644 Run Summary: Type Total Ran Passed Failed Inactive 00:07:47.644 suites 1 1 n/a 0 0 00:07:47.644 tests 20 20 20 0 0 00:07:47.644 asserts 204 204 204 0 n/a 00:07:47.644 00:07:47.644 Elapsed time = 0.000 seconds 00:07:47.903 00:07:47.903 real 0m0.339s 00:07:47.903 user 0m0.521s 00:07:47.903 sys 0m0.157s 00:07:47.903 17:51:40 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:47.903 17:51:40 -- common/autotest_common.sh@10 -- # set +x 00:07:47.903 ************************************ 00:07:47.903 END TEST accel_dif_functional_tests 00:07:47.903 ************************************ 00:07:47.903 00:07:47.903 real 0m55.024s 00:07:47.903 user 1m2.636s 00:07:47.903 sys 0m6.915s 00:07:47.903 17:51:40 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:47.903 17:51:40 -- common/autotest_common.sh@10 -- # set +x 00:07:47.903 ************************************ 00:07:47.903 END TEST accel 00:07:47.903 ************************************ 00:07:47.903 17:51:40 -- spdk/autotest.sh@177 -- # run_test accel_rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel_rpc.sh 00:07:47.904 17:51:40 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:07:47.904 17:51:40 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:47.904 17:51:40 -- common/autotest_common.sh@10 -- # set +x 00:07:47.904 ************************************ 00:07:47.904 START TEST accel_rpc 00:07:47.904 ************************************ 00:07:47.904 17:51:40 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel_rpc.sh 00:07:47.904 * Looking for test storage... 00:07:47.904 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel 00:07:47.904 17:51:40 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:07:47.904 17:51:40 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:07:47.904 17:51:40 -- common/autotest_common.sh@1690 -- # lcov --version 00:07:48.163 17:51:40 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:07:48.163 17:51:40 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:07:48.163 17:51:40 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:07:48.163 17:51:40 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:07:48.163 17:51:40 -- scripts/common.sh@335 -- # IFS=.-: 00:07:48.163 17:51:40 -- scripts/common.sh@335 -- # read -ra ver1 00:07:48.163 17:51:40 -- scripts/common.sh@336 -- # IFS=.-: 00:07:48.163 17:51:40 -- scripts/common.sh@336 -- # read -ra ver2 00:07:48.163 17:51:40 -- scripts/common.sh@337 -- # local 'op=<' 00:07:48.163 17:51:40 -- scripts/common.sh@339 -- # ver1_l=2 00:07:48.163 17:51:40 -- scripts/common.sh@340 -- # ver2_l=1 00:07:48.163 17:51:40 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:07:48.163 17:51:40 -- scripts/common.sh@343 -- # case "$op" in 00:07:48.163 17:51:40 -- scripts/common.sh@344 -- # : 1 00:07:48.163 17:51:40 -- scripts/common.sh@363 -- # (( v = 0 )) 00:07:48.163 17:51:40 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:48.163 17:51:40 -- scripts/common.sh@364 -- # decimal 1 00:07:48.163 17:51:40 -- scripts/common.sh@352 -- # local d=1 00:07:48.163 17:51:40 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:48.163 17:51:40 -- scripts/common.sh@354 -- # echo 1 00:07:48.163 17:51:40 -- scripts/common.sh@364 -- # ver1[v]=1 00:07:48.163 17:51:40 -- scripts/common.sh@365 -- # decimal 2 00:07:48.163 17:51:40 -- scripts/common.sh@352 -- # local d=2 00:07:48.163 17:51:40 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:48.163 17:51:40 -- scripts/common.sh@354 -- # echo 2 00:07:48.163 17:51:40 -- scripts/common.sh@365 -- # ver2[v]=2 00:07:48.163 17:51:40 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:07:48.163 17:51:40 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:07:48.163 17:51:40 -- scripts/common.sh@367 -- # return 0 00:07:48.163 17:51:40 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:48.163 17:51:40 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:07:48.163 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:48.163 --rc genhtml_branch_coverage=1 00:07:48.163 --rc genhtml_function_coverage=1 00:07:48.163 --rc genhtml_legend=1 00:07:48.163 --rc geninfo_all_blocks=1 00:07:48.163 --rc geninfo_unexecuted_blocks=1 00:07:48.163 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:48.163 ' 00:07:48.163 17:51:40 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:07:48.163 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:48.163 --rc genhtml_branch_coverage=1 00:07:48.163 --rc genhtml_function_coverage=1 00:07:48.163 --rc genhtml_legend=1 00:07:48.163 --rc geninfo_all_blocks=1 00:07:48.163 --rc geninfo_unexecuted_blocks=1 00:07:48.163 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:48.163 ' 00:07:48.164 17:51:40 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:07:48.164 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:48.164 --rc genhtml_branch_coverage=1 00:07:48.164 --rc genhtml_function_coverage=1 00:07:48.164 --rc genhtml_legend=1 00:07:48.164 --rc geninfo_all_blocks=1 00:07:48.164 --rc geninfo_unexecuted_blocks=1 00:07:48.164 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:48.164 ' 00:07:48.164 17:51:40 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:07:48.164 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:48.164 --rc genhtml_branch_coverage=1 00:07:48.164 --rc genhtml_function_coverage=1 00:07:48.164 --rc genhtml_legend=1 00:07:48.164 --rc geninfo_all_blocks=1 00:07:48.164 --rc geninfo_unexecuted_blocks=1 00:07:48.164 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:48.164 ' 00:07:48.164 17:51:40 -- accel/accel_rpc.sh@11 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:07:48.164 17:51:40 -- accel/accel_rpc.sh@14 -- # spdk_tgt_pid=632892 00:07:48.164 17:51:40 -- accel/accel_rpc.sh@13 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --wait-for-rpc 00:07:48.164 17:51:40 -- accel/accel_rpc.sh@15 -- # waitforlisten 632892 00:07:48.164 17:51:40 -- common/autotest_common.sh@829 -- # '[' -z 632892 ']' 00:07:48.164 17:51:40 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:48.164 17:51:40 -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:48.164 17:51:40 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:48.164 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:48.164 17:51:40 -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:48.164 17:51:40 -- common/autotest_common.sh@10 -- # set +x 00:07:48.164 [2024-11-19 17:51:40.842382] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:48.164 [2024-11-19 17:51:40.842470] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid632892 ] 00:07:48.164 EAL: No free 2048 kB hugepages reported on node 1 00:07:48.164 [2024-11-19 17:51:40.909016] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:48.164 [2024-11-19 17:51:40.946477] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:48.164 [2024-11-19 17:51:40.946588] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:48.164 17:51:40 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:48.164 17:51:40 -- common/autotest_common.sh@862 -- # return 0 00:07:48.164 17:51:40 -- accel/accel_rpc.sh@45 -- # [[ y == y ]] 00:07:48.164 17:51:40 -- accel/accel_rpc.sh@45 -- # [[ 0 -gt 0 ]] 00:07:48.164 17:51:40 -- accel/accel_rpc.sh@49 -- # [[ y == y ]] 00:07:48.164 17:51:40 -- accel/accel_rpc.sh@49 -- # [[ 0 -gt 0 ]] 00:07:48.164 17:51:40 -- accel/accel_rpc.sh@53 -- # run_test accel_assign_opcode accel_assign_opcode_test_suite 00:07:48.164 17:51:41 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:07:48.164 17:51:41 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:48.164 17:51:41 -- common/autotest_common.sh@10 -- # set +x 00:07:48.164 ************************************ 00:07:48.164 START TEST accel_assign_opcode 00:07:48.164 ************************************ 00:07:48.164 17:51:41 -- common/autotest_common.sh@1114 -- # accel_assign_opcode_test_suite 00:07:48.164 17:51:41 -- accel/accel_rpc.sh@38 -- # rpc_cmd accel_assign_opc -o copy -m incorrect 00:07:48.164 17:51:41 -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:48.164 17:51:41 -- common/autotest_common.sh@10 -- # set +x 00:07:48.164 [2024-11-19 17:51:41.015080] accel_rpc.c: 168:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module incorrect 00:07:48.164 17:51:41 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:48.164 17:51:41 -- accel/accel_rpc.sh@40 -- # rpc_cmd accel_assign_opc -o copy -m software 00:07:48.164 17:51:41 -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:48.164 17:51:41 -- common/autotest_common.sh@10 -- # set +x 00:07:48.164 [2024-11-19 17:51:41.023100] accel_rpc.c: 168:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module software 00:07:48.424 17:51:41 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:48.424 17:51:41 -- accel/accel_rpc.sh@41 -- # rpc_cmd framework_start_init 00:07:48.424 17:51:41 -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:48.424 17:51:41 -- common/autotest_common.sh@10 -- # set +x 00:07:48.424 17:51:41 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:48.424 17:51:41 -- accel/accel_rpc.sh@42 -- # rpc_cmd accel_get_opc_assignments 00:07:48.424 17:51:41 -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:48.424 17:51:41 -- accel/accel_rpc.sh@42 -- # jq -r .copy 00:07:48.424 17:51:41 -- common/autotest_common.sh@10 -- # set +x 00:07:48.424 17:51:41 -- accel/accel_rpc.sh@42 -- # grep software 00:07:48.424 17:51:41 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:48.424 software 00:07:48.424 00:07:48.424 real 0m0.222s 00:07:48.424 user 0m0.047s 00:07:48.424 sys 0m0.012s 00:07:48.424 17:51:41 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:48.424 17:51:41 -- common/autotest_common.sh@10 -- # set +x 00:07:48.424 ************************************ 00:07:48.424 END TEST accel_assign_opcode 00:07:48.424 ************************************ 00:07:48.424 17:51:41 -- accel/accel_rpc.sh@55 -- # killprocess 632892 00:07:48.424 17:51:41 -- common/autotest_common.sh@936 -- # '[' -z 632892 ']' 00:07:48.424 17:51:41 -- common/autotest_common.sh@940 -- # kill -0 632892 00:07:48.424 17:51:41 -- common/autotest_common.sh@941 -- # uname 00:07:48.424 17:51:41 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:07:48.424 17:51:41 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 632892 00:07:48.683 17:51:41 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:07:48.683 17:51:41 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:07:48.683 17:51:41 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 632892' 00:07:48.683 killing process with pid 632892 00:07:48.683 17:51:41 -- common/autotest_common.sh@955 -- # kill 632892 00:07:48.683 17:51:41 -- common/autotest_common.sh@960 -- # wait 632892 00:07:48.943 00:07:48.943 real 0m0.979s 00:07:48.943 user 0m0.891s 00:07:48.943 sys 0m0.445s 00:07:48.943 17:51:41 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:48.943 17:51:41 -- common/autotest_common.sh@10 -- # set +x 00:07:48.943 ************************************ 00:07:48.943 END TEST accel_rpc 00:07:48.943 ************************************ 00:07:48.943 17:51:41 -- spdk/autotest.sh@178 -- # run_test app_cmdline /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/cmdline.sh 00:07:48.943 17:51:41 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:07:48.943 17:51:41 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:48.943 17:51:41 -- common/autotest_common.sh@10 -- # set +x 00:07:48.943 ************************************ 00:07:48.943 START TEST app_cmdline 00:07:48.943 ************************************ 00:07:48.943 17:51:41 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/cmdline.sh 00:07:48.943 * Looking for test storage... 00:07:48.943 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:48.943 17:51:41 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:07:48.943 17:51:41 -- common/autotest_common.sh@1690 -- # lcov --version 00:07:48.943 17:51:41 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:07:49.202 17:51:41 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:07:49.202 17:51:41 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:07:49.202 17:51:41 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:07:49.202 17:51:41 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:07:49.202 17:51:41 -- scripts/common.sh@335 -- # IFS=.-: 00:07:49.202 17:51:41 -- scripts/common.sh@335 -- # read -ra ver1 00:07:49.202 17:51:41 -- scripts/common.sh@336 -- # IFS=.-: 00:07:49.202 17:51:41 -- scripts/common.sh@336 -- # read -ra ver2 00:07:49.202 17:51:41 -- scripts/common.sh@337 -- # local 'op=<' 00:07:49.202 17:51:41 -- scripts/common.sh@339 -- # ver1_l=2 00:07:49.202 17:51:41 -- scripts/common.sh@340 -- # ver2_l=1 00:07:49.202 17:51:41 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:07:49.202 17:51:41 -- scripts/common.sh@343 -- # case "$op" in 00:07:49.202 17:51:41 -- scripts/common.sh@344 -- # : 1 00:07:49.202 17:51:41 -- scripts/common.sh@363 -- # (( v = 0 )) 00:07:49.202 17:51:41 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:49.202 17:51:41 -- scripts/common.sh@364 -- # decimal 1 00:07:49.202 17:51:41 -- scripts/common.sh@352 -- # local d=1 00:07:49.202 17:51:41 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:49.202 17:51:41 -- scripts/common.sh@354 -- # echo 1 00:07:49.202 17:51:41 -- scripts/common.sh@364 -- # ver1[v]=1 00:07:49.202 17:51:41 -- scripts/common.sh@365 -- # decimal 2 00:07:49.202 17:51:41 -- scripts/common.sh@352 -- # local d=2 00:07:49.202 17:51:41 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:49.202 17:51:41 -- scripts/common.sh@354 -- # echo 2 00:07:49.202 17:51:41 -- scripts/common.sh@365 -- # ver2[v]=2 00:07:49.202 17:51:41 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:07:49.202 17:51:41 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:07:49.202 17:51:41 -- scripts/common.sh@367 -- # return 0 00:07:49.202 17:51:41 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:49.202 17:51:41 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:07:49.202 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:49.202 --rc genhtml_branch_coverage=1 00:07:49.202 --rc genhtml_function_coverage=1 00:07:49.202 --rc genhtml_legend=1 00:07:49.202 --rc geninfo_all_blocks=1 00:07:49.202 --rc geninfo_unexecuted_blocks=1 00:07:49.202 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:49.202 ' 00:07:49.202 17:51:41 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:07:49.202 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:49.202 --rc genhtml_branch_coverage=1 00:07:49.202 --rc genhtml_function_coverage=1 00:07:49.202 --rc genhtml_legend=1 00:07:49.202 --rc geninfo_all_blocks=1 00:07:49.202 --rc geninfo_unexecuted_blocks=1 00:07:49.202 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:49.202 ' 00:07:49.202 17:51:41 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:07:49.203 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:49.203 --rc genhtml_branch_coverage=1 00:07:49.203 --rc genhtml_function_coverage=1 00:07:49.203 --rc genhtml_legend=1 00:07:49.203 --rc geninfo_all_blocks=1 00:07:49.203 --rc geninfo_unexecuted_blocks=1 00:07:49.203 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:49.203 ' 00:07:49.203 17:51:41 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:07:49.203 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:49.203 --rc genhtml_branch_coverage=1 00:07:49.203 --rc genhtml_function_coverage=1 00:07:49.203 --rc genhtml_legend=1 00:07:49.203 --rc geninfo_all_blocks=1 00:07:49.203 --rc geninfo_unexecuted_blocks=1 00:07:49.203 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:49.203 ' 00:07:49.203 17:51:41 -- app/cmdline.sh@14 -- # trap 'killprocess $spdk_tgt_pid' EXIT 00:07:49.203 17:51:41 -- app/cmdline.sh@17 -- # spdk_tgt_pid=633106 00:07:49.203 17:51:41 -- app/cmdline.sh@18 -- # waitforlisten 633106 00:07:49.203 17:51:41 -- common/autotest_common.sh@829 -- # '[' -z 633106 ']' 00:07:49.203 17:51:41 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:49.203 17:51:41 -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:49.203 17:51:41 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:49.203 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:49.203 17:51:41 -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:49.203 17:51:41 -- common/autotest_common.sh@10 -- # set +x 00:07:49.203 17:51:41 -- app/cmdline.sh@16 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --rpcs-allowed spdk_get_version,rpc_get_methods 00:07:49.203 [2024-11-19 17:51:41.864900] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:49.203 [2024-11-19 17:51:41.864967] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid633106 ] 00:07:49.203 EAL: No free 2048 kB hugepages reported on node 1 00:07:49.203 [2024-11-19 17:51:41.930763] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:49.203 [2024-11-19 17:51:41.967859] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:49.203 [2024-11-19 17:51:41.967967] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:50.140 17:51:42 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:50.140 17:51:42 -- common/autotest_common.sh@862 -- # return 0 00:07:50.140 17:51:42 -- app/cmdline.sh@20 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py spdk_get_version 00:07:50.140 { 00:07:50.140 "version": "SPDK v24.01.1-pre git sha1 c13c99a5e", 00:07:50.140 "fields": { 00:07:50.140 "major": 24, 00:07:50.140 "minor": 1, 00:07:50.140 "patch": 1, 00:07:50.140 "suffix": "-pre", 00:07:50.140 "commit": "c13c99a5e" 00:07:50.140 } 00:07:50.140 } 00:07:50.140 17:51:42 -- app/cmdline.sh@22 -- # expected_methods=() 00:07:50.140 17:51:42 -- app/cmdline.sh@23 -- # expected_methods+=("rpc_get_methods") 00:07:50.140 17:51:42 -- app/cmdline.sh@24 -- # expected_methods+=("spdk_get_version") 00:07:50.140 17:51:42 -- app/cmdline.sh@26 -- # methods=($(rpc_cmd rpc_get_methods | jq -r ".[]" | sort)) 00:07:50.140 17:51:42 -- app/cmdline.sh@26 -- # rpc_cmd rpc_get_methods 00:07:50.140 17:51:42 -- app/cmdline.sh@26 -- # jq -r '.[]' 00:07:50.140 17:51:42 -- app/cmdline.sh@26 -- # sort 00:07:50.140 17:51:42 -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:50.140 17:51:42 -- common/autotest_common.sh@10 -- # set +x 00:07:50.140 17:51:42 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:50.140 17:51:42 -- app/cmdline.sh@27 -- # (( 2 == 2 )) 00:07:50.140 17:51:42 -- app/cmdline.sh@28 -- # [[ rpc_get_methods spdk_get_version == \r\p\c\_\g\e\t\_\m\e\t\h\o\d\s\ \s\p\d\k\_\g\e\t\_\v\e\r\s\i\o\n ]] 00:07:50.140 17:51:42 -- app/cmdline.sh@30 -- # NOT /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:50.140 17:51:42 -- common/autotest_common.sh@650 -- # local es=0 00:07:50.140 17:51:42 -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:50.140 17:51:42 -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:50.140 17:51:42 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:50.140 17:51:42 -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:50.140 17:51:42 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:50.140 17:51:42 -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:50.140 17:51:42 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:50.140 17:51:42 -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:50.140 17:51:42 -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py ]] 00:07:50.140 17:51:42 -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:50.399 request: 00:07:50.399 { 00:07:50.399 "method": "env_dpdk_get_mem_stats", 00:07:50.399 "req_id": 1 00:07:50.399 } 00:07:50.399 Got JSON-RPC error response 00:07:50.399 response: 00:07:50.399 { 00:07:50.399 "code": -32601, 00:07:50.399 "message": "Method not found" 00:07:50.399 } 00:07:50.399 17:51:43 -- common/autotest_common.sh@653 -- # es=1 00:07:50.399 17:51:43 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:07:50.399 17:51:43 -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:07:50.399 17:51:43 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:07:50.400 17:51:43 -- app/cmdline.sh@1 -- # killprocess 633106 00:07:50.400 17:51:43 -- common/autotest_common.sh@936 -- # '[' -z 633106 ']' 00:07:50.400 17:51:43 -- common/autotest_common.sh@940 -- # kill -0 633106 00:07:50.400 17:51:43 -- common/autotest_common.sh@941 -- # uname 00:07:50.400 17:51:43 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:07:50.400 17:51:43 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 633106 00:07:50.400 17:51:43 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:07:50.400 17:51:43 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:07:50.400 17:51:43 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 633106' 00:07:50.400 killing process with pid 633106 00:07:50.400 17:51:43 -- common/autotest_common.sh@955 -- # kill 633106 00:07:50.400 17:51:43 -- common/autotest_common.sh@960 -- # wait 633106 00:07:50.659 00:07:50.660 real 0m1.727s 00:07:50.660 user 0m1.980s 00:07:50.660 sys 0m0.500s 00:07:50.660 17:51:43 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:50.660 17:51:43 -- common/autotest_common.sh@10 -- # set +x 00:07:50.660 ************************************ 00:07:50.660 END TEST app_cmdline 00:07:50.660 ************************************ 00:07:50.660 17:51:43 -- spdk/autotest.sh@179 -- # run_test version /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/version.sh 00:07:50.660 17:51:43 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:07:50.660 17:51:43 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:50.660 17:51:43 -- common/autotest_common.sh@10 -- # set +x 00:07:50.660 ************************************ 00:07:50.660 START TEST version 00:07:50.660 ************************************ 00:07:50.660 17:51:43 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/version.sh 00:07:50.920 * Looking for test storage... 00:07:50.920 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:50.920 17:51:43 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:07:50.920 17:51:43 -- common/autotest_common.sh@1690 -- # lcov --version 00:07:50.920 17:51:43 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:07:50.920 17:51:43 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:07:50.920 17:51:43 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:07:50.920 17:51:43 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:07:50.920 17:51:43 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:07:50.920 17:51:43 -- scripts/common.sh@335 -- # IFS=.-: 00:07:50.920 17:51:43 -- scripts/common.sh@335 -- # read -ra ver1 00:07:50.920 17:51:43 -- scripts/common.sh@336 -- # IFS=.-: 00:07:50.920 17:51:43 -- scripts/common.sh@336 -- # read -ra ver2 00:07:50.920 17:51:43 -- scripts/common.sh@337 -- # local 'op=<' 00:07:50.920 17:51:43 -- scripts/common.sh@339 -- # ver1_l=2 00:07:50.920 17:51:43 -- scripts/common.sh@340 -- # ver2_l=1 00:07:50.920 17:51:43 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:07:50.920 17:51:43 -- scripts/common.sh@343 -- # case "$op" in 00:07:50.920 17:51:43 -- scripts/common.sh@344 -- # : 1 00:07:50.920 17:51:43 -- scripts/common.sh@363 -- # (( v = 0 )) 00:07:50.920 17:51:43 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:50.920 17:51:43 -- scripts/common.sh@364 -- # decimal 1 00:07:50.920 17:51:43 -- scripts/common.sh@352 -- # local d=1 00:07:50.920 17:51:43 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:50.920 17:51:43 -- scripts/common.sh@354 -- # echo 1 00:07:50.920 17:51:43 -- scripts/common.sh@364 -- # ver1[v]=1 00:07:50.920 17:51:43 -- scripts/common.sh@365 -- # decimal 2 00:07:50.920 17:51:43 -- scripts/common.sh@352 -- # local d=2 00:07:50.920 17:51:43 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:50.920 17:51:43 -- scripts/common.sh@354 -- # echo 2 00:07:50.920 17:51:43 -- scripts/common.sh@365 -- # ver2[v]=2 00:07:50.920 17:51:43 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:07:50.920 17:51:43 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:07:50.920 17:51:43 -- scripts/common.sh@367 -- # return 0 00:07:50.920 17:51:43 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:50.920 17:51:43 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:07:50.920 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:50.920 --rc genhtml_branch_coverage=1 00:07:50.920 --rc genhtml_function_coverage=1 00:07:50.920 --rc genhtml_legend=1 00:07:50.920 --rc geninfo_all_blocks=1 00:07:50.920 --rc geninfo_unexecuted_blocks=1 00:07:50.920 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:50.920 ' 00:07:50.920 17:51:43 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:07:50.920 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:50.920 --rc genhtml_branch_coverage=1 00:07:50.920 --rc genhtml_function_coverage=1 00:07:50.920 --rc genhtml_legend=1 00:07:50.920 --rc geninfo_all_blocks=1 00:07:50.920 --rc geninfo_unexecuted_blocks=1 00:07:50.920 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:50.920 ' 00:07:50.920 17:51:43 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:07:50.920 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:50.920 --rc genhtml_branch_coverage=1 00:07:50.920 --rc genhtml_function_coverage=1 00:07:50.920 --rc genhtml_legend=1 00:07:50.920 --rc geninfo_all_blocks=1 00:07:50.920 --rc geninfo_unexecuted_blocks=1 00:07:50.920 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:50.920 ' 00:07:50.920 17:51:43 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:07:50.920 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:50.920 --rc genhtml_branch_coverage=1 00:07:50.920 --rc genhtml_function_coverage=1 00:07:50.920 --rc genhtml_legend=1 00:07:50.920 --rc geninfo_all_blocks=1 00:07:50.920 --rc geninfo_unexecuted_blocks=1 00:07:50.920 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:50.920 ' 00:07:50.920 17:51:43 -- app/version.sh@17 -- # get_header_version major 00:07:50.920 17:51:43 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MAJOR[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:50.920 17:51:43 -- app/version.sh@14 -- # cut -f2 00:07:50.920 17:51:43 -- app/version.sh@14 -- # tr -d '"' 00:07:50.920 17:51:43 -- app/version.sh@17 -- # major=24 00:07:50.920 17:51:43 -- app/version.sh@18 -- # get_header_version minor 00:07:50.920 17:51:43 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MINOR[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:50.920 17:51:43 -- app/version.sh@14 -- # cut -f2 00:07:50.920 17:51:43 -- app/version.sh@14 -- # tr -d '"' 00:07:50.920 17:51:43 -- app/version.sh@18 -- # minor=1 00:07:50.920 17:51:43 -- app/version.sh@19 -- # get_header_version patch 00:07:50.920 17:51:43 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_PATCH[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:50.920 17:51:43 -- app/version.sh@14 -- # cut -f2 00:07:50.920 17:51:43 -- app/version.sh@14 -- # tr -d '"' 00:07:50.920 17:51:43 -- app/version.sh@19 -- # patch=1 00:07:50.920 17:51:43 -- app/version.sh@20 -- # get_header_version suffix 00:07:50.920 17:51:43 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_SUFFIX[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:50.920 17:51:43 -- app/version.sh@14 -- # cut -f2 00:07:50.920 17:51:43 -- app/version.sh@14 -- # tr -d '"' 00:07:50.920 17:51:43 -- app/version.sh@20 -- # suffix=-pre 00:07:50.920 17:51:43 -- app/version.sh@22 -- # version=24.1 00:07:50.920 17:51:43 -- app/version.sh@25 -- # (( patch != 0 )) 00:07:50.920 17:51:43 -- app/version.sh@25 -- # version=24.1.1 00:07:50.920 17:51:43 -- app/version.sh@28 -- # version=24.1.1rc0 00:07:50.920 17:51:43 -- app/version.sh@30 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:50.920 17:51:43 -- app/version.sh@30 -- # python3 -c 'import spdk; print(spdk.__version__)' 00:07:50.920 17:51:43 -- app/version.sh@30 -- # py_version=24.1.1rc0 00:07:50.920 17:51:43 -- app/version.sh@31 -- # [[ 24.1.1rc0 == \2\4\.\1\.\1\r\c\0 ]] 00:07:50.920 00:07:50.920 real 0m0.240s 00:07:50.920 user 0m0.128s 00:07:50.920 sys 0m0.159s 00:07:50.920 17:51:43 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:50.920 17:51:43 -- common/autotest_common.sh@10 -- # set +x 00:07:50.920 ************************************ 00:07:50.920 END TEST version 00:07:50.920 ************************************ 00:07:50.920 17:51:43 -- spdk/autotest.sh@181 -- # '[' 0 -eq 1 ']' 00:07:50.920 17:51:43 -- spdk/autotest.sh@191 -- # uname -s 00:07:50.920 17:51:43 -- spdk/autotest.sh@191 -- # [[ Linux == Linux ]] 00:07:50.920 17:51:43 -- spdk/autotest.sh@192 -- # [[ 0 -eq 1 ]] 00:07:50.920 17:51:43 -- spdk/autotest.sh@192 -- # [[ 0 -eq 1 ]] 00:07:50.920 17:51:43 -- spdk/autotest.sh@204 -- # '[' 0 -eq 1 ']' 00:07:50.920 17:51:43 -- spdk/autotest.sh@251 -- # '[' 0 -eq 1 ']' 00:07:50.920 17:51:43 -- spdk/autotest.sh@255 -- # timing_exit lib 00:07:50.920 17:51:43 -- common/autotest_common.sh@728 -- # xtrace_disable 00:07:50.920 17:51:43 -- common/autotest_common.sh@10 -- # set +x 00:07:50.920 17:51:43 -- spdk/autotest.sh@257 -- # '[' 0 -eq 1 ']' 00:07:50.920 17:51:43 -- spdk/autotest.sh@265 -- # '[' 0 -eq 1 ']' 00:07:50.920 17:51:43 -- spdk/autotest.sh@274 -- # '[' 0 -eq 1 ']' 00:07:50.921 17:51:43 -- spdk/autotest.sh@298 -- # '[' 0 -eq 1 ']' 00:07:50.921 17:51:43 -- spdk/autotest.sh@302 -- # '[' 0 -eq 1 ']' 00:07:50.921 17:51:43 -- spdk/autotest.sh@306 -- # '[' 0 -eq 1 ']' 00:07:50.921 17:51:43 -- spdk/autotest.sh@311 -- # '[' 0 -eq 1 ']' 00:07:50.921 17:51:43 -- spdk/autotest.sh@320 -- # '[' 0 -eq 1 ']' 00:07:50.921 17:51:43 -- spdk/autotest.sh@325 -- # '[' 0 -eq 1 ']' 00:07:50.921 17:51:43 -- spdk/autotest.sh@329 -- # '[' 0 -eq 1 ']' 00:07:50.921 17:51:43 -- spdk/autotest.sh@333 -- # '[' 0 -eq 1 ']' 00:07:50.921 17:51:43 -- spdk/autotest.sh@337 -- # '[' 0 -eq 1 ']' 00:07:50.921 17:51:43 -- spdk/autotest.sh@342 -- # '[' 0 -eq 1 ']' 00:07:50.921 17:51:43 -- spdk/autotest.sh@346 -- # '[' 0 -eq 1 ']' 00:07:50.921 17:51:43 -- spdk/autotest.sh@353 -- # [[ 0 -eq 1 ]] 00:07:50.921 17:51:43 -- spdk/autotest.sh@357 -- # [[ 0 -eq 1 ]] 00:07:50.921 17:51:43 -- spdk/autotest.sh@361 -- # [[ 1 -eq 1 ]] 00:07:50.921 17:51:43 -- spdk/autotest.sh@362 -- # run_test llvm_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm.sh 00:07:50.921 17:51:43 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:07:50.921 17:51:43 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:50.921 17:51:43 -- common/autotest_common.sh@10 -- # set +x 00:07:50.921 ************************************ 00:07:50.921 START TEST llvm_fuzz 00:07:50.921 ************************************ 00:07:50.921 17:51:43 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm.sh 00:07:51.181 * Looking for test storage... 00:07:51.181 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz 00:07:51.181 17:51:43 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:07:51.181 17:51:43 -- common/autotest_common.sh@1690 -- # lcov --version 00:07:51.181 17:51:43 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:07:51.181 17:51:43 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:07:51.181 17:51:43 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:07:51.181 17:51:43 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:07:51.181 17:51:43 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:07:51.181 17:51:43 -- scripts/common.sh@335 -- # IFS=.-: 00:07:51.181 17:51:43 -- scripts/common.sh@335 -- # read -ra ver1 00:07:51.181 17:51:43 -- scripts/common.sh@336 -- # IFS=.-: 00:07:51.181 17:51:43 -- scripts/common.sh@336 -- # read -ra ver2 00:07:51.181 17:51:43 -- scripts/common.sh@337 -- # local 'op=<' 00:07:51.181 17:51:43 -- scripts/common.sh@339 -- # ver1_l=2 00:07:51.181 17:51:43 -- scripts/common.sh@340 -- # ver2_l=1 00:07:51.181 17:51:43 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:07:51.181 17:51:43 -- scripts/common.sh@343 -- # case "$op" in 00:07:51.181 17:51:43 -- scripts/common.sh@344 -- # : 1 00:07:51.181 17:51:43 -- scripts/common.sh@363 -- # (( v = 0 )) 00:07:51.181 17:51:43 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:51.181 17:51:43 -- scripts/common.sh@364 -- # decimal 1 00:07:51.181 17:51:43 -- scripts/common.sh@352 -- # local d=1 00:07:51.181 17:51:43 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:51.181 17:51:43 -- scripts/common.sh@354 -- # echo 1 00:07:51.181 17:51:43 -- scripts/common.sh@364 -- # ver1[v]=1 00:07:51.181 17:51:43 -- scripts/common.sh@365 -- # decimal 2 00:07:51.181 17:51:43 -- scripts/common.sh@352 -- # local d=2 00:07:51.181 17:51:43 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:51.181 17:51:43 -- scripts/common.sh@354 -- # echo 2 00:07:51.181 17:51:43 -- scripts/common.sh@365 -- # ver2[v]=2 00:07:51.181 17:51:43 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:07:51.181 17:51:43 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:07:51.181 17:51:43 -- scripts/common.sh@367 -- # return 0 00:07:51.181 17:51:43 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:51.181 17:51:43 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:07:51.181 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:51.181 --rc genhtml_branch_coverage=1 00:07:51.181 --rc genhtml_function_coverage=1 00:07:51.181 --rc genhtml_legend=1 00:07:51.181 --rc geninfo_all_blocks=1 00:07:51.181 --rc geninfo_unexecuted_blocks=1 00:07:51.181 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:51.181 ' 00:07:51.181 17:51:43 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:07:51.181 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:51.181 --rc genhtml_branch_coverage=1 00:07:51.181 --rc genhtml_function_coverage=1 00:07:51.181 --rc genhtml_legend=1 00:07:51.181 --rc geninfo_all_blocks=1 00:07:51.181 --rc geninfo_unexecuted_blocks=1 00:07:51.181 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:51.181 ' 00:07:51.181 17:51:43 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:07:51.181 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:51.181 --rc genhtml_branch_coverage=1 00:07:51.181 --rc genhtml_function_coverage=1 00:07:51.181 --rc genhtml_legend=1 00:07:51.181 --rc geninfo_all_blocks=1 00:07:51.181 --rc geninfo_unexecuted_blocks=1 00:07:51.181 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:51.181 ' 00:07:51.181 17:51:43 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:07:51.181 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:51.181 --rc genhtml_branch_coverage=1 00:07:51.181 --rc genhtml_function_coverage=1 00:07:51.181 --rc genhtml_legend=1 00:07:51.181 --rc geninfo_all_blocks=1 00:07:51.181 --rc geninfo_unexecuted_blocks=1 00:07:51.181 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:51.181 ' 00:07:51.181 17:51:43 -- fuzz/llvm.sh@11 -- # fuzzers=($(get_fuzzer_targets)) 00:07:51.181 17:51:43 -- fuzz/llvm.sh@11 -- # get_fuzzer_targets 00:07:51.181 17:51:43 -- common/autotest_common.sh@548 -- # fuzzers=() 00:07:51.181 17:51:43 -- common/autotest_common.sh@548 -- # local fuzzers 00:07:51.181 17:51:43 -- common/autotest_common.sh@550 -- # [[ -n '' ]] 00:07:51.181 17:51:43 -- common/autotest_common.sh@553 -- # fuzzers=("$rootdir/test/fuzz/llvm/"*) 00:07:51.181 17:51:43 -- common/autotest_common.sh@554 -- # fuzzers=("${fuzzers[@]##*/}") 00:07:51.181 17:51:43 -- common/autotest_common.sh@557 -- # echo 'common.sh llvm-gcov.sh nvmf vfio' 00:07:51.181 17:51:43 -- fuzz/llvm.sh@13 -- # llvm_out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm 00:07:51.181 17:51:43 -- fuzz/llvm.sh@15 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/coverage 00:07:51.181 17:51:43 -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:07:51.181 17:51:43 -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:07:51.181 17:51:43 -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:07:51.181 17:51:43 -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:07:51.181 17:51:43 -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:07:51.181 17:51:43 -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:07:51.181 17:51:43 -- fuzz/llvm.sh@19 -- # run_test nvmf_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/run.sh 00:07:51.181 17:51:43 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:07:51.181 17:51:43 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:51.181 17:51:43 -- common/autotest_common.sh@10 -- # set +x 00:07:51.181 ************************************ 00:07:51.181 START TEST nvmf_fuzz 00:07:51.181 ************************************ 00:07:51.181 17:51:43 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/run.sh 00:07:51.443 * Looking for test storage... 00:07:51.443 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:51.443 17:51:44 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:07:51.443 17:51:44 -- common/autotest_common.sh@1690 -- # lcov --version 00:07:51.443 17:51:44 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:07:51.443 17:51:44 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:07:51.443 17:51:44 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:07:51.443 17:51:44 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:07:51.443 17:51:44 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:07:51.443 17:51:44 -- scripts/common.sh@335 -- # IFS=.-: 00:07:51.443 17:51:44 -- scripts/common.sh@335 -- # read -ra ver1 00:07:51.443 17:51:44 -- scripts/common.sh@336 -- # IFS=.-: 00:07:51.443 17:51:44 -- scripts/common.sh@336 -- # read -ra ver2 00:07:51.443 17:51:44 -- scripts/common.sh@337 -- # local 'op=<' 00:07:51.443 17:51:44 -- scripts/common.sh@339 -- # ver1_l=2 00:07:51.443 17:51:44 -- scripts/common.sh@340 -- # ver2_l=1 00:07:51.443 17:51:44 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:07:51.443 17:51:44 -- scripts/common.sh@343 -- # case "$op" in 00:07:51.443 17:51:44 -- scripts/common.sh@344 -- # : 1 00:07:51.443 17:51:44 -- scripts/common.sh@363 -- # (( v = 0 )) 00:07:51.443 17:51:44 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:51.443 17:51:44 -- scripts/common.sh@364 -- # decimal 1 00:07:51.443 17:51:44 -- scripts/common.sh@352 -- # local d=1 00:07:51.443 17:51:44 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:51.443 17:51:44 -- scripts/common.sh@354 -- # echo 1 00:07:51.443 17:51:44 -- scripts/common.sh@364 -- # ver1[v]=1 00:07:51.443 17:51:44 -- scripts/common.sh@365 -- # decimal 2 00:07:51.444 17:51:44 -- scripts/common.sh@352 -- # local d=2 00:07:51.444 17:51:44 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:51.444 17:51:44 -- scripts/common.sh@354 -- # echo 2 00:07:51.444 17:51:44 -- scripts/common.sh@365 -- # ver2[v]=2 00:07:51.444 17:51:44 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:07:51.444 17:51:44 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:07:51.444 17:51:44 -- scripts/common.sh@367 -- # return 0 00:07:51.444 17:51:44 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:51.444 17:51:44 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:07:51.444 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:51.444 --rc genhtml_branch_coverage=1 00:07:51.444 --rc genhtml_function_coverage=1 00:07:51.444 --rc genhtml_legend=1 00:07:51.444 --rc geninfo_all_blocks=1 00:07:51.444 --rc geninfo_unexecuted_blocks=1 00:07:51.444 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:51.444 ' 00:07:51.444 17:51:44 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:07:51.444 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:51.444 --rc genhtml_branch_coverage=1 00:07:51.444 --rc genhtml_function_coverage=1 00:07:51.444 --rc genhtml_legend=1 00:07:51.444 --rc geninfo_all_blocks=1 00:07:51.444 --rc geninfo_unexecuted_blocks=1 00:07:51.444 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:51.444 ' 00:07:51.444 17:51:44 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:07:51.444 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:51.444 --rc genhtml_branch_coverage=1 00:07:51.444 --rc genhtml_function_coverage=1 00:07:51.444 --rc genhtml_legend=1 00:07:51.444 --rc geninfo_all_blocks=1 00:07:51.444 --rc geninfo_unexecuted_blocks=1 00:07:51.444 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:51.444 ' 00:07:51.444 17:51:44 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:07:51.444 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:51.444 --rc genhtml_branch_coverage=1 00:07:51.444 --rc genhtml_function_coverage=1 00:07:51.444 --rc genhtml_legend=1 00:07:51.444 --rc geninfo_all_blocks=1 00:07:51.444 --rc geninfo_unexecuted_blocks=1 00:07:51.444 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:51.444 ' 00:07:51.444 17:51:44 -- nvmf/run.sh@52 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/common.sh 00:07:51.444 17:51:44 -- setup/common.sh@6 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh 00:07:51.444 17:51:44 -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:07:51.444 17:51:44 -- common/autotest_common.sh@34 -- # set -e 00:07:51.444 17:51:44 -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:07:51.444 17:51:44 -- common/autotest_common.sh@36 -- # shopt -s extglob 00:07:51.444 17:51:44 -- common/autotest_common.sh@38 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh ]] 00:07:51.444 17:51:44 -- common/autotest_common.sh@39 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh 00:07:51.444 17:51:44 -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:07:51.444 17:51:44 -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:07:51.444 17:51:44 -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=n 00:07:51.444 17:51:44 -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:07:51.444 17:51:44 -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:07:51.444 17:51:44 -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:07:51.444 17:51:44 -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:07:51.444 17:51:44 -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:07:51.444 17:51:44 -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:07:51.444 17:51:44 -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:07:51.444 17:51:44 -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:07:51.444 17:51:44 -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:07:51.444 17:51:44 -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:07:51.444 17:51:44 -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:07:51.444 17:51:44 -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:07:51.444 17:51:44 -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:07:51.444 17:51:44 -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:07:51.444 17:51:44 -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:07:51.444 17:51:44 -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:07:51.444 17:51:44 -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:07:51.444 17:51:44 -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:07:51.444 17:51:44 -- common/build_config.sh@22 -- # CONFIG_CET=n 00:07:51.444 17:51:44 -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=n 00:07:51.444 17:51:44 -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:07:51.444 17:51:44 -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:07:51.444 17:51:44 -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:07:51.444 17:51:44 -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:07:51.444 17:51:44 -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:07:51.444 17:51:44 -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:07:51.444 17:51:44 -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:07:51.444 17:51:44 -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:07:51.444 17:51:44 -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:07:51.444 17:51:44 -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:07:51.444 17:51:44 -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:07:51.444 17:51:44 -- common/build_config.sh@35 -- # CONFIG_FUZZER=y 00:07:51.444 17:51:44 -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:07:51.444 17:51:44 -- common/build_config.sh@37 -- # CONFIG_CRYPTO=n 00:07:51.444 17:51:44 -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:07:51.444 17:51:44 -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:07:51.444 17:51:44 -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:07:51.444 17:51:44 -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR=//var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:07:51.444 17:51:44 -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:07:51.444 17:51:44 -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:07:51.444 17:51:44 -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:07:51.444 17:51:44 -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:07:51.444 17:51:44 -- common/build_config.sh@46 -- # CONFIG_COVERAGE=y 00:07:51.444 17:51:44 -- common/build_config.sh@47 -- # CONFIG_RDMA=y 00:07:51.444 17:51:44 -- common/build_config.sh@48 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:07:51.444 17:51:44 -- common/build_config.sh@49 -- # CONFIG_URING_PATH= 00:07:51.444 17:51:44 -- common/build_config.sh@50 -- # CONFIG_XNVME=n 00:07:51.444 17:51:44 -- common/build_config.sh@51 -- # CONFIG_VFIO_USER=y 00:07:51.444 17:51:44 -- common/build_config.sh@52 -- # CONFIG_ARCH=native 00:07:51.444 17:51:44 -- common/build_config.sh@53 -- # CONFIG_URING_ZNS=n 00:07:51.444 17:51:44 -- common/build_config.sh@54 -- # CONFIG_WERROR=y 00:07:51.444 17:51:44 -- common/build_config.sh@55 -- # CONFIG_HAVE_LIBBSD=n 00:07:51.444 17:51:44 -- common/build_config.sh@56 -- # CONFIG_UBSAN=y 00:07:51.444 17:51:44 -- common/build_config.sh@57 -- # CONFIG_IPSEC_MB_DIR= 00:07:51.444 17:51:44 -- common/build_config.sh@58 -- # CONFIG_GOLANG=n 00:07:51.444 17:51:44 -- common/build_config.sh@59 -- # CONFIG_ISAL=y 00:07:51.444 17:51:44 -- common/build_config.sh@60 -- # CONFIG_IDXD_KERNEL=y 00:07:51.444 17:51:44 -- common/build_config.sh@61 -- # CONFIG_DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:51.444 17:51:44 -- common/build_config.sh@62 -- # CONFIG_RDMA_PROV=verbs 00:07:51.444 17:51:44 -- common/build_config.sh@63 -- # CONFIG_APPS=y 00:07:51.444 17:51:44 -- common/build_config.sh@64 -- # CONFIG_SHARED=n 00:07:51.444 17:51:44 -- common/build_config.sh@65 -- # CONFIG_FC_PATH= 00:07:51.444 17:51:44 -- common/build_config.sh@66 -- # CONFIG_DPDK_PKG_CONFIG=n 00:07:51.444 17:51:44 -- common/build_config.sh@67 -- # CONFIG_FC=n 00:07:51.444 17:51:44 -- common/build_config.sh@68 -- # CONFIG_AVAHI=n 00:07:51.444 17:51:44 -- common/build_config.sh@69 -- # CONFIG_FIO_PLUGIN=y 00:07:51.444 17:51:44 -- common/build_config.sh@70 -- # CONFIG_RAID5F=n 00:07:51.444 17:51:44 -- common/build_config.sh@71 -- # CONFIG_EXAMPLES=y 00:07:51.444 17:51:44 -- common/build_config.sh@72 -- # CONFIG_TESTS=y 00:07:51.444 17:51:44 -- common/build_config.sh@73 -- # CONFIG_CRYPTO_MLX5=n 00:07:51.444 17:51:44 -- common/build_config.sh@74 -- # CONFIG_MAX_LCORES= 00:07:51.444 17:51:44 -- common/build_config.sh@75 -- # CONFIG_IPSEC_MB=n 00:07:51.444 17:51:44 -- common/build_config.sh@76 -- # CONFIG_DEBUG=y 00:07:51.444 17:51:44 -- common/build_config.sh@77 -- # CONFIG_DPDK_COMPRESSDEV=n 00:07:51.444 17:51:44 -- common/build_config.sh@78 -- # CONFIG_CROSS_PREFIX= 00:07:51.444 17:51:44 -- common/build_config.sh@79 -- # CONFIG_URING=n 00:07:51.444 17:51:44 -- common/autotest_common.sh@48 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:07:51.444 17:51:44 -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:07:51.444 17:51:44 -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:07:51.444 17:51:44 -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:07:51.444 17:51:44 -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:07:51.444 17:51:44 -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:51.444 17:51:44 -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:51.444 17:51:44 -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:51.444 17:51:44 -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:07:51.444 17:51:44 -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:07:51.444 17:51:44 -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:07:51.444 17:51:44 -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:07:51.444 17:51:44 -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:07:51.444 17:51:44 -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:07:51.444 17:51:44 -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/config.h ]] 00:07:51.445 17:51:44 -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:07:51.445 #define SPDK_CONFIG_H 00:07:51.445 #define SPDK_CONFIG_APPS 1 00:07:51.445 #define SPDK_CONFIG_ARCH native 00:07:51.445 #undef SPDK_CONFIG_ASAN 00:07:51.445 #undef SPDK_CONFIG_AVAHI 00:07:51.445 #undef SPDK_CONFIG_CET 00:07:51.445 #define SPDK_CONFIG_COVERAGE 1 00:07:51.445 #define SPDK_CONFIG_CROSS_PREFIX 00:07:51.445 #undef SPDK_CONFIG_CRYPTO 00:07:51.445 #undef SPDK_CONFIG_CRYPTO_MLX5 00:07:51.445 #undef SPDK_CONFIG_CUSTOMOCF 00:07:51.445 #undef SPDK_CONFIG_DAOS 00:07:51.445 #define SPDK_CONFIG_DAOS_DIR 00:07:51.445 #define SPDK_CONFIG_DEBUG 1 00:07:51.445 #undef SPDK_CONFIG_DPDK_COMPRESSDEV 00:07:51.445 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:07:51.445 #define SPDK_CONFIG_DPDK_INC_DIR //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:07:51.445 #define SPDK_CONFIG_DPDK_LIB_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:51.445 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:07:51.445 #define SPDK_CONFIG_ENV /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:07:51.445 #define SPDK_CONFIG_EXAMPLES 1 00:07:51.445 #undef SPDK_CONFIG_FC 00:07:51.445 #define SPDK_CONFIG_FC_PATH 00:07:51.445 #define SPDK_CONFIG_FIO_PLUGIN 1 00:07:51.445 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:07:51.445 #undef SPDK_CONFIG_FUSE 00:07:51.445 #define SPDK_CONFIG_FUZZER 1 00:07:51.445 #define SPDK_CONFIG_FUZZER_LIB /usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:07:51.445 #undef SPDK_CONFIG_GOLANG 00:07:51.445 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:07:51.445 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:07:51.445 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:07:51.445 #undef SPDK_CONFIG_HAVE_LIBBSD 00:07:51.445 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:07:51.445 #define SPDK_CONFIG_IDXD 1 00:07:51.445 #define SPDK_CONFIG_IDXD_KERNEL 1 00:07:51.445 #undef SPDK_CONFIG_IPSEC_MB 00:07:51.445 #define SPDK_CONFIG_IPSEC_MB_DIR 00:07:51.445 #define SPDK_CONFIG_ISAL 1 00:07:51.445 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:07:51.445 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:07:51.445 #define SPDK_CONFIG_LIBDIR 00:07:51.445 #undef SPDK_CONFIG_LTO 00:07:51.445 #define SPDK_CONFIG_MAX_LCORES 00:07:51.445 #define SPDK_CONFIG_NVME_CUSE 1 00:07:51.445 #undef SPDK_CONFIG_OCF 00:07:51.445 #define SPDK_CONFIG_OCF_PATH 00:07:51.445 #define SPDK_CONFIG_OPENSSL_PATH 00:07:51.445 #undef SPDK_CONFIG_PGO_CAPTURE 00:07:51.445 #undef SPDK_CONFIG_PGO_USE 00:07:51.445 #define SPDK_CONFIG_PREFIX /usr/local 00:07:51.445 #undef SPDK_CONFIG_RAID5F 00:07:51.445 #undef SPDK_CONFIG_RBD 00:07:51.445 #define SPDK_CONFIG_RDMA 1 00:07:51.445 #define SPDK_CONFIG_RDMA_PROV verbs 00:07:51.445 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:07:51.445 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:07:51.445 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:07:51.445 #undef SPDK_CONFIG_SHARED 00:07:51.445 #undef SPDK_CONFIG_SMA 00:07:51.445 #define SPDK_CONFIG_TESTS 1 00:07:51.445 #undef SPDK_CONFIG_TSAN 00:07:51.445 #define SPDK_CONFIG_UBLK 1 00:07:51.445 #define SPDK_CONFIG_UBSAN 1 00:07:51.445 #undef SPDK_CONFIG_UNIT_TESTS 00:07:51.445 #undef SPDK_CONFIG_URING 00:07:51.445 #define SPDK_CONFIG_URING_PATH 00:07:51.445 #undef SPDK_CONFIG_URING_ZNS 00:07:51.445 #undef SPDK_CONFIG_USDT 00:07:51.445 #undef SPDK_CONFIG_VBDEV_COMPRESS 00:07:51.445 #undef SPDK_CONFIG_VBDEV_COMPRESS_MLX5 00:07:51.445 #define SPDK_CONFIG_VFIO_USER 1 00:07:51.445 #define SPDK_CONFIG_VFIO_USER_DIR 00:07:51.445 #define SPDK_CONFIG_VHOST 1 00:07:51.445 #define SPDK_CONFIG_VIRTIO 1 00:07:51.445 #undef SPDK_CONFIG_VTUNE 00:07:51.445 #define SPDK_CONFIG_VTUNE_DIR 00:07:51.445 #define SPDK_CONFIG_WERROR 1 00:07:51.445 #define SPDK_CONFIG_WPDK_DIR 00:07:51.445 #undef SPDK_CONFIG_XNVME 00:07:51.445 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:07:51.445 17:51:44 -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:07:51.445 17:51:44 -- common/autotest_common.sh@49 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:07:51.445 17:51:44 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:07:51.445 17:51:44 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:07:51.445 17:51:44 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:07:51.445 17:51:44 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:51.445 17:51:44 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:51.445 17:51:44 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:51.445 17:51:44 -- paths/export.sh@5 -- # export PATH 00:07:51.445 17:51:44 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:51.445 17:51:44 -- common/autotest_common.sh@50 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:07:51.445 17:51:44 -- pm/common@6 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:07:51.445 17:51:44 -- pm/common@6 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:07:51.445 17:51:44 -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:07:51.445 17:51:44 -- pm/common@7 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/../../../ 00:07:51.445 17:51:44 -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:07:51.445 17:51:44 -- pm/common@16 -- # TEST_TAG=N/A 00:07:51.445 17:51:44 -- pm/common@17 -- # TEST_TAG_FILE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.run_test_name 00:07:51.445 17:51:44 -- common/autotest_common.sh@52 -- # : 1 00:07:51.445 17:51:44 -- common/autotest_common.sh@53 -- # export RUN_NIGHTLY 00:07:51.445 17:51:44 -- common/autotest_common.sh@56 -- # : 0 00:07:51.445 17:51:44 -- common/autotest_common.sh@57 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:07:51.445 17:51:44 -- common/autotest_common.sh@58 -- # : 0 00:07:51.445 17:51:44 -- common/autotest_common.sh@59 -- # export SPDK_RUN_VALGRIND 00:07:51.445 17:51:44 -- common/autotest_common.sh@60 -- # : 1 00:07:51.445 17:51:44 -- common/autotest_common.sh@61 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:07:51.445 17:51:44 -- common/autotest_common.sh@62 -- # : 0 00:07:51.445 17:51:44 -- common/autotest_common.sh@63 -- # export SPDK_TEST_UNITTEST 00:07:51.445 17:51:44 -- common/autotest_common.sh@64 -- # : 00:07:51.445 17:51:44 -- common/autotest_common.sh@65 -- # export SPDK_TEST_AUTOBUILD 00:07:51.445 17:51:44 -- common/autotest_common.sh@66 -- # : 0 00:07:51.445 17:51:44 -- common/autotest_common.sh@67 -- # export SPDK_TEST_RELEASE_BUILD 00:07:51.445 17:51:44 -- common/autotest_common.sh@68 -- # : 0 00:07:51.445 17:51:44 -- common/autotest_common.sh@69 -- # export SPDK_TEST_ISAL 00:07:51.445 17:51:44 -- common/autotest_common.sh@70 -- # : 0 00:07:51.445 17:51:44 -- common/autotest_common.sh@71 -- # export SPDK_TEST_ISCSI 00:07:51.445 17:51:44 -- common/autotest_common.sh@72 -- # : 0 00:07:51.445 17:51:44 -- common/autotest_common.sh@73 -- # export SPDK_TEST_ISCSI_INITIATOR 00:07:51.445 17:51:44 -- common/autotest_common.sh@74 -- # : 0 00:07:51.445 17:51:44 -- common/autotest_common.sh@75 -- # export SPDK_TEST_NVME 00:07:51.445 17:51:44 -- common/autotest_common.sh@76 -- # : 0 00:07:51.445 17:51:44 -- common/autotest_common.sh@77 -- # export SPDK_TEST_NVME_PMR 00:07:51.445 17:51:44 -- common/autotest_common.sh@78 -- # : 0 00:07:51.445 17:51:44 -- common/autotest_common.sh@79 -- # export SPDK_TEST_NVME_BP 00:07:51.445 17:51:44 -- common/autotest_common.sh@80 -- # : 0 00:07:51.445 17:51:44 -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME_CLI 00:07:51.445 17:51:44 -- common/autotest_common.sh@82 -- # : 0 00:07:51.445 17:51:44 -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_CUSE 00:07:51.445 17:51:44 -- common/autotest_common.sh@84 -- # : 0 00:07:51.445 17:51:44 -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_FDP 00:07:51.445 17:51:44 -- common/autotest_common.sh@86 -- # : 0 00:07:51.445 17:51:44 -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVMF 00:07:51.445 17:51:44 -- common/autotest_common.sh@88 -- # : 0 00:07:51.445 17:51:44 -- common/autotest_common.sh@89 -- # export SPDK_TEST_VFIOUSER 00:07:51.445 17:51:44 -- common/autotest_common.sh@90 -- # : 0 00:07:51.445 17:51:44 -- common/autotest_common.sh@91 -- # export SPDK_TEST_VFIOUSER_QEMU 00:07:51.445 17:51:44 -- common/autotest_common.sh@92 -- # : 1 00:07:51.445 17:51:44 -- common/autotest_common.sh@93 -- # export SPDK_TEST_FUZZER 00:07:51.445 17:51:44 -- common/autotest_common.sh@94 -- # : 1 00:07:51.445 17:51:44 -- common/autotest_common.sh@95 -- # export SPDK_TEST_FUZZER_SHORT 00:07:51.445 17:51:44 -- common/autotest_common.sh@96 -- # : rdma 00:07:51.445 17:51:44 -- common/autotest_common.sh@97 -- # export SPDK_TEST_NVMF_TRANSPORT 00:07:51.445 17:51:44 -- common/autotest_common.sh@98 -- # : 0 00:07:51.445 17:51:44 -- common/autotest_common.sh@99 -- # export SPDK_TEST_RBD 00:07:51.445 17:51:44 -- common/autotest_common.sh@100 -- # : 0 00:07:51.445 17:51:44 -- common/autotest_common.sh@101 -- # export SPDK_TEST_VHOST 00:07:51.445 17:51:44 -- common/autotest_common.sh@102 -- # : 0 00:07:51.445 17:51:44 -- common/autotest_common.sh@103 -- # export SPDK_TEST_BLOCKDEV 00:07:51.445 17:51:44 -- common/autotest_common.sh@104 -- # : 0 00:07:51.445 17:51:44 -- common/autotest_common.sh@105 -- # export SPDK_TEST_IOAT 00:07:51.445 17:51:44 -- common/autotest_common.sh@106 -- # : 0 00:07:51.445 17:51:44 -- common/autotest_common.sh@107 -- # export SPDK_TEST_BLOBFS 00:07:51.445 17:51:44 -- common/autotest_common.sh@108 -- # : 0 00:07:51.446 17:51:44 -- common/autotest_common.sh@109 -- # export SPDK_TEST_VHOST_INIT 00:07:51.446 17:51:44 -- common/autotest_common.sh@110 -- # : 0 00:07:51.446 17:51:44 -- common/autotest_common.sh@111 -- # export SPDK_TEST_LVOL 00:07:51.446 17:51:44 -- common/autotest_common.sh@112 -- # : 0 00:07:51.446 17:51:44 -- common/autotest_common.sh@113 -- # export SPDK_TEST_VBDEV_COMPRESS 00:07:51.446 17:51:44 -- common/autotest_common.sh@114 -- # : 0 00:07:51.446 17:51:44 -- common/autotest_common.sh@115 -- # export SPDK_RUN_ASAN 00:07:51.446 17:51:44 -- common/autotest_common.sh@116 -- # : 1 00:07:51.446 17:51:44 -- common/autotest_common.sh@117 -- # export SPDK_RUN_UBSAN 00:07:51.446 17:51:44 -- common/autotest_common.sh@118 -- # : /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:07:51.446 17:51:44 -- common/autotest_common.sh@119 -- # export SPDK_RUN_EXTERNAL_DPDK 00:07:51.446 17:51:44 -- common/autotest_common.sh@120 -- # : 0 00:07:51.446 17:51:44 -- common/autotest_common.sh@121 -- # export SPDK_RUN_NON_ROOT 00:07:51.446 17:51:44 -- common/autotest_common.sh@122 -- # : 0 00:07:51.446 17:51:44 -- common/autotest_common.sh@123 -- # export SPDK_TEST_CRYPTO 00:07:51.446 17:51:44 -- common/autotest_common.sh@124 -- # : 0 00:07:51.446 17:51:44 -- common/autotest_common.sh@125 -- # export SPDK_TEST_FTL 00:07:51.446 17:51:44 -- common/autotest_common.sh@126 -- # : 0 00:07:51.446 17:51:44 -- common/autotest_common.sh@127 -- # export SPDK_TEST_OCF 00:07:51.446 17:51:44 -- common/autotest_common.sh@128 -- # : 0 00:07:51.446 17:51:44 -- common/autotest_common.sh@129 -- # export SPDK_TEST_VMD 00:07:51.446 17:51:44 -- common/autotest_common.sh@130 -- # : 0 00:07:51.446 17:51:44 -- common/autotest_common.sh@131 -- # export SPDK_TEST_OPAL 00:07:51.446 17:51:44 -- common/autotest_common.sh@132 -- # : v22.11.4 00:07:51.446 17:51:44 -- common/autotest_common.sh@133 -- # export SPDK_TEST_NATIVE_DPDK 00:07:51.446 17:51:44 -- common/autotest_common.sh@134 -- # : true 00:07:51.446 17:51:44 -- common/autotest_common.sh@135 -- # export SPDK_AUTOTEST_X 00:07:51.446 17:51:44 -- common/autotest_common.sh@136 -- # : 0 00:07:51.446 17:51:44 -- common/autotest_common.sh@137 -- # export SPDK_TEST_RAID5 00:07:51.446 17:51:44 -- common/autotest_common.sh@138 -- # : 0 00:07:51.446 17:51:44 -- common/autotest_common.sh@139 -- # export SPDK_TEST_URING 00:07:51.446 17:51:44 -- common/autotest_common.sh@140 -- # : 0 00:07:51.446 17:51:44 -- common/autotest_common.sh@141 -- # export SPDK_TEST_USDT 00:07:51.446 17:51:44 -- common/autotest_common.sh@142 -- # : 0 00:07:51.446 17:51:44 -- common/autotest_common.sh@143 -- # export SPDK_TEST_USE_IGB_UIO 00:07:51.446 17:51:44 -- common/autotest_common.sh@144 -- # : 0 00:07:51.446 17:51:44 -- common/autotest_common.sh@145 -- # export SPDK_TEST_SCHEDULER 00:07:51.446 17:51:44 -- common/autotest_common.sh@146 -- # : 0 00:07:51.446 17:51:44 -- common/autotest_common.sh@147 -- # export SPDK_TEST_SCANBUILD 00:07:51.446 17:51:44 -- common/autotest_common.sh@148 -- # : 00:07:51.446 17:51:44 -- common/autotest_common.sh@149 -- # export SPDK_TEST_NVMF_NICS 00:07:51.446 17:51:44 -- common/autotest_common.sh@150 -- # : 0 00:07:51.446 17:51:44 -- common/autotest_common.sh@151 -- # export SPDK_TEST_SMA 00:07:51.446 17:51:44 -- common/autotest_common.sh@152 -- # : 0 00:07:51.446 17:51:44 -- common/autotest_common.sh@153 -- # export SPDK_TEST_DAOS 00:07:51.446 17:51:44 -- common/autotest_common.sh@154 -- # : 0 00:07:51.446 17:51:44 -- common/autotest_common.sh@155 -- # export SPDK_TEST_XNVME 00:07:51.446 17:51:44 -- common/autotest_common.sh@156 -- # : 0 00:07:51.446 17:51:44 -- common/autotest_common.sh@157 -- # export SPDK_TEST_ACCEL_DSA 00:07:51.446 17:51:44 -- common/autotest_common.sh@158 -- # : 0 00:07:51.446 17:51:44 -- common/autotest_common.sh@159 -- # export SPDK_TEST_ACCEL_IAA 00:07:51.446 17:51:44 -- common/autotest_common.sh@160 -- # : 0 00:07:51.446 17:51:44 -- common/autotest_common.sh@161 -- # export SPDK_TEST_ACCEL_IOAT 00:07:51.446 17:51:44 -- common/autotest_common.sh@163 -- # : 00:07:51.446 17:51:44 -- common/autotest_common.sh@164 -- # export SPDK_TEST_FUZZER_TARGET 00:07:51.446 17:51:44 -- common/autotest_common.sh@165 -- # : 0 00:07:51.446 17:51:44 -- common/autotest_common.sh@166 -- # export SPDK_TEST_NVMF_MDNS 00:07:51.446 17:51:44 -- common/autotest_common.sh@167 -- # : 0 00:07:51.446 17:51:44 -- common/autotest_common.sh@168 -- # export SPDK_JSONRPC_GO_CLIENT 00:07:51.446 17:51:44 -- common/autotest_common.sh@171 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:07:51.446 17:51:44 -- common/autotest_common.sh@171 -- # SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:07:51.446 17:51:44 -- common/autotest_common.sh@172 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:51.446 17:51:44 -- common/autotest_common.sh@172 -- # DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:51.446 17:51:44 -- common/autotest_common.sh@173 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:51.446 17:51:44 -- common/autotest_common.sh@173 -- # VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:51.446 17:51:44 -- common/autotest_common.sh@174 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:51.446 17:51:44 -- common/autotest_common.sh@174 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:51.446 17:51:44 -- common/autotest_common.sh@177 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:07:51.446 17:51:44 -- common/autotest_common.sh@177 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:07:51.446 17:51:44 -- common/autotest_common.sh@181 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:51.446 17:51:44 -- common/autotest_common.sh@181 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:51.446 17:51:44 -- common/autotest_common.sh@185 -- # export PYTHONDONTWRITEBYTECODE=1 00:07:51.446 17:51:44 -- common/autotest_common.sh@185 -- # PYTHONDONTWRITEBYTECODE=1 00:07:51.446 17:51:44 -- common/autotest_common.sh@189 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:07:51.446 17:51:44 -- common/autotest_common.sh@189 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:07:51.446 17:51:44 -- common/autotest_common.sh@190 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:07:51.446 17:51:44 -- common/autotest_common.sh@190 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:07:51.446 17:51:44 -- common/autotest_common.sh@194 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:07:51.446 17:51:44 -- common/autotest_common.sh@195 -- # rm -rf /var/tmp/asan_suppression_file 00:07:51.446 17:51:44 -- common/autotest_common.sh@196 -- # cat 00:07:51.446 17:51:44 -- common/autotest_common.sh@222 -- # echo leak:libfuse3.so 00:07:51.446 17:51:44 -- common/autotest_common.sh@224 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:07:51.446 17:51:44 -- common/autotest_common.sh@224 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:07:51.446 17:51:44 -- common/autotest_common.sh@226 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:07:51.446 17:51:44 -- common/autotest_common.sh@226 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:07:51.446 17:51:44 -- common/autotest_common.sh@228 -- # '[' -z /var/spdk/dependencies ']' 00:07:51.446 17:51:44 -- common/autotest_common.sh@231 -- # export DEPENDENCY_DIR 00:07:51.446 17:51:44 -- common/autotest_common.sh@235 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:51.446 17:51:44 -- common/autotest_common.sh@235 -- # SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:51.446 17:51:44 -- common/autotest_common.sh@236 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:51.446 17:51:44 -- common/autotest_common.sh@236 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:51.446 17:51:44 -- common/autotest_common.sh@239 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:07:51.446 17:51:44 -- common/autotest_common.sh@239 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:07:51.446 17:51:44 -- common/autotest_common.sh@240 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:07:51.446 17:51:44 -- common/autotest_common.sh@240 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:07:51.446 17:51:44 -- common/autotest_common.sh@242 -- # export AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:07:51.446 17:51:44 -- common/autotest_common.sh@242 -- # AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:07:51.446 17:51:44 -- common/autotest_common.sh@245 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:07:51.446 17:51:44 -- common/autotest_common.sh@245 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:07:51.446 17:51:44 -- common/autotest_common.sh@247 -- # _LCOV_MAIN=0 00:07:51.446 17:51:44 -- common/autotest_common.sh@248 -- # _LCOV_LLVM=1 00:07:51.446 17:51:44 -- common/autotest_common.sh@249 -- # _LCOV= 00:07:51.446 17:51:44 -- common/autotest_common.sh@250 -- # [[ '' == *clang* ]] 00:07:51.446 17:51:44 -- common/autotest_common.sh@250 -- # [[ 1 -eq 1 ]] 00:07:51.446 17:51:44 -- common/autotest_common.sh@250 -- # _LCOV=1 00:07:51.447 17:51:44 -- common/autotest_common.sh@252 -- # _lcov_opt[_LCOV_LLVM]='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:07:51.447 17:51:44 -- common/autotest_common.sh@253 -- # _lcov_opt[_LCOV_MAIN]= 00:07:51.447 17:51:44 -- common/autotest_common.sh@255 -- # lcov_opt='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:07:51.447 17:51:44 -- common/autotest_common.sh@258 -- # '[' 0 -eq 0 ']' 00:07:51.447 17:51:44 -- common/autotest_common.sh@259 -- # export valgrind= 00:07:51.447 17:51:44 -- common/autotest_common.sh@259 -- # valgrind= 00:07:51.447 17:51:44 -- common/autotest_common.sh@265 -- # uname -s 00:07:51.447 17:51:44 -- common/autotest_common.sh@265 -- # '[' Linux = Linux ']' 00:07:51.447 17:51:44 -- common/autotest_common.sh@266 -- # HUGEMEM=4096 00:07:51.447 17:51:44 -- common/autotest_common.sh@267 -- # export CLEAR_HUGE=yes 00:07:51.447 17:51:44 -- common/autotest_common.sh@267 -- # CLEAR_HUGE=yes 00:07:51.447 17:51:44 -- common/autotest_common.sh@268 -- # [[ 0 -eq 1 ]] 00:07:51.447 17:51:44 -- common/autotest_common.sh@268 -- # [[ 0 -eq 1 ]] 00:07:51.447 17:51:44 -- common/autotest_common.sh@275 -- # MAKE=make 00:07:51.447 17:51:44 -- common/autotest_common.sh@276 -- # MAKEFLAGS=-j112 00:07:51.447 17:51:44 -- common/autotest_common.sh@292 -- # export HUGEMEM=4096 00:07:51.447 17:51:44 -- common/autotest_common.sh@292 -- # HUGEMEM=4096 00:07:51.447 17:51:44 -- common/autotest_common.sh@294 -- # '[' -z /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output ']' 00:07:51.447 17:51:44 -- common/autotest_common.sh@299 -- # NO_HUGE=() 00:07:51.447 17:51:44 -- common/autotest_common.sh@300 -- # TEST_MODE= 00:07:51.447 17:51:44 -- common/autotest_common.sh@319 -- # [[ -z 633740 ]] 00:07:51.447 17:51:44 -- common/autotest_common.sh@319 -- # kill -0 633740 00:07:51.447 17:51:44 -- common/autotest_common.sh@1675 -- # set_test_storage 2147483648 00:07:51.447 17:51:44 -- common/autotest_common.sh@329 -- # [[ -v testdir ]] 00:07:51.447 17:51:44 -- common/autotest_common.sh@331 -- # local requested_size=2147483648 00:07:51.447 17:51:44 -- common/autotest_common.sh@332 -- # local mount target_dir 00:07:51.447 17:51:44 -- common/autotest_common.sh@334 -- # local -A mounts fss sizes avails uses 00:07:51.447 17:51:44 -- common/autotest_common.sh@335 -- # local source fs size avail mount use 00:07:51.447 17:51:44 -- common/autotest_common.sh@337 -- # local storage_fallback storage_candidates 00:07:51.447 17:51:44 -- common/autotest_common.sh@339 -- # mktemp -udt spdk.XXXXXX 00:07:51.447 17:51:44 -- common/autotest_common.sh@339 -- # storage_fallback=/tmp/spdk.fN4aR7 00:07:51.447 17:51:44 -- common/autotest_common.sh@344 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:07:51.447 17:51:44 -- common/autotest_common.sh@346 -- # [[ -n '' ]] 00:07:51.447 17:51:44 -- common/autotest_common.sh@351 -- # [[ -n '' ]] 00:07:51.447 17:51:44 -- common/autotest_common.sh@356 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf /tmp/spdk.fN4aR7/tests/nvmf /tmp/spdk.fN4aR7 00:07:51.447 17:51:44 -- common/autotest_common.sh@359 -- # requested_size=2214592512 00:07:51.447 17:51:44 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:51.447 17:51:44 -- common/autotest_common.sh@328 -- # grep -v Filesystem 00:07:51.447 17:51:44 -- common/autotest_common.sh@328 -- # df -T 00:07:51.707 17:51:44 -- common/autotest_common.sh@362 -- # mounts["$mount"]=spdk_devtmpfs 00:07:51.707 17:51:44 -- common/autotest_common.sh@362 -- # fss["$mount"]=devtmpfs 00:07:51.707 17:51:44 -- common/autotest_common.sh@363 -- # avails["$mount"]=67108864 00:07:51.707 17:51:44 -- common/autotest_common.sh@363 -- # sizes["$mount"]=67108864 00:07:51.707 17:51:44 -- common/autotest_common.sh@364 -- # uses["$mount"]=0 00:07:51.707 17:51:44 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:51.707 17:51:44 -- common/autotest_common.sh@362 -- # mounts["$mount"]=/dev/pmem0 00:07:51.707 17:51:44 -- common/autotest_common.sh@362 -- # fss["$mount"]=ext2 00:07:51.707 17:51:44 -- common/autotest_common.sh@363 -- # avails["$mount"]=4096 00:07:51.707 17:51:44 -- common/autotest_common.sh@363 -- # sizes["$mount"]=5284429824 00:07:51.707 17:51:44 -- common/autotest_common.sh@364 -- # uses["$mount"]=5284425728 00:07:51.707 17:51:44 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:51.707 17:51:44 -- common/autotest_common.sh@362 -- # mounts["$mount"]=spdk_root 00:07:51.707 17:51:44 -- common/autotest_common.sh@362 -- # fss["$mount"]=overlay 00:07:51.707 17:51:44 -- common/autotest_common.sh@363 -- # avails["$mount"]=53092982784 00:07:51.707 17:51:44 -- common/autotest_common.sh@363 -- # sizes["$mount"]=61730607104 00:07:51.707 17:51:44 -- common/autotest_common.sh@364 -- # uses["$mount"]=8637624320 00:07:51.707 17:51:44 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:51.707 17:51:44 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:07:51.707 17:51:44 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:07:51.707 17:51:44 -- common/autotest_common.sh@363 -- # avails["$mount"]=30864044032 00:07:51.707 17:51:44 -- common/autotest_common.sh@363 -- # sizes["$mount"]=30865301504 00:07:51.707 17:51:44 -- common/autotest_common.sh@364 -- # uses["$mount"]=1257472 00:07:51.707 17:51:44 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:51.707 17:51:44 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:07:51.707 17:51:44 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:07:51.707 17:51:44 -- common/autotest_common.sh@363 -- # avails["$mount"]=12340121600 00:07:51.707 17:51:44 -- common/autotest_common.sh@363 -- # sizes["$mount"]=12346122240 00:07:51.707 17:51:44 -- common/autotest_common.sh@364 -- # uses["$mount"]=6000640 00:07:51.707 17:51:44 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:51.707 17:51:44 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:07:51.707 17:51:44 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:07:51.707 17:51:44 -- common/autotest_common.sh@363 -- # avails["$mount"]=30864982016 00:07:51.707 17:51:44 -- common/autotest_common.sh@363 -- # sizes["$mount"]=30865305600 00:07:51.707 17:51:44 -- common/autotest_common.sh@364 -- # uses["$mount"]=323584 00:07:51.707 17:51:44 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:51.707 17:51:44 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:07:51.707 17:51:44 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:07:51.707 17:51:44 -- common/autotest_common.sh@363 -- # avails["$mount"]=6173044736 00:07:51.707 17:51:44 -- common/autotest_common.sh@363 -- # sizes["$mount"]=6173057024 00:07:51.707 17:51:44 -- common/autotest_common.sh@364 -- # uses["$mount"]=12288 00:07:51.707 17:51:44 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:51.707 17:51:44 -- common/autotest_common.sh@367 -- # printf '* Looking for test storage...\n' 00:07:51.707 * Looking for test storage... 00:07:51.707 17:51:44 -- common/autotest_common.sh@369 -- # local target_space new_size 00:07:51.707 17:51:44 -- common/autotest_common.sh@370 -- # for target_dir in "${storage_candidates[@]}" 00:07:51.707 17:51:44 -- common/autotest_common.sh@373 -- # df /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:51.707 17:51:44 -- common/autotest_common.sh@373 -- # awk '$1 !~ /Filesystem/{print $6}' 00:07:51.707 17:51:44 -- common/autotest_common.sh@373 -- # mount=/ 00:07:51.707 17:51:44 -- common/autotest_common.sh@375 -- # target_space=53092982784 00:07:51.707 17:51:44 -- common/autotest_common.sh@376 -- # (( target_space == 0 || target_space < requested_size )) 00:07:51.707 17:51:44 -- common/autotest_common.sh@379 -- # (( target_space >= requested_size )) 00:07:51.707 17:51:44 -- common/autotest_common.sh@381 -- # [[ overlay == tmpfs ]] 00:07:51.707 17:51:44 -- common/autotest_common.sh@381 -- # [[ overlay == ramfs ]] 00:07:51.707 17:51:44 -- common/autotest_common.sh@381 -- # [[ / == / ]] 00:07:51.707 17:51:44 -- common/autotest_common.sh@382 -- # new_size=10852216832 00:07:51.707 17:51:44 -- common/autotest_common.sh@383 -- # (( new_size * 100 / sizes[/] > 95 )) 00:07:51.708 17:51:44 -- common/autotest_common.sh@388 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:51.708 17:51:44 -- common/autotest_common.sh@388 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:51.708 17:51:44 -- common/autotest_common.sh@389 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:51.708 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:51.708 17:51:44 -- common/autotest_common.sh@390 -- # return 0 00:07:51.708 17:51:44 -- common/autotest_common.sh@1677 -- # set -o errtrace 00:07:51.708 17:51:44 -- common/autotest_common.sh@1678 -- # shopt -s extdebug 00:07:51.708 17:51:44 -- common/autotest_common.sh@1679 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:07:51.708 17:51:44 -- common/autotest_common.sh@1681 -- # PS4=' \t -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:07:51.708 17:51:44 -- common/autotest_common.sh@1682 -- # true 00:07:51.708 17:51:44 -- common/autotest_common.sh@1684 -- # xtrace_fd 00:07:51.708 17:51:44 -- common/autotest_common.sh@25 -- # [[ -n 14 ]] 00:07:51.708 17:51:44 -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/14 ]] 00:07:51.708 17:51:44 -- common/autotest_common.sh@27 -- # exec 00:07:51.708 17:51:44 -- common/autotest_common.sh@29 -- # exec 00:07:51.708 17:51:44 -- common/autotest_common.sh@31 -- # xtrace_restore 00:07:51.708 17:51:44 -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:07:51.708 17:51:44 -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:07:51.708 17:51:44 -- common/autotest_common.sh@18 -- # set -x 00:07:51.708 17:51:44 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:07:51.708 17:51:44 -- common/autotest_common.sh@1690 -- # lcov --version 00:07:51.708 17:51:44 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:07:51.708 17:51:44 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:07:51.708 17:51:44 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:07:51.708 17:51:44 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:07:51.708 17:51:44 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:07:51.708 17:51:44 -- scripts/common.sh@335 -- # IFS=.-: 00:07:51.708 17:51:44 -- scripts/common.sh@335 -- # read -ra ver1 00:07:51.708 17:51:44 -- scripts/common.sh@336 -- # IFS=.-: 00:07:51.708 17:51:44 -- scripts/common.sh@336 -- # read -ra ver2 00:07:51.708 17:51:44 -- scripts/common.sh@337 -- # local 'op=<' 00:07:51.708 17:51:44 -- scripts/common.sh@339 -- # ver1_l=2 00:07:51.708 17:51:44 -- scripts/common.sh@340 -- # ver2_l=1 00:07:51.708 17:51:44 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:07:51.708 17:51:44 -- scripts/common.sh@343 -- # case "$op" in 00:07:51.708 17:51:44 -- scripts/common.sh@344 -- # : 1 00:07:51.708 17:51:44 -- scripts/common.sh@363 -- # (( v = 0 )) 00:07:51.708 17:51:44 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:51.708 17:51:44 -- scripts/common.sh@364 -- # decimal 1 00:07:51.708 17:51:44 -- scripts/common.sh@352 -- # local d=1 00:07:51.708 17:51:44 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:51.708 17:51:44 -- scripts/common.sh@354 -- # echo 1 00:07:51.708 17:51:44 -- scripts/common.sh@364 -- # ver1[v]=1 00:07:51.708 17:51:44 -- scripts/common.sh@365 -- # decimal 2 00:07:51.708 17:51:44 -- scripts/common.sh@352 -- # local d=2 00:07:51.708 17:51:44 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:51.708 17:51:44 -- scripts/common.sh@354 -- # echo 2 00:07:51.708 17:51:44 -- scripts/common.sh@365 -- # ver2[v]=2 00:07:51.708 17:51:44 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:07:51.708 17:51:44 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:07:51.708 17:51:44 -- scripts/common.sh@367 -- # return 0 00:07:51.708 17:51:44 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:51.708 17:51:44 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:07:51.708 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:51.708 --rc genhtml_branch_coverage=1 00:07:51.708 --rc genhtml_function_coverage=1 00:07:51.708 --rc genhtml_legend=1 00:07:51.708 --rc geninfo_all_blocks=1 00:07:51.708 --rc geninfo_unexecuted_blocks=1 00:07:51.708 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:51.708 ' 00:07:51.708 17:51:44 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:07:51.708 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:51.708 --rc genhtml_branch_coverage=1 00:07:51.708 --rc genhtml_function_coverage=1 00:07:51.708 --rc genhtml_legend=1 00:07:51.708 --rc geninfo_all_blocks=1 00:07:51.708 --rc geninfo_unexecuted_blocks=1 00:07:51.708 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:51.708 ' 00:07:51.708 17:51:44 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:07:51.708 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:51.708 --rc genhtml_branch_coverage=1 00:07:51.708 --rc genhtml_function_coverage=1 00:07:51.708 --rc genhtml_legend=1 00:07:51.708 --rc geninfo_all_blocks=1 00:07:51.708 --rc geninfo_unexecuted_blocks=1 00:07:51.708 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:51.708 ' 00:07:51.708 17:51:44 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:07:51.708 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:51.708 --rc genhtml_branch_coverage=1 00:07:51.708 --rc genhtml_function_coverage=1 00:07:51.708 --rc genhtml_legend=1 00:07:51.708 --rc geninfo_all_blocks=1 00:07:51.708 --rc geninfo_unexecuted_blocks=1 00:07:51.708 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:51.708 ' 00:07:51.708 17:51:44 -- nvmf/run.sh@53 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/../common.sh 00:07:51.708 17:51:44 -- ../common.sh@8 -- # pids=() 00:07:51.708 17:51:44 -- nvmf/run.sh@55 -- # fuzzfile=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c 00:07:51.708 17:51:44 -- nvmf/run.sh@56 -- # grep -c '\.fn =' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c 00:07:51.708 17:51:44 -- nvmf/run.sh@56 -- # fuzz_num=25 00:07:51.708 17:51:44 -- nvmf/run.sh@57 -- # (( fuzz_num != 0 )) 00:07:51.708 17:51:44 -- nvmf/run.sh@59 -- # trap 'cleanup /tmp/llvm_fuzz*; exit 1' SIGINT SIGTERM EXIT 00:07:51.708 17:51:44 -- nvmf/run.sh@61 -- # mem_size=512 00:07:51.708 17:51:44 -- nvmf/run.sh@62 -- # [[ 1 -eq 1 ]] 00:07:51.708 17:51:44 -- nvmf/run.sh@63 -- # start_llvm_fuzz_short 25 1 00:07:51.708 17:51:44 -- ../common.sh@69 -- # local fuzz_num=25 00:07:51.708 17:51:44 -- ../common.sh@70 -- # local time=1 00:07:51.708 17:51:44 -- ../common.sh@72 -- # (( i = 0 )) 00:07:51.708 17:51:44 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:51.708 17:51:44 -- ../common.sh@73 -- # start_llvm_fuzz 0 1 0x1 00:07:51.708 17:51:44 -- nvmf/run.sh@23 -- # local fuzzer_type=0 00:07:51.708 17:51:44 -- nvmf/run.sh@24 -- # local timen=1 00:07:51.708 17:51:44 -- nvmf/run.sh@25 -- # local core=0x1 00:07:51.708 17:51:44 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:07:51.708 17:51:44 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_0.conf 00:07:51.708 17:51:44 -- nvmf/run.sh@29 -- # printf %02d 0 00:07:51.708 17:51:44 -- nvmf/run.sh@29 -- # port=4400 00:07:51.708 17:51:44 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:07:51.708 17:51:44 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4400' 00:07:51.708 17:51:44 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4400"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:51.708 17:51:44 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4400' -c /tmp/fuzz_json_0.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 -Z 0 -r /var/tmp/spdk0.sock 00:07:51.708 [2024-11-19 17:51:44.467184] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:51.708 [2024-11-19 17:51:44.467279] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid633864 ] 00:07:51.708 EAL: No free 2048 kB hugepages reported on node 1 00:07:51.968 [2024-11-19 17:51:44.723886] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:51.968 [2024-11-19 17:51:44.752845] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:51.968 [2024-11-19 17:51:44.752988] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:51.968 [2024-11-19 17:51:44.804321] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:51.968 [2024-11-19 17:51:44.820639] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4400 *** 00:07:52.227 INFO: Running with entropic power schedule (0xFF, 100). 00:07:52.227 INFO: Seed: 1245288460 00:07:52.227 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:07:52.227 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:07:52.227 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:07:52.227 INFO: A corpus is not provided, starting from an empty corpus 00:07:52.227 #2 INITED exec/s: 0 rss: 60Mb 00:07:52.227 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:52.227 This may also happen if the target rejected all inputs we tried so far 00:07:52.227 [2024-11-19 17:51:44.865218] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (3b) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.227 [2024-11-19 17:51:44.865252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.486 NEW_FUNC[1/671]: 0x451418 in fuzz_admin_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:47 00:07:52.486 NEW_FUNC[2/671]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:52.486 #39 NEW cov: 11561 ft: 11563 corp: 2/77b lim: 320 exec/s: 0 rss: 67Mb L: 76/76 MS: 2 ChangeByte-InsertRepeatedBytes- 00:07:52.486 [2024-11-19 17:51:45.185968] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2f) qid:0 cid:4 nsid:9c9c9c9c cdw10:9c9c9c9c cdw11:9c9c9c9c SGL TRANSPORT DATA BLOCK TRANSPORT 0x9c9c9c9c9c9c9c9c 00:07:52.486 [2024-11-19 17:51:45.186006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.486 #44 NEW cov: 11675 ft: 12084 corp: 3/164b lim: 320 exec/s: 0 rss: 67Mb L: 87/87 MS: 5 ChangeBit-CopyPart-EraseBytes-InsertByte-InsertRepeatedBytes- 00:07:52.486 [2024-11-19 17:51:45.236019] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (3b) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.486 [2024-11-19 17:51:45.236050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.486 #45 NEW cov: 11681 ft: 12307 corp: 4/240b lim: 320 exec/s: 0 rss: 67Mb L: 76/87 MS: 1 ChangeByte- 00:07:52.486 [2024-11-19 17:51:45.296215] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (3b) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.486 [2024-11-19 17:51:45.296250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.486 #46 NEW cov: 11766 ft: 12625 corp: 5/316b lim: 320 exec/s: 0 rss: 67Mb L: 76/87 MS: 1 CopyPart- 00:07:52.746 [2024-11-19 17:51:45.356321] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (3b) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.746 [2024-11-19 17:51:45.356353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.746 #47 NEW cov: 11766 ft: 12782 corp: 6/392b lim: 320 exec/s: 0 rss: 67Mb L: 76/87 MS: 1 ChangeBinInt- 00:07:52.746 [2024-11-19 17:51:45.416496] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (3b) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.746 [2024-11-19 17:51:45.416526] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.746 #48 NEW cov: 11766 ft: 12834 corp: 7/468b lim: 320 exec/s: 0 rss: 67Mb L: 76/87 MS: 1 ChangeByte- 00:07:52.746 [2024-11-19 17:51:45.476691] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (3b) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.746 [2024-11-19 17:51:45.476723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.746 #49 NEW cov: 11766 ft: 12906 corp: 8/544b lim: 320 exec/s: 0 rss: 67Mb L: 76/87 MS: 1 CopyPart- 00:07:52.746 [2024-11-19 17:51:45.526783] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (3b) qid:0 cid:4 nsid:b000000 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.746 [2024-11-19 17:51:45.526814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.746 #50 NEW cov: 11766 ft: 12961 corp: 9/622b lim: 320 exec/s: 0 rss: 68Mb L: 78/87 MS: 1 CMP- DE: "\013\000"- 00:07:52.746 [2024-11-19 17:51:45.597054] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2f) qid:0 cid:4 nsid:9c9c9c9c cdw10:9c9c9c9c cdw11:9c9c9c9c SGL TRANSPORT DATA BLOCK TRANSPORT 0x9c9c9c9c9c9c9c9c 00:07:52.746 [2024-11-19 17:51:45.597088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.746 [2024-11-19 17:51:45.597123] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (9c) qid:0 cid:5 nsid:df9c9c9c cdw10:dfdfdfdf cdw11:dfdfdfdf 00:07:52.746 [2024-11-19 17:51:45.597138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.006 #51 NEW cov: 11786 ft: 13201 corp: 10/790b lim: 320 exec/s: 0 rss: 68Mb L: 168/168 MS: 1 InsertRepeatedBytes- 00:07:53.006 [2024-11-19 17:51:45.667143] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:3b0a cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.006 [2024-11-19 17:51:45.667174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.006 #59 NEW cov: 11786 ft: 13272 corp: 11/871b lim: 320 exec/s: 0 rss: 68Mb L: 81/168 MS: 3 ShuffleBytes-CMP-CrossOver- DE: "\377\001\000\000"- 00:07:53.006 [2024-11-19 17:51:45.717320] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (4a) qid:0 cid:4 nsid:99999999 cdw10:99999999 cdw11:99999999 SGL TRANSPORT DATA BLOCK TRANSPORT 0x9999999999999999 00:07:53.006 [2024-11-19 17:51:45.717351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.006 #62 NEW cov: 11786 ft: 13284 corp: 12/971b lim: 320 exec/s: 0 rss: 68Mb L: 100/168 MS: 3 ChangeBit-CopyPart-InsertRepeatedBytes- 00:07:53.006 [2024-11-19 17:51:45.768352] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (3b) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.006 [2024-11-19 17:51:45.768424] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.006 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:53.006 #63 NEW cov: 11803 ft: 13469 corp: 13/1047b lim: 320 exec/s: 0 rss: 68Mb L: 76/168 MS: 1 ChangeBinInt- 00:07:53.006 [2024-11-19 17:51:45.818284] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:3b0a cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.006 [2024-11-19 17:51:45.818308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.006 [2024-11-19 17:51:45.818363] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (9c) qid:0 cid:5 nsid:9c9c9c9c cdw10:9c9c9c9c cdw11:9c9c9c9c 00:07:53.006 [2024-11-19 17:51:45.818376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.006 #64 NEW cov: 11803 ft: 13590 corp: 14/1186b lim: 320 exec/s: 0 rss: 68Mb L: 139/168 MS: 1 CrossOver- 00:07:53.006 [2024-11-19 17:51:45.858313] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (3b) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.006 [2024-11-19 17:51:45.858338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.266 #71 NEW cov: 11803 ft: 13711 corp: 15/1313b lim: 320 exec/s: 71 rss: 68Mb L: 127/168 MS: 2 EraseBytes-InsertRepeatedBytes- 00:07:53.266 [2024-11-19 17:51:45.898380] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (3b) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.266 [2024-11-19 17:51:45.898404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.266 #72 NEW cov: 11803 ft: 13784 corp: 16/1389b lim: 320 exec/s: 72 rss: 68Mb L: 76/168 MS: 1 ChangeBit- 00:07:53.266 [2024-11-19 17:51:45.928482] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (3b) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.266 [2024-11-19 17:51:45.928506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.266 #73 NEW cov: 11803 ft: 13907 corp: 17/1510b lim: 320 exec/s: 73 rss: 68Mb L: 121/168 MS: 1 CopyPart- 00:07:53.266 [2024-11-19 17:51:45.968578] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (3b) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.266 [2024-11-19 17:51:45.968606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.266 #74 NEW cov: 11803 ft: 13925 corp: 18/1637b lim: 320 exec/s: 74 rss: 68Mb L: 127/168 MS: 1 CMP- DE: "\377\021"- 00:07:53.266 [2024-11-19 17:51:46.008850] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2f) qid:0 cid:4 nsid:9c9c9c9c cdw10:9c9c9c9c cdw11:9c9c9c9c SGL TRANSPORT DATA BLOCK TRANSPORT 0x9c9c9c9c9c9c9c9c 00:07:53.266 [2024-11-19 17:51:46.008874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.266 [2024-11-19 17:51:46.008945] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (9c) qid:0 cid:5 nsid:df9c9c9c cdw10:dfdfdfdf cdw11:dfdfdfdf 00:07:53.266 [2024-11-19 17:51:46.008958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.266 #75 NEW cov: 11803 ft: 13951 corp: 19/1806b lim: 320 exec/s: 75 rss: 68Mb L: 169/169 MS: 1 InsertByte- 00:07:53.266 [2024-11-19 17:51:46.048835] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2f) qid:0 cid:4 nsid:9c9c9c9c cdw10:009c9c9c cdw11:9c9c9c9c SGL TRANSPORT DATA BLOCK TRANSPORT 0x9c9c9c9c9c9c9c9c 00:07:53.266 [2024-11-19 17:51:46.048859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.266 #76 NEW cov: 11803 ft: 13987 corp: 20/1894b lim: 320 exec/s: 76 rss: 68Mb L: 88/169 MS: 1 InsertByte- 00:07:53.266 [2024-11-19 17:51:46.089040] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2f) qid:0 cid:4 nsid:9c9c9c9c cdw10:9c9c9c9c cdw11:9c9c9c9c SGL TRANSPORT DATA BLOCK TRANSPORT 0x9c9c9c9c9c9c9c9c 00:07:53.266 [2024-11-19 17:51:46.089070] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.266 [2024-11-19 17:51:46.089141] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (9c) qid:0 cid:5 nsid:df9c9c9c cdw10:dfdfdfdf cdw11:dfdfdfdf 00:07:53.266 [2024-11-19 17:51:46.089154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.266 #77 NEW cov: 11803 ft: 14015 corp: 21/2063b lim: 320 exec/s: 77 rss: 68Mb L: 169/169 MS: 1 ShuffleBytes- 00:07:53.266 [2024-11-19 17:51:46.129189] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (3b) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.266 [2024-11-19 17:51:46.129214] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.526 [2024-11-19 17:51:46.129264] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:53.526 [2024-11-19 17:51:46.129280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.526 #78 NEW cov: 11804 ft: 14062 corp: 22/2212b lim: 320 exec/s: 78 rss: 68Mb L: 149/169 MS: 1 CopyPart- 00:07:53.526 [2024-11-19 17:51:46.169188] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2f) qid:0 cid:4 nsid:9c9c9c9c cdw10:009c9c9c cdw11:9c9c9c9c SGL TRANSPORT DATA BLOCK TRANSPORT 0x9c9c9c9c9c9c9c9c 00:07:53.526 [2024-11-19 17:51:46.169212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.526 #79 NEW cov: 11804 ft: 14118 corp: 23/2300b lim: 320 exec/s: 79 rss: 68Mb L: 88/169 MS: 1 ChangeByte- 00:07:53.526 [2024-11-19 17:51:46.209407] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:53.526 [2024-11-19 17:51:46.209433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.526 NEW_FUNC[1/7]: 0x10e3598 in nvmf_ctrlr_abort /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:3278 00:07:53.526 NEW_FUNC[2/7]: 0x11366a8 in nvmf_ctrlr_abort_on_pg /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:3254 00:07:53.526 #82 NEW cov: 11996 ft: 14556 corp: 24/2425b lim: 320 exec/s: 82 rss: 68Mb L: 125/169 MS: 3 ChangeBit-InsertRepeatedBytes-CrossOver- 00:07:53.526 [2024-11-19 17:51:46.249755] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (3b) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.526 [2024-11-19 17:51:46.249780] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.526 [2024-11-19 17:51:46.249831] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:53.526 [2024-11-19 17:51:46.249845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.526 [2024-11-19 17:51:46.249895] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:53.526 [2024-11-19 17:51:46.249908] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.526 [2024-11-19 17:51:46.249958] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:53.526 [2024-11-19 17:51:46.249971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:53.526 #83 NEW cov: 11996 ft: 14817 corp: 25/2701b lim: 320 exec/s: 83 rss: 68Mb L: 276/276 MS: 1 CopyPart- 00:07:53.526 [2024-11-19 17:51:46.289503] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (3b) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x1 00:07:53.526 [2024-11-19 17:51:46.289530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.526 #84 NEW cov: 11996 ft: 14832 corp: 26/2777b lim: 320 exec/s: 84 rss: 68Mb L: 76/276 MS: 1 ChangeBit- 00:07:53.526 [2024-11-19 17:51:46.329641] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (3b) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00800000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.526 [2024-11-19 17:51:46.329667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.526 #85 NEW cov: 11996 ft: 14858 corp: 27/2898b lim: 320 exec/s: 85 rss: 68Mb L: 121/276 MS: 1 ChangeBit- 00:07:53.526 [2024-11-19 17:51:46.369789] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (3b) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.526 [2024-11-19 17:51:46.369813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.526 #86 NEW cov: 11996 ft: 14879 corp: 28/2974b lim: 320 exec/s: 86 rss: 68Mb L: 76/276 MS: 1 ChangeBit- 00:07:53.785 [2024-11-19 17:51:46.399851] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (3b) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.785 [2024-11-19 17:51:46.399876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.785 #87 NEW cov: 11996 ft: 14887 corp: 29/3051b lim: 320 exec/s: 87 rss: 69Mb L: 77/276 MS: 1 InsertByte- 00:07:53.785 [2024-11-19 17:51:46.440096] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (3b) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00800000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.785 [2024-11-19 17:51:46.440120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.785 [2024-11-19 17:51:46.440169] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:53.785 [2024-11-19 17:51:46.440182] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.785 #88 NEW cov: 11996 ft: 14900 corp: 30/3202b lim: 320 exec/s: 88 rss: 69Mb L: 151/276 MS: 1 CrossOver- 00:07:53.785 [2024-11-19 17:51:46.480054] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2f) qid:0 cid:4 nsid:9c9c9c9c cdw10:9c9c9c9c cdw11:9c9c9c9c SGL TRANSPORT DATA BLOCK TRANSPORT 0x9c9c9c9c9c9c9c9c 00:07:53.785 [2024-11-19 17:51:46.480078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.785 #89 NEW cov: 11996 ft: 14975 corp: 31/3289b lim: 320 exec/s: 89 rss: 69Mb L: 87/276 MS: 1 ChangeByte- 00:07:53.785 [2024-11-19 17:51:46.520196] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (3b) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.785 [2024-11-19 17:51:46.520219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.785 #90 NEW cov: 11996 ft: 14998 corp: 32/3365b lim: 320 exec/s: 90 rss: 69Mb L: 76/276 MS: 1 ChangeByte- 00:07:53.785 [2024-11-19 17:51:46.550291] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (3b) qid:0 cid:4 nsid:b000000 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.785 [2024-11-19 17:51:46.550315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.785 #91 NEW cov: 11996 ft: 15010 corp: 33/3429b lim: 320 exec/s: 91 rss: 69Mb L: 64/276 MS: 1 EraseBytes- 00:07:53.785 [2024-11-19 17:51:46.590432] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (3b) qid:0 cid:4 nsid:b000000 cdw10:00000000 cdw11:00000016 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.786 [2024-11-19 17:51:46.590460] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.786 #92 NEW cov: 11996 ft: 15021 corp: 34/3508b lim: 320 exec/s: 92 rss: 69Mb L: 79/276 MS: 1 InsertByte- 00:07:53.786 [2024-11-19 17:51:46.630699] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (3b) qid:0 cid:4 nsid:0 cdw10:0a0a0a0a cdw11:0a0a0a0a SGL TRANSPORT DATA BLOCK TRANSPORT 0xa0a0a0a0a0a0a0a 00:07:53.786 [2024-11-19 17:51:46.630733] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.045 #95 NEW cov: 12000 ft: 15075 corp: 35/3667b lim: 320 exec/s: 95 rss: 69Mb L: 159/276 MS: 3 EraseBytes-ChangeBit-InsertRepeatedBytes- 00:07:54.045 [2024-11-19 17:51:46.670790] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (3b) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00800000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.045 [2024-11-19 17:51:46.670815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.045 [2024-11-19 17:51:46.670864] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:54.045 [2024-11-19 17:51:46.670878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.045 #96 NEW cov: 12000 ft: 15161 corp: 36/3847b lim: 320 exec/s: 96 rss: 69Mb L: 180/276 MS: 1 CopyPart- 00:07:54.045 [2024-11-19 17:51:46.710817] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (3b) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.045 [2024-11-19 17:51:46.710841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.045 #97 NEW cov: 12000 ft: 15198 corp: 37/3923b lim: 320 exec/s: 97 rss: 69Mb L: 76/276 MS: 1 ChangeBinInt- 00:07:54.045 [2024-11-19 17:51:46.750974] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (4a) qid:0 cid:4 nsid:99999999 cdw10:99999999 cdw11:99999999 SGL TRANSPORT DATA BLOCK TRANSPORT 0x9999999999999999 00:07:54.045 [2024-11-19 17:51:46.750999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.045 #98 NEW cov: 12007 ft: 15220 corp: 38/3991b lim: 320 exec/s: 98 rss: 69Mb L: 68/276 MS: 1 EraseBytes- 00:07:54.045 [2024-11-19 17:51:46.791152] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:54.045 [2024-11-19 17:51:46.791179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.045 #99 NEW cov: 12007 ft: 15242 corp: 39/4118b lim: 320 exec/s: 99 rss: 69Mb L: 127/276 MS: 1 CMP- DE: " \000"- 00:07:54.045 [2024-11-19 17:51:46.831142] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (3b) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.045 [2024-11-19 17:51:46.831166] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.045 #100 NEW cov: 12007 ft: 15245 corp: 40/4226b lim: 320 exec/s: 100 rss: 69Mb L: 108/276 MS: 1 CrossOver- 00:07:54.045 [2024-11-19 17:51:46.861527] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2f) qid:0 cid:4 nsid:9c9c9c9c cdw10:52525252 cdw11:52525252 SGL TRANSPORT DATA BLOCK TRANSPORT 0x5252529c9c9c9c9c 00:07:54.046 [2024-11-19 17:51:46.861551] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.046 [2024-11-19 17:51:46.861619] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (52) qid:0 cid:5 nsid:52525252 cdw10:52525252 cdw11:52525252 SGL TRANSPORT DATA BLOCK TRANSPORT 0x5252525252525252 00:07:54.046 [2024-11-19 17:51:46.861634] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.046 [2024-11-19 17:51:46.861692] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (52) qid:0 cid:6 nsid:52525252 cdw10:9c9c9c9c cdw11:9c9c9c9c SGL TRANSPORT DATA BLOCK TRANSPORT 0x9c9c9c9c9c9c9c9c 00:07:54.046 [2024-11-19 17:51:46.861709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.046 #101 NEW cov: 12007 ft: 15474 corp: 41/4435b lim: 320 exec/s: 50 rss: 69Mb L: 209/276 MS: 1 InsertRepeatedBytes- 00:07:54.046 #101 DONE cov: 12007 ft: 15474 corp: 41/4435b lim: 320 exec/s: 50 rss: 69Mb 00:07:54.046 ###### Recommended dictionary. ###### 00:07:54.046 "\013\000" # Uses: 0 00:07:54.046 "\377\001\000\000" # Uses: 0 00:07:54.046 "\377\021" # Uses: 0 00:07:54.046 " \000" # Uses: 0 00:07:54.046 ###### End of recommended dictionary. ###### 00:07:54.046 Done 101 runs in 2 second(s) 00:07:54.305 17:51:46 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_0.conf 00:07:54.305 17:51:47 -- ../common.sh@72 -- # (( i++ )) 00:07:54.305 17:51:47 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:54.305 17:51:47 -- ../common.sh@73 -- # start_llvm_fuzz 1 1 0x1 00:07:54.305 17:51:47 -- nvmf/run.sh@23 -- # local fuzzer_type=1 00:07:54.305 17:51:47 -- nvmf/run.sh@24 -- # local timen=1 00:07:54.305 17:51:47 -- nvmf/run.sh@25 -- # local core=0x1 00:07:54.305 17:51:47 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:54.305 17:51:47 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_1.conf 00:07:54.305 17:51:47 -- nvmf/run.sh@29 -- # printf %02d 1 00:07:54.305 17:51:47 -- nvmf/run.sh@29 -- # port=4401 00:07:54.305 17:51:47 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:54.305 17:51:47 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4401' 00:07:54.305 17:51:47 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4401"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:54.305 17:51:47 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4401' -c /tmp/fuzz_json_1.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 -Z 1 -r /var/tmp/spdk1.sock 00:07:54.305 [2024-11-19 17:51:47.046874] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:54.305 [2024-11-19 17:51:47.046968] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid634285 ] 00:07:54.305 EAL: No free 2048 kB hugepages reported on node 1 00:07:54.565 [2024-11-19 17:51:47.303300] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:54.565 [2024-11-19 17:51:47.331929] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:54.565 [2024-11-19 17:51:47.332051] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:54.565 [2024-11-19 17:51:47.383466] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:54.565 [2024-11-19 17:51:47.399819] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4401 *** 00:07:54.565 INFO: Running with entropic power schedule (0xFF, 100). 00:07:54.565 INFO: Seed: 3824293413 00:07:54.824 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:07:54.824 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:07:54.824 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:54.824 INFO: A corpus is not provided, starting from an empty corpus 00:07:54.824 #2 INITED exec/s: 0 rss: 59Mb 00:07:54.824 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:54.824 This may also happen if the target rejected all inputs we tried so far 00:07:54.824 [2024-11-19 17:51:47.444313] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000f7f7 00:07:54.824 [2024-11-19 17:51:47.444405] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000f7f7 00:07:54.824 [2024-11-19 17:51:47.444463] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000f7f7 00:07:54.824 [2024-11-19 17:51:47.444580] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0af783f7 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.824 [2024-11-19 17:51:47.444610] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.824 [2024-11-19 17:51:47.444642] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:f7f783f7 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.824 [2024-11-19 17:51:47.444657] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.824 [2024-11-19 17:51:47.444685] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:f7f783f7 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.824 [2024-11-19 17:51:47.444700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.084 NEW_FUNC[1/671]: 0x451d18 in fuzz_admin_get_log_page_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:67 00:07:55.084 NEW_FUNC[2/671]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:55.084 #29 NEW cov: 11626 ft: 11627 corp: 2/19b lim: 30 exec/s: 0 rss: 66Mb L: 18/18 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:07:55.084 [2024-11-19 17:51:47.765080] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:55.084 [2024-11-19 17:51:47.765176] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:55.084 [2024-11-19 17:51:47.765235] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:55.084 [2024-11-19 17:51:47.765291] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:55.084 [2024-11-19 17:51:47.765347] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:55.084 [2024-11-19 17:51:47.765455] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:01ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.084 [2024-11-19 17:51:47.765481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.084 [2024-11-19 17:51:47.765512] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.084 [2024-11-19 17:51:47.765528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.084 [2024-11-19 17:51:47.765556] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.084 [2024-11-19 17:51:47.765572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.084 [2024-11-19 17:51:47.765606] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.084 [2024-11-19 17:51:47.765622] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.084 [2024-11-19 17:51:47.765651] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.084 [2024-11-19 17:51:47.765666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:55.084 #33 NEW cov: 11739 ft: 12754 corp: 3/49b lim: 30 exec/s: 0 rss: 67Mb L: 30/30 MS: 4 ShuffleBytes-ChangeBinInt-ShuffleBytes-InsertRepeatedBytes- 00:07:55.084 [2024-11-19 17:51:47.825155] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000f7f7 00:07:55.084 [2024-11-19 17:51:47.825230] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200001e1e 00:07:55.084 [2024-11-19 17:51:47.825292] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000f7f7 00:07:55.084 [2024-11-19 17:51:47.825349] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000f7f7 00:07:55.084 [2024-11-19 17:51:47.825460] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0af783f7 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.084 [2024-11-19 17:51:47.825481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.084 [2024-11-19 17:51:47.825511] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:f71e021e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.084 [2024-11-19 17:51:47.825526] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.084 [2024-11-19 17:51:47.825554] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:1e1e83f7 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.084 [2024-11-19 17:51:47.825570] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.084 [2024-11-19 17:51:47.825607] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:f7f783f7 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.084 [2024-11-19 17:51:47.825624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.084 #34 NEW cov: 11745 ft: 12941 corp: 4/74b lim: 30 exec/s: 0 rss: 67Mb L: 25/30 MS: 1 InsertRepeatedBytes- 00:07:55.084 [2024-11-19 17:51:47.895273] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (257040) > buf size (4096) 00:07:55.084 [2024-11-19 17:51:47.895408] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:fb030000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.084 [2024-11-19 17:51:47.895431] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.084 #36 NEW cov: 11853 ft: 13632 corp: 5/83b lim: 30 exec/s: 0 rss: 67Mb L: 9/30 MS: 2 ShuffleBytes-CMP- DE: "\373\003\000\000\000\000\000\000"- 00:07:55.344 [2024-11-19 17:51:47.955526] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:55.344 [2024-11-19 17:51:47.955609] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:55.344 [2024-11-19 17:51:47.955668] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:55.344 [2024-11-19 17:51:47.955724] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:55.344 [2024-11-19 17:51:47.955780] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:55.344 [2024-11-19 17:51:47.955907] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:01ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.344 [2024-11-19 17:51:47.955927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.344 [2024-11-19 17:51:47.955958] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.344 [2024-11-19 17:51:47.955974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.344 [2024-11-19 17:51:47.956002] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.344 [2024-11-19 17:51:47.956017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.344 [2024-11-19 17:51:47.956045] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.344 [2024-11-19 17:51:47.956064] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.344 [2024-11-19 17:51:47.956091] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.344 [2024-11-19 17:51:47.956107] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:55.344 #37 NEW cov: 11853 ft: 13782 corp: 6/113b lim: 30 exec/s: 0 rss: 67Mb L: 30/30 MS: 1 CopyPart- 00:07:55.344 [2024-11-19 17:51:48.025684] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:55.344 [2024-11-19 17:51:48.025758] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:55.344 [2024-11-19 17:51:48.025816] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:55.344 [2024-11-19 17:51:48.025871] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:55.344 [2024-11-19 17:51:48.025927] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:55.344 [2024-11-19 17:51:48.026056] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:01ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.344 [2024-11-19 17:51:48.026076] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.344 [2024-11-19 17:51:48.026107] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.344 [2024-11-19 17:51:48.026123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.345 [2024-11-19 17:51:48.026152] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.345 [2024-11-19 17:51:48.026167] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.345 [2024-11-19 17:51:48.026196] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.345 [2024-11-19 17:51:48.026211] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.345 [2024-11-19 17:51:48.026240] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.345 [2024-11-19 17:51:48.026255] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:55.345 #38 NEW cov: 11853 ft: 13856 corp: 7/143b lim: 30 exec/s: 0 rss: 67Mb L: 30/30 MS: 1 ShuffleBytes- 00:07:55.345 [2024-11-19 17:51:48.095842] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:55.345 [2024-11-19 17:51:48.095930] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000fffd 00:07:55.345 [2024-11-19 17:51:48.095988] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:55.345 [2024-11-19 17:51:48.096044] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:55.345 [2024-11-19 17:51:48.096099] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:55.345 [2024-11-19 17:51:48.096209] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:01ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.345 [2024-11-19 17:51:48.096229] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.345 [2024-11-19 17:51:48.096259] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.345 [2024-11-19 17:51:48.096278] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.345 [2024-11-19 17:51:48.096307] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.345 [2024-11-19 17:51:48.096323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.345 [2024-11-19 17:51:48.096350] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.345 [2024-11-19 17:51:48.096365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.345 [2024-11-19 17:51:48.096392] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.345 [2024-11-19 17:51:48.096406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:55.345 #39 NEW cov: 11853 ft: 14022 corp: 8/173b lim: 30 exec/s: 0 rss: 67Mb L: 30/30 MS: 1 ChangeBit- 00:07:55.345 [2024-11-19 17:51:48.166064] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:55.345 [2024-11-19 17:51:48.166138] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:55.345 [2024-11-19 17:51:48.166197] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:55.345 [2024-11-19 17:51:48.166252] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:55.345 [2024-11-19 17:51:48.166308] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffef 00:07:55.345 [2024-11-19 17:51:48.166412] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:01ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.345 [2024-11-19 17:51:48.166432] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.345 [2024-11-19 17:51:48.166462] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.345 [2024-11-19 17:51:48.166477] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.345 [2024-11-19 17:51:48.166505] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.345 [2024-11-19 17:51:48.166520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.345 [2024-11-19 17:51:48.166547] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.345 [2024-11-19 17:51:48.166562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.345 [2024-11-19 17:51:48.166589] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.345 [2024-11-19 17:51:48.166611] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:55.345 #40 NEW cov: 11853 ft: 14147 corp: 9/203b lim: 30 exec/s: 0 rss: 67Mb L: 30/30 MS: 1 ChangeBit- 00:07:55.605 [2024-11-19 17:51:48.216114] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x85 00:07:55.605 [2024-11-19 17:51:48.216248] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.605 [2024-11-19 17:51:48.216270] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.605 #44 NEW cov: 11853 ft: 14309 corp: 10/209b lim: 30 exec/s: 0 rss: 67Mb L: 6/30 MS: 4 ShuffleBytes-ChangeByte-CrossOver-InsertRepeatedBytes- 00:07:55.605 [2024-11-19 17:51:48.266310] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:55.605 [2024-11-19 17:51:48.266400] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:55.605 [2024-11-19 17:51:48.266461] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:55.605 [2024-11-19 17:51:48.266519] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:55.606 [2024-11-19 17:51:48.266576] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000f7ef 00:07:55.606 [2024-11-19 17:51:48.266697] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:01ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.606 [2024-11-19 17:51:48.266720] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.606 [2024-11-19 17:51:48.266751] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.606 [2024-11-19 17:51:48.266768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.606 [2024-11-19 17:51:48.266796] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.606 [2024-11-19 17:51:48.266812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.606 [2024-11-19 17:51:48.266841] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.606 [2024-11-19 17:51:48.266857] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.606 [2024-11-19 17:51:48.266885] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.606 [2024-11-19 17:51:48.266901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:55.606 #45 NEW cov: 11853 ft: 14332 corp: 11/239b lim: 30 exec/s: 0 rss: 67Mb L: 30/30 MS: 1 ChangeBit- 00:07:55.606 [2024-11-19 17:51:48.336531] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:55.606 [2024-11-19 17:51:48.336614] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:55.606 [2024-11-19 17:51:48.336676] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:55.606 [2024-11-19 17:51:48.336734] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:55.606 [2024-11-19 17:51:48.336854] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:01ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.606 [2024-11-19 17:51:48.336875] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.606 [2024-11-19 17:51:48.336907] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.606 [2024-11-19 17:51:48.336923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.606 [2024-11-19 17:51:48.336951] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.606 [2024-11-19 17:51:48.336967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.606 [2024-11-19 17:51:48.336995] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.606 [2024-11-19 17:51:48.337015] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.606 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:55.606 #46 NEW cov: 11870 ft: 14365 corp: 12/267b lim: 30 exec/s: 0 rss: 68Mb L: 28/30 MS: 1 EraseBytes- 00:07:55.606 [2024-11-19 17:51:48.386609] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:55.606 [2024-11-19 17:51:48.386700] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffef 00:07:55.606 [2024-11-19 17:51:48.386758] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:55.606 [2024-11-19 17:51:48.386814] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:55.606 [2024-11-19 17:51:48.386870] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:55.606 [2024-11-19 17:51:48.386977] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:01ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.606 [2024-11-19 17:51:48.386997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.606 [2024-11-19 17:51:48.387027] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.606 [2024-11-19 17:51:48.387042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.606 [2024-11-19 17:51:48.387070] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.606 [2024-11-19 17:51:48.387085] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.606 [2024-11-19 17:51:48.387112] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.606 [2024-11-19 17:51:48.387127] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.606 [2024-11-19 17:51:48.387154] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.606 [2024-11-19 17:51:48.387169] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:55.606 #47 NEW cov: 11870 ft: 14419 corp: 13/297b lim: 30 exec/s: 47 rss: 69Mb L: 30/30 MS: 1 CrossOver- 00:07:55.606 [2024-11-19 17:51:48.456824] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000f7f7 00:07:55.606 [2024-11-19 17:51:48.456912] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200001e1e 00:07:55.606 [2024-11-19 17:51:48.456971] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000f7f7 00:07:55.606 [2024-11-19 17:51:48.457026] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000f7f7 00:07:55.606 [2024-11-19 17:51:48.457135] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0af783f7 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.606 [2024-11-19 17:51:48.457155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.606 [2024-11-19 17:51:48.457185] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:f71e021e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.606 [2024-11-19 17:51:48.457201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.606 [2024-11-19 17:51:48.457228] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:1e1e83f7 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.606 [2024-11-19 17:51:48.457247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.606 [2024-11-19 17:51:48.457275] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:f7f783f7 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.606 [2024-11-19 17:51:48.457290] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.866 #48 NEW cov: 11870 ft: 14437 corp: 14/323b lim: 30 exec/s: 48 rss: 69Mb L: 26/30 MS: 1 InsertByte- 00:07:55.866 [2024-11-19 17:51:48.526975] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000f7f7 00:07:55.866 [2024-11-19 17:51:48.527061] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200001e1e 00:07:55.867 [2024-11-19 17:51:48.527119] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000f7f7 00:07:55.867 [2024-11-19 17:51:48.527175] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000f7f7 00:07:55.867 [2024-11-19 17:51:48.527278] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0af783f7 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.867 [2024-11-19 17:51:48.527298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.867 [2024-11-19 17:51:48.527328] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:f71e021e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.867 [2024-11-19 17:51:48.527343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.867 [2024-11-19 17:51:48.527370] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:1e1e83f7 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.867 [2024-11-19 17:51:48.527386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.867 [2024-11-19 17:51:48.527413] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:f7f783f7 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.867 [2024-11-19 17:51:48.527428] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.867 #49 NEW cov: 11870 ft: 14482 corp: 15/349b lim: 30 exec/s: 49 rss: 69Mb L: 26/30 MS: 1 ChangeBit- 00:07:55.867 [2024-11-19 17:51:48.597193] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:55.867 [2024-11-19 17:51:48.597269] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:55.867 [2024-11-19 17:51:48.597328] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:55.867 [2024-11-19 17:51:48.597386] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:55.867 [2024-11-19 17:51:48.597497] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:01ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.867 [2024-11-19 17:51:48.597517] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.867 [2024-11-19 17:51:48.597549] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.867 [2024-11-19 17:51:48.597564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.867 [2024-11-19 17:51:48.597592] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.867 [2024-11-19 17:51:48.597616] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.867 [2024-11-19 17:51:48.597649] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.867 [2024-11-19 17:51:48.597664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.867 [2024-11-19 17:51:48.647289] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:55.867 [2024-11-19 17:51:48.647374] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000f7f7 00:07:55.867 [2024-11-19 17:51:48.647432] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:55.867 [2024-11-19 17:51:48.647487] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:55.867 [2024-11-19 17:51:48.647591] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:01ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.867 [2024-11-19 17:51:48.647620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.867 [2024-11-19 17:51:48.647651] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.867 [2024-11-19 17:51:48.647666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.867 [2024-11-19 17:51:48.647694] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:f7ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.867 [2024-11-19 17:51:48.647709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.867 [2024-11-19 17:51:48.647736] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.867 [2024-11-19 17:51:48.647751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.867 #51 NEW cov: 11870 ft: 14499 corp: 16/374b lim: 30 exec/s: 51 rss: 69Mb L: 25/30 MS: 2 EraseBytes-CrossOver- 00:07:55.867 [2024-11-19 17:51:48.697435] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:55.867 [2024-11-19 17:51:48.697521] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:55.867 [2024-11-19 17:51:48.697579] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:55.867 [2024-11-19 17:51:48.697643] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:55.867 [2024-11-19 17:51:48.697700] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000f7ef 00:07:55.867 [2024-11-19 17:51:48.697809] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:01ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.867 [2024-11-19 17:51:48.697829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.867 [2024-11-19 17:51:48.697859] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.867 [2024-11-19 17:51:48.697874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.867 [2024-11-19 17:51:48.697902] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffdf83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.867 [2024-11-19 17:51:48.697916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.867 [2024-11-19 17:51:48.697943] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.867 [2024-11-19 17:51:48.697962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.867 [2024-11-19 17:51:48.697989] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.867 [2024-11-19 17:51:48.698004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:56.127 #52 NEW cov: 11870 ft: 14514 corp: 17/404b lim: 30 exec/s: 52 rss: 69Mb L: 30/30 MS: 1 ChangeBit- 00:07:56.127 [2024-11-19 17:51:48.767648] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000f7f7 00:07:56.127 [2024-11-19 17:51:48.767721] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200001e1e 00:07:56.127 [2024-11-19 17:51:48.767779] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000f7f7 00:07:56.127 [2024-11-19 17:51:48.767834] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000f7f7 00:07:56.127 [2024-11-19 17:51:48.767945] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0af783f7 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.127 [2024-11-19 17:51:48.767965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.127 [2024-11-19 17:51:48.767995] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:f71e021e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.127 [2024-11-19 17:51:48.768010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.127 [2024-11-19 17:51:48.768037] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:1e1e83f7 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.127 [2024-11-19 17:51:48.768052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.127 [2024-11-19 17:51:48.768079] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:f7f783f7 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.127 [2024-11-19 17:51:48.768094] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.127 #53 NEW cov: 11870 ft: 14596 corp: 18/430b lim: 30 exec/s: 53 rss: 69Mb L: 26/30 MS: 1 CrossOver- 00:07:56.127 [2024-11-19 17:51:48.818353] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000f7f7 00:07:56.128 [2024-11-19 17:51:48.818469] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200001e1e 00:07:56.128 [2024-11-19 17:51:48.818575] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000f72b 00:07:56.128 [2024-11-19 17:51:48.818686] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000f7f7 00:07:56.128 [2024-11-19 17:51:48.818903] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0af783f7 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.128 [2024-11-19 17:51:48.818932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.128 [2024-11-19 17:51:48.818986] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:f71e021e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.128 [2024-11-19 17:51:48.819000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.128 [2024-11-19 17:51:48.819052] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:1e1e83f7 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.128 [2024-11-19 17:51:48.819066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.128 [2024-11-19 17:51:48.819117] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:f7f783f7 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.128 [2024-11-19 17:51:48.819134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.128 #54 NEW cov: 11870 ft: 14678 corp: 19/457b lim: 30 exec/s: 54 rss: 69Mb L: 27/30 MS: 1 InsertByte- 00:07:56.128 [2024-11-19 17:51:48.858472] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:56.128 [2024-11-19 17:51:48.858583] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:56.128 [2024-11-19 17:51:48.858713] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:56.128 [2024-11-19 17:51:48.858818] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:56.128 [2024-11-19 17:51:48.858923] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:56.128 [2024-11-19 17:51:48.859172] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:011e83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.128 [2024-11-19 17:51:48.859198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.128 [2024-11-19 17:51:48.859252] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.128 [2024-11-19 17:51:48.859267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.128 [2024-11-19 17:51:48.859323] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.128 [2024-11-19 17:51:48.859336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.128 [2024-11-19 17:51:48.859390] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.128 [2024-11-19 17:51:48.859404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.128 [2024-11-19 17:51:48.859459] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.128 [2024-11-19 17:51:48.859473] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:56.128 #55 NEW cov: 11870 ft: 14700 corp: 20/487b lim: 30 exec/s: 55 rss: 69Mb L: 30/30 MS: 1 ChangeBinInt- 00:07:56.128 [2024-11-19 17:51:48.898646] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:56.128 [2024-11-19 17:51:48.898756] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:56.128 [2024-11-19 17:51:48.898860] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:56.128 [2024-11-19 17:51:48.898961] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:56.128 [2024-11-19 17:51:48.899063] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000f7ef 00:07:56.128 [2024-11-19 17:51:48.899283] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:01ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.128 [2024-11-19 17:51:48.899309] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.128 [2024-11-19 17:51:48.899363] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.128 [2024-11-19 17:51:48.899377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.128 [2024-11-19 17:51:48.899432] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffdf83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.128 [2024-11-19 17:51:48.899449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.128 [2024-11-19 17:51:48.899503] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.128 [2024-11-19 17:51:48.899516] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.128 [2024-11-19 17:51:48.899569] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.128 [2024-11-19 17:51:48.899583] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:56.128 #56 NEW cov: 11870 ft: 14738 corp: 21/517b lim: 30 exec/s: 56 rss: 69Mb L: 30/30 MS: 1 ShuffleBytes- 00:07:56.128 [2024-11-19 17:51:48.938703] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000f7f7 00:07:56.128 [2024-11-19 17:51:48.938814] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200001e1e 00:07:56.128 [2024-11-19 17:51:48.938918] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000f7f7 00:07:56.128 [2024-11-19 17:51:48.939021] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000f7f7 00:07:56.128 [2024-11-19 17:51:48.939237] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0af783f7 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.128 [2024-11-19 17:51:48.939263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.128 [2024-11-19 17:51:48.939318] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:f71e021e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.128 [2024-11-19 17:51:48.939332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.128 [2024-11-19 17:51:48.939386] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:1e1e83f7 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.128 [2024-11-19 17:51:48.939400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.128 [2024-11-19 17:51:48.939453] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:f72b83f7 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.128 [2024-11-19 17:51:48.939467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.128 #57 NEW cov: 11870 ft: 14744 corp: 22/544b lim: 30 exec/s: 57 rss: 69Mb L: 27/30 MS: 1 ShuffleBytes- 00:07:56.128 [2024-11-19 17:51:48.978827] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000f7f7 00:07:56.128 [2024-11-19 17:51:48.978938] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200001e1e 00:07:56.128 [2024-11-19 17:51:48.979043] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000f7f7 00:07:56.128 [2024-11-19 17:51:48.979146] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000f7f7 00:07:56.128 [2024-11-19 17:51:48.979347] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0af783f7 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.128 [2024-11-19 17:51:48.979372] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.128 [2024-11-19 17:51:48.979427] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:f71e021e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.128 [2024-11-19 17:51:48.979441] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.128 [2024-11-19 17:51:48.979516] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:1e1e83f7 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.128 [2024-11-19 17:51:48.979531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.128 [2024-11-19 17:51:48.979587] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:f7f783f7 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.128 [2024-11-19 17:51:48.979605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.391 #58 NEW cov: 11870 ft: 14758 corp: 23/570b lim: 30 exec/s: 58 rss: 69Mb L: 26/30 MS: 1 ShuffleBytes- 00:07:56.391 [2024-11-19 17:51:49.018947] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:56.391 [2024-11-19 17:51:49.019060] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:56.391 [2024-11-19 17:51:49.019167] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:56.391 [2024-11-19 17:51:49.019273] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:56.391 [2024-11-19 17:51:49.019481] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:01ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.391 [2024-11-19 17:51:49.019506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.391 [2024-11-19 17:51:49.019562] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.391 [2024-11-19 17:51:49.019575] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.391 [2024-11-19 17:51:49.019638] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.391 [2024-11-19 17:51:49.019651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.391 [2024-11-19 17:51:49.019702] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.391 [2024-11-19 17:51:49.019715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.391 #59 NEW cov: 11870 ft: 14770 corp: 24/596b lim: 30 exec/s: 59 rss: 69Mb L: 26/30 MS: 1 EraseBytes- 00:07:56.391 [2024-11-19 17:51:49.059086] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000f7f7 00:07:56.391 [2024-11-19 17:51:49.059211] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (1040352) > buf size (4096) 00:07:56.391 [2024-11-19 17:51:49.059408] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xf7f7 00:07:56.391 [2024-11-19 17:51:49.059514] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000f7f7 00:07:56.391 [2024-11-19 17:51:49.059725] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0af783f7 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.391 [2024-11-19 17:51:49.059751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.391 [2024-11-19 17:51:49.059807] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:f7f783f7 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.391 [2024-11-19 17:51:49.059821] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.391 [2024-11-19 17:51:49.059875] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.391 [2024-11-19 17:51:49.059893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.391 [2024-11-19 17:51:49.059948] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.391 [2024-11-19 17:51:49.059962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.391 [2024-11-19 17:51:49.060015] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:f7f783f7 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.391 [2024-11-19 17:51:49.060029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:56.391 #60 NEW cov: 11887 ft: 14822 corp: 25/626b lim: 30 exec/s: 60 rss: 69Mb L: 30/30 MS: 1 InsertRepeatedBytes- 00:07:56.391 [2024-11-19 17:51:49.099207] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:56.391 [2024-11-19 17:51:49.099318] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:56.391 [2024-11-19 17:51:49.099423] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:56.391 [2024-11-19 17:51:49.099525] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:56.391 [2024-11-19 17:51:49.099636] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:56.391 [2024-11-19 17:51:49.099838] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:01ff831e cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.391 [2024-11-19 17:51:49.099862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.391 [2024-11-19 17:51:49.099916] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.391 [2024-11-19 17:51:49.099929] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.391 [2024-11-19 17:51:49.099981] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.391 [2024-11-19 17:51:49.099995] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.391 [2024-11-19 17:51:49.100049] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.391 [2024-11-19 17:51:49.100062] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.391 [2024-11-19 17:51:49.100115] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.391 [2024-11-19 17:51:49.100128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:56.391 #61 NEW cov: 11887 ft: 14836 corp: 26/656b lim: 30 exec/s: 61 rss: 69Mb L: 30/30 MS: 1 ChangeBinInt- 00:07:56.391 [2024-11-19 17:51:49.139291] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:56.391 [2024-11-19 17:51:49.139404] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000fffb 00:07:56.391 [2024-11-19 17:51:49.139520] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:56.391 [2024-11-19 17:51:49.139628] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:56.391 [2024-11-19 17:51:49.139831] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:01ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.391 [2024-11-19 17:51:49.139856] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.391 [2024-11-19 17:51:49.139915] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.391 [2024-11-19 17:51:49.139930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.392 [2024-11-19 17:51:49.139982] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.392 [2024-11-19 17:51:49.139996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.392 [2024-11-19 17:51:49.140030] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.392 [2024-11-19 17:51:49.140043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.392 #62 NEW cov: 11887 ft: 14853 corp: 27/684b lim: 30 exec/s: 62 rss: 69Mb L: 28/30 MS: 1 ChangeBit- 00:07:56.392 [2024-11-19 17:51:49.179398] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000f7f7 00:07:56.392 [2024-11-19 17:51:49.179525] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200001e1e 00:07:56.392 [2024-11-19 17:51:49.179636] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000f7f7 00:07:56.392 [2024-11-19 17:51:49.179751] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000f7f7 00:07:56.392 [2024-11-19 17:51:49.179967] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0af783f7 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.392 [2024-11-19 17:51:49.179993] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.392 [2024-11-19 17:51:49.180050] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:f71e022f cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.392 [2024-11-19 17:51:49.180063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.392 [2024-11-19 17:51:49.180118] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:1e1e831e cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.392 [2024-11-19 17:51:49.180131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.392 [2024-11-19 17:51:49.180179] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:f7f7832b cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.392 [2024-11-19 17:51:49.180192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.392 #63 NEW cov: 11887 ft: 14889 corp: 28/712b lim: 30 exec/s: 63 rss: 69Mb L: 28/30 MS: 1 InsertByte- 00:07:56.392 [2024-11-19 17:51:49.219547] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:56.392 [2024-11-19 17:51:49.219679] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:56.392 [2024-11-19 17:51:49.219781] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:56.392 [2024-11-19 17:51:49.219878] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xffff 00:07:56.392 [2024-11-19 17:51:49.219980] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:56.392 [2024-11-19 17:51:49.220190] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:011e83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.392 [2024-11-19 17:51:49.220215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.392 [2024-11-19 17:51:49.220270] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.392 [2024-11-19 17:51:49.220287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.392 [2024-11-19 17:51:49.220340] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.392 [2024-11-19 17:51:49.220353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.392 [2024-11-19 17:51:49.220405] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.392 [2024-11-19 17:51:49.220419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.392 [2024-11-19 17:51:49.220472] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.392 [2024-11-19 17:51:49.220486] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:56.392 #64 NEW cov: 11887 ft: 14899 corp: 29/742b lim: 30 exec/s: 64 rss: 69Mb L: 30/30 MS: 1 ChangeByte- 00:07:56.654 [2024-11-19 17:51:49.259644] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:56.654 [2024-11-19 17:51:49.259754] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:56.654 [2024-11-19 17:51:49.259860] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:56.654 [2024-11-19 17:51:49.259959] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:56.654 [2024-11-19 17:51:49.260153] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:01ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.654 [2024-11-19 17:51:49.260179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.654 [2024-11-19 17:51:49.260233] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.654 [2024-11-19 17:51:49.260247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.654 [2024-11-19 17:51:49.260300] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.654 [2024-11-19 17:51:49.260314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.654 [2024-11-19 17:51:49.260370] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.654 [2024-11-19 17:51:49.260383] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.654 #65 NEW cov: 11887 ft: 14907 corp: 30/768b lim: 30 exec/s: 65 rss: 69Mb L: 26/30 MS: 1 EraseBytes- 00:07:56.654 [2024-11-19 17:51:49.299767] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:56.654 [2024-11-19 17:51:49.299875] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:56.654 [2024-11-19 17:51:49.299982] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:56.654 [2024-11-19 17:51:49.300087] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xffff 00:07:56.654 [2024-11-19 17:51:49.300192] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:56.654 [2024-11-19 17:51:49.300395] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:011e83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.654 [2024-11-19 17:51:49.300424] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.654 [2024-11-19 17:51:49.300477] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.654 [2024-11-19 17:51:49.300491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.654 [2024-11-19 17:51:49.300535] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.654 [2024-11-19 17:51:49.300548] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.654 [2024-11-19 17:51:49.300603] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.654 [2024-11-19 17:51:49.300617] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.654 [2024-11-19 17:51:49.300670] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.654 [2024-11-19 17:51:49.300683] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:56.655 #66 NEW cov: 11887 ft: 14935 corp: 31/798b lim: 30 exec/s: 66 rss: 69Mb L: 30/30 MS: 1 ShuffleBytes- 00:07:56.655 [2024-11-19 17:51:49.339897] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:56.655 [2024-11-19 17:51:49.340008] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000fffd 00:07:56.655 [2024-11-19 17:51:49.340107] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:56.655 [2024-11-19 17:51:49.340202] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (1048576) > buf size (4096) 00:07:56.655 [2024-11-19 17:51:49.340304] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xff 00:07:56.655 [2024-11-19 17:51:49.340499] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:01ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.655 [2024-11-19 17:51:49.340524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.655 [2024-11-19 17:51:49.340577] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.655 [2024-11-19 17:51:49.340590] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.655 [2024-11-19 17:51:49.340650] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.655 [2024-11-19 17:51:49.340664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.655 [2024-11-19 17:51:49.340716] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.655 [2024-11-19 17:51:49.340728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.655 [2024-11-19 17:51:49.340781] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.655 [2024-11-19 17:51:49.340794] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:56.655 #67 NEW cov: 11894 ft: 14953 corp: 32/828b lim: 30 exec/s: 67 rss: 69Mb L: 30/30 MS: 1 PersAutoDict- DE: "\373\003\000\000\000\000\000\000"- 00:07:56.655 [2024-11-19 17:51:49.379985] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:56.655 [2024-11-19 17:51:49.380099] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:56.655 [2024-11-19 17:51:49.380201] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:56.655 [2024-11-19 17:51:49.380301] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:56.655 [2024-11-19 17:51:49.380509] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:01ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.655 [2024-11-19 17:51:49.380534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.655 [2024-11-19 17:51:49.380590] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:fff583ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.655 [2024-11-19 17:51:49.380608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.655 [2024-11-19 17:51:49.380662] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.655 [2024-11-19 17:51:49.380675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.655 [2024-11-19 17:51:49.380728] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.655 [2024-11-19 17:51:49.380742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.655 #68 NEW cov: 11894 ft: 14963 corp: 33/856b lim: 30 exec/s: 68 rss: 69Mb L: 28/30 MS: 1 ChangeBinInt- 00:07:56.655 [2024-11-19 17:51:49.420149] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:56.655 [2024-11-19 17:51:49.420257] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:56.655 [2024-11-19 17:51:49.420374] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x3000028ff 00:07:56.655 [2024-11-19 17:51:49.420471] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:56.655 [2024-11-19 17:51:49.420567] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:56.655 [2024-11-19 17:51:49.420777] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:01ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.655 [2024-11-19 17:51:49.420803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.655 [2024-11-19 17:51:49.420859] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.655 [2024-11-19 17:51:49.420874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.655 [2024-11-19 17:51:49.420929] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.655 [2024-11-19 17:51:49.420942] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.655 [2024-11-19 17:51:49.420997] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.655 [2024-11-19 17:51:49.421011] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.655 [2024-11-19 17:51:49.421064] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.655 [2024-11-19 17:51:49.421077] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:56.655 #69 NEW cov: 11894 ft: 14975 corp: 34/886b lim: 30 exec/s: 34 rss: 70Mb L: 30/30 MS: 1 CopyPart- 00:07:56.655 #69 DONE cov: 11894 ft: 14975 corp: 34/886b lim: 30 exec/s: 34 rss: 70Mb 00:07:56.655 ###### Recommended dictionary. ###### 00:07:56.655 "\373\003\000\000\000\000\000\000" # Uses: 1 00:07:56.655 ###### End of recommended dictionary. ###### 00:07:56.655 Done 69 runs in 2 second(s) 00:07:56.914 17:51:49 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_1.conf 00:07:56.914 17:51:49 -- ../common.sh@72 -- # (( i++ )) 00:07:56.914 17:51:49 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:56.914 17:51:49 -- ../common.sh@73 -- # start_llvm_fuzz 2 1 0x1 00:07:56.914 17:51:49 -- nvmf/run.sh@23 -- # local fuzzer_type=2 00:07:56.914 17:51:49 -- nvmf/run.sh@24 -- # local timen=1 00:07:56.914 17:51:49 -- nvmf/run.sh@25 -- # local core=0x1 00:07:56.914 17:51:49 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:07:56.914 17:51:49 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_2.conf 00:07:56.914 17:51:49 -- nvmf/run.sh@29 -- # printf %02d 2 00:07:56.914 17:51:49 -- nvmf/run.sh@29 -- # port=4402 00:07:56.914 17:51:49 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:07:56.914 17:51:49 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4402' 00:07:56.914 17:51:49 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4402"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:56.914 17:51:49 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4402' -c /tmp/fuzz_json_2.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 -Z 2 -r /var/tmp/spdk2.sock 00:07:56.914 [2024-11-19 17:51:49.604031] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:56.914 [2024-11-19 17:51:49.604118] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid634696 ] 00:07:56.914 EAL: No free 2048 kB hugepages reported on node 1 00:07:57.173 [2024-11-19 17:51:49.858181] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:57.173 [2024-11-19 17:51:49.885512] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:57.173 [2024-11-19 17:51:49.885643] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:57.173 [2024-11-19 17:51:49.937082] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:57.173 [2024-11-19 17:51:49.953397] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4402 *** 00:07:57.173 INFO: Running with entropic power schedule (0xFF, 100). 00:07:57.173 INFO: Seed: 2083310994 00:07:57.173 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:07:57.173 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:07:57.173 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:07:57.173 INFO: A corpus is not provided, starting from an empty corpus 00:07:57.173 #2 INITED exec/s: 0 rss: 59Mb 00:07:57.173 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:57.173 This may also happen if the target rejected all inputs we tried so far 00:07:57.692 NEW_FUNC[1/656]: 0x454738 in fuzz_admin_identify_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:95 00:07:57.692 NEW_FUNC[2/656]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:57.692 #11 NEW cov: 11452 ft: 11453 corp: 2/10b lim: 35 exec/s: 0 rss: 67Mb L: 9/9 MS: 4 ChangeBit-CopyPart-ShuffleBytes-CMP- DE: "\001\214lQm^v,"- 00:07:57.692 #12 NEW cov: 11565 ft: 12178 corp: 3/19b lim: 35 exec/s: 0 rss: 67Mb L: 9/9 MS: 1 ChangeBinInt- 00:07:57.692 [2024-11-19 17:51:50.389855] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:018c002c cdw11:6d006c51 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.692 [2024-11-19 17:51:50.389896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.692 NEW_FUNC[1/14]: 0x16b5478 in spdk_nvme_print_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_qpair.c:263 00:07:57.692 NEW_FUNC[2/14]: 0x16b56b8 in nvme_admin_qpair_print_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_qpair.c:202 00:07:57.692 #13 NEW cov: 11703 ft: 12806 corp: 4/36b lim: 35 exec/s: 0 rss: 67Mb L: 17/17 MS: 1 PersAutoDict- DE: "\001\214lQm^v,"- 00:07:57.692 [2024-11-19 17:51:50.429745] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000009 cdw11:5e00006d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.692 [2024-11-19 17:51:50.429772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.692 #14 NEW cov: 11788 ft: 13447 corp: 5/45b lim: 35 exec/s: 0 rss: 67Mb L: 9/17 MS: 1 ChangeBinInt- 00:07:57.692 [2024-11-19 17:51:50.479909] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:002a0009 cdw11:6d000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.692 [2024-11-19 17:51:50.479934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.692 #15 NEW cov: 11788 ft: 13494 corp: 6/55b lim: 35 exec/s: 0 rss: 67Mb L: 10/17 MS: 1 InsertByte- 00:07:57.692 [2024-11-19 17:51:50.520158] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:018c002c cdw11:6d006c51 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.692 [2024-11-19 17:51:50.520183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.692 #16 NEW cov: 11788 ft: 13532 corp: 7/72b lim: 35 exec/s: 0 rss: 67Mb L: 17/17 MS: 1 ChangeByte- 00:07:57.952 #17 NEW cov: 11788 ft: 13634 corp: 8/82b lim: 35 exec/s: 0 rss: 67Mb L: 10/17 MS: 1 InsertByte- 00:07:57.952 [2024-11-19 17:51:50.600226] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:006d0009 cdw11:2c005e76 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.952 [2024-11-19 17:51:50.600250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.952 #18 NEW cov: 11788 ft: 13764 corp: 9/89b lim: 35 exec/s: 0 rss: 67Mb L: 7/17 MS: 1 EraseBytes- 00:07:57.952 #19 NEW cov: 11788 ft: 13857 corp: 10/98b lim: 35 exec/s: 0 rss: 67Mb L: 9/17 MS: 1 ChangeBit- 00:07:57.952 [2024-11-19 17:51:50.680531] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000009 cdw11:5e00006d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.952 [2024-11-19 17:51:50.680556] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.952 #20 NEW cov: 11788 ft: 13877 corp: 11/107b lim: 35 exec/s: 0 rss: 67Mb L: 9/17 MS: 1 ShuffleBytes- 00:07:57.952 [2024-11-19 17:51:50.720915] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.952 [2024-11-19 17:51:50.720940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.952 [2024-11-19 17:51:50.721008] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:486d006c cdw11:2c005e76 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.952 [2024-11-19 17:51:50.721022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.952 #21 NEW cov: 11788 ft: 14119 corp: 12/128b lim: 35 exec/s: 0 rss: 67Mb L: 21/21 MS: 1 InsertRepeatedBytes- 00:07:57.952 [2024-11-19 17:51:50.760704] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000009 cdw11:5e00276d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.952 [2024-11-19 17:51:50.760729] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.952 #22 NEW cov: 11788 ft: 14181 corp: 13/137b lim: 35 exec/s: 0 rss: 67Mb L: 9/21 MS: 1 ChangeByte- 00:07:57.952 [2024-11-19 17:51:50.800825] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00340009 cdw11:5e00006d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.952 [2024-11-19 17:51:50.800849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.212 #23 NEW cov: 11788 ft: 14202 corp: 14/146b lim: 35 exec/s: 0 rss: 67Mb L: 9/21 MS: 1 ChangeByte- 00:07:58.212 [2024-11-19 17:51:50.840933] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000011 cdw11:5e00006d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.212 [2024-11-19 17:51:50.840957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.212 #24 NEW cov: 11788 ft: 14302 corp: 15/155b lim: 35 exec/s: 0 rss: 67Mb L: 9/21 MS: 1 ChangeBinInt- 00:07:58.212 [2024-11-19 17:51:50.881049] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:8c6c0081 cdw11:5e00486d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.212 [2024-11-19 17:51:50.881073] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.212 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:58.212 #25 NEW cov: 11811 ft: 14449 corp: 16/164b lim: 35 exec/s: 0 rss: 67Mb L: 9/21 MS: 1 ChangeBit- 00:07:58.212 #26 NEW cov: 11811 ft: 14806 corp: 17/181b lim: 35 exec/s: 0 rss: 67Mb L: 17/21 MS: 1 PersAutoDict- DE: "\001\214lQm^v,"- 00:07:58.212 [2024-11-19 17:51:50.961271] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a000009 cdw11:6d000027 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.212 [2024-11-19 17:51:50.961295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.212 #27 NEW cov: 11811 ft: 14839 corp: 18/191b lim: 35 exec/s: 0 rss: 68Mb L: 10/21 MS: 1 CrossOver- 00:07:58.212 [2024-11-19 17:51:51.001806] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:5c5c005c cdw11:5c005c5c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.212 [2024-11-19 17:51:51.001830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.212 [2024-11-19 17:51:51.001883] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:5c5c005c cdw11:5c005c5c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.212 [2024-11-19 17:51:51.001896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.212 [2024-11-19 17:51:51.001948] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:5c5c005c cdw11:5c005c5c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.212 [2024-11-19 17:51:51.001961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.212 [2024-11-19 17:51:51.002013] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:5c5c005c cdw11:5c005c5c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.212 [2024-11-19 17:51:51.002026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.212 #30 NEW cov: 11811 ft: 15340 corp: 19/224b lim: 35 exec/s: 30 rss: 68Mb L: 33/33 MS: 3 ChangeBit-InsertByte-InsertRepeatedBytes- 00:07:58.212 #31 NEW cov: 11811 ft: 15350 corp: 20/233b lim: 35 exec/s: 31 rss: 68Mb L: 9/33 MS: 1 PersAutoDict- DE: "\001\214lQm^v,"- 00:07:58.471 #32 NEW cov: 11811 ft: 15353 corp: 21/244b lim: 35 exec/s: 32 rss: 68Mb L: 11/33 MS: 1 CMP- DE: "\001\022"- 00:07:58.471 [2024-11-19 17:51:51.111744] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000e cdw11:5e00006d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.471 [2024-11-19 17:51:51.111772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.471 #33 NEW cov: 11811 ft: 15410 corp: 22/253b lim: 35 exec/s: 33 rss: 68Mb L: 9/33 MS: 1 ChangeBinInt- 00:07:58.471 [2024-11-19 17:51:51.151828] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a150009 cdw11:6d000027 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.471 [2024-11-19 17:51:51.151852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.471 #34 NEW cov: 11811 ft: 15421 corp: 23/263b lim: 35 exec/s: 34 rss: 68Mb L: 10/33 MS: 1 ChangeByte- 00:07:58.471 [2024-11-19 17:51:51.191974] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a000009 cdw11:6d0000ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.471 [2024-11-19 17:51:51.191999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.471 #35 NEW cov: 11811 ft: 15485 corp: 24/273b lim: 35 exec/s: 35 rss: 68Mb L: 10/33 MS: 1 ChangeByte- 00:07:58.471 #36 NEW cov: 11811 ft: 15556 corp: 25/289b lim: 35 exec/s: 36 rss: 68Mb L: 16/33 MS: 1 CopyPart- 00:07:58.471 [2024-11-19 17:51:51.272204] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:006d0009 cdw11:2c005e76 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.471 [2024-11-19 17:51:51.272229] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.471 #37 NEW cov: 11811 ft: 15675 corp: 26/299b lim: 35 exec/s: 37 rss: 68Mb L: 10/33 MS: 1 CopyPart- 00:07:58.471 [2024-11-19 17:51:51.312504] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.471 [2024-11-19 17:51:51.312529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.731 #38 NEW cov: 11811 ft: 15680 corp: 27/319b lim: 35 exec/s: 38 rss: 68Mb L: 20/33 MS: 1 EraseBytes- 00:07:58.731 [2024-11-19 17:51:51.352405] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000009 cdw11:5e00006d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.731 [2024-11-19 17:51:51.352429] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.731 #39 NEW cov: 11811 ft: 15704 corp: 28/328b lim: 35 exec/s: 39 rss: 68Mb L: 9/33 MS: 1 ChangeByte- 00:07:58.731 [2024-11-19 17:51:51.392745] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00bfff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.731 [2024-11-19 17:51:51.392771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.731 #40 NEW cov: 11811 ft: 15715 corp: 29/348b lim: 35 exec/s: 40 rss: 68Mb L: 20/33 MS: 1 ChangeBit- 00:07:58.731 [2024-11-19 17:51:51.432681] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000009 cdw11:5e00276d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.731 [2024-11-19 17:51:51.432707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.731 [2024-11-19 17:51:51.472786] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00090009 cdw11:27000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.731 [2024-11-19 17:51:51.472811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.731 #42 NEW cov: 11811 ft: 15725 corp: 30/360b lim: 35 exec/s: 42 rss: 68Mb L: 12/33 MS: 2 EraseBytes-CopyPart- 00:07:58.731 [2024-11-19 17:51:51.512930] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:000a0009 cdw11:6d0000ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.731 [2024-11-19 17:51:51.512956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.731 #43 NEW cov: 11811 ft: 15737 corp: 31/370b lim: 35 exec/s: 43 rss: 68Mb L: 10/33 MS: 1 ShuffleBytes- 00:07:58.731 [2024-11-19 17:51:51.553025] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:006d0009 cdw11:cc005e76 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.731 [2024-11-19 17:51:51.553049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.731 #44 NEW cov: 11811 ft: 15813 corp: 32/377b lim: 35 exec/s: 44 rss: 68Mb L: 7/33 MS: 1 ChangeBinInt- 00:07:58.731 [2024-11-19 17:51:51.593166] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:76ff0009 cdw11:00006d00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.731 [2024-11-19 17:51:51.593191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.991 #45 NEW cov: 11811 ft: 15833 corp: 33/387b lim: 35 exec/s: 45 rss: 68Mb L: 10/33 MS: 1 ShuffleBytes- 00:07:58.991 [2024-11-19 17:51:51.633242] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:2f000009 cdw11:00000900 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.991 [2024-11-19 17:51:51.633267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.991 #46 NEW cov: 11811 ft: 15835 corp: 34/400b lim: 35 exec/s: 46 rss: 69Mb L: 13/33 MS: 1 InsertByte- 00:07:58.991 [2024-11-19 17:51:51.673387] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000009 cdw11:5e00006d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.991 [2024-11-19 17:51:51.673412] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.991 #47 NEW cov: 11811 ft: 15839 corp: 35/409b lim: 35 exec/s: 47 rss: 69Mb L: 9/33 MS: 1 ShuffleBytes- 00:07:58.991 [2024-11-19 17:51:51.703578] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:8c6c0081 cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.991 [2024-11-19 17:51:51.703608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.991 [2024-11-19 17:51:51.703688] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.991 [2024-11-19 17:51:51.703702] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.991 #48 NEW cov: 11811 ft: 15861 corp: 36/428b lim: 35 exec/s: 48 rss: 69Mb L: 19/33 MS: 1 InsertRepeatedBytes- 00:07:58.991 [2024-11-19 17:51:51.743579] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:28090009 cdw11:27000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.991 [2024-11-19 17:51:51.743608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.991 #49 NEW cov: 11811 ft: 15876 corp: 37/440b lim: 35 exec/s: 49 rss: 69Mb L: 12/33 MS: 1 ChangeByte- 00:07:58.991 [2024-11-19 17:51:51.783709] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000009 cdw11:5e00006d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.991 [2024-11-19 17:51:51.783734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.991 #50 NEW cov: 11811 ft: 15953 corp: 38/453b lim: 35 exec/s: 50 rss: 69Mb L: 13/33 MS: 1 CMP- DE: "\235\001\000\000"- 00:07:58.991 [2024-11-19 17:51:51.823799] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:5e00006d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.991 [2024-11-19 17:51:51.823824] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.991 #51 NEW cov: 11811 ft: 15962 corp: 39/462b lim: 35 exec/s: 51 rss: 69Mb L: 9/33 MS: 1 ChangeBit- 00:07:59.250 [2024-11-19 17:51:51.864356] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.250 [2024-11-19 17:51:51.864381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.250 [2024-11-19 17:51:51.864431] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.250 [2024-11-19 17:51:51.864444] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.250 [2024-11-19 17:51:51.864495] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.250 [2024-11-19 17:51:51.864507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.250 [2024-11-19 17:51:51.864560] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ff0000ff cdw11:76006d5e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.250 [2024-11-19 17:51:51.864573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.250 #52 NEW cov: 11811 ft: 15987 corp: 40/491b lim: 35 exec/s: 52 rss: 69Mb L: 29/33 MS: 1 InsertRepeatedBytes- 00:07:59.250 [2024-11-19 17:51:51.914106] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:27000009 cdw11:5e00276d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.250 [2024-11-19 17:51:51.914131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.250 #53 NEW cov: 11811 ft: 16049 corp: 41/499b lim: 35 exec/s: 53 rss: 69Mb L: 8/33 MS: 1 EraseBytes- 00:07:59.250 #54 NEW cov: 11811 ft: 16056 corp: 42/516b lim: 35 exec/s: 54 rss: 69Mb L: 17/33 MS: 1 ShuffleBytes- 00:07:59.251 [2024-11-19 17:51:51.994322] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000009 cdw11:6d000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.251 [2024-11-19 17:51:51.994346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.251 #55 NEW cov: 11811 ft: 16126 corp: 43/526b lim: 35 exec/s: 27 rss: 69Mb L: 10/33 MS: 1 CrossOver- 00:07:59.251 #55 DONE cov: 11811 ft: 16126 corp: 43/526b lim: 35 exec/s: 27 rss: 69Mb 00:07:59.251 ###### Recommended dictionary. ###### 00:07:59.251 "\001\214lQm^v," # Uses: 3 00:07:59.251 "\001\022" # Uses: 0 00:07:59.251 "\235\001\000\000" # Uses: 0 00:07:59.251 ###### End of recommended dictionary. ###### 00:07:59.251 Done 55 runs in 2 second(s) 00:07:59.510 17:51:52 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_2.conf 00:07:59.510 17:51:52 -- ../common.sh@72 -- # (( i++ )) 00:07:59.510 17:51:52 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:59.510 17:51:52 -- ../common.sh@73 -- # start_llvm_fuzz 3 1 0x1 00:07:59.510 17:51:52 -- nvmf/run.sh@23 -- # local fuzzer_type=3 00:07:59.510 17:51:52 -- nvmf/run.sh@24 -- # local timen=1 00:07:59.510 17:51:52 -- nvmf/run.sh@25 -- # local core=0x1 00:07:59.510 17:51:52 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:07:59.510 17:51:52 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_3.conf 00:07:59.510 17:51:52 -- nvmf/run.sh@29 -- # printf %02d 3 00:07:59.510 17:51:52 -- nvmf/run.sh@29 -- # port=4403 00:07:59.510 17:51:52 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:07:59.510 17:51:52 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4403' 00:07:59.510 17:51:52 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4403"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:59.510 17:51:52 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4403' -c /tmp/fuzz_json_3.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 -Z 3 -r /var/tmp/spdk3.sock 00:07:59.510 [2024-11-19 17:51:52.179898] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:59.510 [2024-11-19 17:51:52.179992] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid635233 ] 00:07:59.510 EAL: No free 2048 kB hugepages reported on node 1 00:07:59.770 [2024-11-19 17:51:52.436730] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:59.770 [2024-11-19 17:51:52.462265] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:59.770 [2024-11-19 17:51:52.462384] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:59.770 [2024-11-19 17:51:52.513797] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:59.770 [2024-11-19 17:51:52.530117] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4403 *** 00:07:59.770 INFO: Running with entropic power schedule (0xFF, 100). 00:07:59.770 INFO: Seed: 366343141 00:07:59.770 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:07:59.770 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:07:59.770 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:07:59.770 INFO: A corpus is not provided, starting from an empty corpus 00:07:59.770 #2 INITED exec/s: 0 rss: 59Mb 00:07:59.770 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:59.770 This may also happen if the target rejected all inputs we tried so far 00:08:00.030 NEW_FUNC[1/659]: 0x456418 in fuzz_admin_abort_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:114 00:08:00.030 NEW_FUNC[2/659]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:00.030 #16 NEW cov: 11480 ft: 11463 corp: 2/5b lim: 20 exec/s: 0 rss: 66Mb L: 4/4 MS: 4 CrossOver-InsertByte-ChangeBit-InsertByte- 00:08:00.290 [2024-11-19 17:51:52.895526] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:08:00.290 [2024-11-19 17:51:52.895569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.290 NEW_FUNC[1/17]: 0x1137598 in nvmf_qpair_abort_request /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:3224 00:08:00.290 NEW_FUNC[2/17]: 0x1138118 in nvmf_qpair_abort_aer /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:3166 00:08:00.290 #17 NEW cov: 11846 ft: 12314 corp: 3/12b lim: 20 exec/s: 0 rss: 67Mb L: 7/7 MS: 1 InsertRepeatedBytes- 00:08:00.290 #18 NEW cov: 11869 ft: 12896 corp: 4/31b lim: 20 exec/s: 0 rss: 67Mb L: 19/19 MS: 1 InsertRepeatedBytes- 00:08:00.290 #20 NEW cov: 11954 ft: 13221 corp: 5/35b lim: 20 exec/s: 0 rss: 67Mb L: 4/19 MS: 2 CopyPart-CopyPart- 00:08:00.290 [2024-11-19 17:51:53.075873] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:08:00.290 [2024-11-19 17:51:53.075907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.290 #21 NEW cov: 11954 ft: 13309 corp: 6/42b lim: 20 exec/s: 0 rss: 67Mb L: 7/19 MS: 1 ChangeByte- 00:08:00.550 #22 NEW cov: 11954 ft: 13385 corp: 7/46b lim: 20 exec/s: 0 rss: 67Mb L: 4/19 MS: 1 ChangeBit- 00:08:00.550 #23 NEW cov: 11954 ft: 13551 corp: 8/50b lim: 20 exec/s: 0 rss: 67Mb L: 4/19 MS: 1 ShuffleBytes- 00:08:00.550 [2024-11-19 17:51:53.266402] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:08:00.550 [2024-11-19 17:51:53.266434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.550 #24 NEW cov: 11954 ft: 13634 corp: 9/57b lim: 20 exec/s: 0 rss: 67Mb L: 7/19 MS: 1 ShuffleBytes- 00:08:00.550 [2024-11-19 17:51:53.316485] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:08:00.550 [2024-11-19 17:51:53.316520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.550 #25 NEW cov: 11954 ft: 13693 corp: 10/64b lim: 20 exec/s: 0 rss: 67Mb L: 7/19 MS: 1 ChangeBit- 00:08:00.550 #26 NEW cov: 11954 ft: 13728 corp: 11/69b lim: 20 exec/s: 0 rss: 67Mb L: 5/19 MS: 1 InsertByte- 00:08:00.810 #29 NEW cov: 11959 ft: 13955 corp: 12/78b lim: 20 exec/s: 0 rss: 68Mb L: 9/19 MS: 3 ChangeByte-ChangeBit-InsertRepeatedBytes- 00:08:00.810 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:00.810 #30 NEW cov: 11976 ft: 13993 corp: 13/82b lim: 20 exec/s: 0 rss: 68Mb L: 4/19 MS: 1 ShuffleBytes- 00:08:00.810 #31 NEW cov: 11976 ft: 14091 corp: 14/100b lim: 20 exec/s: 31 rss: 68Mb L: 18/19 MS: 1 CopyPart- 00:08:00.810 [2024-11-19 17:51:53.587438] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:08:00.810 [2024-11-19 17:51:53.587471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.810 NEW_FUNC[1/3]: 0x1292e78 in nvmf_transport_qpair_abort_request /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/transport.c:773 00:08:00.810 NEW_FUNC[2/3]: 0x12b3f38 in nvmf_tcp_qpair_abort_request /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/tcp.c:3493 00:08:00.810 #32 NEW cov: 12057 ft: 14309 corp: 15/117b lim: 20 exec/s: 32 rss: 68Mb L: 17/19 MS: 1 InsertRepeatedBytes- 00:08:01.070 #33 NEW cov: 12057 ft: 14343 corp: 16/136b lim: 20 exec/s: 33 rss: 68Mb L: 19/19 MS: 1 ChangeBinInt- 00:08:01.070 #34 NEW cov: 12057 ft: 14356 corp: 17/141b lim: 20 exec/s: 34 rss: 69Mb L: 5/19 MS: 1 InsertByte- 00:08:01.070 #35 NEW cov: 12060 ft: 14485 corp: 18/145b lim: 20 exec/s: 35 rss: 69Mb L: 4/19 MS: 1 EraseBytes- 00:08:01.070 #41 NEW cov: 12060 ft: 14569 corp: 19/150b lim: 20 exec/s: 41 rss: 69Mb L: 5/19 MS: 1 CrossOver- 00:08:01.070 #42 NEW cov: 12060 ft: 14701 corp: 20/170b lim: 20 exec/s: 42 rss: 69Mb L: 20/20 MS: 1 InsertByte- 00:08:01.070 #44 NEW cov: 12060 ft: 14733 corp: 21/179b lim: 20 exec/s: 44 rss: 69Mb L: 9/20 MS: 2 EraseBytes-CrossOver- 00:08:01.329 #45 NEW cov: 12060 ft: 14742 corp: 22/183b lim: 20 exec/s: 45 rss: 69Mb L: 4/20 MS: 1 ShuffleBytes- 00:08:01.329 #46 NEW cov: 12060 ft: 14768 corp: 23/190b lim: 20 exec/s: 46 rss: 69Mb L: 7/20 MS: 1 ShuffleBytes- 00:08:01.329 #47 NEW cov: 12060 ft: 14795 corp: 24/209b lim: 20 exec/s: 47 rss: 69Mb L: 19/20 MS: 1 CrossOver- 00:08:01.329 #48 NEW cov: 12060 ft: 14815 corp: 25/218b lim: 20 exec/s: 48 rss: 69Mb L: 9/20 MS: 1 InsertRepeatedBytes- 00:08:01.329 #49 NEW cov: 12060 ft: 14843 corp: 26/237b lim: 20 exec/s: 49 rss: 69Mb L: 19/20 MS: 1 ChangeByte- 00:08:01.329 #50 NEW cov: 12060 ft: 14910 corp: 27/255b lim: 20 exec/s: 50 rss: 69Mb L: 18/20 MS: 1 ChangeByte- 00:08:01.589 #51 NEW cov: 12060 ft: 14961 corp: 28/260b lim: 20 exec/s: 51 rss: 69Mb L: 5/20 MS: 1 CMP- DE: "\377\377\001\000"- 00:08:01.589 #52 NEW cov: 12060 ft: 14983 corp: 29/269b lim: 20 exec/s: 52 rss: 69Mb L: 9/20 MS: 1 CMP- DE: "\000\000\000\006"- 00:08:01.589 #53 NEW cov: 12060 ft: 14998 corp: 30/273b lim: 20 exec/s: 53 rss: 69Mb L: 4/20 MS: 1 EraseBytes- 00:08:01.589 #54 NEW cov: 12060 ft: 15016 corp: 31/282b lim: 20 exec/s: 54 rss: 69Mb L: 9/20 MS: 1 ChangeBinInt- 00:08:01.589 #55 NEW cov: 12060 ft: 15063 corp: 32/290b lim: 20 exec/s: 55 rss: 69Mb L: 8/20 MS: 1 PersAutoDict- DE: "\000\000\000\006"- 00:08:01.589 #56 NEW cov: 12060 ft: 15085 corp: 33/299b lim: 20 exec/s: 56 rss: 70Mb L: 9/20 MS: 1 ChangeBit- 00:08:01.589 #57 NEW cov: 12064 ft: 15186 corp: 34/313b lim: 20 exec/s: 57 rss: 70Mb L: 14/20 MS: 1 InsertRepeatedBytes- 00:08:01.849 #58 NEW cov: 12071 ft: 15225 corp: 35/332b lim: 20 exec/s: 58 rss: 70Mb L: 19/20 MS: 1 InsertByte- 00:08:01.849 #59 NEW cov: 12071 ft: 15235 corp: 36/342b lim: 20 exec/s: 59 rss: 70Mb L: 10/20 MS: 1 CrossOver- 00:08:01.849 #60 NEW cov: 12071 ft: 15242 corp: 37/350b lim: 20 exec/s: 30 rss: 70Mb L: 8/20 MS: 1 InsertByte- 00:08:01.849 #60 DONE cov: 12071 ft: 15242 corp: 37/350b lim: 20 exec/s: 30 rss: 70Mb 00:08:01.849 ###### Recommended dictionary. ###### 00:08:01.849 "\377\377\001\000" # Uses: 0 00:08:01.849 "\000\000\000\006" # Uses: 1 00:08:01.849 ###### End of recommended dictionary. ###### 00:08:01.849 Done 60 runs in 2 second(s) 00:08:01.849 17:51:54 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_3.conf 00:08:01.849 17:51:54 -- ../common.sh@72 -- # (( i++ )) 00:08:01.849 17:51:54 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:01.849 17:51:54 -- ../common.sh@73 -- # start_llvm_fuzz 4 1 0x1 00:08:01.849 17:51:54 -- nvmf/run.sh@23 -- # local fuzzer_type=4 00:08:01.849 17:51:54 -- nvmf/run.sh@24 -- # local timen=1 00:08:01.849 17:51:54 -- nvmf/run.sh@25 -- # local core=0x1 00:08:01.849 17:51:54 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:08:01.849 17:51:54 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_4.conf 00:08:01.849 17:51:54 -- nvmf/run.sh@29 -- # printf %02d 4 00:08:01.849 17:51:54 -- nvmf/run.sh@29 -- # port=4404 00:08:01.849 17:51:54 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:08:01.849 17:51:54 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4404' 00:08:01.849 17:51:54 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4404"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:01.849 17:51:54 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4404' -c /tmp/fuzz_json_4.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 -Z 4 -r /var/tmp/spdk4.sock 00:08:02.109 [2024-11-19 17:51:54.723125] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:02.109 [2024-11-19 17:51:54.723194] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid635636 ] 00:08:02.109 EAL: No free 2048 kB hugepages reported on node 1 00:08:02.370 [2024-11-19 17:51:54.986424] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:02.370 [2024-11-19 17:51:55.014827] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:02.370 [2024-11-19 17:51:55.014966] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:02.370 [2024-11-19 17:51:55.066446] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:02.370 [2024-11-19 17:51:55.082804] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4404 *** 00:08:02.370 INFO: Running with entropic power schedule (0xFF, 100). 00:08:02.370 INFO: Seed: 2919373073 00:08:02.370 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:02.370 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:02.370 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:08:02.370 INFO: A corpus is not provided, starting from an empty corpus 00:08:02.370 #2 INITED exec/s: 0 rss: 59Mb 00:08:02.370 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:02.370 This may also happen if the target rejected all inputs we tried so far 00:08:02.370 [2024-11-19 17:51:55.127594] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff0ae1 cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.370 [2024-11-19 17:51:55.127635] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.370 [2024-11-19 17:51:55.127669] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.370 [2024-11-19 17:51:55.127684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.370 [2024-11-19 17:51:55.127713] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.370 [2024-11-19 17:51:55.127728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.370 [2024-11-19 17:51:55.127760] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.370 [2024-11-19 17:51:55.127775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.630 NEW_FUNC[1/671]: 0x457518 in fuzz_admin_create_io_completion_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:126 00:08:02.630 NEW_FUNC[2/671]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:02.630 #7 NEW cov: 11605 ft: 11606 corp: 2/32b lim: 35 exec/s: 0 rss: 67Mb L: 31/31 MS: 5 InsertByte-CopyPart-EraseBytes-ChangeByte-InsertRepeatedBytes- 00:08:02.630 [2024-11-19 17:51:55.448184] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:000a3b12 cdw11:0a000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.630 [2024-11-19 17:51:55.448223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.630 #17 NEW cov: 11718 ft: 12946 corp: 3/39b lim: 35 exec/s: 0 rss: 67Mb L: 7/31 MS: 5 ShuffleBytes-CMP-InsertByte-ShuffleBytes-CrossOver- DE: "\022\000\000\000"- 00:08:02.890 [2024-11-19 17:51:55.508405] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:25252525 cdw11:25250000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.890 [2024-11-19 17:51:55.508438] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.890 [2024-11-19 17:51:55.508471] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:25252525 cdw11:25250000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.890 [2024-11-19 17:51:55.508487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.890 [2024-11-19 17:51:55.508515] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:25252525 cdw11:25250000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.890 [2024-11-19 17:51:55.508530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.890 [2024-11-19 17:51:55.508557] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:25252525 cdw11:25250000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.890 [2024-11-19 17:51:55.508572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.890 #18 NEW cov: 11724 ft: 13238 corp: 4/73b lim: 35 exec/s: 0 rss: 67Mb L: 34/34 MS: 1 InsertRepeatedBytes- 00:08:02.890 [2024-11-19 17:51:55.558506] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff0ae1 cdw11:ff210003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.890 [2024-11-19 17:51:55.558537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.890 [2024-11-19 17:51:55.558583] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.890 [2024-11-19 17:51:55.558604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.890 [2024-11-19 17:51:55.558633] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.890 [2024-11-19 17:51:55.558648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.890 [2024-11-19 17:51:55.558676] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.890 [2024-11-19 17:51:55.558695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.890 #19 NEW cov: 11809 ft: 13509 corp: 5/104b lim: 35 exec/s: 0 rss: 67Mb L: 31/34 MS: 1 ChangeByte- 00:08:02.890 [2024-11-19 17:51:55.628586] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.890 [2024-11-19 17:51:55.628623] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.890 [2024-11-19 17:51:55.628669] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.890 [2024-11-19 17:51:55.628685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.890 #20 NEW cov: 11809 ft: 13879 corp: 6/122b lim: 35 exec/s: 0 rss: 67Mb L: 18/34 MS: 1 InsertRepeatedBytes- 00:08:02.890 [2024-11-19 17:51:55.678647] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:12003bc5 cdw11:0a0a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.890 [2024-11-19 17:51:55.678678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.890 #21 NEW cov: 11809 ft: 13967 corp: 7/130b lim: 35 exec/s: 0 rss: 67Mb L: 8/34 MS: 1 InsertByte- 00:08:02.890 [2024-11-19 17:51:55.748870] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.890 [2024-11-19 17:51:55.748903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.150 #22 NEW cov: 11809 ft: 14065 corp: 8/143b lim: 35 exec/s: 0 rss: 67Mb L: 13/34 MS: 1 EraseBytes- 00:08:03.150 [2024-11-19 17:51:55.819099] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.150 [2024-11-19 17:51:55.819134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.150 #23 NEW cov: 11809 ft: 14096 corp: 9/156b lim: 35 exec/s: 0 rss: 67Mb L: 13/34 MS: 1 ChangeByte- 00:08:03.150 [2024-11-19 17:51:55.889203] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:12003bc5 cdw11:0a3b0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.150 [2024-11-19 17:51:55.889235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.150 #24 NEW cov: 11809 ft: 14106 corp: 10/164b lim: 35 exec/s: 0 rss: 67Mb L: 8/34 MS: 1 CopyPart- 00:08:03.150 [2024-11-19 17:51:55.959678] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:12003bc5 cdw11:0a3b0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.150 [2024-11-19 17:51:55.959710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.150 [2024-11-19 17:51:55.959743] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:f3f3f3f3 cdw11:f3f30003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.150 [2024-11-19 17:51:55.959759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.150 [2024-11-19 17:51:55.959788] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:f3f3f3f3 cdw11:f3f30003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.150 [2024-11-19 17:51:55.959804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.150 [2024-11-19 17:51:55.959832] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:f3f3f3f3 cdw11:f3f30000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.150 [2024-11-19 17:51:55.959848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.150 #25 NEW cov: 11809 ft: 14126 corp: 11/192b lim: 35 exec/s: 0 rss: 68Mb L: 28/34 MS: 1 InsertRepeatedBytes- 00:08:03.409 [2024-11-19 17:51:56.029817] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00120000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.409 [2024-11-19 17:51:56.029851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.409 [2024-11-19 17:51:56.029899] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.409 [2024-11-19 17:51:56.029915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.409 [2024-11-19 17:51:56.029944] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.409 [2024-11-19 17:51:56.029959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.409 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:03.409 #26 NEW cov: 11832 ft: 14375 corp: 12/214b lim: 35 exec/s: 0 rss: 68Mb L: 22/34 MS: 1 PersAutoDict- DE: "\022\000\000\000"- 00:08:03.409 [2024-11-19 17:51:56.089963] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:12003bc5 cdw11:0a3b0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.409 [2024-11-19 17:51:56.089995] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.409 [2024-11-19 17:51:56.090028] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:f3f3f3f3 cdw11:f3f30003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.409 [2024-11-19 17:51:56.090044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.409 [2024-11-19 17:51:56.090072] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:f3f3f3f3 cdw11:f3f30003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.410 [2024-11-19 17:51:56.090087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.410 [2024-11-19 17:51:56.090115] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:f3f3f3f3 cdw11:f3f30000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.410 [2024-11-19 17:51:56.090130] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.410 #27 NEW cov: 11832 ft: 14420 corp: 13/242b lim: 35 exec/s: 27 rss: 68Mb L: 28/34 MS: 1 ShuffleBytes- 00:08:03.410 [2024-11-19 17:51:56.150155] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff0ae1 cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.410 [2024-11-19 17:51:56.150187] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.410 [2024-11-19 17:51:56.150234] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.410 [2024-11-19 17:51:56.150249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.410 [2024-11-19 17:51:56.150277] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ff000003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.410 [2024-11-19 17:51:56.150292] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.410 [2024-11-19 17:51:56.150319] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.410 [2024-11-19 17:51:56.150338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.410 #28 NEW cov: 11832 ft: 14443 corp: 14/273b lim: 35 exec/s: 28 rss: 68Mb L: 31/34 MS: 1 ChangeBinInt- 00:08:03.410 [2024-11-19 17:51:56.200251] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:25252525 cdw11:25250000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.410 [2024-11-19 17:51:56.200280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.410 [2024-11-19 17:51:56.200327] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:25252525 cdw11:25250000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.410 [2024-11-19 17:51:56.200342] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.410 [2024-11-19 17:51:56.200370] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:25252525 cdw11:25250000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.410 [2024-11-19 17:51:56.200385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.410 [2024-11-19 17:51:56.200412] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:25252525 cdw11:25250000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.410 [2024-11-19 17:51:56.200427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.410 #29 NEW cov: 11832 ft: 14574 corp: 15/307b lim: 35 exec/s: 29 rss: 68Mb L: 34/34 MS: 1 ShuffleBytes- 00:08:03.410 [2024-11-19 17:51:56.260446] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:25252525 cdw11:25250000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.410 [2024-11-19 17:51:56.260477] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.410 [2024-11-19 17:51:56.260509] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:25252525 cdw11:25250000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.410 [2024-11-19 17:51:56.260524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.410 [2024-11-19 17:51:56.260552] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:25252525 cdw11:25250000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.410 [2024-11-19 17:51:56.260567] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.410 [2024-11-19 17:51:56.260594] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:25252525 cdw11:25250000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.410 [2024-11-19 17:51:56.260621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.670 #30 NEW cov: 11832 ft: 14657 corp: 16/341b lim: 35 exec/s: 30 rss: 68Mb L: 34/34 MS: 1 ShuffleBytes- 00:08:03.670 [2024-11-19 17:51:56.320399] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0a123bc5 cdw11:000a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.670 [2024-11-19 17:51:56.320429] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.670 #31 NEW cov: 11832 ft: 14681 corp: 17/349b lim: 35 exec/s: 31 rss: 68Mb L: 8/34 MS: 1 ShuffleBytes- 00:08:03.670 [2024-11-19 17:51:56.370684] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:12003bc5 cdw11:0a3b0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.670 [2024-11-19 17:51:56.370715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.670 [2024-11-19 17:51:56.370765] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:f3f3f3f3 cdw11:f31b0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.670 [2024-11-19 17:51:56.370781] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.670 [2024-11-19 17:51:56.370809] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:f3f3f3f3 cdw11:f3f30003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.670 [2024-11-19 17:51:56.370824] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.670 [2024-11-19 17:51:56.370851] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:f3f3f3f3 cdw11:f3f30000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.670 [2024-11-19 17:51:56.370866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.670 #32 NEW cov: 11832 ft: 14747 corp: 18/377b lim: 35 exec/s: 32 rss: 68Mb L: 28/34 MS: 1 ChangeByte- 00:08:03.670 [2024-11-19 17:51:56.430858] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00120000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.670 [2024-11-19 17:51:56.430889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.670 [2024-11-19 17:51:56.430935] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00200000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.670 [2024-11-19 17:51:56.430950] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.670 [2024-11-19 17:51:56.430977] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.670 [2024-11-19 17:51:56.430993] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.670 #33 NEW cov: 11832 ft: 14766 corp: 19/399b lim: 35 exec/s: 33 rss: 68Mb L: 22/34 MS: 1 ChangeBit- 00:08:03.670 [2024-11-19 17:51:56.490935] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.670 [2024-11-19 17:51:56.490966] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.670 [2024-11-19 17:51:56.490997] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.670 [2024-11-19 17:51:56.491012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.670 #34 NEW cov: 11832 ft: 14790 corp: 20/417b lim: 35 exec/s: 34 rss: 68Mb L: 18/34 MS: 1 ChangeByte- 00:08:03.930 [2024-11-19 17:51:56.541013] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.930 [2024-11-19 17:51:56.541043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.931 [2024-11-19 17:51:56.541089] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.931 [2024-11-19 17:51:56.541104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.931 #35 NEW cov: 11832 ft: 14846 corp: 21/435b lim: 35 exec/s: 35 rss: 68Mb L: 18/34 MS: 1 ChangeBit- 00:08:03.931 [2024-11-19 17:51:56.601349] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:25252525 cdw11:25250000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.931 [2024-11-19 17:51:56.601379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.931 [2024-11-19 17:51:56.601415] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:25252525 cdw11:25250000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.931 [2024-11-19 17:51:56.601431] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.931 [2024-11-19 17:51:56.601458] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:25252525 cdw11:25250000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.931 [2024-11-19 17:51:56.601474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.931 [2024-11-19 17:51:56.601501] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:25252525 cdw11:25250000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.931 [2024-11-19 17:51:56.601516] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.931 #36 NEW cov: 11832 ft: 14865 corp: 22/469b lim: 35 exec/s: 36 rss: 68Mb L: 34/34 MS: 1 ChangeBit- 00:08:03.931 [2024-11-19 17:51:56.651342] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.931 [2024-11-19 17:51:56.651373] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.931 [2024-11-19 17:51:56.651420] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00120001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.931 [2024-11-19 17:51:56.651436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.931 #42 NEW cov: 11832 ft: 14881 corp: 23/487b lim: 35 exec/s: 42 rss: 68Mb L: 18/34 MS: 1 PersAutoDict- DE: "\022\000\000\000"- 00:08:03.931 [2024-11-19 17:51:56.721453] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.931 [2024-11-19 17:51:56.721482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.931 #43 NEW cov: 11832 ft: 14951 corp: 24/500b lim: 35 exec/s: 43 rss: 68Mb L: 13/34 MS: 1 CrossOver- 00:08:03.931 [2024-11-19 17:51:56.792768] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:12003bc5 cdw11:0a120000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.931 [2024-11-19 17:51:56.792794] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.931 [2024-11-19 17:51:56.792865] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:3bc50000 cdw11:f3f30003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.931 [2024-11-19 17:51:56.792880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.931 [2024-11-19 17:51:56.792931] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:f3f3f3f3 cdw11:f3f30003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.931 [2024-11-19 17:51:56.792944] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.931 [2024-11-19 17:51:56.792995] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:f3f3f3f3 cdw11:f3f30003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.931 [2024-11-19 17:51:56.793009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.192 #44 NEW cov: 11832 ft: 15058 corp: 25/532b lim: 35 exec/s: 44 rss: 68Mb L: 32/34 MS: 1 PersAutoDict- DE: "\022\000\000\000"- 00:08:04.192 [2024-11-19 17:51:56.832510] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0000ff00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.192 [2024-11-19 17:51:56.832538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.192 [2024-11-19 17:51:56.832592] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:0000e800 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.192 [2024-11-19 17:51:56.832611] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.192 #49 NEW cov: 11832 ft: 15174 corp: 26/546b lim: 35 exec/s: 49 rss: 68Mb L: 14/34 MS: 5 CrossOver-ShuffleBytes-CrossOver-ChangeByte-CrossOver- 00:08:04.192 [2024-11-19 17:51:56.872642] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0000ff40 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.192 [2024-11-19 17:51:56.872668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.192 [2024-11-19 17:51:56.872723] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:0000e800 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.192 [2024-11-19 17:51:56.872736] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.192 #50 NEW cov: 11832 ft: 15216 corp: 27/560b lim: 35 exec/s: 50 rss: 68Mb L: 14/34 MS: 1 ChangeBit- 00:08:04.192 [2024-11-19 17:51:56.912630] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:f5edc53a cdw11:fff60000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.192 [2024-11-19 17:51:56.912655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.192 #51 NEW cov: 11832 ft: 15237 corp: 28/568b lim: 35 exec/s: 51 rss: 68Mb L: 8/34 MS: 1 ChangeBinInt- 00:08:04.192 [2024-11-19 17:51:56.952734] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:23000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.192 [2024-11-19 17:51:56.952759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.192 #52 NEW cov: 11832 ft: 15250 corp: 29/581b lim: 35 exec/s: 52 rss: 68Mb L: 13/34 MS: 1 ChangeByte- 00:08:04.192 [2024-11-19 17:51:56.993262] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:12003bc5 cdw11:0a3c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.192 [2024-11-19 17:51:56.993288] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.192 [2024-11-19 17:51:56.993341] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:3bc50000 cdw11:f3f30003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.192 [2024-11-19 17:51:56.993355] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.192 [2024-11-19 17:51:56.993406] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:f3f3f3f3 cdw11:f3f30003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.192 [2024-11-19 17:51:56.993419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.192 [2024-11-19 17:51:56.993471] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:f3f3f3f3 cdw11:f3f30003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.192 [2024-11-19 17:51:56.993483] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.192 #53 NEW cov: 11832 ft: 15262 corp: 30/613b lim: 35 exec/s: 53 rss: 68Mb L: 32/34 MS: 1 ChangeByte- 00:08:04.192 [2024-11-19 17:51:57.033408] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.192 [2024-11-19 17:51:57.033436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.192 [2024-11-19 17:51:57.033503] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.192 [2024-11-19 17:51:57.033517] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.192 [2024-11-19 17:51:57.033569] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000012 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.192 [2024-11-19 17:51:57.033582] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.192 [2024-11-19 17:51:57.033638] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000020 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.192 [2024-11-19 17:51:57.033651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.552 #54 NEW cov: 11832 ft: 15291 corp: 31/645b lim: 35 exec/s: 54 rss: 68Mb L: 32/34 MS: 1 InsertRepeatedBytes- 00:08:04.552 [2024-11-19 17:51:57.073077] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:000a3b12 cdw11:c0000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.552 [2024-11-19 17:51:57.073102] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.552 #55 NEW cov: 11832 ft: 15310 corp: 32/652b lim: 35 exec/s: 55 rss: 68Mb L: 7/34 MS: 1 ChangeByte- 00:08:04.552 [2024-11-19 17:51:57.113646] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:7e7e3b7e cdw11:7e7e0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.552 [2024-11-19 17:51:57.113671] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.552 [2024-11-19 17:51:57.113739] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:7e7e7e7e cdw11:7e7e0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.552 [2024-11-19 17:51:57.113753] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.552 [2024-11-19 17:51:57.113806] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:7e7e7e7e cdw11:7e7e0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.552 [2024-11-19 17:51:57.113819] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.552 [2024-11-19 17:51:57.113869] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:7e7e7e7e cdw11:7e7e0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.552 [2024-11-19 17:51:57.113883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.552 #56 NEW cov: 11832 ft: 15329 corp: 33/686b lim: 35 exec/s: 28 rss: 68Mb L: 34/34 MS: 1 InsertRepeatedBytes- 00:08:04.552 #56 DONE cov: 11832 ft: 15329 corp: 33/686b lim: 35 exec/s: 28 rss: 68Mb 00:08:04.552 ###### Recommended dictionary. ###### 00:08:04.552 "\022\000\000\000" # Uses: 3 00:08:04.552 ###### End of recommended dictionary. ###### 00:08:04.552 Done 56 runs in 2 second(s) 00:08:04.552 17:51:57 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_4.conf 00:08:04.552 17:51:57 -- ../common.sh@72 -- # (( i++ )) 00:08:04.552 17:51:57 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:04.552 17:51:57 -- ../common.sh@73 -- # start_llvm_fuzz 5 1 0x1 00:08:04.552 17:51:57 -- nvmf/run.sh@23 -- # local fuzzer_type=5 00:08:04.552 17:51:57 -- nvmf/run.sh@24 -- # local timen=1 00:08:04.552 17:51:57 -- nvmf/run.sh@25 -- # local core=0x1 00:08:04.552 17:51:57 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:08:04.552 17:51:57 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_5.conf 00:08:04.552 17:51:57 -- nvmf/run.sh@29 -- # printf %02d 5 00:08:04.552 17:51:57 -- nvmf/run.sh@29 -- # port=4405 00:08:04.552 17:51:57 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:08:04.552 17:51:57 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4405' 00:08:04.552 17:51:57 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4405"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:04.552 17:51:57 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4405' -c /tmp/fuzz_json_5.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 -Z 5 -r /var/tmp/spdk5.sock 00:08:04.552 [2024-11-19 17:51:57.293142] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:04.552 [2024-11-19 17:51:57.293214] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid636070 ] 00:08:04.552 EAL: No free 2048 kB hugepages reported on node 1 00:08:04.829 [2024-11-19 17:51:57.553030] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:04.829 [2024-11-19 17:51:57.580425] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:04.829 [2024-11-19 17:51:57.580550] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:04.829 [2024-11-19 17:51:57.632196] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:04.829 [2024-11-19 17:51:57.648522] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4405 *** 00:08:04.829 INFO: Running with entropic power schedule (0xFF, 100). 00:08:04.829 INFO: Seed: 1187379108 00:08:04.829 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:04.829 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:04.829 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:08:04.829 INFO: A corpus is not provided, starting from an empty corpus 00:08:04.829 #2 INITED exec/s: 0 rss: 59Mb 00:08:04.829 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:04.829 This may also happen if the target rejected all inputs we tried so far 00:08:05.098 [2024-11-19 17:51:57.719490] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e6e6e6e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.098 [2024-11-19 17:51:57.719525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.098 [2024-11-19 17:51:57.719593] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:e6e6e6e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.098 [2024-11-19 17:51:57.719611] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.098 [2024-11-19 17:51:57.719671] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:e6e6e6e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.098 [2024-11-19 17:51:57.719686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.098 [2024-11-19 17:51:57.719748] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:e6e6e6e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.098 [2024-11-19 17:51:57.719761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.390 NEW_FUNC[1/671]: 0x4596b8 in fuzz_admin_create_io_submission_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:142 00:08:05.390 NEW_FUNC[2/671]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:05.390 #4 NEW cov: 11616 ft: 11617 corp: 2/41b lim: 45 exec/s: 0 rss: 66Mb L: 40/40 MS: 2 ChangeByte-InsertRepeatedBytes- 00:08:05.390 [2024-11-19 17:51:58.039575] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e6e6e6e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.390 [2024-11-19 17:51:58.039625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.390 [2024-11-19 17:51:58.039764] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:e6e6e6e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.390 [2024-11-19 17:51:58.039786] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.390 [2024-11-19 17:51:58.039910] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:e6e6e6e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.390 [2024-11-19 17:51:58.039930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.390 [2024-11-19 17:51:58.040055] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:e6e6e6e6 cdw11:e6000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.390 [2024-11-19 17:51:58.040078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.390 #5 NEW cov: 11729 ft: 12168 corp: 3/81b lim: 45 exec/s: 0 rss: 67Mb L: 40/40 MS: 1 CMP- DE: "\000\000\000\000\000\000\000\200"- 00:08:05.390 [2024-11-19 17:51:58.089536] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e6e6e6e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.390 [2024-11-19 17:51:58.089563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.390 [2024-11-19 17:51:58.089682] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:e6e6e6e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.390 [2024-11-19 17:51:58.089699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.390 [2024-11-19 17:51:58.089810] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:e6e6e6e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.390 [2024-11-19 17:51:58.089826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.390 [2024-11-19 17:51:58.089938] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:e6e6e6e6 cdw11:e6000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.390 [2024-11-19 17:51:58.089955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.390 #6 NEW cov: 11735 ft: 12451 corp: 4/121b lim: 45 exec/s: 0 rss: 67Mb L: 40/40 MS: 1 CMP- DE: "\001\034"- 00:08:05.390 [2024-11-19 17:51:58.129679] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e6e6e6e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.390 [2024-11-19 17:51:58.129711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.390 [2024-11-19 17:51:58.129833] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:e6e6e6e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.390 [2024-11-19 17:51:58.129850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.390 [2024-11-19 17:51:58.129967] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:e600e6e6 cdw11:00000007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.390 [2024-11-19 17:51:58.129984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.390 [2024-11-19 17:51:58.130106] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:e6e6e6e6 cdw11:e6000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.390 [2024-11-19 17:51:58.130124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.390 #7 NEW cov: 11820 ft: 12824 corp: 5/161b lim: 45 exec/s: 0 rss: 67Mb L: 40/40 MS: 1 CopyPart- 00:08:05.390 [2024-11-19 17:51:58.169762] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e6e6e6e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.390 [2024-11-19 17:51:58.169789] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.390 [2024-11-19 17:51:58.169902] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:e6e6e6e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.390 [2024-11-19 17:51:58.169919] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.390 [2024-11-19 17:51:58.170035] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:e6e6e6e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.390 [2024-11-19 17:51:58.170051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.390 [2024-11-19 17:51:58.170173] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:e6e6e6e6 cdw11:e6000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.390 [2024-11-19 17:51:58.170190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.390 #8 NEW cov: 11820 ft: 12909 corp: 6/201b lim: 45 exec/s: 0 rss: 67Mb L: 40/40 MS: 1 ChangeBit- 00:08:05.390 [2024-11-19 17:51:58.209867] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e6e6e6e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.390 [2024-11-19 17:51:58.209893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.390 [2024-11-19 17:51:58.210011] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:e6e6e6e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.390 [2024-11-19 17:51:58.210028] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.390 [2024-11-19 17:51:58.210141] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:e6e6e6e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.390 [2024-11-19 17:51:58.210160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.390 [2024-11-19 17:51:58.210269] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:e6e6e6e6 cdw11:e6000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.390 [2024-11-19 17:51:58.210286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.390 #9 NEW cov: 11820 ft: 13073 corp: 7/242b lim: 45 exec/s: 0 rss: 67Mb L: 41/41 MS: 1 InsertByte- 00:08:05.659 [2024-11-19 17:51:58.259853] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e6e6e6e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.659 [2024-11-19 17:51:58.259879] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.659 [2024-11-19 17:51:58.260011] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:e6e6e6e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.659 [2024-11-19 17:51:58.260027] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.659 [2024-11-19 17:51:58.260151] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:e6e6e6e6 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.659 [2024-11-19 17:51:58.260170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.659 #10 NEW cov: 11820 ft: 13509 corp: 8/272b lim: 45 exec/s: 0 rss: 67Mb L: 30/41 MS: 1 EraseBytes- 00:08:05.659 [2024-11-19 17:51:58.300302] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e6e6e6e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.659 [2024-11-19 17:51:58.300330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.659 [2024-11-19 17:51:58.300450] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:e6e6e6e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.659 [2024-11-19 17:51:58.300467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.659 [2024-11-19 17:51:58.300571] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:e6e6e6e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.659 [2024-11-19 17:51:58.300588] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.659 [2024-11-19 17:51:58.300713] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:e6e6e61d cdw11:e6000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.659 [2024-11-19 17:51:58.300731] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.659 #11 NEW cov: 11820 ft: 13531 corp: 9/313b lim: 45 exec/s: 0 rss: 67Mb L: 41/41 MS: 1 ChangeBinInt- 00:08:05.659 [2024-11-19 17:51:58.350409] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e6e6e6e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.659 [2024-11-19 17:51:58.350436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.659 [2024-11-19 17:51:58.350560] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:e6e6e6e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.659 [2024-11-19 17:51:58.350577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.659 [2024-11-19 17:51:58.350694] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:e6e6e6e6 cdw11:c6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.659 [2024-11-19 17:51:58.350711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.659 [2024-11-19 17:51:58.350824] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:e6e6e6e6 cdw11:e6000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.659 [2024-11-19 17:51:58.350841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.659 #12 NEW cov: 11820 ft: 13626 corp: 10/354b lim: 45 exec/s: 0 rss: 67Mb L: 41/41 MS: 1 ChangeBit- 00:08:05.659 [2024-11-19 17:51:58.390339] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e6e6e6e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.659 [2024-11-19 17:51:58.390367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.659 [2024-11-19 17:51:58.390483] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:e6e6e6e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.659 [2024-11-19 17:51:58.390501] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.659 [2024-11-19 17:51:58.390626] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:e6e6e6e6 cdw11:e6e60000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.659 [2024-11-19 17:51:58.390642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.659 #13 NEW cov: 11820 ft: 13664 corp: 11/386b lim: 45 exec/s: 0 rss: 67Mb L: 32/41 MS: 1 PersAutoDict- DE: "\001\034"- 00:08:05.659 [2024-11-19 17:51:58.441067] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e6e6e6e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.659 [2024-11-19 17:51:58.441094] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.659 [2024-11-19 17:51:58.441243] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:e6e6e6e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.659 [2024-11-19 17:51:58.441259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.659 [2024-11-19 17:51:58.441371] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:e6e6e6e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.659 [2024-11-19 17:51:58.441388] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.659 [2024-11-19 17:51:58.441510] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:e6e6e6e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.659 [2024-11-19 17:51:58.441526] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.659 [2024-11-19 17:51:58.441651] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:8 nsid:0 cdw10:0000e600 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.659 [2024-11-19 17:51:58.441668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:05.659 #19 NEW cov: 11820 ft: 13791 corp: 12/431b lim: 45 exec/s: 0 rss: 68Mb L: 45/45 MS: 1 CopyPart- 00:08:05.659 [2024-11-19 17:51:58.480741] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e6e6e6e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.659 [2024-11-19 17:51:58.480773] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.659 [2024-11-19 17:51:58.480915] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:e6e6e6e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.659 [2024-11-19 17:51:58.480933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.659 [2024-11-19 17:51:58.481050] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:e6e6e6e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.659 [2024-11-19 17:51:58.481066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.659 [2024-11-19 17:51:58.481187] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:1ce6e601 cdw11:e6000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.659 [2024-11-19 17:51:58.481203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.659 #20 NEW cov: 11820 ft: 13801 corp: 13/472b lim: 45 exec/s: 0 rss: 68Mb L: 41/45 MS: 1 PersAutoDict- DE: "\001\034"- 00:08:05.941 [2024-11-19 17:51:58.520993] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e6e6e6e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.941 [2024-11-19 17:51:58.521021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.941 [2024-11-19 17:51:58.521144] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:e6e6e6e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.941 [2024-11-19 17:51:58.521161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.941 [2024-11-19 17:51:58.521280] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:1ce6e601 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.941 [2024-11-19 17:51:58.521297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.941 [2024-11-19 17:51:58.521417] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:1ce6e601 cdw11:e6000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.941 [2024-11-19 17:51:58.521434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.941 #21 NEW cov: 11820 ft: 13889 corp: 14/513b lim: 45 exec/s: 0 rss: 68Mb L: 41/45 MS: 1 PersAutoDict- DE: "\001\034"- 00:08:05.941 [2024-11-19 17:51:58.571126] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e6e6e6e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.941 [2024-11-19 17:51:58.571152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.941 [2024-11-19 17:51:58.571272] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:e6e6e6e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.941 [2024-11-19 17:51:58.571289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.941 [2024-11-19 17:51:58.571416] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:e6e6e6e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.941 [2024-11-19 17:51:58.571433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.941 [2024-11-19 17:51:58.571562] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:e6e6e6e6 cdw11:e6000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.941 [2024-11-19 17:51:58.571578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.941 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:05.941 #22 NEW cov: 11843 ft: 13930 corp: 15/553b lim: 45 exec/s: 0 rss: 68Mb L: 40/45 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000\200"- 00:08:05.941 [2024-11-19 17:51:58.611144] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e6e6e6e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.941 [2024-11-19 17:51:58.611172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.941 [2024-11-19 17:51:58.611295] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:e6e6e6e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.941 [2024-11-19 17:51:58.611312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.941 [2024-11-19 17:51:58.611430] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:e600e6e6 cdw11:00000007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.941 [2024-11-19 17:51:58.611446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.941 [2024-11-19 17:51:58.611565] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:e6b4e6e6 cdw11:e6e60000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.941 [2024-11-19 17:51:58.611583] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.941 #23 NEW cov: 11843 ft: 13970 corp: 16/594b lim: 45 exec/s: 0 rss: 69Mb L: 41/45 MS: 1 InsertByte- 00:08:05.941 [2024-11-19 17:51:58.651303] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e6e6e6e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.941 [2024-11-19 17:51:58.651331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.941 [2024-11-19 17:51:58.651455] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:e6e6e6e6 cdw11:e6a60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.941 [2024-11-19 17:51:58.651471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.941 [2024-11-19 17:51:58.651584] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:1ce6e601 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.941 [2024-11-19 17:51:58.651603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.941 [2024-11-19 17:51:58.651714] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:1ce6e601 cdw11:e6000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.941 [2024-11-19 17:51:58.651731] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.941 #24 NEW cov: 11843 ft: 14057 corp: 17/635b lim: 45 exec/s: 0 rss: 69Mb L: 41/45 MS: 1 ChangeBit- 00:08:05.941 [2024-11-19 17:51:58.691183] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e6e6e6e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.941 [2024-11-19 17:51:58.691212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.941 [2024-11-19 17:51:58.691328] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:e6e6e6e6 cdw11:e6a60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.941 [2024-11-19 17:51:58.691346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.941 [2024-11-19 17:51:58.691462] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:1ce6e601 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.941 [2024-11-19 17:51:58.691480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.941 #30 NEW cov: 11843 ft: 14081 corp: 18/664b lim: 45 exec/s: 30 rss: 69Mb L: 29/45 MS: 1 EraseBytes- 00:08:05.941 [2024-11-19 17:51:58.741489] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e6e6e6e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.941 [2024-11-19 17:51:58.741515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.941 [2024-11-19 17:51:58.741648] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:e6e60000 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.941 [2024-11-19 17:51:58.741668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.941 [2024-11-19 17:51:58.741781] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:e6e6e6e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.941 [2024-11-19 17:51:58.741798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.941 [2024-11-19 17:51:58.741914] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:e6e6e6e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.941 [2024-11-19 17:51:58.741932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.941 #31 NEW cov: 11843 ft: 14095 corp: 19/708b lim: 45 exec/s: 31 rss: 69Mb L: 44/45 MS: 1 InsertRepeatedBytes- 00:08:05.941 [2024-11-19 17:51:58.781661] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e6e6e6e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.941 [2024-11-19 17:51:58.781688] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.941 [2024-11-19 17:51:58.781803] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:e6e6e6e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.941 [2024-11-19 17:51:58.781822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.941 [2024-11-19 17:51:58.781930] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:e6e6e6e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.941 [2024-11-19 17:51:58.781947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.941 [2024-11-19 17:51:58.782061] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:e6e6e6e6 cdw11:a6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.941 [2024-11-19 17:51:58.782077] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.253 #32 NEW cov: 11843 ft: 14115 corp: 20/748b lim: 45 exec/s: 32 rss: 69Mb L: 40/45 MS: 1 CrossOver- 00:08:06.253 [2024-11-19 17:51:58.821535] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e6e6e6e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.253 [2024-11-19 17:51:58.821561] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.253 [2024-11-19 17:51:58.821703] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:e613e6e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.253 [2024-11-19 17:51:58.821721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.253 [2024-11-19 17:51:58.821843] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:e6e6e6e6 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.253 [2024-11-19 17:51:58.821860] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.253 #38 NEW cov: 11843 ft: 14141 corp: 21/778b lim: 45 exec/s: 38 rss: 69Mb L: 30/45 MS: 1 ChangeBinInt- 00:08:06.253 [2024-11-19 17:51:58.861342] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e6e6e6e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.253 [2024-11-19 17:51:58.861368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.253 [2024-11-19 17:51:58.861488] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:e613e6e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.253 [2024-11-19 17:51:58.861507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.253 #39 NEW cov: 11843 ft: 14407 corp: 22/796b lim: 45 exec/s: 39 rss: 69Mb L: 18/45 MS: 1 EraseBytes- 00:08:06.253 [2024-11-19 17:51:58.902056] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e6e6e6e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.253 [2024-11-19 17:51:58.902082] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.253 [2024-11-19 17:51:58.902194] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:e6e6e6e2 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.253 [2024-11-19 17:51:58.902212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.253 [2024-11-19 17:51:58.902329] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:e600e6e6 cdw11:00000007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.253 [2024-11-19 17:51:58.902344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.253 [2024-11-19 17:51:58.902461] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:e6e6e6e6 cdw11:e6000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.253 [2024-11-19 17:51:58.902479] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.253 #40 NEW cov: 11843 ft: 14425 corp: 23/836b lim: 45 exec/s: 40 rss: 69Mb L: 40/45 MS: 1 ChangeBit- 00:08:06.253 [2024-11-19 17:51:58.941574] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e6e6e6e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.253 [2024-11-19 17:51:58.941605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.253 [2024-11-19 17:51:58.941728] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:e6e6e6e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.253 [2024-11-19 17:51:58.941744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.253 #41 NEW cov: 11843 ft: 14438 corp: 24/861b lim: 45 exec/s: 41 rss: 69Mb L: 25/45 MS: 1 EraseBytes- 00:08:06.253 [2024-11-19 17:51:58.982048] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e6e6e6e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.253 [2024-11-19 17:51:58.982074] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.253 [2024-11-19 17:51:58.982202] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:e6e6e6e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.253 [2024-11-19 17:51:58.982220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.253 [2024-11-19 17:51:58.982338] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:e6e6e6e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.253 [2024-11-19 17:51:58.982356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.253 #42 NEW cov: 11843 ft: 14455 corp: 25/892b lim: 45 exec/s: 42 rss: 69Mb L: 31/45 MS: 1 EraseBytes- 00:08:06.253 [2024-11-19 17:51:59.022356] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e6e6e6e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.253 [2024-11-19 17:51:59.022383] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.253 [2024-11-19 17:51:59.022507] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:e6e6e6e2 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.253 [2024-11-19 17:51:59.022523] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.253 [2024-11-19 17:51:59.022640] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:e600e6e6 cdw11:00000007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.253 [2024-11-19 17:51:59.022657] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.253 [2024-11-19 17:51:59.022776] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:e6e6e6e6 cdw11:e6000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.253 [2024-11-19 17:51:59.022797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.254 #43 NEW cov: 11843 ft: 14467 corp: 26/932b lim: 45 exec/s: 43 rss: 69Mb L: 40/45 MS: 1 ShuffleBytes- 00:08:06.254 [2024-11-19 17:51:59.062701] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e6e6e6e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.254 [2024-11-19 17:51:59.062728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.254 [2024-11-19 17:51:59.062850] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:e6e6e6e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.254 [2024-11-19 17:51:59.062865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.254 [2024-11-19 17:51:59.062978] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:e6e6e6e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.254 [2024-11-19 17:51:59.062994] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.254 [2024-11-19 17:51:59.063108] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:e6e6e61d cdw11:e6000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.254 [2024-11-19 17:51:59.063127] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.254 #44 NEW cov: 11843 ft: 14493 corp: 27/973b lim: 45 exec/s: 44 rss: 69Mb L: 41/45 MS: 1 ShuffleBytes- 00:08:06.254 [2024-11-19 17:51:59.113023] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e6e6e6e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.254 [2024-11-19 17:51:59.113053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.254 [2024-11-19 17:51:59.113161] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:e6e6e6e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.254 [2024-11-19 17:51:59.113178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.254 [2024-11-19 17:51:59.113298] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:e6e6e6e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.254 [2024-11-19 17:51:59.113316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.254 [2024-11-19 17:51:59.113437] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:0000e600 cdw11:00010000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.254 [2024-11-19 17:51:59.113454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.254 [2024-11-19 17:51:59.113576] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00010000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.254 [2024-11-19 17:51:59.113592] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:06.567 #45 NEW cov: 11843 ft: 14510 corp: 28/1018b lim: 45 exec/s: 45 rss: 69Mb L: 45/45 MS: 1 InsertRepeatedBytes- 00:08:06.567 [2024-11-19 17:51:59.152780] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e6e6e6e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.567 [2024-11-19 17:51:59.152808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.567 [2024-11-19 17:51:59.152931] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:e6e6e6e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.567 [2024-11-19 17:51:59.152953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.567 [2024-11-19 17:51:59.153075] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:e6e6e6e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.567 [2024-11-19 17:51:59.153091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.568 [2024-11-19 17:51:59.153210] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:e6e6e6e6 cdw11:e6000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.568 [2024-11-19 17:51:59.153228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.568 #46 NEW cov: 11843 ft: 14525 corp: 29/1058b lim: 45 exec/s: 46 rss: 69Mb L: 40/45 MS: 1 ShuffleBytes- 00:08:06.568 [2024-11-19 17:51:59.192987] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e6e6e6e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.568 [2024-11-19 17:51:59.193015] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.568 [2024-11-19 17:51:59.193137] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:e6e6e6e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.568 [2024-11-19 17:51:59.193154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.568 [2024-11-19 17:51:59.193268] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:e600e6e6 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.568 [2024-11-19 17:51:59.193285] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.568 [2024-11-19 17:51:59.193407] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:e6e600e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.568 [2024-11-19 17:51:59.193424] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.568 #47 NEW cov: 11843 ft: 14533 corp: 30/1102b lim: 45 exec/s: 47 rss: 69Mb L: 44/45 MS: 1 InsertRepeatedBytes- 00:08:06.568 [2024-11-19 17:51:59.233125] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e6e6e6e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.568 [2024-11-19 17:51:59.233153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.568 [2024-11-19 17:51:59.233269] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:e6e6e6e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.568 [2024-11-19 17:51:59.233285] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.568 [2024-11-19 17:51:59.233398] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:e6e6e6e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.568 [2024-11-19 17:51:59.233414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.568 [2024-11-19 17:51:59.233520] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:e6e6e6e6 cdw11:e6000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.568 [2024-11-19 17:51:59.233538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.568 #48 NEW cov: 11843 ft: 14552 corp: 31/1145b lim: 45 exec/s: 48 rss: 69Mb L: 43/45 MS: 1 CopyPart- 00:08:06.568 [2024-11-19 17:51:59.273551] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e6e6e6e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.568 [2024-11-19 17:51:59.273582] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.568 [2024-11-19 17:51:59.273695] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:e6e6e6e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.568 [2024-11-19 17:51:59.273713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.568 [2024-11-19 17:51:59.273825] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:e6e6e6e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.568 [2024-11-19 17:51:59.273842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.568 [2024-11-19 17:51:59.273952] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:0000e600 cdw11:00010000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.568 [2024-11-19 17:51:59.273969] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.568 [2024-11-19 17:51:59.274082] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00010000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.568 [2024-11-19 17:51:59.274100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:06.568 #49 NEW cov: 11843 ft: 14562 corp: 32/1190b lim: 45 exec/s: 49 rss: 69Mb L: 45/45 MS: 1 ShuffleBytes- 00:08:06.568 [2024-11-19 17:51:59.323434] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:48480a48 cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.568 [2024-11-19 17:51:59.323460] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.568 [2024-11-19 17:51:59.323573] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:48484848 cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.568 [2024-11-19 17:51:59.323592] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.568 [2024-11-19 17:51:59.323712] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:48484848 cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.568 [2024-11-19 17:51:59.323727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.568 [2024-11-19 17:51:59.323853] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:48484848 cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.568 [2024-11-19 17:51:59.323870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.568 #50 NEW cov: 11843 ft: 14584 corp: 33/1230b lim: 45 exec/s: 50 rss: 69Mb L: 40/45 MS: 1 InsertRepeatedBytes- 00:08:06.568 [2024-11-19 17:51:59.363284] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e6e6e6e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.568 [2024-11-19 17:51:59.363313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.568 [2024-11-19 17:51:59.363429] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:e6e6e6e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.568 [2024-11-19 17:51:59.363456] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.568 [2024-11-19 17:51:59.363569] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:0000e600 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.568 [2024-11-19 17:51:59.363586] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.568 #51 NEW cov: 11843 ft: 14613 corp: 34/1257b lim: 45 exec/s: 51 rss: 69Mb L: 27/45 MS: 1 CrossOver- 00:08:06.568 [2024-11-19 17:51:59.412868] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e6e6e6e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.568 [2024-11-19 17:51:59.412895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.827 #52 NEW cov: 11843 ft: 15393 corp: 35/1269b lim: 45 exec/s: 52 rss: 69Mb L: 12/45 MS: 1 CrossOver- 00:08:06.827 [2024-11-19 17:51:59.463384] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e6e6e6e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.827 [2024-11-19 17:51:59.463413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.827 [2024-11-19 17:51:59.463526] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:e6e6e6e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.827 [2024-11-19 17:51:59.463542] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.827 #53 NEW cov: 11843 ft: 15418 corp: 36/1289b lim: 45 exec/s: 53 rss: 70Mb L: 20/45 MS: 1 CrossOver- 00:08:06.827 [2024-11-19 17:51:59.504198] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e6e6e6e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.827 [2024-11-19 17:51:59.504225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.827 [2024-11-19 17:51:59.504334] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:e6e6e6e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.827 [2024-11-19 17:51:59.504352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.827 [2024-11-19 17:51:59.504465] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:e6e6e6e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.827 [2024-11-19 17:51:59.504480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.827 [2024-11-19 17:51:59.504594] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:0000e600 cdw11:00010000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.827 [2024-11-19 17:51:59.504617] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.827 [2024-11-19 17:51:59.504732] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00010000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.827 [2024-11-19 17:51:59.504750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:06.827 #54 NEW cov: 11843 ft: 15430 corp: 37/1334b lim: 45 exec/s: 54 rss: 70Mb L: 45/45 MS: 1 ShuffleBytes- 00:08:06.827 [2024-11-19 17:51:59.543852] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e6e6e6e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.827 [2024-11-19 17:51:59.543880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.827 [2024-11-19 17:51:59.543999] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:e6e6e6e6 cdw11:1fa60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.827 [2024-11-19 17:51:59.544015] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.827 [2024-11-19 17:51:59.544129] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:1ce6e601 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.827 [2024-11-19 17:51:59.544149] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.827 #55 NEW cov: 11843 ft: 15441 corp: 38/1363b lim: 45 exec/s: 55 rss: 70Mb L: 29/45 MS: 1 ChangeByte- 00:08:06.827 [2024-11-19 17:51:59.584212] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e6e6e6e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.827 [2024-11-19 17:51:59.584239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.827 [2024-11-19 17:51:59.584351] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:e6e6e6e2 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.828 [2024-11-19 17:51:59.584367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.828 [2024-11-19 17:51:59.584484] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:e600e6e6 cdw11:00000007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.828 [2024-11-19 17:51:59.584500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.828 [2024-11-19 17:51:59.584623] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:e6e621e6 cdw11:e6e60000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.828 [2024-11-19 17:51:59.584653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.828 #56 NEW cov: 11843 ft: 15450 corp: 39/1404b lim: 45 exec/s: 56 rss: 70Mb L: 41/45 MS: 1 InsertByte- 00:08:06.828 [2024-11-19 17:51:59.634664] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e6e6e6e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.828 [2024-11-19 17:51:59.634691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.828 [2024-11-19 17:51:59.634810] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:2d00e6e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.828 [2024-11-19 17:51:59.634828] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.828 [2024-11-19 17:51:59.634947] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:e6e6e6e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.828 [2024-11-19 17:51:59.634965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.828 [2024-11-19 17:51:59.635082] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:0000e600 cdw11:00010000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.828 [2024-11-19 17:51:59.635102] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.828 [2024-11-19 17:51:59.635223] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00010000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.828 [2024-11-19 17:51:59.635241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:06.828 #57 NEW cov: 11843 ft: 15460 corp: 40/1449b lim: 45 exec/s: 57 rss: 70Mb L: 45/45 MS: 1 ChangeBinInt- 00:08:06.828 [2024-11-19 17:51:59.684539] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e6e6e6e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.828 [2024-11-19 17:51:59.684566] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.828 [2024-11-19 17:51:59.684693] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:19191919 cdw11:19130007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.828 [2024-11-19 17:51:59.684717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.828 [2024-11-19 17:51:59.684834] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:e6e6e6e6 cdw11:e6e60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.828 [2024-11-19 17:51:59.684851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.828 [2024-11-19 17:51:59.684967] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:e6e6e6e6 cdw11:e6000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.828 [2024-11-19 17:51:59.684984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.123 #58 NEW cov: 11843 ft: 15466 corp: 41/1489b lim: 45 exec/s: 29 rss: 70Mb L: 40/45 MS: 1 ChangeBinInt- 00:08:07.123 #58 DONE cov: 11843 ft: 15466 corp: 41/1489b lim: 45 exec/s: 29 rss: 70Mb 00:08:07.123 ###### Recommended dictionary. ###### 00:08:07.123 "\000\000\000\000\000\000\000\200" # Uses: 1 00:08:07.123 "\001\034" # Uses: 4 00:08:07.123 ###### End of recommended dictionary. ###### 00:08:07.123 Done 58 runs in 2 second(s) 00:08:07.123 17:51:59 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_5.conf 00:08:07.123 17:51:59 -- ../common.sh@72 -- # (( i++ )) 00:08:07.123 17:51:59 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:07.123 17:51:59 -- ../common.sh@73 -- # start_llvm_fuzz 6 1 0x1 00:08:07.123 17:51:59 -- nvmf/run.sh@23 -- # local fuzzer_type=6 00:08:07.123 17:51:59 -- nvmf/run.sh@24 -- # local timen=1 00:08:07.123 17:51:59 -- nvmf/run.sh@25 -- # local core=0x1 00:08:07.123 17:51:59 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:08:07.123 17:51:59 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_6.conf 00:08:07.123 17:51:59 -- nvmf/run.sh@29 -- # printf %02d 6 00:08:07.123 17:51:59 -- nvmf/run.sh@29 -- # port=4406 00:08:07.123 17:51:59 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:08:07.123 17:51:59 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4406' 00:08:07.123 17:51:59 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4406"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:07.123 17:51:59 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4406' -c /tmp/fuzz_json_6.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 -Z 6 -r /var/tmp/spdk6.sock 00:08:07.123 [2024-11-19 17:51:59.869373] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:07.123 [2024-11-19 17:51:59.869462] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid636626 ] 00:08:07.123 EAL: No free 2048 kB hugepages reported on node 1 00:08:07.383 [2024-11-19 17:52:00.120844] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:07.383 [2024-11-19 17:52:00.147311] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:07.383 [2024-11-19 17:52:00.147451] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:07.383 [2024-11-19 17:52:00.198802] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:07.383 [2024-11-19 17:52:00.215115] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4406 *** 00:08:07.383 INFO: Running with entropic power schedule (0xFF, 100). 00:08:07.383 INFO: Seed: 3756399162 00:08:07.643 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:07.643 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:07.643 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:08:07.643 INFO: A corpus is not provided, starting from an empty corpus 00:08:07.643 #2 INITED exec/s: 0 rss: 59Mb 00:08:07.643 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:07.643 This may also happen if the target rejected all inputs we tried so far 00:08:07.643 [2024-11-19 17:52:00.292182] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:07.643 [2024-11-19 17:52:00.292222] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.904 NEW_FUNC[1/669]: 0x45bec8 in fuzz_admin_delete_io_completion_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:161 00:08:07.904 NEW_FUNC[2/669]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:07.904 #4 NEW cov: 11533 ft: 11519 corp: 2/3b lim: 10 exec/s: 0 rss: 67Mb L: 2/2 MS: 2 ShuffleBytes-CrossOver- 00:08:07.904 [2024-11-19 17:52:00.612212] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 00:08:07.904 [2024-11-19 17:52:00.612250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.904 [2024-11-19 17:52:00.612369] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:07.904 [2024-11-19 17:52:00.612385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.904 #5 NEW cov: 11646 ft: 12423 corp: 3/8b lim: 10 exec/s: 0 rss: 67Mb L: 5/5 MS: 1 InsertRepeatedBytes- 00:08:07.904 [2024-11-19 17:52:00.662046] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:07.904 [2024-11-19 17:52:00.662074] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.904 [2024-11-19 17:52:00.662197] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:07.904 [2024-11-19 17:52:00.662214] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.904 [2024-11-19 17:52:00.662340] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:07.905 [2024-11-19 17:52:00.662356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.905 #6 NEW cov: 11652 ft: 12871 corp: 4/14b lim: 10 exec/s: 0 rss: 67Mb L: 6/6 MS: 1 InsertRepeatedBytes- 00:08:07.905 [2024-11-19 17:52:00.712925] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:07.905 [2024-11-19 17:52:00.712952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.905 [2024-11-19 17:52:00.713073] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:07.905 [2024-11-19 17:52:00.713093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.905 [2024-11-19 17:52:00.713210] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:07.905 [2024-11-19 17:52:00.713230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.905 [2024-11-19 17:52:00.713347] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:08:07.905 [2024-11-19 17:52:00.713363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.905 #7 NEW cov: 11737 ft: 13356 corp: 5/23b lim: 10 exec/s: 0 rss: 67Mb L: 9/9 MS: 1 CopyPart- 00:08:08.165 [2024-11-19 17:52:00.772641] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 00:08:08.165 [2024-11-19 17:52:00.772676] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.165 [2024-11-19 17:52:00.772798] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000020 cdw11:00000000 00:08:08.165 [2024-11-19 17:52:00.772814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.165 #8 NEW cov: 11737 ft: 13435 corp: 6/28b lim: 10 exec/s: 0 rss: 67Mb L: 5/9 MS: 1 ChangeBit- 00:08:08.165 [2024-11-19 17:52:00.812616] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:08.165 [2024-11-19 17:52:00.812643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.165 [2024-11-19 17:52:00.812752] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:08.165 [2024-11-19 17:52:00.812768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.165 [2024-11-19 17:52:00.812883] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000e00 cdw11:00000000 00:08:08.165 [2024-11-19 17:52:00.812898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.165 #9 NEW cov: 11737 ft: 13519 corp: 7/34b lim: 10 exec/s: 0 rss: 67Mb L: 6/9 MS: 1 ChangeByte- 00:08:08.165 [2024-11-19 17:52:00.852651] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:08.165 [2024-11-19 17:52:00.852689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.165 [2024-11-19 17:52:00.852809] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:08.165 [2024-11-19 17:52:00.852825] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.165 [2024-11-19 17:52:00.852935] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ff0a cdw11:00000000 00:08:08.165 [2024-11-19 17:52:00.852953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.165 #10 NEW cov: 11737 ft: 13570 corp: 8/41b lim: 10 exec/s: 0 rss: 67Mb L: 7/9 MS: 1 InsertRepeatedBytes- 00:08:08.165 [2024-11-19 17:52:00.903452] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000daff cdw11:00000000 00:08:08.165 [2024-11-19 17:52:00.903478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.165 [2024-11-19 17:52:00.903604] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:08.165 [2024-11-19 17:52:00.903620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.165 [2024-11-19 17:52:00.903749] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:08.165 [2024-11-19 17:52:00.903765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.165 [2024-11-19 17:52:00.903887] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:08.165 [2024-11-19 17:52:00.903903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:08.165 #12 NEW cov: 11737 ft: 13620 corp: 9/49b lim: 10 exec/s: 0 rss: 67Mb L: 8/9 MS: 2 ChangeByte-CrossOver- 00:08:08.165 [2024-11-19 17:52:00.953635] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000daff cdw11:00000000 00:08:08.165 [2024-11-19 17:52:00.953667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.165 [2024-11-19 17:52:00.953788] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:08.165 [2024-11-19 17:52:00.953806] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.165 [2024-11-19 17:52:00.953933] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:08.166 [2024-11-19 17:52:00.953952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.166 [2024-11-19 17:52:00.954077] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:08.166 [2024-11-19 17:52:00.954096] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:08.166 #13 NEW cov: 11737 ft: 13697 corp: 10/57b lim: 10 exec/s: 0 rss: 67Mb L: 8/9 MS: 1 ShuffleBytes- 00:08:08.166 [2024-11-19 17:52:01.013611] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:08.166 [2024-11-19 17:52:01.013639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.166 [2024-11-19 17:52:01.013759] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:08.166 [2024-11-19 17:52:01.013777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.166 [2024-11-19 17:52:01.013895] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ff0b cdw11:00000000 00:08:08.166 [2024-11-19 17:52:01.013911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.425 #14 NEW cov: 11737 ft: 13784 corp: 11/64b lim: 10 exec/s: 0 rss: 68Mb L: 7/9 MS: 1 ChangeBit- 00:08:08.425 [2024-11-19 17:52:01.073592] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:08.425 [2024-11-19 17:52:01.073626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.425 [2024-11-19 17:52:01.073746] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:08.425 [2024-11-19 17:52:01.073762] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.425 #15 NEW cov: 11737 ft: 13812 corp: 12/69b lim: 10 exec/s: 0 rss: 68Mb L: 5/9 MS: 1 CrossOver- 00:08:08.425 [2024-11-19 17:52:01.123889] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:08.425 [2024-11-19 17:52:01.123915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.425 [2024-11-19 17:52:01.124039] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:08.425 [2024-11-19 17:52:01.124056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.425 [2024-11-19 17:52:01.124183] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000e08 cdw11:00000000 00:08:08.425 [2024-11-19 17:52:01.124199] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.425 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:08.425 #16 NEW cov: 11760 ft: 13879 corp: 13/75b lim: 10 exec/s: 0 rss: 68Mb L: 6/9 MS: 1 ChangeBit- 00:08:08.426 [2024-11-19 17:52:01.184537] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:08.426 [2024-11-19 17:52:01.184564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.426 [2024-11-19 17:52:01.184678] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:08.426 [2024-11-19 17:52:01.184702] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.426 [2024-11-19 17:52:01.184813] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:08.426 [2024-11-19 17:52:01.184834] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.426 [2024-11-19 17:52:01.184947] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:08:08.426 [2024-11-19 17:52:01.184964] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:08.426 [2024-11-19 17:52:01.185076] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 00:08:08.426 [2024-11-19 17:52:01.185094] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:08.426 #17 NEW cov: 11760 ft: 14025 corp: 14/85b lim: 10 exec/s: 0 rss: 68Mb L: 10/10 MS: 1 CrossOver- 00:08:08.426 [2024-11-19 17:52:01.234495] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:08.426 [2024-11-19 17:52:01.234522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.426 [2024-11-19 17:52:01.234656] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000a00 cdw11:00000000 00:08:08.426 [2024-11-19 17:52:01.234674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.426 [2024-11-19 17:52:01.234802] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:08.426 [2024-11-19 17:52:01.234819] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.426 [2024-11-19 17:52:01.234931] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:08:08.426 [2024-11-19 17:52:01.234949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:08.426 #18 NEW cov: 11760 ft: 14048 corp: 15/93b lim: 10 exec/s: 18 rss: 68Mb L: 8/10 MS: 1 EraseBytes- 00:08:08.426 [2024-11-19 17:52:01.284254] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 00:08:08.426 [2024-11-19 17:52:01.284281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.426 [2024-11-19 17:52:01.284384] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00002000 cdw11:00000000 00:08:08.426 [2024-11-19 17:52:01.284402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.686 #19 NEW cov: 11760 ft: 14159 corp: 16/98b lim: 10 exec/s: 19 rss: 68Mb L: 5/10 MS: 1 ShuffleBytes- 00:08:08.686 [2024-11-19 17:52:01.334736] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00005aff cdw11:00000000 00:08:08.686 [2024-11-19 17:52:01.334765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.686 [2024-11-19 17:52:01.334889] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:08.686 [2024-11-19 17:52:01.334909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.686 [2024-11-19 17:52:01.335029] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:08.686 [2024-11-19 17:52:01.335047] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.686 [2024-11-19 17:52:01.335168] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:08.686 [2024-11-19 17:52:01.335184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:08.686 #20 NEW cov: 11760 ft: 14188 corp: 17/106b lim: 10 exec/s: 20 rss: 68Mb L: 8/10 MS: 1 ChangeBit- 00:08:08.686 [2024-11-19 17:52:01.384668] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 00:08:08.686 [2024-11-19 17:52:01.384694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.686 [2024-11-19 17:52:01.384818] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000020 cdw11:00000000 00:08:08.686 [2024-11-19 17:52:01.384834] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.686 [2024-11-19 17:52:01.384955] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:08.686 [2024-11-19 17:52:01.384972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.686 #21 NEW cov: 11760 ft: 14215 corp: 18/112b lim: 10 exec/s: 21 rss: 68Mb L: 6/10 MS: 1 CopyPart- 00:08:08.686 [2024-11-19 17:52:01.434677] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:08.686 [2024-11-19 17:52:01.434704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.686 [2024-11-19 17:52:01.434849] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:08.686 [2024-11-19 17:52:01.434866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.686 #22 NEW cov: 11760 ft: 14243 corp: 19/117b lim: 10 exec/s: 22 rss: 68Mb L: 5/10 MS: 1 CrossOver- 00:08:08.686 [2024-11-19 17:52:01.484806] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 00:08:08.686 [2024-11-19 17:52:01.484832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.686 [2024-11-19 17:52:01.484950] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000020 cdw11:00000000 00:08:08.686 [2024-11-19 17:52:01.484966] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.686 #23 NEW cov: 11760 ft: 14264 corp: 20/122b lim: 10 exec/s: 23 rss: 68Mb L: 5/10 MS: 1 EraseBytes- 00:08:08.686 [2024-11-19 17:52:01.535323] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:08.686 [2024-11-19 17:52:01.535350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.686 [2024-11-19 17:52:01.535475] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:08.686 [2024-11-19 17:52:01.535490] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.686 [2024-11-19 17:52:01.535602] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000100 cdw11:00000000 00:08:08.686 [2024-11-19 17:52:01.535622] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.686 [2024-11-19 17:52:01.535747] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:08:08.686 [2024-11-19 17:52:01.535767] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:08.946 #24 NEW cov: 11760 ft: 14288 corp: 21/131b lim: 10 exec/s: 24 rss: 69Mb L: 9/10 MS: 1 ChangeBinInt- 00:08:08.946 [2024-11-19 17:52:01.584853] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a28 cdw11:00000000 00:08:08.946 [2024-11-19 17:52:01.584880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.946 [2024-11-19 17:52:01.585005] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:08.946 [2024-11-19 17:52:01.585021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.946 [2024-11-19 17:52:01.585130] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000e00 cdw11:00000000 00:08:08.946 [2024-11-19 17:52:01.585146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.946 #25 NEW cov: 11760 ft: 14321 corp: 22/137b lim: 10 exec/s: 25 rss: 69Mb L: 6/10 MS: 1 ChangeByte- 00:08:08.946 [2024-11-19 17:52:01.635798] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:08.946 [2024-11-19 17:52:01.635824] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.946 [2024-11-19 17:52:01.635944] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:08.946 [2024-11-19 17:52:01.635961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.946 [2024-11-19 17:52:01.636072] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ff0a cdw11:00000000 00:08:08.946 [2024-11-19 17:52:01.636089] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.946 [2024-11-19 17:52:01.636199] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000aff cdw11:00000000 00:08:08.946 [2024-11-19 17:52:01.636215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:08.946 [2024-11-19 17:52:01.636325] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:08.946 [2024-11-19 17:52:01.636343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:08.946 #26 NEW cov: 11760 ft: 14333 corp: 23/147b lim: 10 exec/s: 26 rss: 69Mb L: 10/10 MS: 1 CopyPart- 00:08:08.946 [2024-11-19 17:52:01.685780] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000daff cdw11:00000000 00:08:08.946 [2024-11-19 17:52:01.685807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.946 [2024-11-19 17:52:01.685931] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:08.946 [2024-11-19 17:52:01.685947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.947 [2024-11-19 17:52:01.686064] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000a1ff cdw11:00000000 00:08:08.947 [2024-11-19 17:52:01.686083] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.947 [2024-11-19 17:52:01.686205] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000ff0a cdw11:00000000 00:08:08.947 [2024-11-19 17:52:01.686222] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:08.947 #27 NEW cov: 11760 ft: 14344 corp: 24/156b lim: 10 exec/s: 27 rss: 69Mb L: 9/10 MS: 1 InsertByte- 00:08:08.947 [2024-11-19 17:52:01.725192] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a2a cdw11:00000000 00:08:08.947 [2024-11-19 17:52:01.725219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.947 #28 NEW cov: 11760 ft: 14379 corp: 25/158b lim: 10 exec/s: 28 rss: 69Mb L: 2/10 MS: 1 InsertByte- 00:08:08.947 [2024-11-19 17:52:01.765725] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a28 cdw11:00000000 00:08:08.947 [2024-11-19 17:52:01.765752] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.947 [2024-11-19 17:52:01.765868] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:08.947 [2024-11-19 17:52:01.765885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.947 [2024-11-19 17:52:01.765998] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000e00 cdw11:00000000 00:08:08.947 [2024-11-19 17:52:01.766014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.947 #29 NEW cov: 11760 ft: 14442 corp: 26/165b lim: 10 exec/s: 29 rss: 69Mb L: 7/10 MS: 1 InsertByte- 00:08:09.208 [2024-11-19 17:52:01.826190] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000daff cdw11:00000000 00:08:09.208 [2024-11-19 17:52:01.826218] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.208 [2024-11-19 17:52:01.826346] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:09.208 [2024-11-19 17:52:01.826364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.208 [2024-11-19 17:52:01.826488] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000a1ff cdw11:00000000 00:08:09.208 [2024-11-19 17:52:01.826506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.208 [2024-11-19 17:52:01.826606] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000ff00 cdw11:00000000 00:08:09.208 [2024-11-19 17:52:01.826617] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.208 #30 NEW cov: 11760 ft: 14469 corp: 27/174b lim: 10 exec/s: 30 rss: 69Mb L: 9/10 MS: 1 ChangeBinInt- 00:08:09.208 [2024-11-19 17:52:01.875709] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:09.208 [2024-11-19 17:52:01.875736] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.208 #31 NEW cov: 11760 ft: 14479 corp: 28/177b lim: 10 exec/s: 31 rss: 69Mb L: 3/10 MS: 1 InsertByte- 00:08:09.208 [2024-11-19 17:52:01.915603] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:09.208 [2024-11-19 17:52:01.915630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.208 [2024-11-19 17:52:01.915750] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000b0a cdw11:00000000 00:08:09.208 [2024-11-19 17:52:01.915770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.208 #32 NEW cov: 11760 ft: 14486 corp: 29/181b lim: 10 exec/s: 32 rss: 69Mb L: 4/10 MS: 1 EraseBytes- 00:08:09.208 [2024-11-19 17:52:01.956041] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 00:08:09.208 [2024-11-19 17:52:01.956068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.208 [2024-11-19 17:52:01.956183] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000020 cdw11:00000000 00:08:09.208 [2024-11-19 17:52:01.956199] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.208 #33 NEW cov: 11760 ft: 14567 corp: 30/186b lim: 10 exec/s: 33 rss: 69Mb L: 5/10 MS: 1 ShuffleBytes- 00:08:09.208 [2024-11-19 17:52:02.006115] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a2f cdw11:00000000 00:08:09.208 [2024-11-19 17:52:02.006141] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.208 #34 NEW cov: 11760 ft: 14575 corp: 31/188b lim: 10 exec/s: 34 rss: 69Mb L: 2/10 MS: 1 ChangeByte- 00:08:09.208 [2024-11-19 17:52:02.056893] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:09.208 [2024-11-19 17:52:02.056920] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.208 [2024-11-19 17:52:02.057049] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:09.208 [2024-11-19 17:52:02.057066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.208 [2024-11-19 17:52:02.057178] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:09.208 [2024-11-19 17:52:02.057196] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.208 [2024-11-19 17:52:02.057318] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:09.208 [2024-11-19 17:52:02.057336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.468 #35 NEW cov: 11760 ft: 14601 corp: 32/197b lim: 10 exec/s: 35 rss: 69Mb L: 9/10 MS: 1 InsertRepeatedBytes- 00:08:09.468 [2024-11-19 17:52:02.107088] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000daff cdw11:00000000 00:08:09.468 [2024-11-19 17:52:02.107115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.468 [2024-11-19 17:52:02.107231] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:09.468 [2024-11-19 17:52:02.107249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.468 [2024-11-19 17:52:02.107368] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000a1ff cdw11:00000000 00:08:09.468 [2024-11-19 17:52:02.107385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.468 [2024-11-19 17:52:02.107500] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000ff00 cdw11:00000000 00:08:09.468 [2024-11-19 17:52:02.107515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.468 #36 NEW cov: 11760 ft: 14612 corp: 33/206b lim: 10 exec/s: 36 rss: 69Mb L: 9/10 MS: 1 ShuffleBytes- 00:08:09.468 [2024-11-19 17:52:02.157560] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 00:08:09.468 [2024-11-19 17:52:02.157586] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.468 [2024-11-19 17:52:02.157701] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000020 cdw11:00000000 00:08:09.468 [2024-11-19 17:52:02.157716] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.468 [2024-11-19 17:52:02.157828] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000a00 cdw11:00000000 00:08:09.468 [2024-11-19 17:52:02.157845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.468 [2024-11-19 17:52:02.157962] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:08:09.468 [2024-11-19 17:52:02.157978] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.468 [2024-11-19 17:52:02.158102] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000000a cdw11:00000000 00:08:09.469 [2024-11-19 17:52:02.158117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:09.469 #37 NEW cov: 11760 ft: 14638 corp: 34/216b lim: 10 exec/s: 37 rss: 69Mb L: 10/10 MS: 1 CrossOver- 00:08:09.469 [2024-11-19 17:52:02.207502] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:09.469 [2024-11-19 17:52:02.207529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.469 [2024-11-19 17:52:02.207648] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:09.469 [2024-11-19 17:52:02.207664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.469 [2024-11-19 17:52:02.207782] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:09.469 [2024-11-19 17:52:02.207798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.469 [2024-11-19 17:52:02.207906] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000080 cdw11:00000000 00:08:09.469 [2024-11-19 17:52:02.207923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.469 #38 NEW cov: 11760 ft: 14646 corp: 35/224b lim: 10 exec/s: 38 rss: 69Mb L: 8/10 MS: 1 CMP- DE: "\000\000\000\000\000\000\000\200"- 00:08:09.469 [2024-11-19 17:52:02.257827] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 00:08:09.469 [2024-11-19 17:52:02.257855] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.469 [2024-11-19 17:52:02.257972] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000020 cdw11:00000000 00:08:09.469 [2024-11-19 17:52:02.257990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.469 [2024-11-19 17:52:02.258108] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000a00 cdw11:00000000 00:08:09.469 [2024-11-19 17:52:02.258125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.469 [2024-11-19 17:52:02.258244] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:08:09.469 [2024-11-19 17:52:02.258264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.469 [2024-11-19 17:52:02.258378] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000000a cdw11:00000000 00:08:09.469 [2024-11-19 17:52:02.258396] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:09.469 #39 NEW cov: 11760 ft: 14659 corp: 36/234b lim: 10 exec/s: 19 rss: 69Mb L: 10/10 MS: 1 CopyPart- 00:08:09.469 #39 DONE cov: 11760 ft: 14659 corp: 36/234b lim: 10 exec/s: 19 rss: 69Mb 00:08:09.469 ###### Recommended dictionary. ###### 00:08:09.469 "\000\000\000\000\000\000\000\200" # Uses: 0 00:08:09.469 ###### End of recommended dictionary. ###### 00:08:09.469 Done 39 runs in 2 second(s) 00:08:09.729 17:52:02 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_6.conf 00:08:09.729 17:52:02 -- ../common.sh@72 -- # (( i++ )) 00:08:09.729 17:52:02 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:09.729 17:52:02 -- ../common.sh@73 -- # start_llvm_fuzz 7 1 0x1 00:08:09.729 17:52:02 -- nvmf/run.sh@23 -- # local fuzzer_type=7 00:08:09.729 17:52:02 -- nvmf/run.sh@24 -- # local timen=1 00:08:09.729 17:52:02 -- nvmf/run.sh@25 -- # local core=0x1 00:08:09.729 17:52:02 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:08:09.729 17:52:02 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_7.conf 00:08:09.729 17:52:02 -- nvmf/run.sh@29 -- # printf %02d 7 00:08:09.729 17:52:02 -- nvmf/run.sh@29 -- # port=4407 00:08:09.729 17:52:02 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:08:09.729 17:52:02 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4407' 00:08:09.729 17:52:02 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4407"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:09.729 17:52:02 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4407' -c /tmp/fuzz_json_7.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 -Z 7 -r /var/tmp/spdk7.sock 00:08:09.729 [2024-11-19 17:52:02.439776] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:09.729 [2024-11-19 17:52:02.439838] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid637077 ] 00:08:09.729 EAL: No free 2048 kB hugepages reported on node 1 00:08:09.989 [2024-11-19 17:52:02.691886] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:09.989 [2024-11-19 17:52:02.720104] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:09.989 [2024-11-19 17:52:02.720232] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:09.989 [2024-11-19 17:52:02.771484] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:09.989 [2024-11-19 17:52:02.787815] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4407 *** 00:08:09.989 INFO: Running with entropic power schedule (0xFF, 100). 00:08:09.989 INFO: Seed: 2034424727 00:08:09.989 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:09.989 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:09.989 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:08:09.989 INFO: A corpus is not provided, starting from an empty corpus 00:08:09.989 #2 INITED exec/s: 0 rss: 59Mb 00:08:09.989 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:09.989 This may also happen if the target rejected all inputs we tried so far 00:08:09.989 [2024-11-19 17:52:02.843549] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 00:08:09.989 [2024-11-19 17:52:02.843579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.989 [2024-11-19 17:52:02.843640] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:09.989 [2024-11-19 17:52:02.843655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.989 [2024-11-19 17:52:02.843707] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:09.989 [2024-11-19 17:52:02.843721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.989 [2024-11-19 17:52:02.843772] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:08:09.989 [2024-11-19 17:52:02.843786] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.989 [2024-11-19 17:52:02.843839] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 00:08:09.989 [2024-11-19 17:52:02.843853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:10.509 NEW_FUNC[1/665]: 0x45c8c8 in fuzz_admin_delete_io_submission_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:172 00:08:10.509 NEW_FUNC[2/665]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:10.509 #3 NEW cov: 11498 ft: 11527 corp: 2/11b lim: 10 exec/s: 0 rss: 67Mb L: 10/10 MS: 1 InsertRepeatedBytes- 00:08:10.509 [2024-11-19 17:52:03.134089] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a80 cdw11:00000000 00:08:10.509 [2024-11-19 17:52:03.134123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.509 [2024-11-19 17:52:03.134172] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:10.509 [2024-11-19 17:52:03.134186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.509 [2024-11-19 17:52:03.134236] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:10.509 [2024-11-19 17:52:03.134250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.509 [2024-11-19 17:52:03.134299] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:08:10.509 [2024-11-19 17:52:03.134312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.509 [2024-11-19 17:52:03.134361] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 00:08:10.509 [2024-11-19 17:52:03.134374] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:10.509 NEW_FUNC[1/4]: 0x1c716e8 in spdk_thread_is_exited /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:728 00:08:10.509 NEW_FUNC[2/4]: 0x1c72478 in spdk_thread_get_from_ctx /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:797 00:08:10.509 #4 NEW cov: 11646 ft: 11973 corp: 3/21b lim: 10 exec/s: 0 rss: 67Mb L: 10/10 MS: 1 ChangeBit- 00:08:10.509 [2024-11-19 17:52:03.184044] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 00:08:10.509 [2024-11-19 17:52:03.184071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.509 [2024-11-19 17:52:03.184122] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:10.509 [2024-11-19 17:52:03.184139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.509 [2024-11-19 17:52:03.184185] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:10.509 [2024-11-19 17:52:03.184197] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.509 [2024-11-19 17:52:03.184242] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:08:10.509 [2024-11-19 17:52:03.184254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.509 #7 NEW cov: 11652 ft: 12319 corp: 4/29b lim: 10 exec/s: 0 rss: 67Mb L: 8/10 MS: 3 CopyPart-CrossOver-CrossOver- 00:08:10.509 [2024-11-19 17:52:03.224106] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 00:08:10.509 [2024-11-19 17:52:03.224131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.509 [2024-11-19 17:52:03.224179] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000008 cdw11:00000000 00:08:10.509 [2024-11-19 17:52:03.224192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.509 [2024-11-19 17:52:03.224237] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:10.509 [2024-11-19 17:52:03.224251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.509 [2024-11-19 17:52:03.224298] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:08:10.509 [2024-11-19 17:52:03.224310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.509 #8 NEW cov: 11737 ft: 12581 corp: 5/37b lim: 10 exec/s: 0 rss: 67Mb L: 8/10 MS: 1 ChangeBinInt- 00:08:10.509 [2024-11-19 17:52:03.264343] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 00:08:10.509 [2024-11-19 17:52:03.264368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.509 [2024-11-19 17:52:03.264418] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00008000 cdw11:00000000 00:08:10.509 [2024-11-19 17:52:03.264432] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.509 [2024-11-19 17:52:03.264482] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:10.509 [2024-11-19 17:52:03.264495] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.509 [2024-11-19 17:52:03.264543] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:08:10.509 [2024-11-19 17:52:03.264556] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.509 [2024-11-19 17:52:03.264623] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 00:08:10.509 [2024-11-19 17:52:03.264637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:10.509 #9 NEW cov: 11737 ft: 12633 corp: 6/47b lim: 10 exec/s: 0 rss: 67Mb L: 10/10 MS: 1 ChangeBit- 00:08:10.509 [2024-11-19 17:52:03.304360] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:10.509 [2024-11-19 17:52:03.304385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.509 [2024-11-19 17:52:03.304437] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000080a cdw11:00000000 00:08:10.509 [2024-11-19 17:52:03.304451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.509 [2024-11-19 17:52:03.304499] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:10.509 [2024-11-19 17:52:03.304512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.509 [2024-11-19 17:52:03.304558] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:08:10.509 [2024-11-19 17:52:03.304571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.509 #10 NEW cov: 11737 ft: 12737 corp: 7/55b lim: 10 exec/s: 0 rss: 67Mb L: 8/10 MS: 1 ShuffleBytes- 00:08:10.509 [2024-11-19 17:52:03.344589] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a80 cdw11:00000000 00:08:10.509 [2024-11-19 17:52:03.344619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.509 [2024-11-19 17:52:03.344696] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:10.509 [2024-11-19 17:52:03.344710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.509 [2024-11-19 17:52:03.344759] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:10.509 [2024-11-19 17:52:03.344772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.509 [2024-11-19 17:52:03.344819] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:08:10.509 [2024-11-19 17:52:03.344832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.509 [2024-11-19 17:52:03.344881] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 00:08:10.509 [2024-11-19 17:52:03.344894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:10.509 #11 NEW cov: 11737 ft: 12818 corp: 8/65b lim: 10 exec/s: 0 rss: 68Mb L: 10/10 MS: 1 ShuffleBytes- 00:08:10.769 [2024-11-19 17:52:03.384290] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:10.769 [2024-11-19 17:52:03.384315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.769 #12 NEW cov: 11737 ft: 13276 corp: 9/67b lim: 10 exec/s: 0 rss: 68Mb L: 2/10 MS: 1 CopyPart- 00:08:10.769 [2024-11-19 17:52:03.424711] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 00:08:10.769 [2024-11-19 17:52:03.424736] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.769 [2024-11-19 17:52:03.424785] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000001 cdw11:00000000 00:08:10.769 [2024-11-19 17:52:03.424799] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.769 [2024-11-19 17:52:03.424845] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:10.769 [2024-11-19 17:52:03.424859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.769 [2024-11-19 17:52:03.424908] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:08:10.769 [2024-11-19 17:52:03.424921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.769 #13 NEW cov: 11737 ft: 13340 corp: 10/75b lim: 10 exec/s: 0 rss: 68Mb L: 8/10 MS: 1 ChangeBit- 00:08:10.769 [2024-11-19 17:52:03.464708] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:10.769 [2024-11-19 17:52:03.464733] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.769 [2024-11-19 17:52:03.464801] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000080a cdw11:00000000 00:08:10.769 [2024-11-19 17:52:03.464815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.769 [2024-11-19 17:52:03.464865] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:10.769 [2024-11-19 17:52:03.464878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.769 #14 NEW cov: 11737 ft: 13567 corp: 11/81b lim: 10 exec/s: 0 rss: 68Mb L: 6/10 MS: 1 EraseBytes- 00:08:10.770 [2024-11-19 17:52:03.504667] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:10.770 [2024-11-19 17:52:03.504691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.770 #15 NEW cov: 11737 ft: 13610 corp: 12/83b lim: 10 exec/s: 0 rss: 68Mb L: 2/10 MS: 1 ShuffleBytes- 00:08:10.770 [2024-11-19 17:52:03.545183] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a80 cdw11:00000000 00:08:10.770 [2024-11-19 17:52:03.545208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.770 [2024-11-19 17:52:03.545255] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:10.770 [2024-11-19 17:52:03.545268] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.770 [2024-11-19 17:52:03.545315] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:10.770 [2024-11-19 17:52:03.545328] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.770 [2024-11-19 17:52:03.545375] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:08:10.770 [2024-11-19 17:52:03.545388] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.770 [2024-11-19 17:52:03.545433] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 00:08:10.770 [2024-11-19 17:52:03.545446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:10.770 #16 NEW cov: 11737 ft: 13664 corp: 13/93b lim: 10 exec/s: 0 rss: 68Mb L: 10/10 MS: 1 ShuffleBytes- 00:08:10.770 [2024-11-19 17:52:03.584870] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:10.770 [2024-11-19 17:52:03.584896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.770 #17 NEW cov: 11737 ft: 13684 corp: 14/95b lim: 10 exec/s: 0 rss: 68Mb L: 2/10 MS: 1 ShuffleBytes- 00:08:10.770 [2024-11-19 17:52:03.624976] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002a0a cdw11:00000000 00:08:10.770 [2024-11-19 17:52:03.625001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.030 #18 NEW cov: 11737 ft: 13697 corp: 15/97b lim: 10 exec/s: 0 rss: 68Mb L: 2/10 MS: 1 ChangeBit- 00:08:11.030 [2024-11-19 17:52:03.665456] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 00:08:11.030 [2024-11-19 17:52:03.665481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.030 [2024-11-19 17:52:03.665527] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000001 cdw11:00000000 00:08:11.030 [2024-11-19 17:52:03.665541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.030 [2024-11-19 17:52:03.665587] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:11.030 [2024-11-19 17:52:03.665605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.030 [2024-11-19 17:52:03.665657] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00002c00 cdw11:00000000 00:08:11.030 [2024-11-19 17:52:03.665671] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.030 #19 NEW cov: 11737 ft: 13725 corp: 16/106b lim: 10 exec/s: 0 rss: 68Mb L: 9/10 MS: 1 InsertByte- 00:08:11.030 [2024-11-19 17:52:03.705582] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:11.030 [2024-11-19 17:52:03.705614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.030 [2024-11-19 17:52:03.705680] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000a00 cdw11:00000000 00:08:11.030 [2024-11-19 17:52:03.705694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.030 [2024-11-19 17:52:03.705744] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:11.030 [2024-11-19 17:52:03.705757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.030 [2024-11-19 17:52:03.705805] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:08:11.031 [2024-11-19 17:52:03.705818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.031 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:11.031 #20 NEW cov: 11760 ft: 13775 corp: 17/114b lim: 10 exec/s: 0 rss: 68Mb L: 8/10 MS: 1 ShuffleBytes- 00:08:11.031 [2024-11-19 17:52:03.745580] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002a0a cdw11:00000000 00:08:11.031 [2024-11-19 17:52:03.745611] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.031 [2024-11-19 17:52:03.745662] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000000a cdw11:00000000 00:08:11.031 [2024-11-19 17:52:03.745676] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.031 [2024-11-19 17:52:03.745723] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:11.031 [2024-11-19 17:52:03.745737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.031 #21 NEW cov: 11760 ft: 13786 corp: 18/120b lim: 10 exec/s: 0 rss: 68Mb L: 6/10 MS: 1 CrossOver- 00:08:11.031 [2024-11-19 17:52:03.785913] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 00:08:11.031 [2024-11-19 17:52:03.785941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.031 [2024-11-19 17:52:03.785992] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00008000 cdw11:00000000 00:08:11.031 [2024-11-19 17:52:03.786005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.031 [2024-11-19 17:52:03.786055] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:11.031 [2024-11-19 17:52:03.786068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.031 [2024-11-19 17:52:03.786118] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:08:11.031 [2024-11-19 17:52:03.786131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.031 [2024-11-19 17:52:03.786179] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 00:08:11.031 [2024-11-19 17:52:03.786193] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:11.031 #22 NEW cov: 11760 ft: 13805 corp: 19/130b lim: 10 exec/s: 0 rss: 68Mb L: 10/10 MS: 1 ShuffleBytes- 00:08:11.031 [2024-11-19 17:52:03.825798] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:11.031 [2024-11-19 17:52:03.825824] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.031 [2024-11-19 17:52:03.825885] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000080a cdw11:00000000 00:08:11.031 [2024-11-19 17:52:03.825899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.031 [2024-11-19 17:52:03.825947] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:11.031 [2024-11-19 17:52:03.825960] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.031 #23 NEW cov: 11760 ft: 13889 corp: 20/137b lim: 10 exec/s: 23 rss: 68Mb L: 7/10 MS: 1 InsertByte- 00:08:11.031 [2024-11-19 17:52:03.866017] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:11.031 [2024-11-19 17:52:03.866043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.031 [2024-11-19 17:52:03.866107] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:11.031 [2024-11-19 17:52:03.866121] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.031 [2024-11-19 17:52:03.866170] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000008 cdw11:00000000 00:08:11.031 [2024-11-19 17:52:03.866183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.031 [2024-11-19 17:52:03.866232] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000a00 cdw11:00000000 00:08:11.031 [2024-11-19 17:52:03.866256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.031 #24 NEW cov: 11760 ft: 13897 corp: 21/145b lim: 10 exec/s: 24 rss: 68Mb L: 8/10 MS: 1 ShuffleBytes- 00:08:11.291 [2024-11-19 17:52:03.905780] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a41 cdw11:00000000 00:08:11.291 [2024-11-19 17:52:03.905809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.291 #25 NEW cov: 11760 ft: 13957 corp: 22/147b lim: 10 exec/s: 25 rss: 68Mb L: 2/10 MS: 1 ChangeByte- 00:08:11.291 [2024-11-19 17:52:03.946324] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 00:08:11.291 [2024-11-19 17:52:03.946349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.291 [2024-11-19 17:52:03.946395] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00008000 cdw11:00000000 00:08:11.291 [2024-11-19 17:52:03.946407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.291 [2024-11-19 17:52:03.946452] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:11.291 [2024-11-19 17:52:03.946465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.291 [2024-11-19 17:52:03.946528] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:08:11.291 [2024-11-19 17:52:03.946542] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.291 [2024-11-19 17:52:03.946589] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 00:08:11.291 [2024-11-19 17:52:03.946617] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:11.291 #26 NEW cov: 11760 ft: 13970 corp: 23/157b lim: 10 exec/s: 26 rss: 68Mb L: 10/10 MS: 1 CopyPart- 00:08:11.292 [2024-11-19 17:52:03.986404] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a80 cdw11:00000000 00:08:11.292 [2024-11-19 17:52:03.986429] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.292 [2024-11-19 17:52:03.986480] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:11.292 [2024-11-19 17:52:03.986494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.292 [2024-11-19 17:52:03.986543] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:11.292 [2024-11-19 17:52:03.986556] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.292 [2024-11-19 17:52:03.986608] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:08:11.292 [2024-11-19 17:52:03.986622] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.292 [2024-11-19 17:52:03.986668] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00001000 cdw11:00000000 00:08:11.292 [2024-11-19 17:52:03.986682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:11.292 #27 NEW cov: 11760 ft: 13997 corp: 24/167b lim: 10 exec/s: 27 rss: 68Mb L: 10/10 MS: 1 ChangeBit- 00:08:11.292 [2024-11-19 17:52:04.026317] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002a00 cdw11:00000000 00:08:11.292 [2024-11-19 17:52:04.026342] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.292 [2024-11-19 17:52:04.026409] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:11.292 [2024-11-19 17:52:04.026422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.292 [2024-11-19 17:52:04.026475] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:11.292 [2024-11-19 17:52:04.026489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.292 #28 NEW cov: 11760 ft: 14035 corp: 25/173b lim: 10 exec/s: 28 rss: 69Mb L: 6/10 MS: 1 ShuffleBytes- 00:08:11.292 [2024-11-19 17:52:04.066624] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a80 cdw11:00000000 00:08:11.292 [2024-11-19 17:52:04.066649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.292 [2024-11-19 17:52:04.066697] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:11.292 [2024-11-19 17:52:04.066710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.292 [2024-11-19 17:52:04.066757] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000f6ff cdw11:00000000 00:08:11.292 [2024-11-19 17:52:04.066770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.292 [2024-11-19 17:52:04.066816] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:08:11.292 [2024-11-19 17:52:04.066829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.292 [2024-11-19 17:52:04.066875] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 00:08:11.292 [2024-11-19 17:52:04.066888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:11.292 #29 NEW cov: 11760 ft: 14039 corp: 26/183b lim: 10 exec/s: 29 rss: 69Mb L: 10/10 MS: 1 ChangeBinInt- 00:08:11.292 [2024-11-19 17:52:04.106315] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002a0a cdw11:00000000 00:08:11.292 [2024-11-19 17:52:04.106339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.292 #30 NEW cov: 11760 ft: 14063 corp: 27/185b lim: 10 exec/s: 30 rss: 69Mb L: 2/10 MS: 1 ShuffleBytes- 00:08:11.292 [2024-11-19 17:52:04.146665] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 00:08:11.292 [2024-11-19 17:52:04.146698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.292 [2024-11-19 17:52:04.146749] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00008000 cdw11:00000000 00:08:11.292 [2024-11-19 17:52:04.146763] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.292 [2024-11-19 17:52:04.146812] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:11.292 [2024-11-19 17:52:04.146825] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.552 #31 NEW cov: 11760 ft: 14125 corp: 28/192b lim: 10 exec/s: 31 rss: 69Mb L: 7/10 MS: 1 CrossOver- 00:08:11.552 [2024-11-19 17:52:04.186994] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002a0a cdw11:00000000 00:08:11.552 [2024-11-19 17:52:04.187018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.552 [2024-11-19 17:52:04.187067] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000000a cdw11:00000000 00:08:11.553 [2024-11-19 17:52:04.187080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.553 [2024-11-19 17:52:04.187131] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:000000b4 cdw11:00000000 00:08:11.553 [2024-11-19 17:52:04.187145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.553 [2024-11-19 17:52:04.187192] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000b4b4 cdw11:00000000 00:08:11.553 [2024-11-19 17:52:04.187204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.553 [2024-11-19 17:52:04.187252] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:0000b400 cdw11:00000000 00:08:11.553 [2024-11-19 17:52:04.187265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:11.553 #32 NEW cov: 11760 ft: 14131 corp: 29/202b lim: 10 exec/s: 32 rss: 69Mb L: 10/10 MS: 1 InsertRepeatedBytes- 00:08:11.553 [2024-11-19 17:52:04.226992] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:11.553 [2024-11-19 17:52:04.227016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.553 [2024-11-19 17:52:04.227083] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000080a cdw11:00000000 00:08:11.553 [2024-11-19 17:52:04.227097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.553 [2024-11-19 17:52:04.227146] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:11.553 [2024-11-19 17:52:04.227159] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.553 [2024-11-19 17:52:04.227210] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00002800 cdw11:00000000 00:08:11.553 [2024-11-19 17:52:04.227222] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.553 #33 NEW cov: 11760 ft: 14151 corp: 30/210b lim: 10 exec/s: 33 rss: 69Mb L: 8/10 MS: 1 ChangeByte- 00:08:11.553 [2024-11-19 17:52:04.267110] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:11.553 [2024-11-19 17:52:04.267134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.553 [2024-11-19 17:52:04.267184] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000f6ff cdw11:00000000 00:08:11.553 [2024-11-19 17:52:04.267198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.553 [2024-11-19 17:52:04.267247] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:11.553 [2024-11-19 17:52:04.267260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.553 [2024-11-19 17:52:04.267307] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:08:11.553 [2024-11-19 17:52:04.267320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.553 #34 NEW cov: 11760 ft: 14160 corp: 31/218b lim: 10 exec/s: 34 rss: 69Mb L: 8/10 MS: 1 EraseBytes- 00:08:11.553 [2024-11-19 17:52:04.307305] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a80 cdw11:00000000 00:08:11.553 [2024-11-19 17:52:04.307329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.553 [2024-11-19 17:52:04.307380] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000010 cdw11:00000000 00:08:11.553 [2024-11-19 17:52:04.307396] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.553 [2024-11-19 17:52:04.307441] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:11.553 [2024-11-19 17:52:04.307455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.553 [2024-11-19 17:52:04.307499] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:08:11.553 [2024-11-19 17:52:04.307512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.553 [2024-11-19 17:52:04.307560] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 00:08:11.553 [2024-11-19 17:52:04.307573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:11.553 #35 NEW cov: 11760 ft: 14168 corp: 32/228b lim: 10 exec/s: 35 rss: 69Mb L: 10/10 MS: 1 ChangeBit- 00:08:11.553 [2024-11-19 17:52:04.347002] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:11.553 [2024-11-19 17:52:04.347026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.553 #36 NEW cov: 11760 ft: 14195 corp: 33/230b lim: 10 exec/s: 36 rss: 69Mb L: 2/10 MS: 1 ChangeByte- 00:08:11.553 [2024-11-19 17:52:04.387443] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 00:08:11.553 [2024-11-19 17:52:04.387467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.553 [2024-11-19 17:52:04.387529] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00008000 cdw11:00000000 00:08:11.553 [2024-11-19 17:52:04.387543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.553 [2024-11-19 17:52:04.387593] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:11.553 [2024-11-19 17:52:04.387610] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.553 [2024-11-19 17:52:04.387659] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:08:11.553 [2024-11-19 17:52:04.387672] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.553 #42 NEW cov: 11760 ft: 14198 corp: 34/238b lim: 10 exec/s: 42 rss: 69Mb L: 8/10 MS: 1 EraseBytes- 00:08:11.814 [2024-11-19 17:52:04.427670] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:11.814 [2024-11-19 17:52:04.427695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.814 [2024-11-19 17:52:04.427761] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000fff5 cdw11:00000000 00:08:11.814 [2024-11-19 17:52:04.427775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.814 [2024-11-19 17:52:04.427826] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:11.814 [2024-11-19 17:52:04.427839] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.814 [2024-11-19 17:52:04.427886] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:08:11.814 [2024-11-19 17:52:04.427903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.814 [2024-11-19 17:52:04.427962] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 00:08:11.814 [2024-11-19 17:52:04.427976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:11.814 #43 NEW cov: 11760 ft: 14224 corp: 35/248b lim: 10 exec/s: 43 rss: 69Mb L: 10/10 MS: 1 CMP- DE: "\377\377\377\365"- 00:08:11.814 [2024-11-19 17:52:04.467455] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:11.814 [2024-11-19 17:52:04.467480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.814 [2024-11-19 17:52:04.467529] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000010 cdw11:00000000 00:08:11.814 [2024-11-19 17:52:04.467543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.814 #44 NEW cov: 11760 ft: 14361 corp: 36/253b lim: 10 exec/s: 44 rss: 69Mb L: 5/10 MS: 1 EraseBytes- 00:08:11.814 [2024-11-19 17:52:04.507902] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a80 cdw11:00000000 00:08:11.814 [2024-11-19 17:52:04.507927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.814 [2024-11-19 17:52:04.507975] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:11.814 [2024-11-19 17:52:04.507988] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.814 [2024-11-19 17:52:04.508035] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:11.814 [2024-11-19 17:52:04.508048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.815 [2024-11-19 17:52:04.508110] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:08:11.815 [2024-11-19 17:52:04.508123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.815 [2024-11-19 17:52:04.508170] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 00:08:11.815 [2024-11-19 17:52:04.508184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:11.815 #45 NEW cov: 11760 ft: 14364 corp: 37/263b lim: 10 exec/s: 45 rss: 69Mb L: 10/10 MS: 1 CopyPart- 00:08:11.815 [2024-11-19 17:52:04.547887] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:000000bd cdw11:00000000 00:08:11.815 [2024-11-19 17:52:04.547911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.815 [2024-11-19 17:52:04.547959] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000f6ff cdw11:00000000 00:08:11.815 [2024-11-19 17:52:04.547972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.815 [2024-11-19 17:52:04.548021] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:11.815 [2024-11-19 17:52:04.548035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.815 [2024-11-19 17:52:04.548083] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:08:11.815 [2024-11-19 17:52:04.548096] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.815 #46 NEW cov: 11760 ft: 14381 corp: 38/271b lim: 10 exec/s: 46 rss: 69Mb L: 8/10 MS: 1 ChangeByte- 00:08:11.815 [2024-11-19 17:52:04.588131] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 00:08:11.815 [2024-11-19 17:52:04.588155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.815 [2024-11-19 17:52:04.588205] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000a00 cdw11:00000000 00:08:11.815 [2024-11-19 17:52:04.588218] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.815 [2024-11-19 17:52:04.588266] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000008 cdw11:00000000 00:08:11.815 [2024-11-19 17:52:04.588279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.815 [2024-11-19 17:52:04.588326] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000a00 cdw11:00000000 00:08:11.815 [2024-11-19 17:52:04.588338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.815 [2024-11-19 17:52:04.588385] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00000049 cdw11:00000000 00:08:11.815 [2024-11-19 17:52:04.588398] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:11.815 #47 NEW cov: 11760 ft: 14389 corp: 39/281b lim: 10 exec/s: 47 rss: 69Mb L: 10/10 MS: 1 CopyPart- 00:08:11.815 [2024-11-19 17:52:04.627940] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000100 cdw11:00000000 00:08:11.815 [2024-11-19 17:52:04.627964] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.815 [2024-11-19 17:52:04.628028] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000000e cdw11:00000000 00:08:11.815 [2024-11-19 17:52:04.628041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.815 #51 NEW cov: 11760 ft: 14400 corp: 40/286b lim: 10 exec/s: 51 rss: 69Mb L: 5/10 MS: 4 EraseBytes-CrossOver-ShuffleBytes-CMP- DE: "\001\000\000\016"- 00:08:11.815 [2024-11-19 17:52:04.668131] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 00:08:11.815 [2024-11-19 17:52:04.668155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.815 [2024-11-19 17:52:04.668206] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:11.815 [2024-11-19 17:52:04.668219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.815 [2024-11-19 17:52:04.668270] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:11.815 [2024-11-19 17:52:04.668283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.075 #52 NEW cov: 11760 ft: 14463 corp: 41/293b lim: 10 exec/s: 52 rss: 69Mb L: 7/10 MS: 1 CrossOver- 00:08:12.075 [2024-11-19 17:52:04.708045] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000290a cdw11:00000000 00:08:12.075 [2024-11-19 17:52:04.708070] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.075 #53 NEW cov: 11760 ft: 14471 corp: 42/296b lim: 10 exec/s: 53 rss: 69Mb L: 3/10 MS: 1 InsertByte- 00:08:12.075 [2024-11-19 17:52:04.738447] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:12.075 [2024-11-19 17:52:04.738474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.075 [2024-11-19 17:52:04.738526] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000f62f cdw11:00000000 00:08:12.075 [2024-11-19 17:52:04.738539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.076 [2024-11-19 17:52:04.738591] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:12.076 [2024-11-19 17:52:04.738608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.076 [2024-11-19 17:52:04.738657] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:08:12.076 [2024-11-19 17:52:04.738669] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.076 #54 NEW cov: 11760 ft: 14515 corp: 43/304b lim: 10 exec/s: 54 rss: 69Mb L: 8/10 MS: 1 ChangeByte- 00:08:12.076 [2024-11-19 17:52:04.778675] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:12.076 [2024-11-19 17:52:04.778699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.076 [2024-11-19 17:52:04.778750] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00002a0a cdw11:00000000 00:08:12.076 [2024-11-19 17:52:04.778764] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.076 [2024-11-19 17:52:04.778813] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:12.076 [2024-11-19 17:52:04.778842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.076 [2024-11-19 17:52:04.778890] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:08:12.076 [2024-11-19 17:52:04.778903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.076 [2024-11-19 17:52:04.778951] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 00:08:12.076 [2024-11-19 17:52:04.778965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:12.076 #55 NEW cov: 11760 ft: 14524 corp: 44/314b lim: 10 exec/s: 55 rss: 70Mb L: 10/10 MS: 1 CrossOver- 00:08:12.076 [2024-11-19 17:52:04.818378] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000100 cdw11:00000000 00:08:12.076 [2024-11-19 17:52:04.818402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.076 #56 NEW cov: 11760 ft: 14542 corp: 45/317b lim: 10 exec/s: 28 rss: 70Mb L: 3/10 MS: 1 EraseBytes- 00:08:12.076 #56 DONE cov: 11760 ft: 14542 corp: 45/317b lim: 10 exec/s: 28 rss: 70Mb 00:08:12.076 ###### Recommended dictionary. ###### 00:08:12.076 "\377\377\377\365" # Uses: 0 00:08:12.076 "\001\000\000\016" # Uses: 0 00:08:12.076 ###### End of recommended dictionary. ###### 00:08:12.076 Done 56 runs in 2 second(s) 00:08:12.336 17:52:04 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_7.conf 00:08:12.336 17:52:04 -- ../common.sh@72 -- # (( i++ )) 00:08:12.336 17:52:04 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:12.336 17:52:04 -- ../common.sh@73 -- # start_llvm_fuzz 8 1 0x1 00:08:12.336 17:52:04 -- nvmf/run.sh@23 -- # local fuzzer_type=8 00:08:12.336 17:52:04 -- nvmf/run.sh@24 -- # local timen=1 00:08:12.336 17:52:04 -- nvmf/run.sh@25 -- # local core=0x1 00:08:12.336 17:52:04 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:08:12.336 17:52:04 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_8.conf 00:08:12.336 17:52:04 -- nvmf/run.sh@29 -- # printf %02d 8 00:08:12.336 17:52:04 -- nvmf/run.sh@29 -- # port=4408 00:08:12.336 17:52:04 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:08:12.336 17:52:04 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4408' 00:08:12.336 17:52:04 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4408"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:12.336 17:52:04 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4408' -c /tmp/fuzz_json_8.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 -Z 8 -r /var/tmp/spdk8.sock 00:08:12.336 [2024-11-19 17:52:04.998560] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:12.336 [2024-11-19 17:52:04.998640] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid637462 ] 00:08:12.336 EAL: No free 2048 kB hugepages reported on node 1 00:08:12.596 [2024-11-19 17:52:05.255452] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:12.596 [2024-11-19 17:52:05.283176] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:12.596 [2024-11-19 17:52:05.283315] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:12.596 [2024-11-19 17:52:05.334635] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:12.596 [2024-11-19 17:52:05.350963] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4408 *** 00:08:12.596 INFO: Running with entropic power schedule (0xFF, 100). 00:08:12.596 INFO: Seed: 302437867 00:08:12.596 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:12.596 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:12.596 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:08:12.596 INFO: A corpus is not provided, starting from an empty corpus 00:08:12.596 [2024-11-19 17:52:05.418158] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.596 [2024-11-19 17:52:05.418199] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.596 #2 INITED cov: 11561 ft: 11562 corp: 1/1b exec/s: 0 rss: 66Mb 00:08:12.857 [2024-11-19 17:52:05.468468] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.857 [2024-11-19 17:52:05.468497] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.857 #3 NEW cov: 11674 ft: 12193 corp: 2/2b lim: 5 exec/s: 0 rss: 66Mb L: 1/1 MS: 1 ChangeBit- 00:08:12.857 [2024-11-19 17:52:05.530274] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.857 [2024-11-19 17:52:05.530303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.857 [2024-11-19 17:52:05.530399] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.857 [2024-11-19 17:52:05.530415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.857 [2024-11-19 17:52:05.530500] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.857 [2024-11-19 17:52:05.530519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.857 [2024-11-19 17:52:05.530602] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.857 [2024-11-19 17:52:05.530618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.857 #4 NEW cov: 11680 ft: 13200 corp: 3/6b lim: 5 exec/s: 0 rss: 67Mb L: 4/4 MS: 1 InsertRepeatedBytes- 00:08:12.857 [2024-11-19 17:52:05.589366] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.857 [2024-11-19 17:52:05.589394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.857 #5 NEW cov: 11765 ft: 13489 corp: 4/7b lim: 5 exec/s: 0 rss: 67Mb L: 1/4 MS: 1 ChangeByte- 00:08:12.857 [2024-11-19 17:52:05.639703] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.857 [2024-11-19 17:52:05.639732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.857 #6 NEW cov: 11765 ft: 13601 corp: 5/8b lim: 5 exec/s: 0 rss: 67Mb L: 1/4 MS: 1 ShuffleBytes- 00:08:12.857 [2024-11-19 17:52:05.690028] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.857 [2024-11-19 17:52:05.690055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.857 #7 NEW cov: 11765 ft: 13666 corp: 6/9b lim: 5 exec/s: 0 rss: 67Mb L: 1/4 MS: 1 ChangeBinInt- 00:08:13.118 [2024-11-19 17:52:05.740794] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.118 [2024-11-19 17:52:05.740822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.118 [2024-11-19 17:52:05.740901] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.118 [2024-11-19 17:52:05.740916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.118 #8 NEW cov: 11765 ft: 13902 corp: 7/11b lim: 5 exec/s: 0 rss: 67Mb L: 2/4 MS: 1 InsertByte- 00:08:13.118 [2024-11-19 17:52:05.800957] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.118 [2024-11-19 17:52:05.800984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.118 [2024-11-19 17:52:05.801071] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.118 [2024-11-19 17:52:05.801088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.118 #9 NEW cov: 11765 ft: 13955 corp: 8/13b lim: 5 exec/s: 0 rss: 67Mb L: 2/4 MS: 1 InsertByte- 00:08:13.118 [2024-11-19 17:52:05.850728] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.118 [2024-11-19 17:52:05.850754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.118 #10 NEW cov: 11765 ft: 13990 corp: 9/14b lim: 5 exec/s: 0 rss: 67Mb L: 1/4 MS: 1 ChangeBinInt- 00:08:13.118 [2024-11-19 17:52:05.901396] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.118 [2024-11-19 17:52:05.901428] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.118 [2024-11-19 17:52:05.901512] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.118 [2024-11-19 17:52:05.901528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.118 #11 NEW cov: 11765 ft: 14050 corp: 10/16b lim: 5 exec/s: 0 rss: 67Mb L: 2/4 MS: 1 ChangeBit- 00:08:13.118 [2024-11-19 17:52:05.961940] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.118 [2024-11-19 17:52:05.961969] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.118 [2024-11-19 17:52:05.962059] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.118 [2024-11-19 17:52:05.962077] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.378 #12 NEW cov: 11765 ft: 14073 corp: 11/18b lim: 5 exec/s: 0 rss: 67Mb L: 2/4 MS: 1 ChangeBinInt- 00:08:13.378 [2024-11-19 17:52:06.021787] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.378 [2024-11-19 17:52:06.021815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.378 #13 NEW cov: 11765 ft: 14128 corp: 12/19b lim: 5 exec/s: 0 rss: 67Mb L: 1/4 MS: 1 ChangeBinInt- 00:08:13.378 [2024-11-19 17:52:06.071882] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.379 [2024-11-19 17:52:06.071910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.379 #14 NEW cov: 11765 ft: 14155 corp: 13/20b lim: 5 exec/s: 0 rss: 67Mb L: 1/4 MS: 1 CopyPart- 00:08:13.379 [2024-11-19 17:52:06.134006] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.379 [2024-11-19 17:52:06.134033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.379 [2024-11-19 17:52:06.134128] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.379 [2024-11-19 17:52:06.134145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.379 [2024-11-19 17:52:06.134233] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.379 [2024-11-19 17:52:06.134247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.379 [2024-11-19 17:52:06.134324] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.379 [2024-11-19 17:52:06.134340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:13.379 [2024-11-19 17:52:06.134425] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.379 [2024-11-19 17:52:06.134440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:13.379 #15 NEW cov: 11765 ft: 14287 corp: 14/25b lim: 5 exec/s: 0 rss: 67Mb L: 5/5 MS: 1 InsertRepeatedBytes- 00:08:13.379 [2024-11-19 17:52:06.182523] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.379 [2024-11-19 17:52:06.182549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.379 #16 NEW cov: 11765 ft: 14383 corp: 15/26b lim: 5 exec/s: 0 rss: 67Mb L: 1/5 MS: 1 ChangeByte- 00:08:13.379 [2024-11-19 17:52:06.234692] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.379 [2024-11-19 17:52:06.234720] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.379 [2024-11-19 17:52:06.234815] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.379 [2024-11-19 17:52:06.234832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.379 [2024-11-19 17:52:06.234910] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.379 [2024-11-19 17:52:06.234925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.379 [2024-11-19 17:52:06.235007] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.379 [2024-11-19 17:52:06.235023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:13.379 [2024-11-19 17:52:06.235104] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.379 [2024-11-19 17:52:06.235119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:13.639 #17 NEW cov: 11765 ft: 14428 corp: 16/31b lim: 5 exec/s: 0 rss: 67Mb L: 5/5 MS: 1 ChangeBinInt- 00:08:13.639 [2024-11-19 17:52:06.294559] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.639 [2024-11-19 17:52:06.294586] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.639 [2024-11-19 17:52:06.294675] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.639 [2024-11-19 17:52:06.294693] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.639 [2024-11-19 17:52:06.294775] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.639 [2024-11-19 17:52:06.294791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.639 [2024-11-19 17:52:06.294873] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.639 [2024-11-19 17:52:06.294888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:13.899 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:13.899 #18 NEW cov: 11788 ft: 14494 corp: 17/35b lim: 5 exec/s: 18 rss: 69Mb L: 4/5 MS: 1 CrossOver- 00:08:13.899 [2024-11-19 17:52:06.603644] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.899 [2024-11-19 17:52:06.603679] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.899 [2024-11-19 17:52:06.603820] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.899 [2024-11-19 17:52:06.603838] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.899 #19 NEW cov: 11788 ft: 14676 corp: 18/37b lim: 5 exec/s: 19 rss: 69Mb L: 2/5 MS: 1 CrossOver- 00:08:13.899 [2024-11-19 17:52:06.653448] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.899 [2024-11-19 17:52:06.653478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.899 #20 NEW cov: 11788 ft: 14724 corp: 19/38b lim: 5 exec/s: 20 rss: 69Mb L: 1/5 MS: 1 CrossOver- 00:08:13.899 [2024-11-19 17:52:06.704593] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.899 [2024-11-19 17:52:06.704625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.899 [2024-11-19 17:52:06.704756] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.899 [2024-11-19 17:52:06.704774] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.899 [2024-11-19 17:52:06.704912] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.899 [2024-11-19 17:52:06.704930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.899 [2024-11-19 17:52:06.705063] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.899 [2024-11-19 17:52:06.705083] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:13.899 #21 NEW cov: 11788 ft: 14736 corp: 20/42b lim: 5 exec/s: 21 rss: 69Mb L: 4/5 MS: 1 InsertRepeatedBytes- 00:08:14.161 [2024-11-19 17:52:06.764173] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.161 [2024-11-19 17:52:06.764201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.161 [2024-11-19 17:52:06.764328] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.161 [2024-11-19 17:52:06.764348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.161 #22 NEW cov: 11788 ft: 14756 corp: 21/44b lim: 5 exec/s: 22 rss: 69Mb L: 2/5 MS: 1 InsertByte- 00:08:14.161 [2024-11-19 17:52:06.814265] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.161 [2024-11-19 17:52:06.814293] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.161 [2024-11-19 17:52:06.814436] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.161 [2024-11-19 17:52:06.814456] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.161 #23 NEW cov: 11788 ft: 14825 corp: 22/46b lim: 5 exec/s: 23 rss: 69Mb L: 2/5 MS: 1 InsertByte- 00:08:14.161 [2024-11-19 17:52:06.875078] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.161 [2024-11-19 17:52:06.875107] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.161 [2024-11-19 17:52:06.875244] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.161 [2024-11-19 17:52:06.875263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.161 [2024-11-19 17:52:06.875410] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.161 [2024-11-19 17:52:06.875430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.161 [2024-11-19 17:52:06.875574] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.161 [2024-11-19 17:52:06.875594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:14.161 #24 NEW cov: 11788 ft: 14827 corp: 23/50b lim: 5 exec/s: 24 rss: 69Mb L: 4/5 MS: 1 ChangeBinInt- 00:08:14.161 [2024-11-19 17:52:06.935603] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.161 [2024-11-19 17:52:06.935631] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.161 [2024-11-19 17:52:06.935769] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.161 [2024-11-19 17:52:06.935788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.161 [2024-11-19 17:52:06.935926] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.161 [2024-11-19 17:52:06.935945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.161 [2024-11-19 17:52:06.936089] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.161 [2024-11-19 17:52:06.936108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:14.161 [2024-11-19 17:52:06.936250] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.161 [2024-11-19 17:52:06.936269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:14.161 #25 NEW cov: 11788 ft: 14851 corp: 24/55b lim: 5 exec/s: 25 rss: 69Mb L: 5/5 MS: 1 InsertRepeatedBytes- 00:08:14.161 [2024-11-19 17:52:06.995864] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.161 [2024-11-19 17:52:06.995892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.161 [2024-11-19 17:52:06.996036] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.161 [2024-11-19 17:52:06.996057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.161 [2024-11-19 17:52:06.996189] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.161 [2024-11-19 17:52:06.996210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.161 [2024-11-19 17:52:06.996355] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.161 [2024-11-19 17:52:06.996375] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:14.161 [2024-11-19 17:52:06.996519] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.161 [2024-11-19 17:52:06.996538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:14.422 #26 NEW cov: 11788 ft: 14865 corp: 25/60b lim: 5 exec/s: 26 rss: 69Mb L: 5/5 MS: 1 ChangeBinInt- 00:08:14.422 [2024-11-19 17:52:07.056040] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.422 [2024-11-19 17:52:07.056068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.422 [2024-11-19 17:52:07.056185] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.422 [2024-11-19 17:52:07.056203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.422 [2024-11-19 17:52:07.056343] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.422 [2024-11-19 17:52:07.056361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.422 [2024-11-19 17:52:07.056507] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.422 [2024-11-19 17:52:07.056524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:14.422 [2024-11-19 17:52:07.056677] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.422 [2024-11-19 17:52:07.056693] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:14.422 #27 NEW cov: 11788 ft: 14880 corp: 26/65b lim: 5 exec/s: 27 rss: 69Mb L: 5/5 MS: 1 ChangeBit- 00:08:14.422 [2024-11-19 17:52:07.105524] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.422 [2024-11-19 17:52:07.105552] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.422 [2024-11-19 17:52:07.105690] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.422 [2024-11-19 17:52:07.105709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.422 [2024-11-19 17:52:07.105837] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.422 [2024-11-19 17:52:07.105859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.422 #28 NEW cov: 11788 ft: 15044 corp: 27/68b lim: 5 exec/s: 28 rss: 69Mb L: 3/5 MS: 1 InsertByte- 00:08:14.422 [2024-11-19 17:52:07.166471] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.422 [2024-11-19 17:52:07.166499] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.423 [2024-11-19 17:52:07.166628] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.423 [2024-11-19 17:52:07.166645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.423 [2024-11-19 17:52:07.166771] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.423 [2024-11-19 17:52:07.166788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.423 [2024-11-19 17:52:07.166931] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.423 [2024-11-19 17:52:07.166949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:14.423 [2024-11-19 17:52:07.167083] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.423 [2024-11-19 17:52:07.167102] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:14.423 #29 NEW cov: 11788 ft: 15055 corp: 28/73b lim: 5 exec/s: 29 rss: 69Mb L: 5/5 MS: 1 ShuffleBytes- 00:08:14.423 [2024-11-19 17:52:07.226256] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.423 [2024-11-19 17:52:07.226284] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.423 [2024-11-19 17:52:07.226426] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.423 [2024-11-19 17:52:07.226444] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.423 [2024-11-19 17:52:07.226579] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.423 [2024-11-19 17:52:07.226596] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.423 [2024-11-19 17:52:07.226734] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.423 [2024-11-19 17:52:07.226753] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:14.423 #30 NEW cov: 11788 ft: 15058 corp: 29/77b lim: 5 exec/s: 30 rss: 69Mb L: 4/5 MS: 1 ChangeBit- 00:08:14.423 [2024-11-19 17:52:07.275499] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.423 [2024-11-19 17:52:07.275525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.684 #31 NEW cov: 11788 ft: 15064 corp: 30/78b lim: 5 exec/s: 31 rss: 70Mb L: 1/5 MS: 1 CrossOver- 00:08:14.684 [2024-11-19 17:52:07.325948] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.684 [2024-11-19 17:52:07.325975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.684 [2024-11-19 17:52:07.326118] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.684 [2024-11-19 17:52:07.326135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.684 #32 NEW cov: 11788 ft: 15068 corp: 31/80b lim: 5 exec/s: 32 rss: 70Mb L: 2/5 MS: 1 CrossOver- 00:08:14.684 [2024-11-19 17:52:07.376186] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.684 [2024-11-19 17:52:07.376213] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.684 [2024-11-19 17:52:07.376346] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.684 [2024-11-19 17:52:07.376366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.684 #33 NEW cov: 11788 ft: 15082 corp: 32/82b lim: 5 exec/s: 16 rss: 70Mb L: 2/5 MS: 1 ChangeByte- 00:08:14.684 #33 DONE cov: 11788 ft: 15082 corp: 32/82b lim: 5 exec/s: 16 rss: 70Mb 00:08:14.684 Done 33 runs in 2 second(s) 00:08:14.684 17:52:07 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_8.conf 00:08:14.684 17:52:07 -- ../common.sh@72 -- # (( i++ )) 00:08:14.684 17:52:07 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:14.684 17:52:07 -- ../common.sh@73 -- # start_llvm_fuzz 9 1 0x1 00:08:14.684 17:52:07 -- nvmf/run.sh@23 -- # local fuzzer_type=9 00:08:14.684 17:52:07 -- nvmf/run.sh@24 -- # local timen=1 00:08:14.684 17:52:07 -- nvmf/run.sh@25 -- # local core=0x1 00:08:14.684 17:52:07 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:08:14.684 17:52:07 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_9.conf 00:08:14.684 17:52:07 -- nvmf/run.sh@29 -- # printf %02d 9 00:08:14.684 17:52:07 -- nvmf/run.sh@29 -- # port=4409 00:08:14.684 17:52:07 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:08:14.684 17:52:07 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4409' 00:08:14.684 17:52:07 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4409"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:14.684 17:52:07 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4409' -c /tmp/fuzz_json_9.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 -Z 9 -r /var/tmp/spdk9.sock 00:08:14.944 [2024-11-19 17:52:07.559565] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:14.945 [2024-11-19 17:52:07.559644] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid637999 ] 00:08:14.945 EAL: No free 2048 kB hugepages reported on node 1 00:08:15.206 [2024-11-19 17:52:07.813284] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:15.206 [2024-11-19 17:52:07.839874] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:15.206 [2024-11-19 17:52:07.839990] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:15.206 [2024-11-19 17:52:07.891256] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:15.206 [2024-11-19 17:52:07.907578] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4409 *** 00:08:15.206 INFO: Running with entropic power schedule (0xFF, 100). 00:08:15.206 INFO: Seed: 2857456751 00:08:15.206 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:15.206 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:15.206 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:08:15.206 INFO: A corpus is not provided, starting from an empty corpus 00:08:15.206 [2024-11-19 17:52:07.952774] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.206 [2024-11-19 17:52:07.952801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.206 #2 INITED cov: 11561 ft: 11562 corp: 1/1b exec/s: 0 rss: 66Mb 00:08:15.206 [2024-11-19 17:52:07.982740] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.206 [2024-11-19 17:52:07.982765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.206 #3 NEW cov: 11674 ft: 12155 corp: 2/2b lim: 5 exec/s: 0 rss: 66Mb L: 1/1 MS: 1 CrossOver- 00:08:15.206 [2024-11-19 17:52:08.022853] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.206 [2024-11-19 17:52:08.022878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.206 #4 NEW cov: 11680 ft: 12349 corp: 3/3b lim: 5 exec/s: 0 rss: 66Mb L: 1/1 MS: 1 ChangeBit- 00:08:15.206 [2024-11-19 17:52:08.053067] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.206 [2024-11-19 17:52:08.053091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.206 [2024-11-19 17:52:08.053160] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.206 [2024-11-19 17:52:08.053174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.467 #5 NEW cov: 11765 ft: 13227 corp: 4/5b lim: 5 exec/s: 0 rss: 66Mb L: 2/2 MS: 1 InsertByte- 00:08:15.467 [2024-11-19 17:52:08.093145] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.467 [2024-11-19 17:52:08.093169] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.467 [2024-11-19 17:52:08.093239] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.467 [2024-11-19 17:52:08.093252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.467 #6 NEW cov: 11765 ft: 13306 corp: 5/7b lim: 5 exec/s: 0 rss: 66Mb L: 2/2 MS: 1 ChangeByte- 00:08:15.467 [2024-11-19 17:52:08.133292] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.467 [2024-11-19 17:52:08.133317] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.467 [2024-11-19 17:52:08.133370] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.467 [2024-11-19 17:52:08.133384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.467 #7 NEW cov: 11765 ft: 13471 corp: 6/9b lim: 5 exec/s: 0 rss: 66Mb L: 2/2 MS: 1 InsertByte- 00:08:15.467 [2024-11-19 17:52:08.173289] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.467 [2024-11-19 17:52:08.173316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.467 #8 NEW cov: 11765 ft: 13549 corp: 7/10b lim: 5 exec/s: 0 rss: 66Mb L: 1/2 MS: 1 ChangeBinInt- 00:08:15.467 [2024-11-19 17:52:08.213716] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.467 [2024-11-19 17:52:08.213741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.467 [2024-11-19 17:52:08.213797] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.467 [2024-11-19 17:52:08.213811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.467 [2024-11-19 17:52:08.213867] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.467 [2024-11-19 17:52:08.213880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.467 #9 NEW cov: 11765 ft: 13900 corp: 8/13b lim: 5 exec/s: 0 rss: 66Mb L: 3/3 MS: 1 CrossOver- 00:08:15.467 [2024-11-19 17:52:08.253559] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.467 [2024-11-19 17:52:08.253587] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.467 #10 NEW cov: 11765 ft: 14002 corp: 9/14b lim: 5 exec/s: 0 rss: 66Mb L: 1/3 MS: 1 ChangeByte- 00:08:15.467 [2024-11-19 17:52:08.294046] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.467 [2024-11-19 17:52:08.294072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.467 [2024-11-19 17:52:08.294127] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.467 [2024-11-19 17:52:08.294140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.467 [2024-11-19 17:52:08.294192] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.467 [2024-11-19 17:52:08.294205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.467 [2024-11-19 17:52:08.294257] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.467 [2024-11-19 17:52:08.294270] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:15.467 #11 NEW cov: 11765 ft: 14330 corp: 10/18b lim: 5 exec/s: 0 rss: 66Mb L: 4/4 MS: 1 InsertRepeatedBytes- 00:08:15.728 [2024-11-19 17:52:08.333899] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.728 [2024-11-19 17:52:08.333925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.728 [2024-11-19 17:52:08.333980] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.728 [2024-11-19 17:52:08.333997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.728 #12 NEW cov: 11765 ft: 14345 corp: 11/20b lim: 5 exec/s: 0 rss: 66Mb L: 2/4 MS: 1 CopyPart- 00:08:15.728 [2024-11-19 17:52:08.373821] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.728 [2024-11-19 17:52:08.373846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.728 #13 NEW cov: 11765 ft: 14381 corp: 12/21b lim: 5 exec/s: 0 rss: 66Mb L: 1/4 MS: 1 ChangeByte- 00:08:15.728 [2024-11-19 17:52:08.414548] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.728 [2024-11-19 17:52:08.414572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.728 [2024-11-19 17:52:08.414648] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.728 [2024-11-19 17:52:08.414663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.728 [2024-11-19 17:52:08.414717] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.728 [2024-11-19 17:52:08.414730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.728 [2024-11-19 17:52:08.414782] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.728 [2024-11-19 17:52:08.414796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:15.728 [2024-11-19 17:52:08.414852] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.728 [2024-11-19 17:52:08.414865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:15.728 #14 NEW cov: 11765 ft: 14490 corp: 13/26b lim: 5 exec/s: 0 rss: 66Mb L: 5/5 MS: 1 InsertByte- 00:08:15.728 [2024-11-19 17:52:08.454216] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.728 [2024-11-19 17:52:08.454240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.728 [2024-11-19 17:52:08.454307] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.728 [2024-11-19 17:52:08.454321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.728 #15 NEW cov: 11765 ft: 14520 corp: 14/28b lim: 5 exec/s: 0 rss: 66Mb L: 2/5 MS: 1 EraseBytes- 00:08:15.728 [2024-11-19 17:52:08.494471] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.728 [2024-11-19 17:52:08.494496] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.728 [2024-11-19 17:52:08.494569] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.728 [2024-11-19 17:52:08.494583] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.728 [2024-11-19 17:52:08.494645] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.728 [2024-11-19 17:52:08.494658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.728 #16 NEW cov: 11765 ft: 14538 corp: 15/31b lim: 5 exec/s: 0 rss: 66Mb L: 3/5 MS: 1 CopyPart- 00:08:15.728 [2024-11-19 17:52:08.534374] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.728 [2024-11-19 17:52:08.534398] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.728 #17 NEW cov: 11765 ft: 14561 corp: 16/32b lim: 5 exec/s: 0 rss: 66Mb L: 1/5 MS: 1 EraseBytes- 00:08:15.728 [2024-11-19 17:52:08.574405] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.728 [2024-11-19 17:52:08.574429] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.989 #18 NEW cov: 11765 ft: 14572 corp: 17/33b lim: 5 exec/s: 0 rss: 66Mb L: 1/5 MS: 1 ChangeBit- 00:08:15.989 [2024-11-19 17:52:08.614685] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.989 [2024-11-19 17:52:08.614709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.989 [2024-11-19 17:52:08.614776] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.989 [2024-11-19 17:52:08.614790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.989 #19 NEW cov: 11765 ft: 14587 corp: 18/35b lim: 5 exec/s: 0 rss: 66Mb L: 2/5 MS: 1 ChangeByte- 00:08:15.989 [2024-11-19 17:52:08.654682] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.989 [2024-11-19 17:52:08.654706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.989 #20 NEW cov: 11765 ft: 14670 corp: 19/36b lim: 5 exec/s: 0 rss: 66Mb L: 1/5 MS: 1 ChangeByte- 00:08:15.989 [2024-11-19 17:52:08.694751] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.989 [2024-11-19 17:52:08.694776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.989 #21 NEW cov: 11765 ft: 14733 corp: 20/37b lim: 5 exec/s: 0 rss: 66Mb L: 1/5 MS: 1 CrossOver- 00:08:15.989 [2024-11-19 17:52:08.725056] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.989 [2024-11-19 17:52:08.725082] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.989 [2024-11-19 17:52:08.725137] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.989 [2024-11-19 17:52:08.725151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.989 #22 NEW cov: 11765 ft: 14740 corp: 21/39b lim: 5 exec/s: 0 rss: 66Mb L: 2/5 MS: 1 ChangeBit- 00:08:15.989 [2024-11-19 17:52:08.765457] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.989 [2024-11-19 17:52:08.765484] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.989 [2024-11-19 17:52:08.765554] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.989 [2024-11-19 17:52:08.765568] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.989 [2024-11-19 17:52:08.765625] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.989 [2024-11-19 17:52:08.765638] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.989 [2024-11-19 17:52:08.765691] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.989 [2024-11-19 17:52:08.765704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:15.989 #23 NEW cov: 11765 ft: 14798 corp: 22/43b lim: 5 exec/s: 0 rss: 66Mb L: 4/5 MS: 1 ChangeBit- 00:08:15.989 [2024-11-19 17:52:08.805590] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.989 [2024-11-19 17:52:08.805621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.989 [2024-11-19 17:52:08.805703] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.989 [2024-11-19 17:52:08.805723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.989 [2024-11-19 17:52:08.805782] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.989 [2024-11-19 17:52:08.805796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.989 [2024-11-19 17:52:08.805848] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.989 [2024-11-19 17:52:08.805861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:15.989 #24 NEW cov: 11765 ft: 14810 corp: 23/47b lim: 5 exec/s: 0 rss: 66Mb L: 4/5 MS: 1 ChangeBinInt- 00:08:15.989 [2024-11-19 17:52:08.845408] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.989 [2024-11-19 17:52:08.845432] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.989 [2024-11-19 17:52:08.845500] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.989 [2024-11-19 17:52:08.845513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.524 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:16.524 #25 NEW cov: 11788 ft: 14922 corp: 24/49b lim: 5 exec/s: 25 rss: 68Mb L: 2/5 MS: 1 ChangeBit- 00:08:16.524 [2024-11-19 17:52:09.136417] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.524 [2024-11-19 17:52:09.136447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.524 [2024-11-19 17:52:09.136519] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.524 [2024-11-19 17:52:09.136532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.524 [2024-11-19 17:52:09.136581] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.524 [2024-11-19 17:52:09.136594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.524 [2024-11-19 17:52:09.136650] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.524 [2024-11-19 17:52:09.136663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:16.524 #26 NEW cov: 11788 ft: 14992 corp: 25/53b lim: 5 exec/s: 26 rss: 68Mb L: 4/5 MS: 1 CMP- DE: "\001\000\000\010"- 00:08:16.524 [2024-11-19 17:52:09.176162] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.524 [2024-11-19 17:52:09.176187] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.524 [2024-11-19 17:52:09.176255] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.524 [2024-11-19 17:52:09.176269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.524 #27 NEW cov: 11788 ft: 14995 corp: 26/55b lim: 5 exec/s: 27 rss: 68Mb L: 2/5 MS: 1 CrossOver- 00:08:16.524 [2024-11-19 17:52:09.216263] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.524 [2024-11-19 17:52:09.216288] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.524 [2024-11-19 17:52:09.216353] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.524 [2024-11-19 17:52:09.216366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.524 #28 NEW cov: 11788 ft: 15030 corp: 27/57b lim: 5 exec/s: 28 rss: 68Mb L: 2/5 MS: 1 CopyPart- 00:08:16.524 [2024-11-19 17:52:09.256323] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.524 [2024-11-19 17:52:09.256347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.524 #29 NEW cov: 11788 ft: 15044 corp: 28/58b lim: 5 exec/s: 29 rss: 68Mb L: 1/5 MS: 1 CopyPart- 00:08:16.524 [2024-11-19 17:52:09.286899] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.524 [2024-11-19 17:52:09.286923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.524 [2024-11-19 17:52:09.286991] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.524 [2024-11-19 17:52:09.287004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.524 [2024-11-19 17:52:09.287054] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.524 [2024-11-19 17:52:09.287070] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.524 [2024-11-19 17:52:09.287119] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.524 [2024-11-19 17:52:09.287132] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:16.524 [2024-11-19 17:52:09.287182] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.524 [2024-11-19 17:52:09.287194] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:16.524 #30 NEW cov: 11788 ft: 15057 corp: 29/63b lim: 5 exec/s: 30 rss: 68Mb L: 5/5 MS: 1 CopyPart- 00:08:16.524 [2024-11-19 17:52:09.326763] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.524 [2024-11-19 17:52:09.326787] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.524 [2024-11-19 17:52:09.326856] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.524 [2024-11-19 17:52:09.326870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.524 [2024-11-19 17:52:09.326923] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.524 [2024-11-19 17:52:09.326936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.524 #31 NEW cov: 11788 ft: 15072 corp: 30/66b lim: 5 exec/s: 31 rss: 68Mb L: 3/5 MS: 1 InsertByte- 00:08:16.524 [2024-11-19 17:52:09.366727] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.524 [2024-11-19 17:52:09.366751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.524 [2024-11-19 17:52:09.366820] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.524 [2024-11-19 17:52:09.366833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.784 #32 NEW cov: 11788 ft: 15083 corp: 31/68b lim: 5 exec/s: 32 rss: 68Mb L: 2/5 MS: 1 ChangeBinInt- 00:08:16.784 [2024-11-19 17:52:09.407310] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.784 [2024-11-19 17:52:09.407335] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.784 [2024-11-19 17:52:09.407388] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.784 [2024-11-19 17:52:09.407403] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.784 [2024-11-19 17:52:09.407455] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.784 [2024-11-19 17:52:09.407485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.784 [2024-11-19 17:52:09.407535] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.784 [2024-11-19 17:52:09.407552] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:16.784 [2024-11-19 17:52:09.407607] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.784 [2024-11-19 17:52:09.407622] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:16.784 #33 NEW cov: 11788 ft: 15089 corp: 32/73b lim: 5 exec/s: 33 rss: 68Mb L: 5/5 MS: 1 PersAutoDict- DE: "\001\000\000\010"- 00:08:16.784 [2024-11-19 17:52:09.446822] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.784 [2024-11-19 17:52:09.446846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.784 #34 NEW cov: 11788 ft: 15104 corp: 33/74b lim: 5 exec/s: 34 rss: 69Mb L: 1/5 MS: 1 CrossOver- 00:08:16.784 [2024-11-19 17:52:09.487025] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.784 [2024-11-19 17:52:09.487049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.784 #35 NEW cov: 11788 ft: 15118 corp: 34/75b lim: 5 exec/s: 35 rss: 69Mb L: 1/5 MS: 1 ChangeByte- 00:08:16.784 [2024-11-19 17:52:09.517444] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.784 [2024-11-19 17:52:09.517468] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.784 [2024-11-19 17:52:09.517536] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.784 [2024-11-19 17:52:09.517550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.784 [2024-11-19 17:52:09.517606] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.784 [2024-11-19 17:52:09.517620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.784 [2024-11-19 17:52:09.517671] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.784 [2024-11-19 17:52:09.517684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:16.785 #36 NEW cov: 11788 ft: 15119 corp: 35/79b lim: 5 exec/s: 36 rss: 69Mb L: 4/5 MS: 1 ChangeBit- 00:08:16.785 [2024-11-19 17:52:09.557592] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.785 [2024-11-19 17:52:09.557621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.785 [2024-11-19 17:52:09.557691] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.785 [2024-11-19 17:52:09.557705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.785 [2024-11-19 17:52:09.557758] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.785 [2024-11-19 17:52:09.557771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.785 [2024-11-19 17:52:09.557830] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.785 [2024-11-19 17:52:09.557843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:16.785 #37 NEW cov: 11788 ft: 15134 corp: 36/83b lim: 5 exec/s: 37 rss: 69Mb L: 4/5 MS: 1 ChangeBinInt- 00:08:16.785 [2024-11-19 17:52:09.597399] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.785 [2024-11-19 17:52:09.597423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.785 [2024-11-19 17:52:09.597490] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.785 [2024-11-19 17:52:09.597504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.785 #38 NEW cov: 11788 ft: 15147 corp: 37/85b lim: 5 exec/s: 38 rss: 69Mb L: 2/5 MS: 1 InsertByte- 00:08:16.785 [2024-11-19 17:52:09.637510] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.785 [2024-11-19 17:52:09.637533] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.785 [2024-11-19 17:52:09.637608] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.785 [2024-11-19 17:52:09.637622] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.045 #39 NEW cov: 11788 ft: 15153 corp: 38/87b lim: 5 exec/s: 39 rss: 69Mb L: 2/5 MS: 1 CopyPart- 00:08:17.045 [2024-11-19 17:52:09.677501] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.045 [2024-11-19 17:52:09.677525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.045 #40 NEW cov: 11788 ft: 15156 corp: 39/88b lim: 5 exec/s: 40 rss: 69Mb L: 1/5 MS: 1 ChangeBit- 00:08:17.045 [2024-11-19 17:52:09.707560] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.045 [2024-11-19 17:52:09.707584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.045 #41 NEW cov: 11788 ft: 15170 corp: 40/89b lim: 5 exec/s: 41 rss: 69Mb L: 1/5 MS: 1 CopyPart- 00:08:17.045 [2024-11-19 17:52:09.748159] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.045 [2024-11-19 17:52:09.748184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.045 [2024-11-19 17:52:09.748252] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.045 [2024-11-19 17:52:09.748267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.045 [2024-11-19 17:52:09.748320] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.045 [2024-11-19 17:52:09.748332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.045 [2024-11-19 17:52:09.748387] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.045 [2024-11-19 17:52:09.748401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:17.045 #42 NEW cov: 11788 ft: 15185 corp: 41/93b lim: 5 exec/s: 42 rss: 69Mb L: 4/5 MS: 1 ChangeBit- 00:08:17.045 [2024-11-19 17:52:09.787793] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.045 [2024-11-19 17:52:09.787819] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.045 #43 NEW cov: 11788 ft: 15204 corp: 42/94b lim: 5 exec/s: 43 rss: 69Mb L: 1/5 MS: 1 ChangeByte- 00:08:17.045 [2024-11-19 17:52:09.828479] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.045 [2024-11-19 17:52:09.828505] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.045 [2024-11-19 17:52:09.828572] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.045 [2024-11-19 17:52:09.828586] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.045 [2024-11-19 17:52:09.828645] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.045 [2024-11-19 17:52:09.828659] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.045 [2024-11-19 17:52:09.828709] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.045 [2024-11-19 17:52:09.828721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:17.045 [2024-11-19 17:52:09.828771] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.045 [2024-11-19 17:52:09.828784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:17.045 #44 NEW cov: 11788 ft: 15210 corp: 43/99b lim: 5 exec/s: 44 rss: 69Mb L: 5/5 MS: 1 InsertByte- 00:08:17.045 [2024-11-19 17:52:09.868495] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.045 [2024-11-19 17:52:09.868519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.045 [2024-11-19 17:52:09.868588] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.045 [2024-11-19 17:52:09.868609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.045 [2024-11-19 17:52:09.868664] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.045 [2024-11-19 17:52:09.868678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.045 [2024-11-19 17:52:09.868729] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.045 [2024-11-19 17:52:09.868741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:17.305 [2024-11-19 17:52:09.908613] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.305 [2024-11-19 17:52:09.908646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.305 [2024-11-19 17:52:09.908716] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.305 [2024-11-19 17:52:09.908730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.305 [2024-11-19 17:52:09.908783] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.305 [2024-11-19 17:52:09.908796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.305 [2024-11-19 17:52:09.908850] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.305 [2024-11-19 17:52:09.908864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:17.305 #46 NEW cov: 11788 ft: 15215 corp: 44/103b lim: 5 exec/s: 46 rss: 69Mb L: 4/5 MS: 2 CrossOver-ShuffleBytes- 00:08:17.305 [2024-11-19 17:52:09.948744] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.305 [2024-11-19 17:52:09.948768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.305 [2024-11-19 17:52:09.948836] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.305 [2024-11-19 17:52:09.948849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.305 [2024-11-19 17:52:09.948900] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.305 [2024-11-19 17:52:09.948913] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.305 [2024-11-19 17:52:09.948964] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.305 [2024-11-19 17:52:09.948977] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:17.305 #47 NEW cov: 11788 ft: 15233 corp: 45/107b lim: 5 exec/s: 23 rss: 69Mb L: 4/5 MS: 1 ChangeByte- 00:08:17.305 #47 DONE cov: 11788 ft: 15233 corp: 45/107b lim: 5 exec/s: 23 rss: 69Mb 00:08:17.305 ###### Recommended dictionary. ###### 00:08:17.305 "\001\000\000\010" # Uses: 1 00:08:17.305 ###### End of recommended dictionary. ###### 00:08:17.305 Done 47 runs in 2 second(s) 00:08:17.305 17:52:10 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_9.conf 00:08:17.305 17:52:10 -- ../common.sh@72 -- # (( i++ )) 00:08:17.305 17:52:10 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:17.305 17:52:10 -- ../common.sh@73 -- # start_llvm_fuzz 10 1 0x1 00:08:17.305 17:52:10 -- nvmf/run.sh@23 -- # local fuzzer_type=10 00:08:17.305 17:52:10 -- nvmf/run.sh@24 -- # local timen=1 00:08:17.305 17:52:10 -- nvmf/run.sh@25 -- # local core=0x1 00:08:17.305 17:52:10 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:08:17.305 17:52:10 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_10.conf 00:08:17.305 17:52:10 -- nvmf/run.sh@29 -- # printf %02d 10 00:08:17.306 17:52:10 -- nvmf/run.sh@29 -- # port=4410 00:08:17.306 17:52:10 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:08:17.306 17:52:10 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4410' 00:08:17.306 17:52:10 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4410"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:17.306 17:52:10 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4410' -c /tmp/fuzz_json_10.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 -Z 10 -r /var/tmp/spdk10.sock 00:08:17.306 [2024-11-19 17:52:10.131141] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:17.306 [2024-11-19 17:52:10.131210] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid638460 ] 00:08:17.306 EAL: No free 2048 kB hugepages reported on node 1 00:08:17.565 [2024-11-19 17:52:10.380892] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:17.565 [2024-11-19 17:52:10.407865] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:17.565 [2024-11-19 17:52:10.408004] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:17.825 [2024-11-19 17:52:10.459518] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:17.825 [2024-11-19 17:52:10.475830] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4410 *** 00:08:17.825 INFO: Running with entropic power schedule (0xFF, 100). 00:08:17.825 INFO: Seed: 1131498337 00:08:17.825 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:17.825 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:17.825 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:08:17.825 INFO: A corpus is not provided, starting from an empty corpus 00:08:17.825 #2 INITED exec/s: 0 rss: 59Mb 00:08:17.825 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:17.825 This may also happen if the target rejected all inputs we tried so far 00:08:17.825 [2024-11-19 17:52:10.521097] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:3d3d3d3d cdw11:3d3d3d3d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.825 [2024-11-19 17:52:10.521128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.085 NEW_FUNC[1/670]: 0x45e248 in fuzz_admin_security_receive_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:205 00:08:18.085 NEW_FUNC[2/670]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:18.085 #3 NEW cov: 11584 ft: 11582 corp: 2/16b lim: 40 exec/s: 0 rss: 67Mb L: 15/15 MS: 1 InsertRepeatedBytes- 00:08:18.085 [2024-11-19 17:52:10.821940] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:5b350000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.085 [2024-11-19 17:52:10.821974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.085 [2024-11-19 17:52:10.822045] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.085 [2024-11-19 17:52:10.822068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.085 #8 NEW cov: 11697 ft: 12430 corp: 3/32b lim: 40 exec/s: 0 rss: 67Mb L: 16/16 MS: 5 CrossOver-ChangeBit-ChangeByte-ChangeBinInt-InsertRepeatedBytes- 00:08:18.085 [2024-11-19 17:52:10.872004] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:5b350000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.085 [2024-11-19 17:52:10.872034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.085 [2024-11-19 17:52:10.872094] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:3d000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.085 [2024-11-19 17:52:10.872108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.085 #9 NEW cov: 11703 ft: 12627 corp: 4/48b lim: 40 exec/s: 0 rss: 67Mb L: 16/16 MS: 1 ChangeByte- 00:08:18.085 [2024-11-19 17:52:10.912072] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:5b350000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.085 [2024-11-19 17:52:10.912097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.085 [2024-11-19 17:52:10.912159] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:3d240000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.085 [2024-11-19 17:52:10.912172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.085 #10 NEW cov: 11788 ft: 12910 corp: 5/64b lim: 40 exec/s: 0 rss: 67Mb L: 16/16 MS: 1 ChangeByte- 00:08:18.346 [2024-11-19 17:52:10.952477] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:5b350000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.346 [2024-11-19 17:52:10.952503] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.346 [2024-11-19 17:52:10.952581] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:3db4b4b4 cdw11:b4b4b4b4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.346 [2024-11-19 17:52:10.952595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.346 [2024-11-19 17:52:10.952662] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:b4b4b4b4 cdw11:b4b4b4b4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.346 [2024-11-19 17:52:10.952674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:18.346 [2024-11-19 17:52:10.952735] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:b4b4b4b4 cdw11:b4240000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.346 [2024-11-19 17:52:10.952749] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:18.346 #11 NEW cov: 11788 ft: 13499 corp: 6/100b lim: 40 exec/s: 0 rss: 67Mb L: 36/36 MS: 1 InsertRepeatedBytes- 00:08:18.346 [2024-11-19 17:52:10.992631] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:3d3d3d3d cdw11:3d3d3d3d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.346 [2024-11-19 17:52:10.992656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.346 [2024-11-19 17:52:10.992745] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:3d3d3d3d cdw11:003db4b4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.346 [2024-11-19 17:52:10.992760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.346 [2024-11-19 17:52:10.992812] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:b4b4b4b4 cdw11:b4b4b4b4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.346 [2024-11-19 17:52:10.992825] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:18.346 [2024-11-19 17:52:10.992886] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:b4b4b4b4 cdw11:b4b4b4b4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.346 [2024-11-19 17:52:10.992902] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:18.346 #12 NEW cov: 11788 ft: 13612 corp: 7/137b lim: 40 exec/s: 0 rss: 67Mb L: 37/37 MS: 1 CrossOver- 00:08:18.346 [2024-11-19 17:52:11.032425] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00003d00 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.346 [2024-11-19 17:52:11.032450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.346 [2024-11-19 17:52:11.032513] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.346 [2024-11-19 17:52:11.032527] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.346 #13 NEW cov: 11788 ft: 13763 corp: 8/153b lim: 40 exec/s: 0 rss: 67Mb L: 16/37 MS: 1 CopyPart- 00:08:18.346 [2024-11-19 17:52:11.072568] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:5b350000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.346 [2024-11-19 17:52:11.072594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.346 [2024-11-19 17:52:11.072677] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:3d240000 cdw11:00230000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.346 [2024-11-19 17:52:11.072691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.346 #14 NEW cov: 11788 ft: 13846 corp: 9/169b lim: 40 exec/s: 0 rss: 67Mb L: 16/37 MS: 1 ChangeByte- 00:08:18.346 [2024-11-19 17:52:11.112919] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00240000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.346 [2024-11-19 17:52:11.112944] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.346 [2024-11-19 17:52:11.113003] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:3db4b4b4 cdw11:b4b4b4b4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.347 [2024-11-19 17:52:11.113017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.347 [2024-11-19 17:52:11.113074] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:b4b4b4b4 cdw11:b4b4b4b4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.347 [2024-11-19 17:52:11.113088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:18.347 [2024-11-19 17:52:11.113143] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:b4b4b4b4 cdw11:b4240000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.347 [2024-11-19 17:52:11.113156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:18.347 #15 NEW cov: 11788 ft: 13873 corp: 10/205b lim: 40 exec/s: 0 rss: 67Mb L: 36/37 MS: 1 ChangeBinInt- 00:08:18.347 [2024-11-19 17:52:11.152782] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:5b352a00 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.347 [2024-11-19 17:52:11.152808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.347 [2024-11-19 17:52:11.152882] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.347 [2024-11-19 17:52:11.152896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.347 #16 NEW cov: 11788 ft: 13959 corp: 11/222b lim: 40 exec/s: 0 rss: 67Mb L: 17/37 MS: 1 InsertByte- 00:08:18.347 [2024-11-19 17:52:11.193177] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00240000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.347 [2024-11-19 17:52:11.193202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.347 [2024-11-19 17:52:11.193276] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:3db4b4b4 cdw11:b4b4b4b4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.347 [2024-11-19 17:52:11.193291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.347 [2024-11-19 17:52:11.193348] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:b4b4b4b4 cdw11:b4b4b4b4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.347 [2024-11-19 17:52:11.193362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:18.347 [2024-11-19 17:52:11.193419] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:b4b4b4b4 cdw11:b4240000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.347 [2024-11-19 17:52:11.193433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:18.608 #17 NEW cov: 11788 ft: 13994 corp: 12/261b lim: 40 exec/s: 0 rss: 68Mb L: 39/39 MS: 1 InsertRepeatedBytes- 00:08:18.608 [2024-11-19 17:52:11.233040] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:5b350000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.608 [2024-11-19 17:52:11.233066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.608 [2024-11-19 17:52:11.233128] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:3d240000 cdw11:00030000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.608 [2024-11-19 17:52:11.233142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.608 #18 NEW cov: 11788 ft: 14008 corp: 13/277b lim: 40 exec/s: 0 rss: 68Mb L: 16/39 MS: 1 ChangeBit- 00:08:18.608 [2024-11-19 17:52:11.273147] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a5b3500 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.608 [2024-11-19 17:52:11.273174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.608 [2024-11-19 17:52:11.273236] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.608 [2024-11-19 17:52:11.273249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.608 #19 NEW cov: 11788 ft: 14023 corp: 14/294b lim: 40 exec/s: 0 rss: 68Mb L: 17/39 MS: 1 CrossOver- 00:08:18.608 [2024-11-19 17:52:11.313195] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:fe5b3500 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.608 [2024-11-19 17:52:11.313221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.608 [2024-11-19 17:52:11.313283] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:003d2400 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.608 [2024-11-19 17:52:11.313297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.608 #20 NEW cov: 11788 ft: 14055 corp: 15/311b lim: 40 exec/s: 0 rss: 68Mb L: 17/39 MS: 1 InsertByte- 00:08:18.608 [2024-11-19 17:52:11.353586] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00240000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.608 [2024-11-19 17:52:11.353620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.608 [2024-11-19 17:52:11.353684] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:3db4b4b4 cdw11:b4b4b4b4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.608 [2024-11-19 17:52:11.353699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.608 [2024-11-19 17:52:11.353758] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:b4b4b4b4 cdw11:b4b4b4b4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.608 [2024-11-19 17:52:11.353772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:18.608 [2024-11-19 17:52:11.353835] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:b4b4b4b4 cdw11:b4240000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.608 [2024-11-19 17:52:11.353848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:18.608 #26 NEW cov: 11788 ft: 14063 corp: 16/350b lim: 40 exec/s: 0 rss: 68Mb L: 39/39 MS: 1 CrossOver- 00:08:18.608 [2024-11-19 17:52:11.393510] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:5b350000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.608 [2024-11-19 17:52:11.393535] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.608 [2024-11-19 17:52:11.393616] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:2a3d2400 cdw11:00002300 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.608 [2024-11-19 17:52:11.393630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.608 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:18.608 #27 NEW cov: 11811 ft: 14094 corp: 17/367b lim: 40 exec/s: 0 rss: 68Mb L: 17/39 MS: 1 InsertByte- 00:08:18.608 [2024-11-19 17:52:11.433579] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:5b355b35 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.608 [2024-11-19 17:52:11.433610] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.608 [2024-11-19 17:52:11.433675] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.608 [2024-11-19 17:52:11.433689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.608 #28 NEW cov: 11811 ft: 14172 corp: 18/384b lim: 40 exec/s: 0 rss: 68Mb L: 17/39 MS: 1 CrossOver- 00:08:18.868 [2024-11-19 17:52:11.473682] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:5b350000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.868 [2024-11-19 17:52:11.473707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.868 [2024-11-19 17:52:11.473784] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:2ac3dbff cdw11:fc002300 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.868 [2024-11-19 17:52:11.473799] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.868 #29 NEW cov: 11811 ft: 14202 corp: 19/401b lim: 40 exec/s: 0 rss: 68Mb L: 17/39 MS: 1 ChangeBinInt- 00:08:18.868 [2024-11-19 17:52:11.513814] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:5b350000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.869 [2024-11-19 17:52:11.513842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.869 [2024-11-19 17:52:11.513915] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:b4240000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.869 [2024-11-19 17:52:11.513929] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.869 #30 NEW cov: 11811 ft: 14219 corp: 20/417b lim: 40 exec/s: 30 rss: 68Mb L: 16/39 MS: 1 CrossOver- 00:08:18.869 [2024-11-19 17:52:11.554181] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:5b350000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.869 [2024-11-19 17:52:11.554208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.869 [2024-11-19 17:52:11.554267] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:3db4b4b4 cdw11:b4b4b4b4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.869 [2024-11-19 17:52:11.554280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.869 [2024-11-19 17:52:11.554337] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:b4b4b4b4 cdw11:b4b4b4b4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.869 [2024-11-19 17:52:11.554351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:18.869 [2024-11-19 17:52:11.554408] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:b4d2b4b4 cdw11:b4240000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.869 [2024-11-19 17:52:11.554421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:18.869 #31 NEW cov: 11811 ft: 14230 corp: 21/453b lim: 40 exec/s: 31 rss: 68Mb L: 36/39 MS: 1 ChangeByte- 00:08:18.869 [2024-11-19 17:52:11.594180] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:5b350000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.869 [2024-11-19 17:52:11.594205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.869 [2024-11-19 17:52:11.594265] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:2ac3dbff cdw11:fc002300 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.869 [2024-11-19 17:52:11.594279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.869 [2024-11-19 17:52:11.594338] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00002ac3 cdw11:dbfffc00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.869 [2024-11-19 17:52:11.594351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:18.869 #32 NEW cov: 11811 ft: 14419 corp: 22/480b lim: 40 exec/s: 32 rss: 68Mb L: 27/39 MS: 1 CopyPart- 00:08:18.869 [2024-11-19 17:52:11.634288] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a5bff8b cdw11:6c5dfc2c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.869 [2024-11-19 17:52:11.634314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.869 [2024-11-19 17:52:11.634374] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:2f663500 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.869 [2024-11-19 17:52:11.634388] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.869 [2024-11-19 17:52:11.634462] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.869 [2024-11-19 17:52:11.634479] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:18.869 #33 NEW cov: 11811 ft: 14492 corp: 23/505b lim: 40 exec/s: 33 rss: 68Mb L: 25/39 MS: 1 CMP- DE: "\377\213l]\374,/f"- 00:08:18.869 [2024-11-19 17:52:11.674202] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0024002d cdw11:00280000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.869 [2024-11-19 17:52:11.674227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.869 #37 NEW cov: 11811 ft: 14509 corp: 24/519b lim: 40 exec/s: 37 rss: 68Mb L: 14/39 MS: 4 CrossOver-ChangeByte-ChangeByte-CopyPart- 00:08:18.869 [2024-11-19 17:52:11.714399] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:5b350000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.869 [2024-11-19 17:52:11.714425] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.869 [2024-11-19 17:52:11.714487] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:2a3d2400 cdw11:00002300 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.869 [2024-11-19 17:52:11.714500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.130 #38 NEW cov: 11811 ft: 14523 corp: 25/536b lim: 40 exec/s: 38 rss: 68Mb L: 17/39 MS: 1 ShuffleBytes- 00:08:19.130 [2024-11-19 17:52:11.754483] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:5b350000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.130 [2024-11-19 17:52:11.754509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.130 [2024-11-19 17:52:11.754570] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:3d242300 cdw11:00002300 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.130 [2024-11-19 17:52:11.754584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.130 #39 NEW cov: 11811 ft: 14536 corp: 26/553b lim: 40 exec/s: 39 rss: 68Mb L: 17/39 MS: 1 InsertByte- 00:08:19.130 [2024-11-19 17:52:11.794480] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:3d3d3d3d cdw11:3d3d3dff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.130 [2024-11-19 17:52:11.794506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.130 #40 NEW cov: 11811 ft: 14577 corp: 27/568b lim: 40 exec/s: 40 rss: 68Mb L: 15/39 MS: 1 PersAutoDict- DE: "\377\213l]\374,/f"- 00:08:19.130 [2024-11-19 17:52:11.834743] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00003d00 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.130 [2024-11-19 17:52:11.834768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.130 [2024-11-19 17:52:11.834841] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:04000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.130 [2024-11-19 17:52:11.834855] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.130 #41 NEW cov: 11811 ft: 14583 corp: 28/584b lim: 40 exec/s: 41 rss: 68Mb L: 16/39 MS: 1 ChangeBit- 00:08:19.130 [2024-11-19 17:52:11.874848] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:5b350000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.130 [2024-11-19 17:52:11.874873] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.130 [2024-11-19 17:52:11.874942] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:2a3d2411 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.130 [2024-11-19 17:52:11.874955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.130 #42 NEW cov: 11811 ft: 14590 corp: 29/601b lim: 40 exec/s: 42 rss: 68Mb L: 17/39 MS: 1 ChangeBinInt- 00:08:19.130 [2024-11-19 17:52:11.914858] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:5b352400 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.130 [2024-11-19 17:52:11.914883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.130 #43 NEW cov: 11811 ft: 14609 corp: 30/610b lim: 40 exec/s: 43 rss: 69Mb L: 9/39 MS: 1 EraseBytes- 00:08:19.130 [2024-11-19 17:52:11.955502] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:5b350000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.130 [2024-11-19 17:52:11.955527] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.130 [2024-11-19 17:52:11.955588] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:3db4b4b4 cdw11:b4b4b4b4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.130 [2024-11-19 17:52:11.955605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.130 [2024-11-19 17:52:11.955682] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:b4b4b4b4 cdw11:b4b4b4b4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.130 [2024-11-19 17:52:11.955696] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:19.130 [2024-11-19 17:52:11.955757] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:b4d2b4ff cdw11:ffffffb4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.130 [2024-11-19 17:52:11.955771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:19.130 [2024-11-19 17:52:11.955830] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:b4240000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.130 [2024-11-19 17:52:11.955844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:19.130 #44 NEW cov: 11811 ft: 14657 corp: 31/650b lim: 40 exec/s: 44 rss: 69Mb L: 40/40 MS: 1 InsertRepeatedBytes- 00:08:19.391 [2024-11-19 17:52:11.995236] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:5b000000 cdw11:0000003d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.391 [2024-11-19 17:52:11.995261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.391 [2024-11-19 17:52:11.995325] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:24000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.391 [2024-11-19 17:52:11.995339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.391 #45 NEW cov: 11811 ft: 14660 corp: 32/666b lim: 40 exec/s: 45 rss: 69Mb L: 16/40 MS: 1 CopyPart- 00:08:19.391 [2024-11-19 17:52:12.025431] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:5b000000 cdw11:0000003d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.391 [2024-11-19 17:52:12.025455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.391 [2024-11-19 17:52:12.025515] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:2400ff8b cdw11:6c5dfc2c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.391 [2024-11-19 17:52:12.025532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.391 [2024-11-19 17:52:12.025592] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:2f660000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.391 [2024-11-19 17:52:12.025610] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:19.391 #46 NEW cov: 11811 ft: 14671 corp: 33/690b lim: 40 exec/s: 46 rss: 69Mb L: 24/40 MS: 1 PersAutoDict- DE: "\377\213l]\374,/f"- 00:08:19.391 [2024-11-19 17:52:12.065310] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.391 [2024-11-19 17:52:12.065335] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.391 #47 NEW cov: 11811 ft: 14682 corp: 34/701b lim: 40 exec/s: 47 rss: 69Mb L: 11/40 MS: 1 EraseBytes- 00:08:19.391 [2024-11-19 17:52:12.105705] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:5b350000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.391 [2024-11-19 17:52:12.105731] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.391 [2024-11-19 17:52:12.105806] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:2a5b3d24 cdw11:11000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.391 [2024-11-19 17:52:12.105820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.391 [2024-11-19 17:52:12.105879] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:35000000 cdw11:0000003d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.391 [2024-11-19 17:52:12.105892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:19.391 #48 NEW cov: 11811 ft: 14724 corp: 35/730b lim: 40 exec/s: 48 rss: 69Mb L: 29/40 MS: 1 CrossOver- 00:08:19.391 [2024-11-19 17:52:12.145812] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0acb5bff cdw11:8b6c5dfc SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.391 [2024-11-19 17:52:12.145836] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.391 [2024-11-19 17:52:12.145911] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:2c2f6635 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.391 [2024-11-19 17:52:12.145925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.391 [2024-11-19 17:52:12.145984] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.391 [2024-11-19 17:52:12.145998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:19.391 #49 NEW cov: 11811 ft: 14747 corp: 36/756b lim: 40 exec/s: 49 rss: 69Mb L: 26/40 MS: 1 InsertByte- 00:08:19.391 [2024-11-19 17:52:12.185978] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:5b350000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.391 [2024-11-19 17:52:12.186002] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.391 [2024-11-19 17:52:12.186079] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:3db4b4b4 cdw11:b4b4b4b4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.391 [2024-11-19 17:52:12.186094] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.391 [2024-11-19 17:52:12.186155] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:b4b4b4b4 cdw11:b4b4b4b4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.391 [2024-11-19 17:52:12.186169] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:19.391 [2024-11-19 17:52:12.186224] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:b4d2b4b4 cdw11:b42c0000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.391 [2024-11-19 17:52:12.186236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:19.391 #50 NEW cov: 11811 ft: 14778 corp: 37/792b lim: 40 exec/s: 50 rss: 69Mb L: 36/40 MS: 1 ChangeBit- 00:08:19.391 [2024-11-19 17:52:12.225716] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:3d3d3d3d cdw11:3d3d3d3d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.391 [2024-11-19 17:52:12.225740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.391 #51 NEW cov: 11811 ft: 14805 corp: 38/802b lim: 40 exec/s: 51 rss: 69Mb L: 10/40 MS: 1 EraseBytes- 00:08:19.652 [2024-11-19 17:52:12.266107] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:fe5b3500 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.652 [2024-11-19 17:52:12.266132] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.652 [2024-11-19 17:52:12.266209] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:003d2400 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.652 [2024-11-19 17:52:12.266224] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.652 [2024-11-19 17:52:12.266285] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:04000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.652 [2024-11-19 17:52:12.266298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:19.652 #52 NEW cov: 11811 ft: 14839 corp: 39/827b lim: 40 exec/s: 52 rss: 69Mb L: 25/40 MS: 1 CMP- DE: "\001\004\000\000\000\000\000\000"- 00:08:19.652 [2024-11-19 17:52:12.306070] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:5b350000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.652 [2024-11-19 17:52:12.306095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.652 [2024-11-19 17:52:12.306171] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:3d240000 cdw11:0003002a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.652 [2024-11-19 17:52:12.306184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.652 #53 NEW cov: 11811 ft: 14854 corp: 40/843b lim: 40 exec/s: 53 rss: 69Mb L: 16/40 MS: 1 ChangeByte- 00:08:19.652 [2024-11-19 17:52:12.346206] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:5b350000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.652 [2024-11-19 17:52:12.346230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.652 [2024-11-19 17:52:12.346309] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.652 [2024-11-19 17:52:12.346324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.652 #54 NEW cov: 11811 ft: 14878 corp: 41/859b lim: 40 exec/s: 54 rss: 69Mb L: 16/40 MS: 1 ShuffleBytes- 00:08:19.652 [2024-11-19 17:52:12.386211] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00010400 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.652 [2024-11-19 17:52:12.386235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.652 #55 NEW cov: 11811 ft: 14924 corp: 42/873b lim: 40 exec/s: 55 rss: 69Mb L: 14/40 MS: 1 PersAutoDict- DE: "\001\004\000\000\000\000\000\000"- 00:08:19.652 [2024-11-19 17:52:12.426482] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:5b350000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.652 [2024-11-19 17:52:12.426507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.652 [2024-11-19 17:52:12.426568] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:3d000000 cdw11:0000003d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.652 [2024-11-19 17:52:12.426581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.652 #56 NEW cov: 11811 ft: 14956 corp: 43/894b lim: 40 exec/s: 56 rss: 69Mb L: 21/40 MS: 1 CopyPart- 00:08:19.652 [2024-11-19 17:52:12.466424] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.652 [2024-11-19 17:52:12.466448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.652 #57 NEW cov: 11811 ft: 14980 corp: 44/905b lim: 40 exec/s: 57 rss: 70Mb L: 11/40 MS: 1 ChangeByte- 00:08:19.652 [2024-11-19 17:52:12.506718] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:5b350000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.652 [2024-11-19 17:52:12.506742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.652 [2024-11-19 17:52:12.506815] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:3d000000 cdw11:00000600 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.652 [2024-11-19 17:52:12.506829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.912 #58 NEW cov: 11811 ft: 15063 corp: 45/921b lim: 40 exec/s: 29 rss: 70Mb L: 16/40 MS: 1 ChangeBinInt- 00:08:19.912 #58 DONE cov: 11811 ft: 15063 corp: 45/921b lim: 40 exec/s: 29 rss: 70Mb 00:08:19.912 ###### Recommended dictionary. ###### 00:08:19.912 "\377\213l]\374,/f" # Uses: 2 00:08:19.912 "\001\004\000\000\000\000\000\000" # Uses: 1 00:08:19.912 ###### End of recommended dictionary. ###### 00:08:19.912 Done 58 runs in 2 second(s) 00:08:19.912 17:52:12 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_10.conf 00:08:19.912 17:52:12 -- ../common.sh@72 -- # (( i++ )) 00:08:19.912 17:52:12 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:19.912 17:52:12 -- ../common.sh@73 -- # start_llvm_fuzz 11 1 0x1 00:08:19.912 17:52:12 -- nvmf/run.sh@23 -- # local fuzzer_type=11 00:08:19.912 17:52:12 -- nvmf/run.sh@24 -- # local timen=1 00:08:19.912 17:52:12 -- nvmf/run.sh@25 -- # local core=0x1 00:08:19.912 17:52:12 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:08:19.912 17:52:12 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_11.conf 00:08:19.912 17:52:12 -- nvmf/run.sh@29 -- # printf %02d 11 00:08:19.912 17:52:12 -- nvmf/run.sh@29 -- # port=4411 00:08:19.912 17:52:12 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:08:19.912 17:52:12 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4411' 00:08:19.912 17:52:12 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4411"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:19.912 17:52:12 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4411' -c /tmp/fuzz_json_11.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 -Z 11 -r /var/tmp/spdk11.sock 00:08:19.912 [2024-11-19 17:52:12.680143] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:19.913 [2024-11-19 17:52:12.680212] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid638830 ] 00:08:19.913 EAL: No free 2048 kB hugepages reported on node 1 00:08:20.173 [2024-11-19 17:52:12.930648] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:20.173 [2024-11-19 17:52:12.956529] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:20.173 [2024-11-19 17:52:12.956674] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:20.173 [2024-11-19 17:52:13.007977] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:20.173 [2024-11-19 17:52:13.024292] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4411 *** 00:08:20.433 INFO: Running with entropic power schedule (0xFF, 100). 00:08:20.433 INFO: Seed: 3680474724 00:08:20.433 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:20.433 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:20.433 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:08:20.433 INFO: A corpus is not provided, starting from an empty corpus 00:08:20.433 #2 INITED exec/s: 0 rss: 59Mb 00:08:20.433 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:20.433 This may also happen if the target rejected all inputs we tried so far 00:08:20.433 [2024-11-19 17:52:13.091591] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.433 [2024-11-19 17:52:13.091633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.433 [2024-11-19 17:52:13.091715] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.433 [2024-11-19 17:52:13.091730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.433 [2024-11-19 17:52:13.091808] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:0000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.433 [2024-11-19 17:52:13.091823] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.693 NEW_FUNC[1/671]: 0x45ffb8 in fuzz_admin_security_send_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:223 00:08:20.693 NEW_FUNC[2/671]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:20.693 #12 NEW cov: 11596 ft: 11592 corp: 2/25b lim: 40 exec/s: 0 rss: 67Mb L: 24/24 MS: 5 ChangeByte-CrossOver-CopyPart-CopyPart-InsertRepeatedBytes- 00:08:20.693 [2024-11-19 17:52:13.411453] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.693 [2024-11-19 17:52:13.411491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.693 [2024-11-19 17:52:13.411627] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.693 [2024-11-19 17:52:13.411655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.693 [2024-11-19 17:52:13.411783] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.693 [2024-11-19 17:52:13.411807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.693 #13 NEW cov: 11709 ft: 12362 corp: 3/54b lim: 40 exec/s: 0 rss: 67Mb L: 29/29 MS: 1 CrossOver- 00:08:20.693 [2024-11-19 17:52:13.462231] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:dddddddd cdw11:dddddddd SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.693 [2024-11-19 17:52:13.462259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.693 [2024-11-19 17:52:13.462377] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:dddddd0a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.693 [2024-11-19 17:52:13.462394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.693 [2024-11-19 17:52:13.462511] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.693 [2024-11-19 17:52:13.462525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.693 [2024-11-19 17:52:13.462669] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.693 [2024-11-19 17:52:13.462685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:20.693 [2024-11-19 17:52:13.462821] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:0000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.693 [2024-11-19 17:52:13.462837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:20.693 #14 NEW cov: 11715 ft: 12972 corp: 4/94b lim: 40 exec/s: 0 rss: 68Mb L: 40/40 MS: 1 InsertRepeatedBytes- 00:08:20.693 [2024-11-19 17:52:13.511772] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.693 [2024-11-19 17:52:13.511801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.693 [2024-11-19 17:52:13.511927] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000021 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.693 [2024-11-19 17:52:13.511946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.693 [2024-11-19 17:52:13.512071] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.693 [2024-11-19 17:52:13.512088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.693 #20 NEW cov: 11800 ft: 13171 corp: 5/124b lim: 40 exec/s: 0 rss: 68Mb L: 30/40 MS: 1 InsertByte- 00:08:20.693 [2024-11-19 17:52:13.552081] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:01000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.693 [2024-11-19 17:52:13.552108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.693 [2024-11-19 17:52:13.552235] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.693 [2024-11-19 17:52:13.552252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.693 [2024-11-19 17:52:13.552385] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:0000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.693 [2024-11-19 17:52:13.552404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.953 #21 NEW cov: 11800 ft: 13289 corp: 6/148b lim: 40 exec/s: 0 rss: 68Mb L: 24/40 MS: 1 ChangeBit- 00:08:20.953 [2024-11-19 17:52:13.592143] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.953 [2024-11-19 17:52:13.592168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.953 [2024-11-19 17:52:13.592298] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:000000ff cdw11:21000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.953 [2024-11-19 17:52:13.592314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.953 [2024-11-19 17:52:13.592438] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.953 [2024-11-19 17:52:13.592453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.953 #22 NEW cov: 11800 ft: 13357 corp: 7/179b lim: 40 exec/s: 0 rss: 68Mb L: 31/40 MS: 1 InsertByte- 00:08:20.953 [2024-11-19 17:52:13.632241] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a000006 cdw11:01000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.953 [2024-11-19 17:52:13.632268] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.953 [2024-11-19 17:52:13.632405] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.953 [2024-11-19 17:52:13.632421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.954 [2024-11-19 17:52:13.632553] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:0000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.954 [2024-11-19 17:52:13.632571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.954 #23 NEW cov: 11800 ft: 13535 corp: 8/203b lim: 40 exec/s: 0 rss: 68Mb L: 24/40 MS: 1 ChangeBinInt- 00:08:20.954 [2024-11-19 17:52:13.672603] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a010000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.954 [2024-11-19 17:52:13.672628] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.954 [2024-11-19 17:52:13.672766] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000601 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.954 [2024-11-19 17:52:13.672781] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.954 [2024-11-19 17:52:13.672900] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.954 [2024-11-19 17:52:13.672916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.954 [2024-11-19 17:52:13.673041] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.954 [2024-11-19 17:52:13.673057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:20.954 #24 NEW cov: 11800 ft: 13649 corp: 9/238b lim: 40 exec/s: 0 rss: 68Mb L: 35/40 MS: 1 CopyPart- 00:08:20.954 [2024-11-19 17:52:13.712484] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.954 [2024-11-19 17:52:13.712514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.954 [2024-11-19 17:52:13.712642] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00003f00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.954 [2024-11-19 17:52:13.712660] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.954 [2024-11-19 17:52:13.712794] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.954 [2024-11-19 17:52:13.712810] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.954 #25 NEW cov: 11800 ft: 13713 corp: 10/268b lim: 40 exec/s: 0 rss: 68Mb L: 30/40 MS: 1 InsertByte- 00:08:20.954 [2024-11-19 17:52:13.752804] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.954 [2024-11-19 17:52:13.752829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.954 [2024-11-19 17:52:13.752953] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.954 [2024-11-19 17:52:13.752969] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.954 [2024-11-19 17:52:13.753102] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.954 [2024-11-19 17:52:13.753121] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.954 [2024-11-19 17:52:13.753252] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:0000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.954 [2024-11-19 17:52:13.753269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:20.954 #26 NEW cov: 11800 ft: 13727 corp: 11/300b lim: 40 exec/s: 0 rss: 68Mb L: 32/40 MS: 1 CopyPart- 00:08:20.954 [2024-11-19 17:52:13.792721] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.954 [2024-11-19 17:52:13.792747] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.954 [2024-11-19 17:52:13.792873] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:000000ff cdw11:21000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.954 [2024-11-19 17:52:13.792890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.954 [2024-11-19 17:52:13.793024] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.954 [2024-11-19 17:52:13.793040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:21.214 #27 NEW cov: 11800 ft: 13739 corp: 12/331b lim: 40 exec/s: 0 rss: 68Mb L: 31/40 MS: 1 ShuffleBytes- 00:08:21.214 [2024-11-19 17:52:13.833049] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a002323 cdw11:23230000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.214 [2024-11-19 17:52:13.833074] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.214 [2024-11-19 17:52:13.833200] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000021 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.214 [2024-11-19 17:52:13.833220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.214 [2024-11-19 17:52:13.833354] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.214 [2024-11-19 17:52:13.833370] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:21.214 [2024-11-19 17:52:13.833495] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.214 [2024-11-19 17:52:13.833510] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:21.214 #28 NEW cov: 11800 ft: 13776 corp: 13/365b lim: 40 exec/s: 0 rss: 68Mb L: 34/40 MS: 1 InsertRepeatedBytes- 00:08:21.214 [2024-11-19 17:52:13.872987] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a000007 cdw11:01000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.214 [2024-11-19 17:52:13.873013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.214 [2024-11-19 17:52:13.873152] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.214 [2024-11-19 17:52:13.873168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.215 [2024-11-19 17:52:13.873301] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:0000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.215 [2024-11-19 17:52:13.873317] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:21.215 #29 NEW cov: 11800 ft: 13821 corp: 14/389b lim: 40 exec/s: 0 rss: 68Mb L: 24/40 MS: 1 ChangeBit- 00:08:21.215 [2024-11-19 17:52:13.913186] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.215 [2024-11-19 17:52:13.913214] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.215 [2024-11-19 17:52:13.913348] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.215 [2024-11-19 17:52:13.913365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.215 [2024-11-19 17:52:13.913497] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.215 [2024-11-19 17:52:13.913513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:21.215 #30 NEW cov: 11800 ft: 13828 corp: 15/418b lim: 40 exec/s: 0 rss: 68Mb L: 29/40 MS: 1 ShuffleBytes- 00:08:21.215 [2024-11-19 17:52:13.953000] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.215 [2024-11-19 17:52:13.953026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.215 [2024-11-19 17:52:13.953152] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000021 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.215 [2024-11-19 17:52:13.953178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.215 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:21.215 #31 NEW cov: 11823 ft: 14085 corp: 16/440b lim: 40 exec/s: 0 rss: 68Mb L: 22/40 MS: 1 EraseBytes- 00:08:21.215 [2024-11-19 17:52:13.993005] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.215 [2024-11-19 17:52:13.993032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.215 [2024-11-19 17:52:13.993155] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.215 [2024-11-19 17:52:13.993172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.215 #32 NEW cov: 11823 ft: 14100 corp: 17/461b lim: 40 exec/s: 0 rss: 68Mb L: 21/40 MS: 1 EraseBytes- 00:08:21.215 [2024-11-19 17:52:14.033704] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a002323 cdw11:23230000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.215 [2024-11-19 17:52:14.033730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.215 [2024-11-19 17:52:14.033867] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000021 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.215 [2024-11-19 17:52:14.033884] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.215 [2024-11-19 17:52:14.034006] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.215 [2024-11-19 17:52:14.034023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:21.215 [2024-11-19 17:52:14.034146] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.215 [2024-11-19 17:52:14.034163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:21.215 #33 NEW cov: 11823 ft: 14177 corp: 18/495b lim: 40 exec/s: 0 rss: 68Mb L: 34/40 MS: 1 ShuffleBytes- 00:08:21.215 [2024-11-19 17:52:14.073882] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.215 [2024-11-19 17:52:14.073910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.215 [2024-11-19 17:52:14.074054] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.215 [2024-11-19 17:52:14.074070] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.215 [2024-11-19 17:52:14.074214] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.215 [2024-11-19 17:52:14.074230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:21.215 [2024-11-19 17:52:14.074367] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:0000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.215 [2024-11-19 17:52:14.074385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:21.475 #34 NEW cov: 11823 ft: 14231 corp: 19/527b lim: 40 exec/s: 34 rss: 68Mb L: 32/40 MS: 1 InsertRepeatedBytes- 00:08:21.475 [2024-11-19 17:52:14.113682] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00c20000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.475 [2024-11-19 17:52:14.113710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.475 [2024-11-19 17:52:14.113841] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.475 [2024-11-19 17:52:14.113856] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.475 [2024-11-19 17:52:14.113979] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.475 [2024-11-19 17:52:14.113996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:21.475 #35 NEW cov: 11823 ft: 14237 corp: 20/557b lim: 40 exec/s: 35 rss: 69Mb L: 30/40 MS: 1 InsertByte- 00:08:21.475 [2024-11-19 17:52:14.164372] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.475 [2024-11-19 17:52:14.164401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.475 [2024-11-19 17:52:14.164532] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:000000ff cdw11:21000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.475 [2024-11-19 17:52:14.164550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.475 [2024-11-19 17:52:14.164683] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.475 [2024-11-19 17:52:14.164701] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:21.475 [2024-11-19 17:52:14.164825] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:0000c5c5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.475 [2024-11-19 17:52:14.164842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:21.475 [2024-11-19 17:52:14.164970] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:8 nsid:0 cdw10:c5c5c5c5 cdw11:c5c5c50a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.476 [2024-11-19 17:52:14.164989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:21.476 #36 NEW cov: 11823 ft: 14252 corp: 21/597b lim: 40 exec/s: 36 rss: 69Mb L: 40/40 MS: 1 InsertRepeatedBytes- 00:08:21.476 [2024-11-19 17:52:14.203941] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a00007e cdw11:01000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.476 [2024-11-19 17:52:14.203969] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.476 [2024-11-19 17:52:14.204110] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.476 [2024-11-19 17:52:14.204130] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.476 [2024-11-19 17:52:14.204272] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:0000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.476 [2024-11-19 17:52:14.204288] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:21.476 #37 NEW cov: 11823 ft: 14306 corp: 22/621b lim: 40 exec/s: 37 rss: 69Mb L: 24/40 MS: 1 ChangeByte- 00:08:21.476 [2024-11-19 17:52:14.244366] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a003100 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.476 [2024-11-19 17:52:14.244395] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.476 [2024-11-19 17:52:14.244530] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.476 [2024-11-19 17:52:14.244546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.476 [2024-11-19 17:52:14.244684] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.476 [2024-11-19 17:52:14.244701] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:21.476 [2024-11-19 17:52:14.244847] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.476 [2024-11-19 17:52:14.244863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:21.476 #38 NEW cov: 11823 ft: 14347 corp: 23/654b lim: 40 exec/s: 38 rss: 69Mb L: 33/40 MS: 1 InsertByte- 00:08:21.476 [2024-11-19 17:52:14.293856] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.476 [2024-11-19 17:52:14.293885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.476 [2024-11-19 17:52:14.294008] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.476 [2024-11-19 17:52:14.294025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.476 #39 NEW cov: 11823 ft: 14362 corp: 24/675b lim: 40 exec/s: 39 rss: 69Mb L: 21/40 MS: 1 CopyPart- 00:08:21.736 [2024-11-19 17:52:14.344319] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a000006 cdw11:01000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.736 [2024-11-19 17:52:14.344347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.736 [2024-11-19 17:52:14.344470] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.736 [2024-11-19 17:52:14.344487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.736 [2024-11-19 17:52:14.344609] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:0000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.736 [2024-11-19 17:52:14.344626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:21.736 #40 NEW cov: 11823 ft: 14391 corp: 25/699b lim: 40 exec/s: 40 rss: 69Mb L: 24/40 MS: 1 ShuffleBytes- 00:08:21.736 [2024-11-19 17:52:14.384155] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.736 [2024-11-19 17:52:14.384183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.736 [2024-11-19 17:52:14.384310] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.736 [2024-11-19 17:52:14.384327] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.736 #41 NEW cov: 11823 ft: 14401 corp: 26/719b lim: 40 exec/s: 41 rss: 69Mb L: 20/40 MS: 1 EraseBytes- 00:08:21.736 [2024-11-19 17:52:14.434492] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.736 [2024-11-19 17:52:14.434521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.736 [2024-11-19 17:52:14.434655] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.736 [2024-11-19 17:52:14.434673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.736 [2024-11-19 17:52:14.434816] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.736 [2024-11-19 17:52:14.434833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:21.736 [2024-11-19 17:52:14.434966] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:0000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.736 [2024-11-19 17:52:14.434984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:21.736 #42 NEW cov: 11823 ft: 14439 corp: 27/751b lim: 40 exec/s: 42 rss: 69Mb L: 32/40 MS: 1 ShuffleBytes- 00:08:21.736 [2024-11-19 17:52:14.475077] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.736 [2024-11-19 17:52:14.475104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.737 [2024-11-19 17:52:14.475236] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00002100 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.737 [2024-11-19 17:52:14.475252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.737 [2024-11-19 17:52:14.475379] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000021 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.737 [2024-11-19 17:52:14.475397] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:21.737 [2024-11-19 17:52:14.475539] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.737 [2024-11-19 17:52:14.475555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:21.737 #43 NEW cov: 11823 ft: 14471 corp: 28/785b lim: 40 exec/s: 43 rss: 69Mb L: 34/40 MS: 1 CrossOver- 00:08:21.737 [2024-11-19 17:52:14.525204] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.737 [2024-11-19 17:52:14.525231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.737 [2024-11-19 17:52:14.525360] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00002100 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.737 [2024-11-19 17:52:14.525376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.737 [2024-11-19 17:52:14.525496] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:005b0000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.737 [2024-11-19 17:52:14.525512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:21.737 [2024-11-19 17:52:14.525646] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:21000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.737 [2024-11-19 17:52:14.525667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:21.737 #44 NEW cov: 11823 ft: 14492 corp: 29/820b lim: 40 exec/s: 44 rss: 69Mb L: 35/40 MS: 1 InsertByte- 00:08:21.737 [2024-11-19 17:52:14.575331] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a003100 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.737 [2024-11-19 17:52:14.575357] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.737 [2024-11-19 17:52:14.575476] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.737 [2024-11-19 17:52:14.575493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.737 [2024-11-19 17:52:14.575625] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00970000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.737 [2024-11-19 17:52:14.575641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:21.737 [2024-11-19 17:52:14.575774] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.737 [2024-11-19 17:52:14.575791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:21.997 #45 NEW cov: 11823 ft: 14505 corp: 30/854b lim: 40 exec/s: 45 rss: 69Mb L: 34/40 MS: 1 InsertByte- 00:08:21.997 [2024-11-19 17:52:14.625759] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:dddddddd cdw11:dddddddd SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.997 [2024-11-19 17:52:14.625785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.997 [2024-11-19 17:52:14.625904] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:dd2ddd0a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.997 [2024-11-19 17:52:14.625922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.997 [2024-11-19 17:52:14.626054] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.997 [2024-11-19 17:52:14.626073] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:21.997 [2024-11-19 17:52:14.626202] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.997 [2024-11-19 17:52:14.626219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:21.997 [2024-11-19 17:52:14.626356] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:0000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.997 [2024-11-19 17:52:14.626373] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:21.997 #46 NEW cov: 11823 ft: 14532 corp: 31/894b lim: 40 exec/s: 46 rss: 69Mb L: 40/40 MS: 1 ChangeBinInt- 00:08:21.997 [2024-11-19 17:52:14.675313] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00c20000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.997 [2024-11-19 17:52:14.675338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.997 [2024-11-19 17:52:14.675480] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.997 [2024-11-19 17:52:14.675498] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.997 [2024-11-19 17:52:14.675627] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.997 [2024-11-19 17:52:14.675643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:21.997 #47 NEW cov: 11823 ft: 14537 corp: 32/924b lim: 40 exec/s: 47 rss: 69Mb L: 30/40 MS: 1 ShuffleBytes- 00:08:21.997 [2024-11-19 17:52:14.715425] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.997 [2024-11-19 17:52:14.715453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.997 [2024-11-19 17:52:14.715592] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.997 [2024-11-19 17:52:14.715611] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.997 [2024-11-19 17:52:14.715755] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.997 [2024-11-19 17:52:14.715771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:21.997 #48 NEW cov: 11823 ft: 14558 corp: 33/952b lim: 40 exec/s: 48 rss: 69Mb L: 28/40 MS: 1 EraseBytes- 00:08:21.997 [2024-11-19 17:52:14.755439] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00c20000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.997 [2024-11-19 17:52:14.755466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.997 [2024-11-19 17:52:14.755604] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:0000fdff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.997 [2024-11-19 17:52:14.755622] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.997 [2024-11-19 17:52:14.755752] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.997 [2024-11-19 17:52:14.755767] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:21.997 #49 NEW cov: 11823 ft: 14584 corp: 34/982b lim: 40 exec/s: 49 rss: 69Mb L: 30/40 MS: 1 ChangeBinInt- 00:08:21.997 [2024-11-19 17:52:14.795636] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.997 [2024-11-19 17:52:14.795661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.997 [2024-11-19 17:52:14.795793] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:000000ff cdw11:21000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.997 [2024-11-19 17:52:14.795810] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.998 [2024-11-19 17:52:14.795932] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.998 [2024-11-19 17:52:14.795949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:21.998 #50 NEW cov: 11823 ft: 14603 corp: 35/1007b lim: 40 exec/s: 50 rss: 70Mb L: 25/40 MS: 1 EraseBytes- 00:08:21.998 [2024-11-19 17:52:14.836002] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a080000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.998 [2024-11-19 17:52:14.836028] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.998 [2024-11-19 17:52:14.836150] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00002100 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.998 [2024-11-19 17:52:14.836165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.998 [2024-11-19 17:52:14.836289] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:005b0000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.998 [2024-11-19 17:52:14.836305] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:21.998 [2024-11-19 17:52:14.836427] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:21000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.998 [2024-11-19 17:52:14.836443] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:22.258 #51 NEW cov: 11823 ft: 14617 corp: 36/1042b lim: 40 exec/s: 51 rss: 70Mb L: 35/40 MS: 1 ChangeBit- 00:08:22.258 [2024-11-19 17:52:14.875845] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a000002 cdw11:01000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.258 [2024-11-19 17:52:14.875870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.258 [2024-11-19 17:52:14.875997] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.258 [2024-11-19 17:52:14.876014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.258 [2024-11-19 17:52:14.876142] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:0000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.258 [2024-11-19 17:52:14.876158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:22.258 #52 NEW cov: 11823 ft: 14655 corp: 37/1066b lim: 40 exec/s: 52 rss: 70Mb L: 24/40 MS: 1 ChangeBit- 00:08:22.258 [2024-11-19 17:52:14.915989] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:08000006 cdw11:01000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.258 [2024-11-19 17:52:14.916014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.258 [2024-11-19 17:52:14.916150] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.258 [2024-11-19 17:52:14.916166] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.258 [2024-11-19 17:52:14.916305] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:0000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.258 [2024-11-19 17:52:14.916321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:22.258 #53 NEW cov: 11823 ft: 14665 corp: 38/1090b lim: 40 exec/s: 53 rss: 70Mb L: 24/40 MS: 1 ChangeBit- 00:08:22.258 [2024-11-19 17:52:14.955785] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00280000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.258 [2024-11-19 17:52:14.955811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.258 [2024-11-19 17:52:14.955932] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:21000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.258 [2024-11-19 17:52:14.955949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.258 #54 NEW cov: 11823 ft: 14675 corp: 39/1113b lim: 40 exec/s: 54 rss: 70Mb L: 23/40 MS: 1 InsertByte- 00:08:22.258 [2024-11-19 17:52:14.996508] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.258 [2024-11-19 17:52:14.996534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.258 [2024-11-19 17:52:14.996670] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00003f00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.258 [2024-11-19 17:52:14.996688] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.258 [2024-11-19 17:52:14.996816] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00007676 cdw11:76767676 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.258 [2024-11-19 17:52:14.996843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:22.258 [2024-11-19 17:52:14.996968] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:76760000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.258 [2024-11-19 17:52:14.996984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:22.258 #55 NEW cov: 11823 ft: 14676 corp: 40/1151b lim: 40 exec/s: 55 rss: 70Mb L: 38/40 MS: 1 InsertRepeatedBytes- 00:08:22.258 [2024-11-19 17:52:15.036090] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.258 [2024-11-19 17:52:15.036117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.258 [2024-11-19 17:52:15.036254] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.258 [2024-11-19 17:52:15.036274] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.258 #56 NEW cov: 11823 ft: 14685 corp: 41/1172b lim: 40 exec/s: 56 rss: 70Mb L: 21/40 MS: 1 CrossOver- 00:08:22.258 [2024-11-19 17:52:15.076522] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a002323 cdw11:23230000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.258 [2024-11-19 17:52:15.076550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.258 [2024-11-19 17:52:15.076682] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000021 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.258 [2024-11-19 17:52:15.076700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.258 [2024-11-19 17:52:15.076838] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.259 [2024-11-19 17:52:15.076855] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:22.259 #57 NEW cov: 11823 ft: 14688 corp: 42/1203b lim: 40 exec/s: 28 rss: 70Mb L: 31/40 MS: 1 CrossOver- 00:08:22.259 #57 DONE cov: 11823 ft: 14688 corp: 42/1203b lim: 40 exec/s: 28 rss: 70Mb 00:08:22.259 Done 57 runs in 2 second(s) 00:08:22.519 17:52:15 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_11.conf 00:08:22.519 17:52:15 -- ../common.sh@72 -- # (( i++ )) 00:08:22.519 17:52:15 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:22.519 17:52:15 -- ../common.sh@73 -- # start_llvm_fuzz 12 1 0x1 00:08:22.519 17:52:15 -- nvmf/run.sh@23 -- # local fuzzer_type=12 00:08:22.519 17:52:15 -- nvmf/run.sh@24 -- # local timen=1 00:08:22.519 17:52:15 -- nvmf/run.sh@25 -- # local core=0x1 00:08:22.519 17:52:15 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:08:22.519 17:52:15 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_12.conf 00:08:22.519 17:52:15 -- nvmf/run.sh@29 -- # printf %02d 12 00:08:22.519 17:52:15 -- nvmf/run.sh@29 -- # port=4412 00:08:22.519 17:52:15 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:08:22.519 17:52:15 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4412' 00:08:22.519 17:52:15 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4412"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:22.519 17:52:15 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4412' -c /tmp/fuzz_json_12.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 -Z 12 -r /var/tmp/spdk12.sock 00:08:22.519 [2024-11-19 17:52:15.252724] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:22.519 [2024-11-19 17:52:15.252796] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid639378 ] 00:08:22.519 EAL: No free 2048 kB hugepages reported on node 1 00:08:22.780 [2024-11-19 17:52:15.503588] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:22.780 [2024-11-19 17:52:15.531409] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:22.780 [2024-11-19 17:52:15.531527] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:22.780 [2024-11-19 17:52:15.583023] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:22.780 [2024-11-19 17:52:15.599348] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4412 *** 00:08:22.780 INFO: Running with entropic power schedule (0xFF, 100). 00:08:22.780 INFO: Seed: 1958511316 00:08:22.780 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:22.780 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:22.780 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:08:22.780 INFO: A corpus is not provided, starting from an empty corpus 00:08:22.780 #2 INITED exec/s: 0 rss: 59Mb 00:08:22.780 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:22.780 This may also happen if the target rejected all inputs we tried so far 00:08:23.040 [2024-11-19 17:52:15.648564] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.040 [2024-11-19 17:52:15.648592] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.040 [2024-11-19 17:52:15.648674] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.040 [2024-11-19 17:52:15.648689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:23.040 [2024-11-19 17:52:15.648749] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.040 [2024-11-19 17:52:15.648763] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:23.040 [2024-11-19 17:52:15.648810] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.040 [2024-11-19 17:52:15.648827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:23.301 NEW_FUNC[1/671]: 0x461d28 in fuzz_admin_directive_send_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:241 00:08:23.301 NEW_FUNC[2/671]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:23.301 #4 NEW cov: 11594 ft: 11595 corp: 2/35b lim: 40 exec/s: 0 rss: 67Mb L: 34/34 MS: 2 CopyPart-InsertRepeatedBytes- 00:08:23.301 [2024-11-19 17:52:15.949226] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.301 [2024-11-19 17:52:15.949256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.301 [2024-11-19 17:52:15.949307] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.301 [2024-11-19 17:52:15.949321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:23.301 [2024-11-19 17:52:15.949373] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.301 [2024-11-19 17:52:15.949386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:23.301 [2024-11-19 17:52:15.949441] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:2f0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.301 [2024-11-19 17:52:15.949454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:23.301 #10 NEW cov: 11707 ft: 12092 corp: 3/70b lim: 40 exec/s: 0 rss: 68Mb L: 35/35 MS: 1 InsertByte- 00:08:23.302 [2024-11-19 17:52:15.999291] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.302 [2024-11-19 17:52:15.999317] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.302 [2024-11-19 17:52:15.999386] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.302 [2024-11-19 17:52:15.999400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:23.302 [2024-11-19 17:52:15.999454] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.302 [2024-11-19 17:52:15.999467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:23.302 [2024-11-19 17:52:15.999520] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:0c110c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.302 [2024-11-19 17:52:15.999534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:23.302 #16 NEW cov: 11713 ft: 12327 corp: 4/104b lim: 40 exec/s: 0 rss: 68Mb L: 34/35 MS: 1 ChangeBinInt- 00:08:23.302 [2024-11-19 17:52:16.039394] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.302 [2024-11-19 17:52:16.039420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.302 [2024-11-19 17:52:16.039476] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.302 [2024-11-19 17:52:16.039492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:23.302 [2024-11-19 17:52:16.039543] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.302 [2024-11-19 17:52:16.039556] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:23.302 [2024-11-19 17:52:16.039610] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:0c11070c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.302 [2024-11-19 17:52:16.039623] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:23.302 #17 NEW cov: 11798 ft: 12573 corp: 5/138b lim: 40 exec/s: 0 rss: 68Mb L: 34/35 MS: 1 ChangeBinInt- 00:08:23.302 [2024-11-19 17:52:16.079193] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:2d4a8585 cdw11:85858585 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.302 [2024-11-19 17:52:16.079219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.302 [2024-11-19 17:52:16.079273] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:85858585 cdw11:85858585 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.302 [2024-11-19 17:52:16.079286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:23.302 #22 NEW cov: 11798 ft: 13080 corp: 6/154b lim: 40 exec/s: 0 rss: 68Mb L: 16/35 MS: 5 ShuffleBytes-InsertByte-ChangeByte-ChangeBit-InsertRepeatedBytes- 00:08:23.302 [2024-11-19 17:52:16.119589] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.302 [2024-11-19 17:52:16.119618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.302 [2024-11-19 17:52:16.119690] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.302 [2024-11-19 17:52:16.119705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:23.302 [2024-11-19 17:52:16.119759] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.302 [2024-11-19 17:52:16.119772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:23.302 [2024-11-19 17:52:16.119827] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:0c11070c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.302 [2024-11-19 17:52:16.119840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:23.302 #23 NEW cov: 11798 ft: 13198 corp: 7/188b lim: 40 exec/s: 0 rss: 68Mb L: 34/35 MS: 1 ChangeByte- 00:08:23.302 [2024-11-19 17:52:16.159584] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.302 [2024-11-19 17:52:16.159618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.302 [2024-11-19 17:52:16.159672] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.302 [2024-11-19 17:52:16.159685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:23.302 [2024-11-19 17:52:16.159736] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.302 [2024-11-19 17:52:16.159753] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:23.563 #24 NEW cov: 11798 ft: 13489 corp: 8/219b lim: 40 exec/s: 0 rss: 68Mb L: 31/35 MS: 1 EraseBytes- 00:08:23.563 [2024-11-19 17:52:16.199882] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.563 [2024-11-19 17:52:16.199907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.563 [2024-11-19 17:52:16.199962] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.563 [2024-11-19 17:52:16.199975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:23.563 [2024-11-19 17:52:16.200027] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.563 [2024-11-19 17:52:16.200057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:23.563 [2024-11-19 17:52:16.200109] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.563 [2024-11-19 17:52:16.200122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:23.563 #25 NEW cov: 11798 ft: 13519 corp: 9/258b lim: 40 exec/s: 0 rss: 68Mb L: 39/39 MS: 1 CrossOver- 00:08:23.563 [2024-11-19 17:52:16.239806] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.563 [2024-11-19 17:52:16.239832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.563 [2024-11-19 17:52:16.239901] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.563 [2024-11-19 17:52:16.239915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:23.563 [2024-11-19 17:52:16.239969] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:0c0c1107 cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.563 [2024-11-19 17:52:16.239983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:23.563 #26 NEW cov: 11798 ft: 13587 corp: 10/283b lim: 40 exec/s: 0 rss: 68Mb L: 25/39 MS: 1 EraseBytes- 00:08:23.563 [2024-11-19 17:52:16.280230] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.563 [2024-11-19 17:52:16.280255] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.563 [2024-11-19 17:52:16.280308] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.563 [2024-11-19 17:52:16.280322] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:23.563 [2024-11-19 17:52:16.280374] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.563 [2024-11-19 17:52:16.280388] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:23.563 [2024-11-19 17:52:16.280440] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:0c000000 cdw11:00000011 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.563 [2024-11-19 17:52:16.280457] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:23.563 [2024-11-19 17:52:16.280507] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:8 nsid:0 cdw10:070c0c0c cdw11:0c0c0c0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.563 [2024-11-19 17:52:16.280519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:23.563 #27 NEW cov: 11798 ft: 13659 corp: 11/323b lim: 40 exec/s: 0 rss: 68Mb L: 40/40 MS: 1 InsertRepeatedBytes- 00:08:23.563 [2024-11-19 17:52:16.320237] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.563 [2024-11-19 17:52:16.320262] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.563 [2024-11-19 17:52:16.320333] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.563 [2024-11-19 17:52:16.320346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:23.563 [2024-11-19 17:52:16.320399] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.563 [2024-11-19 17:52:16.320412] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:23.563 [2024-11-19 17:52:16.320465] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.563 [2024-11-19 17:52:16.320477] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:23.563 #28 NEW cov: 11798 ft: 13675 corp: 12/357b lim: 40 exec/s: 0 rss: 68Mb L: 34/40 MS: 1 ChangeByte- 00:08:23.563 [2024-11-19 17:52:16.360311] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.563 [2024-11-19 17:52:16.360336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.563 [2024-11-19 17:52:16.360405] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.563 [2024-11-19 17:52:16.360419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:23.563 [2024-11-19 17:52:16.360473] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:0c0c0c0c cdw11:7e0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.563 [2024-11-19 17:52:16.360487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:23.563 [2024-11-19 17:52:16.360545] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:0c0c1107 cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.563 [2024-11-19 17:52:16.360559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:23.563 #29 NEW cov: 11798 ft: 13707 corp: 13/392b lim: 40 exec/s: 0 rss: 68Mb L: 35/40 MS: 1 InsertByte- 00:08:23.563 [2024-11-19 17:52:16.400004] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.564 [2024-11-19 17:52:16.400028] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.825 #32 NEW cov: 11798 ft: 14456 corp: 14/405b lim: 40 exec/s: 0 rss: 68Mb L: 13/40 MS: 3 CrossOver-EraseBytes-CrossOver- 00:08:23.825 [2024-11-19 17:52:16.450284] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.825 [2024-11-19 17:52:16.450309] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.825 [2024-11-19 17:52:16.450364] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.825 [2024-11-19 17:52:16.450377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:23.825 #33 NEW cov: 11798 ft: 14472 corp: 15/422b lim: 40 exec/s: 0 rss: 68Mb L: 17/40 MS: 1 EraseBytes- 00:08:23.825 [2024-11-19 17:52:16.490391] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:2d4a8588 cdw11:85858585 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.825 [2024-11-19 17:52:16.490416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.825 [2024-11-19 17:52:16.490468] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:85858585 cdw11:85858585 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.825 [2024-11-19 17:52:16.490481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:23.825 #34 NEW cov: 11798 ft: 14477 corp: 16/438b lim: 40 exec/s: 0 rss: 68Mb L: 16/40 MS: 1 ChangeBinInt- 00:08:23.825 [2024-11-19 17:52:16.530831] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.825 [2024-11-19 17:52:16.530855] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.825 [2024-11-19 17:52:16.530909] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.825 [2024-11-19 17:52:16.530922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:23.825 [2024-11-19 17:52:16.530974] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.825 [2024-11-19 17:52:16.530987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:23.825 [2024-11-19 17:52:16.531038] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:0c11070c cdw11:0c0c0cff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.825 [2024-11-19 17:52:16.531051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:23.825 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:23.825 #35 NEW cov: 11821 ft: 14534 corp: 17/477b lim: 40 exec/s: 0 rss: 69Mb L: 39/40 MS: 1 InsertRepeatedBytes- 00:08:23.825 [2024-11-19 17:52:16.570611] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.825 [2024-11-19 17:52:16.570636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.825 [2024-11-19 17:52:16.570693] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.825 [2024-11-19 17:52:16.570706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:23.825 #36 NEW cov: 11821 ft: 14560 corp: 18/500b lim: 40 exec/s: 0 rss: 69Mb L: 23/40 MS: 1 EraseBytes- 00:08:23.825 [2024-11-19 17:52:16.610915] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.825 [2024-11-19 17:52:16.610939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.825 [2024-11-19 17:52:16.610996] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.825 [2024-11-19 17:52:16.611009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:23.825 [2024-11-19 17:52:16.611065] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.825 [2024-11-19 17:52:16.611077] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:23.825 #37 NEW cov: 11821 ft: 14578 corp: 19/526b lim: 40 exec/s: 37 rss: 69Mb L: 26/40 MS: 1 CrossOver- 00:08:23.825 [2024-11-19 17:52:16.651189] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.825 [2024-11-19 17:52:16.651214] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.825 [2024-11-19 17:52:16.651269] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:0c0c0c0c cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.825 [2024-11-19 17:52:16.651282] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:23.825 [2024-11-19 17:52:16.651334] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.825 [2024-11-19 17:52:16.651347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:23.825 [2024-11-19 17:52:16.651399] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffff0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.825 [2024-11-19 17:52:16.651412] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:23.825 #38 NEW cov: 11821 ft: 14590 corp: 20/562b lim: 40 exec/s: 38 rss: 69Mb L: 36/40 MS: 1 InsertRepeatedBytes- 00:08:24.087 [2024-11-19 17:52:16.691330] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.087 [2024-11-19 17:52:16.691355] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.087 [2024-11-19 17:52:16.691426] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.087 [2024-11-19 17:52:16.691440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.087 [2024-11-19 17:52:16.691494] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.087 [2024-11-19 17:52:16.691507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:24.087 [2024-11-19 17:52:16.691560] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.087 [2024-11-19 17:52:16.691574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:24.087 #44 NEW cov: 11821 ft: 14612 corp: 21/596b lim: 40 exec/s: 44 rss: 69Mb L: 34/40 MS: 1 ShuffleBytes- 00:08:24.087 [2024-11-19 17:52:16.731379] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.087 [2024-11-19 17:52:16.731403] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.087 [2024-11-19 17:52:16.731472] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:0c0c220c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.087 [2024-11-19 17:52:16.731485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.087 [2024-11-19 17:52:16.731536] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.087 [2024-11-19 17:52:16.731550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:24.087 [2024-11-19 17:52:16.731606] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:0c11070c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.087 [2024-11-19 17:52:16.731619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:24.087 #45 NEW cov: 11821 ft: 14703 corp: 22/630b lim: 40 exec/s: 45 rss: 69Mb L: 34/40 MS: 1 ChangeBinInt- 00:08:24.087 [2024-11-19 17:52:16.771475] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.087 [2024-11-19 17:52:16.771500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.087 [2024-11-19 17:52:16.771553] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:0c0c0c0c cdw11:040c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.087 [2024-11-19 17:52:16.771566] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.087 [2024-11-19 17:52:16.771626] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.087 [2024-11-19 17:52:16.771639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:24.087 [2024-11-19 17:52:16.771691] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:0c110c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.087 [2024-11-19 17:52:16.771704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:24.087 #46 NEW cov: 11821 ft: 14723 corp: 23/664b lim: 40 exec/s: 46 rss: 69Mb L: 34/40 MS: 1 ChangeBit- 00:08:24.087 [2024-11-19 17:52:16.811318] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:2d0e0000 cdw11:00858585 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.087 [2024-11-19 17:52:16.811343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.087 [2024-11-19 17:52:16.811397] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:85858585 cdw11:85858585 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.087 [2024-11-19 17:52:16.811410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.087 #47 NEW cov: 11821 ft: 14762 corp: 24/680b lim: 40 exec/s: 47 rss: 69Mb L: 16/40 MS: 1 CMP- DE: "\016\000\000\000"- 00:08:24.087 [2024-11-19 17:52:16.851589] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.087 [2024-11-19 17:52:16.851618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.087 [2024-11-19 17:52:16.851678] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.087 [2024-11-19 17:52:16.851691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.087 [2024-11-19 17:52:16.851745] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.087 [2024-11-19 17:52:16.851758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:24.087 #48 NEW cov: 11821 ft: 14820 corp: 25/711b lim: 40 exec/s: 48 rss: 69Mb L: 31/40 MS: 1 PersAutoDict- DE: "\016\000\000\000"- 00:08:24.087 [2024-11-19 17:52:16.891993] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0c0c0c0c cdw11:0c0c2f0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.087 [2024-11-19 17:52:16.892018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.087 [2024-11-19 17:52:16.892070] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.087 [2024-11-19 17:52:16.892084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.087 [2024-11-19 17:52:16.892134] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.087 [2024-11-19 17:52:16.892148] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:24.087 [2024-11-19 17:52:16.892201] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:0c000000 cdw11:00000011 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.087 [2024-11-19 17:52:16.892214] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:24.087 [2024-11-19 17:52:16.892267] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:8 nsid:0 cdw10:070c0c0c cdw11:0c0c0c0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.087 [2024-11-19 17:52:16.892280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:24.087 #49 NEW cov: 11821 ft: 14828 corp: 26/751b lim: 40 exec/s: 49 rss: 69Mb L: 40/40 MS: 1 ChangeByte- 00:08:24.087 [2024-11-19 17:52:16.932053] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.087 [2024-11-19 17:52:16.932079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.087 [2024-11-19 17:52:16.932130] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:0c0c0c08 cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.087 [2024-11-19 17:52:16.932143] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.088 [2024-11-19 17:52:16.932196] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.088 [2024-11-19 17:52:16.932209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:24.088 [2024-11-19 17:52:16.932260] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.088 [2024-11-19 17:52:16.932273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:24.348 #50 NEW cov: 11821 ft: 14835 corp: 27/785b lim: 40 exec/s: 50 rss: 69Mb L: 34/40 MS: 1 ChangeBit- 00:08:24.348 [2024-11-19 17:52:16.961891] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.348 [2024-11-19 17:52:16.961916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.348 [2024-11-19 17:52:16.961969] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:0c0c0c0c cdw11:080c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.348 [2024-11-19 17:52:16.961982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.348 [2024-11-19 17:52:16.962036] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.348 [2024-11-19 17:52:16.962049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:24.348 #51 NEW cov: 11821 ft: 14880 corp: 28/814b lim: 40 exec/s: 51 rss: 69Mb L: 29/40 MS: 1 CrossOver- 00:08:24.348 [2024-11-19 17:52:17.001882] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.348 [2024-11-19 17:52:17.001907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.348 [2024-11-19 17:52:17.001962] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.348 [2024-11-19 17:52:17.001976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.348 #52 NEW cov: 11821 ft: 14895 corp: 29/837b lim: 40 exec/s: 52 rss: 69Mb L: 23/40 MS: 1 ChangeBinInt- 00:08:24.348 [2024-11-19 17:52:17.042171] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.348 [2024-11-19 17:52:17.042196] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.348 [2024-11-19 17:52:17.042251] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:0c0c0c0c cdw11:080c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.348 [2024-11-19 17:52:17.042264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.348 [2024-11-19 17:52:17.042319] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.348 [2024-11-19 17:52:17.042332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:24.348 #53 NEW cov: 11821 ft: 14985 corp: 30/866b lim: 40 exec/s: 53 rss: 69Mb L: 29/40 MS: 1 ShuffleBytes- 00:08:24.348 [2024-11-19 17:52:17.082409] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.348 [2024-11-19 17:52:17.082434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.348 [2024-11-19 17:52:17.082488] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:0c0c0c0c cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.348 [2024-11-19 17:52:17.082501] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.348 [2024-11-19 17:52:17.082554] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.348 [2024-11-19 17:52:17.082567] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:24.348 [2024-11-19 17:52:17.082627] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffff0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.348 [2024-11-19 17:52:17.082639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:24.348 #54 NEW cov: 11821 ft: 14988 corp: 31/903b lim: 40 exec/s: 54 rss: 69Mb L: 37/40 MS: 1 InsertByte- 00:08:24.348 [2024-11-19 17:52:17.122482] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.348 [2024-11-19 17:52:17.122507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.348 [2024-11-19 17:52:17.122562] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.348 [2024-11-19 17:52:17.122575] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.348 [2024-11-19 17:52:17.122647] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:0000000c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.348 [2024-11-19 17:52:17.122661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:24.348 [2024-11-19 17:52:17.122716] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.348 [2024-11-19 17:52:17.122729] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:24.348 #55 NEW cov: 11821 ft: 15003 corp: 32/941b lim: 40 exec/s: 55 rss: 69Mb L: 38/40 MS: 1 PersAutoDict- DE: "\016\000\000\000"- 00:08:24.348 [2024-11-19 17:52:17.152275] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:2d858585 cdw11:85858585 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.348 [2024-11-19 17:52:17.152299] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.348 [2024-11-19 17:52:17.152352] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:85858585 cdw11:85858585 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.348 [2024-11-19 17:52:17.152365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.348 #56 NEW cov: 11821 ft: 15017 corp: 33/957b lim: 40 exec/s: 56 rss: 69Mb L: 16/40 MS: 1 CopyPart- 00:08:24.348 [2024-11-19 17:52:17.192382] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.348 [2024-11-19 17:52:17.192407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.348 [2024-11-19 17:52:17.192462] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.348 [2024-11-19 17:52:17.192476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.609 #57 NEW cov: 11821 ft: 15069 corp: 34/974b lim: 40 exec/s: 57 rss: 69Mb L: 17/40 MS: 1 CopyPart- 00:08:24.609 [2024-11-19 17:52:17.232772] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c08 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.609 [2024-11-19 17:52:17.232797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.609 [2024-11-19 17:52:17.232851] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.609 [2024-11-19 17:52:17.232868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.609 [2024-11-19 17:52:17.232919] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.609 [2024-11-19 17:52:17.232931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:24.609 [2024-11-19 17:52:17.232985] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:0c11070c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.609 [2024-11-19 17:52:17.232997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:24.609 #58 NEW cov: 11821 ft: 15079 corp: 35/1008b lim: 40 exec/s: 58 rss: 69Mb L: 34/40 MS: 1 ChangeBit- 00:08:24.609 [2024-11-19 17:52:17.272892] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.609 [2024-11-19 17:52:17.272917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.609 [2024-11-19 17:52:17.272973] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:0c0c0c0c cdw11:3affffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.609 [2024-11-19 17:52:17.272987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.609 [2024-11-19 17:52:17.273041] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.609 [2024-11-19 17:52:17.273070] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:24.609 [2024-11-19 17:52:17.273124] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.609 [2024-11-19 17:52:17.273137] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:24.609 #59 NEW cov: 11821 ft: 15096 corp: 36/1045b lim: 40 exec/s: 59 rss: 69Mb L: 37/40 MS: 1 InsertByte- 00:08:24.609 [2024-11-19 17:52:17.313002] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.609 [2024-11-19 17:52:17.313027] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.609 [2024-11-19 17:52:17.313084] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:0c0c0c0c cdw11:3affffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.609 [2024-11-19 17:52:17.313097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.609 [2024-11-19 17:52:17.313150] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.609 [2024-11-19 17:52:17.313163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:24.609 [2024-11-19 17:52:17.313215] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffff0e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.609 [2024-11-19 17:52:17.313227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:24.609 #60 NEW cov: 11821 ft: 15098 corp: 37/1082b lim: 40 exec/s: 60 rss: 69Mb L: 37/40 MS: 1 PersAutoDict- DE: "\016\000\000\000"- 00:08:24.609 [2024-11-19 17:52:17.352984] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.609 [2024-11-19 17:52:17.353012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.609 [2024-11-19 17:52:17.353068] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.609 [2024-11-19 17:52:17.353081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.609 [2024-11-19 17:52:17.353136] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:0c0c0c0c cdw11:11070c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.609 [2024-11-19 17:52:17.353149] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:24.609 #61 NEW cov: 11821 ft: 15106 corp: 38/1107b lim: 40 exec/s: 61 rss: 69Mb L: 25/40 MS: 1 CopyPart- 00:08:24.610 [2024-11-19 17:52:17.393243] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.610 [2024-11-19 17:52:17.393269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.610 [2024-11-19 17:52:17.393326] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.610 [2024-11-19 17:52:17.393339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.610 [2024-11-19 17:52:17.393398] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.610 [2024-11-19 17:52:17.393411] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:24.610 [2024-11-19 17:52:17.393468] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:2f0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.610 [2024-11-19 17:52:17.393481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:24.610 #62 NEW cov: 11821 ft: 15109 corp: 39/1143b lim: 40 exec/s: 62 rss: 69Mb L: 36/40 MS: 1 CopyPart- 00:08:24.610 [2024-11-19 17:52:17.433429] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.610 [2024-11-19 17:52:17.433455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.610 [2024-11-19 17:52:17.433526] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.610 [2024-11-19 17:52:17.433540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.610 [2024-11-19 17:52:17.433606] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.610 [2024-11-19 17:52:17.433619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:24.610 [2024-11-19 17:52:17.433675] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:0c11070c cdw11:0c0c0cff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.610 [2024-11-19 17:52:17.433689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:24.610 #63 NEW cov: 11821 ft: 15111 corp: 40/1182b lim: 40 exec/s: 63 rss: 70Mb L: 39/40 MS: 1 ChangeByte- 00:08:24.870 [2024-11-19 17:52:17.473384] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0c0c0c0c cdw11:060c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.871 [2024-11-19 17:52:17.473414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.871 [2024-11-19 17:52:17.473469] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.871 [2024-11-19 17:52:17.473482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.871 [2024-11-19 17:52:17.473538] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.871 [2024-11-19 17:52:17.473552] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:24.871 #64 NEW cov: 11821 ft: 15130 corp: 41/1208b lim: 40 exec/s: 64 rss: 70Mb L: 26/40 MS: 1 ChangeBinInt- 00:08:24.871 [2024-11-19 17:52:17.513318] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0e002d85 cdw11:00850085 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.871 [2024-11-19 17:52:17.513342] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.871 [2024-11-19 17:52:17.513410] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:85858585 cdw11:85858585 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.871 [2024-11-19 17:52:17.513424] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.871 #65 NEW cov: 11821 ft: 15134 corp: 42/1224b lim: 40 exec/s: 65 rss: 70Mb L: 16/40 MS: 1 ShuffleBytes- 00:08:24.871 [2024-11-19 17:52:17.553869] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0c0c0c0c cdw11:0c0c2f0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.871 [2024-11-19 17:52:17.553894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.871 [2024-11-19 17:52:17.553942] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:0c0c0c0c cdw11:0cf00c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.871 [2024-11-19 17:52:17.553955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.871 [2024-11-19 17:52:17.554008] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.871 [2024-11-19 17:52:17.554036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:24.871 [2024-11-19 17:52:17.554091] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:0c000000 cdw11:00000011 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.871 [2024-11-19 17:52:17.554104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:24.871 [2024-11-19 17:52:17.554157] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:8 nsid:0 cdw10:070c0c0c cdw11:0c0c0c0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.871 [2024-11-19 17:52:17.554170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:24.871 #71 NEW cov: 11821 ft: 15140 corp: 43/1264b lim: 40 exec/s: 71 rss: 70Mb L: 40/40 MS: 1 ChangeBinInt- 00:08:24.871 [2024-11-19 17:52:17.593987] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.871 [2024-11-19 17:52:17.594011] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.871 [2024-11-19 17:52:17.594064] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.871 [2024-11-19 17:52:17.594078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.871 [2024-11-19 17:52:17.594131] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.871 [2024-11-19 17:52:17.594144] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:24.871 [2024-11-19 17:52:17.594197] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:0c000c0a cdw11:00000011 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.871 [2024-11-19 17:52:17.594210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:24.871 [2024-11-19 17:52:17.594261] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:8 nsid:0 cdw10:070c0c0c cdw11:0c0c0c0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.871 [2024-11-19 17:52:17.594274] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:24.871 #72 NEW cov: 11821 ft: 15150 corp: 44/1304b lim: 40 exec/s: 72 rss: 70Mb L: 40/40 MS: 1 CopyPart- 00:08:24.871 [2024-11-19 17:52:17.634138] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.871 [2024-11-19 17:52:17.634162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.871 [2024-11-19 17:52:17.634217] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.871 [2024-11-19 17:52:17.634230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.871 [2024-11-19 17:52:17.634284] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.871 [2024-11-19 17:52:17.634297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:24.871 [2024-11-19 17:52:17.634351] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:0c000000 cdw11:0000000c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.871 [2024-11-19 17:52:17.634364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:24.871 [2024-11-19 17:52:17.634417] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:8 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.871 [2024-11-19 17:52:17.634430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:24.871 #73 NEW cov: 11821 ft: 15159 corp: 45/1344b lim: 40 exec/s: 36 rss: 70Mb L: 40/40 MS: 1 CrossOver- 00:08:24.871 #73 DONE cov: 11821 ft: 15159 corp: 45/1344b lim: 40 exec/s: 36 rss: 70Mb 00:08:24.871 ###### Recommended dictionary. ###### 00:08:24.871 "\016\000\000\000" # Uses: 3 00:08:24.871 ###### End of recommended dictionary. ###### 00:08:24.871 Done 73 runs in 2 second(s) 00:08:25.132 17:52:17 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_12.conf 00:08:25.132 17:52:17 -- ../common.sh@72 -- # (( i++ )) 00:08:25.132 17:52:17 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:25.132 17:52:17 -- ../common.sh@73 -- # start_llvm_fuzz 13 1 0x1 00:08:25.132 17:52:17 -- nvmf/run.sh@23 -- # local fuzzer_type=13 00:08:25.132 17:52:17 -- nvmf/run.sh@24 -- # local timen=1 00:08:25.132 17:52:17 -- nvmf/run.sh@25 -- # local core=0x1 00:08:25.132 17:52:17 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:08:25.132 17:52:17 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_13.conf 00:08:25.132 17:52:17 -- nvmf/run.sh@29 -- # printf %02d 13 00:08:25.132 17:52:17 -- nvmf/run.sh@29 -- # port=4413 00:08:25.132 17:52:17 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:08:25.132 17:52:17 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4413' 00:08:25.132 17:52:17 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4413"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:25.132 17:52:17 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4413' -c /tmp/fuzz_json_13.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 -Z 13 -r /var/tmp/spdk13.sock 00:08:25.132 [2024-11-19 17:52:17.806333] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:25.132 [2024-11-19 17:52:17.806412] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid639882 ] 00:08:25.132 EAL: No free 2048 kB hugepages reported on node 1 00:08:25.392 [2024-11-19 17:52:18.059476] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:25.392 [2024-11-19 17:52:18.088291] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:25.392 [2024-11-19 17:52:18.088427] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:25.392 [2024-11-19 17:52:18.139763] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:25.392 [2024-11-19 17:52:18.156095] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4413 *** 00:08:25.392 INFO: Running with entropic power schedule (0xFF, 100). 00:08:25.392 INFO: Seed: 220561251 00:08:25.392 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:25.392 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:25.392 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:08:25.392 INFO: A corpus is not provided, starting from an empty corpus 00:08:25.392 #2 INITED exec/s: 0 rss: 59Mb 00:08:25.392 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:25.392 This may also happen if the target rejected all inputs we tried so far 00:08:25.392 [2024-11-19 17:52:18.205008] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:bbbb1515 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.392 [2024-11-19 17:52:18.205035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.652 NEW_FUNC[1/670]: 0x4638f8 in fuzz_admin_directive_receive_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:257 00:08:25.652 NEW_FUNC[2/670]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:25.652 #11 NEW cov: 11578 ft: 11583 corp: 2/16b lim: 40 exec/s: 0 rss: 67Mb L: 15/15 MS: 4 InsertRepeatedBytes-ChangeByte-CopyPart-InsertRepeatedBytes- 00:08:25.652 [2024-11-19 17:52:18.505783] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:8ebbbb15 cdw11:15000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.652 [2024-11-19 17:52:18.505815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.912 #13 NEW cov: 11695 ft: 12023 corp: 3/25b lim: 40 exec/s: 0 rss: 68Mb L: 9/15 MS: 2 InsertRepeatedBytes-CrossOver- 00:08:25.912 [2024-11-19 17:52:18.545874] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:bfbb1515 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.912 [2024-11-19 17:52:18.545899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.912 #14 NEW cov: 11701 ft: 12285 corp: 4/40b lim: 40 exec/s: 0 rss: 68Mb L: 15/15 MS: 1 ChangeBit- 00:08:25.912 [2024-11-19 17:52:18.585973] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:bfbb1515 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.912 [2024-11-19 17:52:18.585999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.912 #20 NEW cov: 11786 ft: 12660 corp: 5/55b lim: 40 exec/s: 0 rss: 68Mb L: 15/15 MS: 1 CrossOver- 00:08:25.912 [2024-11-19 17:52:18.626059] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:288ebbbb cdw11:15150000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.912 [2024-11-19 17:52:18.626084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.912 #21 NEW cov: 11786 ft: 12760 corp: 6/65b lim: 40 exec/s: 0 rss: 68Mb L: 10/15 MS: 1 InsertByte- 00:08:25.912 [2024-11-19 17:52:18.666256] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:bbbb1515 cdw11:8b000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.912 [2024-11-19 17:52:18.666281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.912 [2024-11-19 17:52:18.666339] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:0015150a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.912 [2024-11-19 17:52:18.666353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.912 #22 NEW cov: 11786 ft: 13239 corp: 7/81b lim: 40 exec/s: 0 rss: 68Mb L: 16/16 MS: 1 InsertByte- 00:08:25.912 [2024-11-19 17:52:18.706280] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:000f1515 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.912 [2024-11-19 17:52:18.706304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.912 #23 NEW cov: 11786 ft: 13332 corp: 8/96b lim: 40 exec/s: 0 rss: 68Mb L: 15/16 MS: 1 ChangeBinInt- 00:08:25.912 [2024-11-19 17:52:18.746775] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:000f1515 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.912 [2024-11-19 17:52:18.746801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.912 [2024-11-19 17:52:18.746858] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:15f4f4f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.912 [2024-11-19 17:52:18.746872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.913 [2024-11-19 17:52:18.746926] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:f4f4f4f4 cdw11:f4f4f4f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.913 [2024-11-19 17:52:18.746939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:25.913 [2024-11-19 17:52:18.746992] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:f4f4f4f4 cdw11:f4f4f4f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.913 [2024-11-19 17:52:18.747005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:25.913 #24 NEW cov: 11786 ft: 13889 corp: 9/135b lim: 40 exec/s: 0 rss: 68Mb L: 39/39 MS: 1 InsertRepeatedBytes- 00:08:26.173 [2024-11-19 17:52:18.786494] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:bfbb1515 cdw11:feffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.173 [2024-11-19 17:52:18.786520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.173 #25 NEW cov: 11786 ft: 13970 corp: 10/150b lim: 40 exec/s: 0 rss: 68Mb L: 15/39 MS: 1 ChangeBinInt- 00:08:26.173 [2024-11-19 17:52:18.826759] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:bfbb1515 cdw11:feffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.173 [2024-11-19 17:52:18.826783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.173 [2024-11-19 17:52:18.826840] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.173 [2024-11-19 17:52:18.826853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.173 #26 NEW cov: 11786 ft: 14081 corp: 11/170b lim: 40 exec/s: 0 rss: 68Mb L: 20/39 MS: 1 InsertRepeatedBytes- 00:08:26.173 [2024-11-19 17:52:18.866910] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:000f1515 cdw11:018c6c62 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.173 [2024-11-19 17:52:18.866936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.173 [2024-11-19 17:52:18.866994] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:2f427d66 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.173 [2024-11-19 17:52:18.867008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.173 #27 NEW cov: 11786 ft: 14100 corp: 12/193b lim: 40 exec/s: 0 rss: 68Mb L: 23/39 MS: 1 CMP- DE: "\001\214lb/B}f"- 00:08:26.173 [2024-11-19 17:52:18.907001] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:000f1515 cdw11:018c6c62 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.173 [2024-11-19 17:52:18.907026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.173 [2024-11-19 17:52:18.907083] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:2d427d66 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.173 [2024-11-19 17:52:18.907098] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.173 #28 NEW cov: 11786 ft: 14142 corp: 13/216b lim: 40 exec/s: 0 rss: 68Mb L: 23/39 MS: 1 ChangeBit- 00:08:26.173 [2024-11-19 17:52:18.947010] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:bfbbe3ea cdw11:ffff0000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.173 [2024-11-19 17:52:18.947036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.173 #29 NEW cov: 11786 ft: 14185 corp: 14/231b lim: 40 exec/s: 0 rss: 68Mb L: 15/39 MS: 1 ChangeBinInt- 00:08:26.173 [2024-11-19 17:52:18.987078] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:bbbb1515 cdw11:15000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.173 [2024-11-19 17:52:18.987102] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.173 #30 NEW cov: 11786 ft: 14227 corp: 15/246b lim: 40 exec/s: 0 rss: 68Mb L: 15/39 MS: 1 CopyPart- 00:08:26.173 [2024-11-19 17:52:19.027359] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:bfbbe3ea cdw11:ffff0000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.173 [2024-11-19 17:52:19.027383] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.173 [2024-11-19 17:52:19.027460] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:158e25be SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.173 [2024-11-19 17:52:19.027474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.448 #31 NEW cov: 11786 ft: 14316 corp: 16/269b lim: 40 exec/s: 0 rss: 69Mb L: 23/39 MS: 1 CMP- DE: "%\276\323Ebl\214\000"- 00:08:26.448 [2024-11-19 17:52:19.067338] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:000f1515 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.448 [2024-11-19 17:52:19.067362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.448 #32 NEW cov: 11786 ft: 14331 corp: 17/282b lim: 40 exec/s: 0 rss: 69Mb L: 13/39 MS: 1 EraseBytes- 00:08:26.448 [2024-11-19 17:52:19.107459] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:bbbb1515 cdw11:15000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.448 [2024-11-19 17:52:19.107485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.448 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:26.448 #33 NEW cov: 11809 ft: 14402 corp: 18/297b lim: 40 exec/s: 0 rss: 69Mb L: 15/39 MS: 1 ChangeBit- 00:08:26.448 [2024-11-19 17:52:19.147563] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:288ebbbb cdw11:15150000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.448 [2024-11-19 17:52:19.147588] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.448 #34 NEW cov: 11809 ft: 14437 corp: 19/307b lim: 40 exec/s: 0 rss: 69Mb L: 10/39 MS: 1 ShuffleBytes- 00:08:26.448 [2024-11-19 17:52:19.187821] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:bfbb1515 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.448 [2024-11-19 17:52:19.187846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.448 [2024-11-19 17:52:19.187922] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:bbbb1515 cdw11:8b000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.448 [2024-11-19 17:52:19.187937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.448 #35 NEW cov: 11809 ft: 14451 corp: 20/327b lim: 40 exec/s: 35 rss: 69Mb L: 20/39 MS: 1 CrossOver- 00:08:26.448 [2024-11-19 17:52:19.227843] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:28018c6c cdw11:622f427d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.448 [2024-11-19 17:52:19.227867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.448 #36 NEW cov: 11809 ft: 14461 corp: 21/337b lim: 40 exec/s: 36 rss: 69Mb L: 10/39 MS: 1 PersAutoDict- DE: "\001\214lb/B}f"- 00:08:26.448 [2024-11-19 17:52:19.267945] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:bfbb1515 cdw11:0025bed3 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.448 [2024-11-19 17:52:19.267970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.448 #37 NEW cov: 11809 ft: 14483 corp: 22/352b lim: 40 exec/s: 37 rss: 69Mb L: 15/39 MS: 1 PersAutoDict- DE: "%\276\323Ebl\214\000"- 00:08:26.448 [2024-11-19 17:52:19.308166] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:bb15bb15 cdw11:008b0000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.448 [2024-11-19 17:52:19.308193] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.448 [2024-11-19 17:52:19.308254] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:0015150a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.448 [2024-11-19 17:52:19.308269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.710 #38 NEW cov: 11809 ft: 14550 corp: 23/368b lim: 40 exec/s: 38 rss: 69Mb L: 16/39 MS: 1 ShuffleBytes- 00:08:26.710 [2024-11-19 17:52:19.348315] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:bbbb1515 cdw11:8b000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.710 [2024-11-19 17:52:19.348341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.710 [2024-11-19 17:52:19.348415] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:80000000 cdw11:0015150a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.710 [2024-11-19 17:52:19.348429] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.710 #39 NEW cov: 11809 ft: 14574 corp: 24/384b lim: 40 exec/s: 39 rss: 69Mb L: 16/39 MS: 1 ChangeBit- 00:08:26.710 [2024-11-19 17:52:19.388665] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:bbbb1515 cdw11:15000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.710 [2024-11-19 17:52:19.388691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.710 [2024-11-19 17:52:19.388762] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:10000000 cdw11:15150aff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.710 [2024-11-19 17:52:19.388777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.710 [2024-11-19 17:52:19.388833] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.710 [2024-11-19 17:52:19.388847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:26.710 [2024-11-19 17:52:19.388903] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.710 [2024-11-19 17:52:19.388916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:26.710 #40 NEW cov: 11809 ft: 14604 corp: 25/421b lim: 40 exec/s: 40 rss: 69Mb L: 37/39 MS: 1 InsertRepeatedBytes- 00:08:26.710 [2024-11-19 17:52:19.428678] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:000f1515 cdw11:018c6c62 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.710 [2024-11-19 17:52:19.428702] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.710 [2024-11-19 17:52:19.428785] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:2d427d66 cdw11:2d420000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.710 [2024-11-19 17:52:19.428799] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.710 [2024-11-19 17:52:19.428855] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00001515 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.710 [2024-11-19 17:52:19.428868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:26.710 #41 NEW cov: 11809 ft: 14782 corp: 26/446b lim: 40 exec/s: 41 rss: 69Mb L: 25/39 MS: 1 CopyPart- 00:08:26.710 [2024-11-19 17:52:19.468812] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:000f1515 cdw11:018c6c62 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.710 [2024-11-19 17:52:19.468837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.710 [2024-11-19 17:52:19.468891] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:2f427d66 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.710 [2024-11-19 17:52:19.468909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.710 [2024-11-19 17:52:19.468962] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:0000ec00 cdw11:0015150a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.710 [2024-11-19 17:52:19.468975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:26.711 #42 NEW cov: 11809 ft: 14796 corp: 27/470b lim: 40 exec/s: 42 rss: 69Mb L: 24/39 MS: 1 InsertByte- 00:08:26.711 [2024-11-19 17:52:19.508661] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0fbb1515 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.711 [2024-11-19 17:52:19.508685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.711 #43 NEW cov: 11809 ft: 14889 corp: 28/485b lim: 40 exec/s: 43 rss: 69Mb L: 15/39 MS: 1 ChangeBinInt- 00:08:26.711 [2024-11-19 17:52:19.538857] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:000f1515 cdw11:158c6c62 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.711 [2024-11-19 17:52:19.538882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.711 [2024-11-19 17:52:19.538953] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:2d427d66 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.711 [2024-11-19 17:52:19.538967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.711 #44 NEW cov: 11809 ft: 14900 corp: 29/508b lim: 40 exec/s: 44 rss: 69Mb L: 23/39 MS: 1 CrossOver- 00:08:26.972 [2024-11-19 17:52:19.578851] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:8e15bbbb cdw11:15000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.972 [2024-11-19 17:52:19.578876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.972 #45 NEW cov: 11809 ft: 14924 corp: 30/517b lim: 40 exec/s: 45 rss: 69Mb L: 9/39 MS: 1 ShuffleBytes- 00:08:26.972 [2024-11-19 17:52:19.619193] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:000f1515 cdw11:018c6c62 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.972 [2024-11-19 17:52:19.619218] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.972 [2024-11-19 17:52:19.619293] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:2f427d66 cdw11:000000ec SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.972 [2024-11-19 17:52:19.619306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.972 [2024-11-19 17:52:19.619364] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:0015150a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.972 [2024-11-19 17:52:19.619378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:26.972 #46 NEW cov: 11809 ft: 14933 corp: 31/541b lim: 40 exec/s: 46 rss: 69Mb L: 24/39 MS: 1 ShuffleBytes- 00:08:26.972 [2024-11-19 17:52:19.659102] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:018c6c62 cdw11:2f427d66 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.972 [2024-11-19 17:52:19.659127] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.972 #47 NEW cov: 11809 ft: 14941 corp: 32/551b lim: 40 exec/s: 47 rss: 69Mb L: 10/39 MS: 1 PersAutoDict- DE: "\001\214lb/B}f"- 00:08:26.972 [2024-11-19 17:52:19.699661] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:bf000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.972 [2024-11-19 17:52:19.699692] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.972 [2024-11-19 17:52:19.699773] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.972 [2024-11-19 17:52:19.699788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.972 [2024-11-19 17:52:19.699843] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:0000bb15 cdw11:15000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.972 [2024-11-19 17:52:19.699856] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:26.972 [2024-11-19 17:52:19.699908] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00158ebb SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.972 [2024-11-19 17:52:19.699922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:26.972 #48 NEW cov: 11809 ft: 14945 corp: 33/583b lim: 40 exec/s: 48 rss: 69Mb L: 32/39 MS: 1 InsertRepeatedBytes- 00:08:26.972 [2024-11-19 17:52:19.739319] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:bfbb1515 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.972 [2024-11-19 17:52:19.739343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.972 #49 NEW cov: 11809 ft: 14954 corp: 34/598b lim: 40 exec/s: 49 rss: 69Mb L: 15/39 MS: 1 ChangeByte- 00:08:26.972 [2024-11-19 17:52:19.769393] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:bfbb1515 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.972 [2024-11-19 17:52:19.769417] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.972 #50 NEW cov: 11809 ft: 15012 corp: 35/613b lim: 40 exec/s: 50 rss: 69Mb L: 15/39 MS: 1 ChangeBinInt- 00:08:26.972 [2024-11-19 17:52:19.799499] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:bbbb8ebb cdw11:bb151500 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.972 [2024-11-19 17:52:19.799523] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.972 #51 NEW cov: 11809 ft: 15015 corp: 36/628b lim: 40 exec/s: 51 rss: 69Mb L: 15/39 MS: 1 CrossOver- 00:08:27.231 [2024-11-19 17:52:19.839620] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:bfbbe3ea cdw11:27ff0000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.232 [2024-11-19 17:52:19.839647] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.232 #52 NEW cov: 11809 ft: 15029 corp: 37/643b lim: 40 exec/s: 52 rss: 69Mb L: 15/39 MS: 1 ChangeByte- 00:08:27.232 [2024-11-19 17:52:19.879916] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:bfbb1515 cdw11:feffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.232 [2024-11-19 17:52:19.879941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.232 [2024-11-19 17:52:19.879995] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:12d556e1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.232 [2024-11-19 17:52:19.880009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:27.232 #53 NEW cov: 11809 ft: 15053 corp: 38/662b lim: 40 exec/s: 53 rss: 69Mb L: 19/39 MS: 1 CMP- DE: "\022\325V\341"- 00:08:27.232 [2024-11-19 17:52:19.919844] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0fbb1515 cdw11:00018c6c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.232 [2024-11-19 17:52:19.919868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.232 #54 NEW cov: 11809 ft: 15095 corp: 39/677b lim: 40 exec/s: 54 rss: 70Mb L: 15/39 MS: 1 PersAutoDict- DE: "\001\214lb/B}f"- 00:08:27.232 [2024-11-19 17:52:19.960062] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:bf002515 cdw11:bb15bed3 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.232 [2024-11-19 17:52:19.960086] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.232 #55 NEW cov: 11809 ft: 15109 corp: 40/692b lim: 40 exec/s: 55 rss: 70Mb L: 15/39 MS: 1 ShuffleBytes- 00:08:27.232 [2024-11-19 17:52:20.000615] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:bf000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.232 [2024-11-19 17:52:20.000641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.232 [2024-11-19 17:52:20.000699] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:000000bb SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.232 [2024-11-19 17:52:20.000713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:27.232 [2024-11-19 17:52:20.000769] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000015 cdw11:15000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.232 [2024-11-19 17:52:20.000783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:27.232 [2024-11-19 17:52:20.000840] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00158ebb SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.232 [2024-11-19 17:52:20.000854] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:27.232 #56 NEW cov: 11809 ft: 15125 corp: 41/724b lim: 40 exec/s: 56 rss: 70Mb L: 32/39 MS: 1 ShuffleBytes- 00:08:27.232 [2024-11-19 17:52:20.050357] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:28018c6c cdw11:622f427d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.232 [2024-11-19 17:52:20.050382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.232 #57 NEW cov: 11809 ft: 15149 corp: 42/734b lim: 40 exec/s: 57 rss: 70Mb L: 10/39 MS: 1 ChangeBit- 00:08:27.232 [2024-11-19 17:52:20.090406] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:bfbbe3ea cdw11:27ff0000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.232 [2024-11-19 17:52:20.090431] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.491 #58 NEW cov: 11809 ft: 15155 corp: 43/749b lim: 40 exec/s: 58 rss: 70Mb L: 15/39 MS: 1 ChangeBinInt- 00:08:27.491 [2024-11-19 17:52:20.130874] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:28018c6c cdw11:622f427d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.491 [2024-11-19 17:52:20.130899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.491 [2024-11-19 17:52:20.130960] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:6700000f cdw11:1515158c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.491 [2024-11-19 17:52:20.130974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:27.491 [2024-11-19 17:52:20.131034] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:6c622d42 cdw11:7d660000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.491 [2024-11-19 17:52:20.131051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:27.491 #59 NEW cov: 11809 ft: 15167 corp: 44/777b lim: 40 exec/s: 59 rss: 70Mb L: 28/39 MS: 1 CrossOver- 00:08:27.491 [2024-11-19 17:52:20.170696] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:28018c6c cdw11:622f427d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.491 [2024-11-19 17:52:20.170721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.491 #60 NEW cov: 11809 ft: 15174 corp: 45/790b lim: 40 exec/s: 30 rss: 70Mb L: 13/39 MS: 1 CopyPart- 00:08:27.491 #60 DONE cov: 11809 ft: 15174 corp: 45/790b lim: 40 exec/s: 30 rss: 70Mb 00:08:27.491 ###### Recommended dictionary. ###### 00:08:27.491 "\001\214lb/B}f" # Uses: 3 00:08:27.491 "%\276\323Ebl\214\000" # Uses: 1 00:08:27.491 "\022\325V\341" # Uses: 0 00:08:27.492 ###### End of recommended dictionary. ###### 00:08:27.492 Done 60 runs in 2 second(s) 00:08:27.492 17:52:20 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_13.conf 00:08:27.492 17:52:20 -- ../common.sh@72 -- # (( i++ )) 00:08:27.492 17:52:20 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:27.492 17:52:20 -- ../common.sh@73 -- # start_llvm_fuzz 14 1 0x1 00:08:27.492 17:52:20 -- nvmf/run.sh@23 -- # local fuzzer_type=14 00:08:27.492 17:52:20 -- nvmf/run.sh@24 -- # local timen=1 00:08:27.492 17:52:20 -- nvmf/run.sh@25 -- # local core=0x1 00:08:27.492 17:52:20 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:08:27.492 17:52:20 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_14.conf 00:08:27.492 17:52:20 -- nvmf/run.sh@29 -- # printf %02d 14 00:08:27.492 17:52:20 -- nvmf/run.sh@29 -- # port=4414 00:08:27.492 17:52:20 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:08:27.492 17:52:20 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4414' 00:08:27.492 17:52:20 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4414"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:27.492 17:52:20 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4414' -c /tmp/fuzz_json_14.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 -Z 14 -r /var/tmp/spdk14.sock 00:08:27.492 [2024-11-19 17:52:20.349616] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:27.492 [2024-11-19 17:52:20.349708] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid640210 ] 00:08:27.752 EAL: No free 2048 kB hugepages reported on node 1 00:08:27.752 [2024-11-19 17:52:20.613728] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:28.012 [2024-11-19 17:52:20.641695] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:28.012 [2024-11-19 17:52:20.641838] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:28.012 [2024-11-19 17:52:20.693498] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:28.012 [2024-11-19 17:52:20.709840] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4414 *** 00:08:28.012 INFO: Running with entropic power schedule (0xFF, 100). 00:08:28.012 INFO: Seed: 2774561914 00:08:28.012 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:28.012 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:28.012 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:08:28.012 INFO: A corpus is not provided, starting from an empty corpus 00:08:28.012 #2 INITED exec/s: 0 rss: 60Mb 00:08:28.012 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:28.012 This may also happen if the target rejected all inputs we tried so far 00:08:28.012 [2024-11-19 17:52:20.780535] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.012 [2024-11-19 17:52:20.780575] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.272 NEW_FUNC[1/671]: 0x4654c8 in fuzz_admin_set_features_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:392 00:08:28.272 NEW_FUNC[2/671]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:28.272 #4 NEW cov: 11576 ft: 11577 corp: 2/10b lim: 35 exec/s: 0 rss: 67Mb L: 9/9 MS: 2 ChangeByte-InsertRepeatedBytes- 00:08:28.272 [2024-11-19 17:52:21.101348] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.272 [2024-11-19 17:52:21.101397] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.272 [2024-11-19 17:52:21.101533] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.272 [2024-11-19 17:52:21.101552] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:28.272 #5 NEW cov: 11689 ft: 12942 corp: 3/24b lim: 35 exec/s: 0 rss: 67Mb L: 14/14 MS: 1 CrossOver- 00:08:28.531 [2024-11-19 17:52:21.162012] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.531 [2024-11-19 17:52:21.162048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.531 [2024-11-19 17:52:21.162194] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.531 [2024-11-19 17:52:21.162212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:28.531 [2024-11-19 17:52:21.162346] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.532 [2024-11-19 17:52:21.162365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:28.532 [2024-11-19 17:52:21.162507] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.532 [2024-11-19 17:52:21.162525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:28.532 #6 NEW cov: 11702 ft: 13429 corp: 4/52b lim: 35 exec/s: 0 rss: 67Mb L: 28/28 MS: 1 InsertRepeatedBytes- 00:08:28.532 [2024-11-19 17:52:21.211141] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000f7 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.532 [2024-11-19 17:52:21.211176] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.532 #7 NEW cov: 11787 ft: 13777 corp: 5/61b lim: 35 exec/s: 0 rss: 67Mb L: 9/28 MS: 1 ChangeBit- 00:08:28.532 [2024-11-19 17:52:21.262157] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.532 [2024-11-19 17:52:21.262189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.532 [2024-11-19 17:52:21.262327] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.532 [2024-11-19 17:52:21.262345] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:28.532 [2024-11-19 17:52:21.262477] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.532 [2024-11-19 17:52:21.262494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:28.532 [2024-11-19 17:52:21.262638] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.532 [2024-11-19 17:52:21.262657] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:28.532 #8 NEW cov: 11787 ft: 13863 corp: 6/92b lim: 35 exec/s: 0 rss: 67Mb L: 31/31 MS: 1 InsertRepeatedBytes- 00:08:28.532 [2024-11-19 17:52:21.321880] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.532 [2024-11-19 17:52:21.321911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.532 [2024-11-19 17:52:21.322041] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.532 [2024-11-19 17:52:21.322062] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:28.532 #9 NEW cov: 11787 ft: 13980 corp: 7/106b lim: 35 exec/s: 0 rss: 67Mb L: 14/31 MS: 1 ShuffleBytes- 00:08:28.532 [2024-11-19 17:52:21.372424] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.532 [2024-11-19 17:52:21.372458] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.532 [2024-11-19 17:52:21.372605] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.532 [2024-11-19 17:52:21.372626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:28.532 [2024-11-19 17:52:21.372757] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.532 [2024-11-19 17:52:21.372779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:28.791 #10 NEW cov: 11787 ft: 14200 corp: 8/133b lim: 35 exec/s: 0 rss: 68Mb L: 27/31 MS: 1 CopyPart- 00:08:28.791 [2024-11-19 17:52:21.432500] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.791 [2024-11-19 17:52:21.432533] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.791 [2024-11-19 17:52:21.432683] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.791 [2024-11-19 17:52:21.432704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:28.791 [2024-11-19 17:52:21.432833] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.791 [2024-11-19 17:52:21.432859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:28.791 #11 NEW cov: 11787 ft: 14277 corp: 9/160b lim: 35 exec/s: 0 rss: 68Mb L: 27/31 MS: 1 ChangeBit- 00:08:28.791 [2024-11-19 17:52:21.492284] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.791 [2024-11-19 17:52:21.492316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.791 [2024-11-19 17:52:21.492447] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.791 [2024-11-19 17:52:21.492470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:28.791 #12 NEW cov: 11787 ft: 14316 corp: 10/178b lim: 35 exec/s: 0 rss: 68Mb L: 18/31 MS: 1 EraseBytes- 00:08:28.791 [2024-11-19 17:52:21.543439] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.791 [2024-11-19 17:52:21.543471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.791 [2024-11-19 17:52:21.543603] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.791 [2024-11-19 17:52:21.543623] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:28.791 [2024-11-19 17:52:21.543774] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.791 [2024-11-19 17:52:21.543790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:28.791 [2024-11-19 17:52:21.543940] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.791 [2024-11-19 17:52:21.543960] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:28.791 [2024-11-19 17:52:21.544058] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:8 cdw10:0000004c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.791 [2024-11-19 17:52:21.544076] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:28.791 #18 NEW cov: 11787 ft: 14404 corp: 11/213b lim: 35 exec/s: 0 rss: 68Mb L: 35/35 MS: 1 InsertRepeatedBytes- 00:08:28.791 [2024-11-19 17:52:21.602530] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.791 [2024-11-19 17:52:21.602564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.791 #19 NEW cov: 11787 ft: 14447 corp: 12/222b lim: 35 exec/s: 0 rss: 68Mb L: 9/35 MS: 1 ChangeByte- 00:08:28.791 [2024-11-19 17:52:21.652987] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.791 [2024-11-19 17:52:21.653017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.791 [2024-11-19 17:52:21.653102] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.791 [2024-11-19 17:52:21.653128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.051 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:29.051 #20 NEW cov: 11810 ft: 14476 corp: 13/241b lim: 35 exec/s: 0 rss: 68Mb L: 19/35 MS: 1 InsertByte- 00:08:29.051 [2024-11-19 17:52:21.713428] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.051 [2024-11-19 17:52:21.713461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.051 [2024-11-19 17:52:21.713602] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.052 [2024-11-19 17:52:21.713629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.052 [2024-11-19 17:52:21.713778] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:6 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.052 [2024-11-19 17:52:21.713803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:29.052 NEW_FUNC[1/1]: 0x4868f8 in feat_write_atomicity /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:340 00:08:29.052 #21 NEW cov: 11820 ft: 14516 corp: 14/268b lim: 35 exec/s: 0 rss: 68Mb L: 27/35 MS: 1 ChangeByte- 00:08:29.052 [2024-11-19 17:52:21.763027] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000f7 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.052 [2024-11-19 17:52:21.763062] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.052 #22 NEW cov: 11820 ft: 14562 corp: 15/277b lim: 35 exec/s: 22 rss: 68Mb L: 9/35 MS: 1 ChangeByte- 00:08:29.052 [2024-11-19 17:52:21.813225] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000f7 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.052 [2024-11-19 17:52:21.813257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.052 #23 NEW cov: 11820 ft: 14575 corp: 16/286b lim: 35 exec/s: 23 rss: 68Mb L: 9/35 MS: 1 ChangeBinInt- 00:08:29.052 [2024-11-19 17:52:21.863653] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000f7 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.052 [2024-11-19 17:52:21.863687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.052 [2024-11-19 17:52:21.863838] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.052 [2024-11-19 17:52:21.863861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.052 #24 NEW cov: 11820 ft: 14612 corp: 17/304b lim: 35 exec/s: 24 rss: 68Mb L: 18/35 MS: 1 CopyPart- 00:08:29.052 [2024-11-19 17:52:21.913593] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000f7 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.052 [2024-11-19 17:52:21.913629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.311 #25 NEW cov: 11820 ft: 14676 corp: 18/314b lim: 35 exec/s: 25 rss: 68Mb L: 10/35 MS: 1 InsertByte- 00:08:29.311 [2024-11-19 17:52:21.963950] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.311 [2024-11-19 17:52:21.963989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.311 [2024-11-19 17:52:21.964124] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.311 [2024-11-19 17:52:21.964144] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.311 #26 NEW cov: 11820 ft: 14698 corp: 19/328b lim: 35 exec/s: 26 rss: 68Mb L: 14/35 MS: 1 CopyPart- 00:08:29.311 [2024-11-19 17:52:22.014195] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000f7 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.311 [2024-11-19 17:52:22.014226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.311 [2024-11-19 17:52:22.014353] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.311 [2024-11-19 17:52:22.014379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.311 #27 NEW cov: 11820 ft: 14712 corp: 20/346b lim: 35 exec/s: 27 rss: 68Mb L: 18/35 MS: 1 ChangeByte- 00:08:29.311 [2024-11-19 17:52:22.074386] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000f7 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.311 [2024-11-19 17:52:22.074423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.311 [2024-11-19 17:52:22.074562] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.311 [2024-11-19 17:52:22.074587] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.311 #28 NEW cov: 11820 ft: 14721 corp: 21/365b lim: 35 exec/s: 28 rss: 68Mb L: 19/35 MS: 1 InsertByte- 00:08:29.311 [2024-11-19 17:52:22.124548] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000f7 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.311 [2024-11-19 17:52:22.124582] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.311 [2024-11-19 17:52:22.124717] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:8000005d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.311 [2024-11-19 17:52:22.124745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.311 #29 NEW cov: 11820 ft: 14735 corp: 22/383b lim: 35 exec/s: 29 rss: 68Mb L: 18/35 MS: 1 ChangeByte- 00:08:29.572 [2024-11-19 17:52:22.174978] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.572 [2024-11-19 17:52:22.175014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.572 [2024-11-19 17:52:22.175154] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.572 [2024-11-19 17:52:22.175177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.572 [2024-11-19 17:52:22.175321] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.572 [2024-11-19 17:52:22.175346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:29.572 #30 NEW cov: 11820 ft: 14749 corp: 23/405b lim: 35 exec/s: 30 rss: 69Mb L: 22/35 MS: 1 InsertRepeatedBytes- 00:08:29.572 [2024-11-19 17:52:22.235854] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.572 [2024-11-19 17:52:22.235889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.572 [2024-11-19 17:52:22.236024] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.572 [2024-11-19 17:52:22.236043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.572 [2024-11-19 17:52:22.236179] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.572 [2024-11-19 17:52:22.236195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:29.572 [2024-11-19 17:52:22.236335] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.572 [2024-11-19 17:52:22.236353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:29.572 [2024-11-19 17:52:22.236511] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:8 cdw10:8000004c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.572 [2024-11-19 17:52:22.236538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:29.572 #31 NEW cov: 11820 ft: 14814 corp: 24/440b lim: 35 exec/s: 31 rss: 69Mb L: 35/35 MS: 1 ShuffleBytes- 00:08:29.572 [2024-11-19 17:52:22.294785] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.572 [2024-11-19 17:52:22.294817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.572 #32 NEW cov: 11820 ft: 14875 corp: 25/449b lim: 35 exec/s: 32 rss: 69Mb L: 9/35 MS: 1 ShuffleBytes- 00:08:29.572 [2024-11-19 17:52:22.344968] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000f7 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.572 [2024-11-19 17:52:22.345000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.572 #33 NEW cov: 11820 ft: 14904 corp: 26/456b lim: 35 exec/s: 33 rss: 69Mb L: 7/35 MS: 1 EraseBytes- 00:08:29.572 [2024-11-19 17:52:22.396343] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.572 [2024-11-19 17:52:22.396378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.572 [2024-11-19 17:52:22.396512] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.572 [2024-11-19 17:52:22.396535] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.572 [2024-11-19 17:52:22.396675] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.572 [2024-11-19 17:52:22.396695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:29.572 [2024-11-19 17:52:22.396833] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:000000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.572 [2024-11-19 17:52:22.396850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:29.572 [2024-11-19 17:52:22.396984] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:8 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.572 [2024-11-19 17:52:22.397008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:29.572 #34 NEW cov: 11820 ft: 14955 corp: 27/491b lim: 35 exec/s: 34 rss: 69Mb L: 35/35 MS: 1 InsertRepeatedBytes- 00:08:29.832 [2024-11-19 17:52:22.455195] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000f7 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.832 [2024-11-19 17:52:22.455231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.832 #35 NEW cov: 11820 ft: 14984 corp: 28/498b lim: 35 exec/s: 35 rss: 69Mb L: 7/35 MS: 1 ShuffleBytes- 00:08:29.832 [2024-11-19 17:52:22.505956] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.832 [2024-11-19 17:52:22.505989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.832 [2024-11-19 17:52:22.506138] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.832 [2024-11-19 17:52:22.506158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.832 [2024-11-19 17:52:22.506303] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.832 [2024-11-19 17:52:22.506322] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:29.832 #36 NEW cov: 11820 ft: 15012 corp: 29/521b lim: 35 exec/s: 36 rss: 69Mb L: 23/35 MS: 1 EraseBytes- 00:08:29.832 [2024-11-19 17:52:22.555620] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000f7 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.832 [2024-11-19 17:52:22.555654] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.832 #37 NEW cov: 11820 ft: 15034 corp: 30/534b lim: 35 exec/s: 37 rss: 69Mb L: 13/35 MS: 1 CopyPart- 00:08:29.833 [2024-11-19 17:52:22.605773] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.833 [2024-11-19 17:52:22.605806] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.833 #38 NEW cov: 11820 ft: 15048 corp: 31/547b lim: 35 exec/s: 38 rss: 69Mb L: 13/35 MS: 1 CrossOver- 00:08:29.833 [2024-11-19 17:52:22.656846] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.833 [2024-11-19 17:52:22.656878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.833 [2024-11-19 17:52:22.657022] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.833 [2024-11-19 17:52:22.657042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.833 [2024-11-19 17:52:22.657179] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000064 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.833 [2024-11-19 17:52:22.657197] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:29.833 [2024-11-19 17:52:22.657343] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.833 [2024-11-19 17:52:22.657361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:29.833 #39 NEW cov: 11820 ft: 15061 corp: 32/578b lim: 35 exec/s: 39 rss: 69Mb L: 31/35 MS: 1 CMP- DE: "\377\213ldc&&("- 00:08:30.093 [2024-11-19 17:52:22.717234] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.093 [2024-11-19 17:52:22.717268] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:30.093 [2024-11-19 17:52:22.717425] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.093 [2024-11-19 17:52:22.717442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:30.093 [2024-11-19 17:52:22.717588] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.093 [2024-11-19 17:52:22.717609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:30.093 [2024-11-19 17:52:22.717747] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.093 [2024-11-19 17:52:22.717764] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:30.093 [2024-11-19 17:52:22.717897] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:8 cdw10:8000004c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.093 [2024-11-19 17:52:22.717921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:30.093 #40 NEW cov: 11820 ft: 15075 corp: 33/613b lim: 35 exec/s: 40 rss: 69Mb L: 35/35 MS: 1 ChangeByte- 00:08:30.093 [2024-11-19 17:52:22.776530] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000f7 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.093 [2024-11-19 17:52:22.776562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:30.093 [2024-11-19 17:52:22.776685] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000064 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.093 [2024-11-19 17:52:22.776703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:30.093 #41 NEW cov: 11820 ft: 15088 corp: 34/628b lim: 35 exec/s: 20 rss: 69Mb L: 15/35 MS: 1 PersAutoDict- DE: "\377\213ldc&&("- 00:08:30.093 #41 DONE cov: 11820 ft: 15088 corp: 34/628b lim: 35 exec/s: 20 rss: 69Mb 00:08:30.093 ###### Recommended dictionary. ###### 00:08:30.093 "\377\213ldc&&(" # Uses: 1 00:08:30.093 ###### End of recommended dictionary. ###### 00:08:30.093 Done 41 runs in 2 second(s) 00:08:30.093 17:52:22 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_14.conf 00:08:30.093 17:52:22 -- ../common.sh@72 -- # (( i++ )) 00:08:30.093 17:52:22 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:30.093 17:52:22 -- ../common.sh@73 -- # start_llvm_fuzz 15 1 0x1 00:08:30.093 17:52:22 -- nvmf/run.sh@23 -- # local fuzzer_type=15 00:08:30.093 17:52:22 -- nvmf/run.sh@24 -- # local timen=1 00:08:30.093 17:52:22 -- nvmf/run.sh@25 -- # local core=0x1 00:08:30.093 17:52:22 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:08:30.093 17:52:22 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_15.conf 00:08:30.093 17:52:22 -- nvmf/run.sh@29 -- # printf %02d 15 00:08:30.093 17:52:22 -- nvmf/run.sh@29 -- # port=4415 00:08:30.093 17:52:22 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:08:30.093 17:52:22 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4415' 00:08:30.093 17:52:22 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4415"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:30.093 17:52:22 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4415' -c /tmp/fuzz_json_15.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 -Z 15 -r /var/tmp/spdk15.sock 00:08:30.354 [2024-11-19 17:52:22.957109] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:30.354 [2024-11-19 17:52:22.957199] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid640746 ] 00:08:30.354 EAL: No free 2048 kB hugepages reported on node 1 00:08:30.354 [2024-11-19 17:52:23.207612] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:30.613 [2024-11-19 17:52:23.235091] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:30.613 [2024-11-19 17:52:23.235213] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:30.613 [2024-11-19 17:52:23.286520] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:30.613 [2024-11-19 17:52:23.302855] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4415 *** 00:08:30.614 INFO: Running with entropic power schedule (0xFF, 100). 00:08:30.614 INFO: Seed: 1072601375 00:08:30.614 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:30.614 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:30.614 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:08:30.614 INFO: A corpus is not provided, starting from an empty corpus 00:08:30.614 #2 INITED exec/s: 0 rss: 59Mb 00:08:30.614 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:30.614 This may also happen if the target rejected all inputs we tried so far 00:08:30.614 [2024-11-19 17:52:23.352474] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005a1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.614 [2024-11-19 17:52:23.352501] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:30.614 [2024-11-19 17:52:23.352564] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000005a1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.614 [2024-11-19 17:52:23.352578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:30.614 [2024-11-19 17:52:23.352655] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000005a1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.614 [2024-11-19 17:52:23.352669] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:30.874 NEW_FUNC[1/671]: 0x466a08 in fuzz_admin_get_features_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:460 00:08:30.874 NEW_FUNC[2/671]: 0x4868f8 in feat_write_atomicity /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:340 00:08:30.874 #9 NEW cov: 11578 ft: 11579 corp: 2/35b lim: 35 exec/s: 0 rss: 66Mb L: 34/34 MS: 2 InsertByte-InsertRepeatedBytes- 00:08:30.874 [2024-11-19 17:52:23.653197] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005a1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.874 [2024-11-19 17:52:23.653228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:30.874 [2024-11-19 17:52:23.653288] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:0000025f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.874 [2024-11-19 17:52:23.653302] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:30.874 [2024-11-19 17:52:23.653363] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000563 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.874 [2024-11-19 17:52:23.653376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:30.874 #20 NEW cov: 11691 ft: 12171 corp: 3/69b lim: 35 exec/s: 0 rss: 68Mb L: 34/34 MS: 1 ChangeBinInt- 00:08:30.874 [2024-11-19 17:52:23.703275] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005a1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.874 [2024-11-19 17:52:23.703301] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:30.874 [2024-11-19 17:52:23.703359] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000005a1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.874 [2024-11-19 17:52:23.703373] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:30.874 [2024-11-19 17:52:23.703434] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000005a1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.874 [2024-11-19 17:52:23.703449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:30.874 #21 NEW cov: 11697 ft: 12342 corp: 4/103b lim: 35 exec/s: 0 rss: 68Mb L: 34/34 MS: 1 ShuffleBytes- 00:08:31.133 [2024-11-19 17:52:23.742940] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.133 [2024-11-19 17:52:23.742966] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:31.133 #23 NEW cov: 11782 ft: 13275 corp: 5/112b lim: 35 exec/s: 0 rss: 68Mb L: 9/34 MS: 2 ChangeBit-CMP- DE: "\000\000\000\000\000\000\000\000"- 00:08:31.133 [2024-11-19 17:52:23.783491] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005a1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.133 [2024-11-19 17:52:23.783523] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:31.133 [2024-11-19 17:52:23.783582] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.133 [2024-11-19 17:52:23.783596] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:31.133 [2024-11-19 17:52:23.783679] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000005a1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.133 [2024-11-19 17:52:23.783692] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:31.133 #24 NEW cov: 11782 ft: 13427 corp: 6/146b lim: 35 exec/s: 0 rss: 68Mb L: 34/34 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000\000"- 00:08:31.133 [2024-11-19 17:52:23.823614] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005a1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.134 [2024-11-19 17:52:23.823639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:31.134 [2024-11-19 17:52:23.823714] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:0000025f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.134 [2024-11-19 17:52:23.823728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:31.134 [2024-11-19 17:52:23.823798] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000563 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.134 [2024-11-19 17:52:23.823811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:31.134 #35 NEW cov: 11782 ft: 13488 corp: 7/180b lim: 35 exec/s: 0 rss: 68Mb L: 34/34 MS: 1 ChangeByte- 00:08:31.134 [2024-11-19 17:52:23.863303] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.134 [2024-11-19 17:52:23.863327] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:31.134 #36 NEW cov: 11782 ft: 13596 corp: 8/193b lim: 35 exec/s: 0 rss: 68Mb L: 13/34 MS: 1 InsertRepeatedBytes- 00:08:31.134 [2024-11-19 17:52:23.903918] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005a1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.134 [2024-11-19 17:52:23.903943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:31.134 [2024-11-19 17:52:23.904005] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000005a1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.134 [2024-11-19 17:52:23.904018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:31.134 NEW_FUNC[1/1]: 0x481108 in feat_power_management /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:282 00:08:31.134 #37 NEW cov: 11805 ft: 13836 corp: 9/227b lim: 35 exec/s: 0 rss: 68Mb L: 34/34 MS: 1 ChangeByte- 00:08:31.134 #38 NEW cov: 11805 ft: 13932 corp: 10/236b lim: 35 exec/s: 0 rss: 68Mb L: 9/34 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000\000"- 00:08:31.134 [2024-11-19 17:52:23.984017] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.134 [2024-11-19 17:52:23.984043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:31.134 [2024-11-19 17:52:23.984124] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005a5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.134 [2024-11-19 17:52:23.984138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:31.134 [2024-11-19 17:52:23.984179] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000005a5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.134 [2024-11-19 17:52:23.984193] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:31.394 NEW_FUNC[1/1]: 0x48a868 in feat_keep_alive_timer /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:364 00:08:31.394 #39 NEW cov: 11824 ft: 14035 corp: 11/266b lim: 35 exec/s: 0 rss: 68Mb L: 30/34 MS: 1 InsertRepeatedBytes- 00:08:31.394 [2024-11-19 17:52:24.024258] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005a1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.394 [2024-11-19 17:52:24.024282] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:31.394 [2024-11-19 17:52:24.024318] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000005a1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.394 [2024-11-19 17:52:24.024331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:31.394 [2024-11-19 17:52:24.024389] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000005a1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.394 [2024-11-19 17:52:24.024403] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:31.394 #40 NEW cov: 11824 ft: 14067 corp: 12/300b lim: 35 exec/s: 0 rss: 68Mb L: 34/34 MS: 1 CopyPart- 00:08:31.394 [2024-11-19 17:52:24.064330] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005a1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.394 [2024-11-19 17:52:24.064355] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:31.394 [2024-11-19 17:52:24.064416] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:0000025f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.394 [2024-11-19 17:52:24.064428] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:31.394 [2024-11-19 17:52:24.064490] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000563 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.394 [2024-11-19 17:52:24.064503] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:31.394 #41 NEW cov: 11824 ft: 14084 corp: 13/334b lim: 35 exec/s: 0 rss: 68Mb L: 34/34 MS: 1 ChangeBit- 00:08:31.394 [2024-11-19 17:52:24.104552] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005a1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.394 [2024-11-19 17:52:24.104576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:31.394 [2024-11-19 17:52:24.104639] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000005a1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.394 [2024-11-19 17:52:24.104653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:31.394 [2024-11-19 17:52:24.104709] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000000a1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.394 [2024-11-19 17:52:24.104721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:31.394 [2024-11-19 17:52:24.104778] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:000005a1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.394 [2024-11-19 17:52:24.104791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:31.394 #42 NEW cov: 11824 ft: 14272 corp: 14/369b lim: 35 exec/s: 0 rss: 68Mb L: 35/35 MS: 1 CopyPart- 00:08:31.394 NEW_FUNC[1/2]: 0x485c78 in feat_interrupt_vector_configuration /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:332 00:08:31.394 NEW_FUNC[2/2]: 0x113bd48 in nvmf_ctrlr_get_features_interrupt_vector_configuration /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:1592 00:08:31.394 #43 NEW cov: 11873 ft: 14411 corp: 15/378b lim: 35 exec/s: 0 rss: 69Mb L: 9/35 MS: 1 ChangeBinInt- 00:08:31.394 [2024-11-19 17:52:24.184420] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005a5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.394 [2024-11-19 17:52:24.184447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:31.394 #44 NEW cov: 11873 ft: 14500 corp: 16/393b lim: 35 exec/s: 0 rss: 69Mb L: 15/35 MS: 1 EraseBytes- 00:08:31.394 [2024-11-19 17:52:24.224958] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005a1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.394 [2024-11-19 17:52:24.224982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:31.394 [2024-11-19 17:52:24.225052] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:0000025f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.394 [2024-11-19 17:52:24.225066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:31.394 [2024-11-19 17:52:24.225126] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000363 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.394 [2024-11-19 17:52:24.225140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:31.394 [2024-11-19 17:52:24.225200] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:000005a1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.394 [2024-11-19 17:52:24.225213] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:31.394 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:31.394 #45 NEW cov: 11896 ft: 14552 corp: 17/428b lim: 35 exec/s: 0 rss: 69Mb L: 35/35 MS: 1 InsertByte- 00:08:31.655 [2024-11-19 17:52:24.264967] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005a1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.655 [2024-11-19 17:52:24.264993] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:31.655 [2024-11-19 17:52:24.265050] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000257 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.655 [2024-11-19 17:52:24.265065] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:31.655 [2024-11-19 17:52:24.265124] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000563 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.655 [2024-11-19 17:52:24.265137] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:31.655 #46 NEW cov: 11896 ft: 14560 corp: 18/462b lim: 35 exec/s: 0 rss: 69Mb L: 34/35 MS: 1 ChangeBit- 00:08:31.655 [2024-11-19 17:52:24.305172] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005a1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.655 [2024-11-19 17:52:24.305196] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:31.655 [2024-11-19 17:52:24.305256] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:0000025f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.655 [2024-11-19 17:52:24.305269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:31.655 [2024-11-19 17:52:24.305332] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000363 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.655 [2024-11-19 17:52:24.305346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:31.655 [2024-11-19 17:52:24.305406] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:000005a1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.655 [2024-11-19 17:52:24.305419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:31.655 #47 NEW cov: 11896 ft: 14610 corp: 19/497b lim: 35 exec/s: 0 rss: 69Mb L: 35/35 MS: 1 CopyPart- 00:08:31.655 [2024-11-19 17:52:24.345333] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005a1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.655 [2024-11-19 17:52:24.345358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:31.655 [2024-11-19 17:52:24.345405] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000005a5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.655 [2024-11-19 17:52:24.345418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:31.655 [2024-11-19 17:52:24.345479] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000000a5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.655 [2024-11-19 17:52:24.345492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:31.655 [2024-11-19 17:52:24.345550] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.655 [2024-11-19 17:52:24.345563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:31.655 #48 NEW cov: 11896 ft: 14625 corp: 20/532b lim: 35 exec/s: 48 rss: 69Mb L: 35/35 MS: 1 CrossOver- 00:08:31.655 [2024-11-19 17:52:24.385249] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.655 [2024-11-19 17:52:24.385273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:31.655 [2024-11-19 17:52:24.385348] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005a5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.655 [2024-11-19 17:52:24.385362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:31.655 [2024-11-19 17:52:24.385422] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000005a5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.655 [2024-11-19 17:52:24.385435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:31.655 #49 NEW cov: 11896 ft: 14646 corp: 21/562b lim: 35 exec/s: 49 rss: 69Mb L: 30/35 MS: 1 ShuffleBytes- 00:08:31.655 [2024-11-19 17:52:24.425069] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005a5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.655 [2024-11-19 17:52:24.425095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:31.655 #50 NEW cov: 11896 ft: 14663 corp: 22/578b lim: 35 exec/s: 50 rss: 69Mb L: 16/35 MS: 1 InsertByte- 00:08:31.655 #51 NEW cov: 11896 ft: 14674 corp: 23/587b lim: 35 exec/s: 51 rss: 69Mb L: 9/35 MS: 1 ChangeBinInt- 00:08:31.655 [2024-11-19 17:52:24.505658] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005a1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.655 [2024-11-19 17:52:24.505683] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:31.655 [2024-11-19 17:52:24.505747] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000005a1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.655 [2024-11-19 17:52:24.505761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:31.655 [2024-11-19 17:52:24.505835] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000005a1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.655 [2024-11-19 17:52:24.505849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:31.915 #52 NEW cov: 11896 ft: 14710 corp: 24/621b lim: 35 exec/s: 52 rss: 69Mb L: 34/35 MS: 1 ChangeBit- 00:08:31.915 [2024-11-19 17:52:24.545310] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000100 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.915 [2024-11-19 17:52:24.545334] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:31.915 #53 NEW cov: 11896 ft: 14722 corp: 25/630b lim: 35 exec/s: 53 rss: 69Mb L: 9/35 MS: 1 ChangeByte- 00:08:31.915 #54 NEW cov: 11896 ft: 14733 corp: 26/639b lim: 35 exec/s: 54 rss: 69Mb L: 9/35 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000\000"- 00:08:31.915 [2024-11-19 17:52:24.625982] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005a1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.915 [2024-11-19 17:52:24.626006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:31.915 [2024-11-19 17:52:24.626081] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000005a1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.915 [2024-11-19 17:52:24.626095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:31.915 [2024-11-19 17:52:24.626156] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000005a1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.915 [2024-11-19 17:52:24.626170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:31.915 #55 NEW cov: 11896 ft: 14736 corp: 27/673b lim: 35 exec/s: 55 rss: 69Mb L: 34/35 MS: 1 ChangeBinInt- 00:08:31.915 [2024-11-19 17:52:24.656180] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005a1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.915 [2024-11-19 17:52:24.656204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:31.915 [2024-11-19 17:52:24.656263] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:0000025f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.915 [2024-11-19 17:52:24.656277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:31.915 [2024-11-19 17:52:24.656334] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000363 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.915 [2024-11-19 17:52:24.656347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:31.915 [2024-11-19 17:52:24.656405] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:000005a1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.915 [2024-11-19 17:52:24.656418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:31.915 #56 NEW cov: 11896 ft: 14746 corp: 28/708b lim: 35 exec/s: 56 rss: 69Mb L: 35/35 MS: 1 ShuffleBytes- 00:08:31.915 [2024-11-19 17:52:24.696315] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005a1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.915 [2024-11-19 17:52:24.696339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:31.915 [2024-11-19 17:52:24.696406] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:0000025f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.915 [2024-11-19 17:52:24.696420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:31.915 [2024-11-19 17:52:24.696480] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000363 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.915 [2024-11-19 17:52:24.696493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:31.915 [2024-11-19 17:52:24.696553] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:000005a1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.915 [2024-11-19 17:52:24.696566] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:31.915 #57 NEW cov: 11896 ft: 14762 corp: 29/743b lim: 35 exec/s: 57 rss: 69Mb L: 35/35 MS: 1 ChangeByte- 00:08:31.915 [2024-11-19 17:52:24.736317] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005a1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.915 [2024-11-19 17:52:24.736341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:31.915 [2024-11-19 17:52:24.736400] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.915 [2024-11-19 17:52:24.736413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:31.916 #58 NEW cov: 11896 ft: 14798 corp: 30/777b lim: 35 exec/s: 58 rss: 69Mb L: 34/35 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000\000"- 00:08:32.175 #59 NEW cov: 11896 ft: 14814 corp: 31/786b lim: 35 exec/s: 59 rss: 69Mb L: 9/35 MS: 1 ChangeByte- 00:08:32.175 [2024-11-19 17:52:24.816088] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.175 [2024-11-19 17:52:24.816113] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:32.175 #60 NEW cov: 11896 ft: 14874 corp: 32/799b lim: 35 exec/s: 60 rss: 69Mb L: 13/35 MS: 1 ChangeByte- 00:08:32.175 [2024-11-19 17:52:24.856669] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005a1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.175 [2024-11-19 17:52:24.856694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:32.175 [2024-11-19 17:52:24.856756] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000005a1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.175 [2024-11-19 17:52:24.856770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:32.175 [2024-11-19 17:52:24.856829] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000005a1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.175 [2024-11-19 17:52:24.856842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:32.175 #61 NEW cov: 11896 ft: 14939 corp: 33/833b lim: 35 exec/s: 61 rss: 69Mb L: 34/35 MS: 1 ChangeBit- 00:08:32.175 [2024-11-19 17:52:24.896956] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005a1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.175 [2024-11-19 17:52:24.896980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:32.175 [2024-11-19 17:52:24.897040] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000005a5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.175 [2024-11-19 17:52:24.897053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:32.175 [2024-11-19 17:52:24.897115] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000000a5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.175 [2024-11-19 17:52:24.897128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:32.175 [2024-11-19 17:52:24.897188] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.175 [2024-11-19 17:52:24.897202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:32.175 #62 NEW cov: 11896 ft: 14943 corp: 34/868b lim: 35 exec/s: 62 rss: 70Mb L: 35/35 MS: 1 ChangeBit- 00:08:32.175 [2024-11-19 17:52:24.936992] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005a5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.175 [2024-11-19 17:52:24.937017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:32.175 [2024-11-19 17:52:24.937074] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005a5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.175 [2024-11-19 17:52:24.937087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:32.175 [2024-11-19 17:52:24.937131] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.175 [2024-11-19 17:52:24.937144] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:32.175 [2024-11-19 17:52:24.937203] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000000a5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.175 [2024-11-19 17:52:24.937217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:32.176 [2024-11-19 17:52:24.937276] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.176 [2024-11-19 17:52:24.937289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:32.176 #63 NEW cov: 11896 ft: 14964 corp: 35/903b lim: 35 exec/s: 63 rss: 70Mb L: 35/35 MS: 1 CopyPart- 00:08:32.176 [2024-11-19 17:52:24.977042] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005a1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.176 [2024-11-19 17:52:24.977068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:32.176 [2024-11-19 17:52:24.977131] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.176 [2024-11-19 17:52:24.977145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:32.176 #64 NEW cov: 11896 ft: 15044 corp: 36/937b lim: 35 exec/s: 64 rss: 70Mb L: 34/35 MS: 1 ChangeBit- 00:08:32.176 [2024-11-19 17:52:25.017174] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005a1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.176 [2024-11-19 17:52:25.017201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:32.176 [2024-11-19 17:52:25.017261] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000005a1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.176 [2024-11-19 17:52:25.017275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:32.176 [2024-11-19 17:52:25.017335] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000005a1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.176 [2024-11-19 17:52:25.017348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:32.176 #65 NEW cov: 11896 ft: 15060 corp: 37/971b lim: 35 exec/s: 65 rss: 70Mb L: 34/35 MS: 1 ChangeBinInt- 00:08:32.436 [2024-11-19 17:52:25.057354] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.436 [2024-11-19 17:52:25.057381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:32.436 [2024-11-19 17:52:25.057444] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:0000025f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.436 [2024-11-19 17:52:25.057457] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:32.436 [2024-11-19 17:52:25.057515] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000363 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.436 [2024-11-19 17:52:25.057528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:32.436 [2024-11-19 17:52:25.057587] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:000005a1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.436 [2024-11-19 17:52:25.057605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:32.436 #66 NEW cov: 11896 ft: 15109 corp: 38/1006b lim: 35 exec/s: 66 rss: 70Mb L: 35/35 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000\000"- 00:08:32.436 [2024-11-19 17:52:25.097529] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005a1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.436 [2024-11-19 17:52:25.097554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:32.436 [2024-11-19 17:52:25.097629] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000005a5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.436 [2024-11-19 17:52:25.097643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:32.436 [2024-11-19 17:52:25.097680] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000000a5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.436 [2024-11-19 17:52:25.097694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:32.436 [2024-11-19 17:52:25.097751] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.436 [2024-11-19 17:52:25.097764] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:32.436 #67 NEW cov: 11896 ft: 15120 corp: 39/1041b lim: 35 exec/s: 67 rss: 70Mb L: 35/35 MS: 1 ChangeByte- 00:08:32.436 [2024-11-19 17:52:25.137192] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005a1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.436 [2024-11-19 17:52:25.137217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:32.436 #68 NEW cov: 11896 ft: 15163 corp: 40/1061b lim: 35 exec/s: 68 rss: 70Mb L: 20/35 MS: 1 CrossOver- 00:08:32.436 [2024-11-19 17:52:25.177355] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005a1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.436 [2024-11-19 17:52:25.177380] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:32.436 #69 NEW cov: 11896 ft: 15224 corp: 41/1081b lim: 35 exec/s: 69 rss: 70Mb L: 20/35 MS: 1 EraseBytes- 00:08:32.436 [2024-11-19 17:52:25.217396] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.436 [2024-11-19 17:52:25.217420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:32.436 #70 NEW cov: 11896 ft: 15230 corp: 42/1095b lim: 35 exec/s: 70 rss: 70Mb L: 14/35 MS: 1 InsertByte- 00:08:32.436 [2024-11-19 17:52:25.257680] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005a1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.436 [2024-11-19 17:52:25.257706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:32.436 [2024-11-19 17:52:25.257763] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000005a1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.436 [2024-11-19 17:52:25.257776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:32.436 #71 NEW cov: 11896 ft: 15292 corp: 43/1120b lim: 35 exec/s: 71 rss: 70Mb L: 25/35 MS: 1 CrossOver- 00:08:32.436 [2024-11-19 17:52:25.297805] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005a1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.436 [2024-11-19 17:52:25.297830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:32.436 [2024-11-19 17:52:25.297906] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000005a1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.436 [2024-11-19 17:52:25.297921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:32.697 #72 NEW cov: 11896 ft: 15300 corp: 44/1145b lim: 35 exec/s: 72 rss: 70Mb L: 25/35 MS: 1 ShuffleBytes- 00:08:32.697 [2024-11-19 17:52:25.338029] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.697 [2024-11-19 17:52:25.338055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:32.697 [2024-11-19 17:52:25.338116] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005a5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.697 [2024-11-19 17:52:25.338130] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:32.697 [2024-11-19 17:52:25.338188] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000005a5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.697 [2024-11-19 17:52:25.338202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:32.697 #73 NEW cov: 11896 ft: 15304 corp: 45/1175b lim: 35 exec/s: 36 rss: 70Mb L: 30/35 MS: 1 ChangeByte- 00:08:32.697 #73 DONE cov: 11896 ft: 15304 corp: 45/1175b lim: 35 exec/s: 36 rss: 70Mb 00:08:32.697 ###### Recommended dictionary. ###### 00:08:32.697 "\000\000\000\000\000\000\000\000" # Uses: 5 00:08:32.697 ###### End of recommended dictionary. ###### 00:08:32.697 Done 73 runs in 2 second(s) 00:08:32.697 17:52:25 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_15.conf 00:08:32.697 17:52:25 -- ../common.sh@72 -- # (( i++ )) 00:08:32.697 17:52:25 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:32.697 17:52:25 -- ../common.sh@73 -- # start_llvm_fuzz 16 1 0x1 00:08:32.697 17:52:25 -- nvmf/run.sh@23 -- # local fuzzer_type=16 00:08:32.697 17:52:25 -- nvmf/run.sh@24 -- # local timen=1 00:08:32.697 17:52:25 -- nvmf/run.sh@25 -- # local core=0x1 00:08:32.697 17:52:25 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:08:32.697 17:52:25 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_16.conf 00:08:32.697 17:52:25 -- nvmf/run.sh@29 -- # printf %02d 16 00:08:32.697 17:52:25 -- nvmf/run.sh@29 -- # port=4416 00:08:32.697 17:52:25 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:08:32.697 17:52:25 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4416' 00:08:32.697 17:52:25 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4416"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:32.697 17:52:25 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4416' -c /tmp/fuzz_json_16.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 -Z 16 -r /var/tmp/spdk16.sock 00:08:32.697 [2024-11-19 17:52:25.517928] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:32.697 [2024-11-19 17:52:25.517995] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid641286 ] 00:08:32.697 EAL: No free 2048 kB hugepages reported on node 1 00:08:32.957 [2024-11-19 17:52:25.769702] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:32.957 [2024-11-19 17:52:25.798213] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:32.957 [2024-11-19 17:52:25.798348] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:33.217 [2024-11-19 17:52:25.849945] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:33.217 [2024-11-19 17:52:25.866266] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4416 *** 00:08:33.217 INFO: Running with entropic power schedule (0xFF, 100). 00:08:33.217 INFO: Seed: 3636598699 00:08:33.217 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:33.217 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:33.217 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:08:33.217 INFO: A corpus is not provided, starting from an empty corpus 00:08:33.217 #2 INITED exec/s: 0 rss: 59Mb 00:08:33.217 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:33.217 This may also happen if the target rejected all inputs we tried so far 00:08:33.217 [2024-11-19 17:52:25.911703] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1446803456593761300 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.217 [2024-11-19 17:52:25.911733] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.217 [2024-11-19 17:52:25.911772] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.217 [2024-11-19 17:52:25.911788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.217 [2024-11-19 17:52:25.911841] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.217 [2024-11-19 17:52:25.911856] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.217 [2024-11-19 17:52:25.911910] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.217 [2024-11-19 17:52:25.911924] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:33.477 NEW_FUNC[1/671]: 0x467ec8 in fuzz_nvm_read_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:519 00:08:33.477 NEW_FUNC[2/671]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:33.477 #5 NEW cov: 11667 ft: 11668 corp: 2/93b lim: 105 exec/s: 0 rss: 67Mb L: 92/92 MS: 3 CrossOver-InsertRepeatedBytes-InsertRepeatedBytes- 00:08:33.477 [2024-11-19 17:52:26.222324] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1446803456593761300 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.477 [2024-11-19 17:52:26.222357] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.477 [2024-11-19 17:52:26.222392] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.477 [2024-11-19 17:52:26.222411] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.477 [2024-11-19 17:52:26.222458] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.477 [2024-11-19 17:52:26.222474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.477 [2024-11-19 17:52:26.222524] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.477 [2024-11-19 17:52:26.222538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:33.477 #6 NEW cov: 11780 ft: 12163 corp: 3/185b lim: 105 exec/s: 0 rss: 67Mb L: 92/92 MS: 1 CopyPart- 00:08:33.477 [2024-11-19 17:52:26.272422] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1446803456593761300 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.477 [2024-11-19 17:52:26.272450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.477 [2024-11-19 17:52:26.272497] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.477 [2024-11-19 17:52:26.272512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.477 [2024-11-19 17:52:26.272562] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.477 [2024-11-19 17:52:26.272577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.477 [2024-11-19 17:52:26.272648] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.477 [2024-11-19 17:52:26.272664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:33.477 #7 NEW cov: 11786 ft: 12303 corp: 4/277b lim: 105 exec/s: 0 rss: 67Mb L: 92/92 MS: 1 CopyPart- 00:08:33.477 [2024-11-19 17:52:26.312435] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.477 [2024-11-19 17:52:26.312464] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.477 [2024-11-19 17:52:26.312499] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.477 [2024-11-19 17:52:26.312514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.477 [2024-11-19 17:52:26.312564] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.478 [2024-11-19 17:52:26.312580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.478 #8 NEW cov: 11871 ft: 13024 corp: 5/357b lim: 105 exec/s: 0 rss: 67Mb L: 80/92 MS: 1 EraseBytes- 00:08:33.738 [2024-11-19 17:52:26.352563] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.738 [2024-11-19 17:52:26.352591] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.738 [2024-11-19 17:52:26.352649] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.738 [2024-11-19 17:52:26.352666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.738 [2024-11-19 17:52:26.352722] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.738 [2024-11-19 17:52:26.352737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.738 #9 NEW cov: 11871 ft: 13050 corp: 6/438b lim: 105 exec/s: 0 rss: 67Mb L: 81/92 MS: 1 InsertByte- 00:08:33.738 [2024-11-19 17:52:26.392576] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.738 [2024-11-19 17:52:26.392609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.738 [2024-11-19 17:52:26.392663] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.738 [2024-11-19 17:52:26.392679] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.738 #10 NEW cov: 11871 ft: 13594 corp: 7/485b lim: 105 exec/s: 0 rss: 67Mb L: 47/92 MS: 1 EraseBytes- 00:08:33.738 [2024-11-19 17:52:26.432897] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.738 [2024-11-19 17:52:26.432924] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.738 [2024-11-19 17:52:26.432972] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.738 [2024-11-19 17:52:26.432987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.738 [2024-11-19 17:52:26.433037] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.738 [2024-11-19 17:52:26.433052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.738 [2024-11-19 17:52:26.433103] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.738 [2024-11-19 17:52:26.433118] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:33.738 #11 NEW cov: 11871 ft: 13690 corp: 8/586b lim: 105 exec/s: 0 rss: 67Mb L: 101/101 MS: 1 CopyPart- 00:08:33.738 [2024-11-19 17:52:26.472925] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.738 [2024-11-19 17:52:26.472953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.738 [2024-11-19 17:52:26.472995] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.738 [2024-11-19 17:52:26.473010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.738 [2024-11-19 17:52:26.473061] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.738 [2024-11-19 17:52:26.473091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.738 #12 NEW cov: 11871 ft: 13804 corp: 9/666b lim: 105 exec/s: 0 rss: 67Mb L: 80/101 MS: 1 ChangeBit- 00:08:33.738 [2024-11-19 17:52:26.513147] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.738 [2024-11-19 17:52:26.513175] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.738 [2024-11-19 17:52:26.513237] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.738 [2024-11-19 17:52:26.513255] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.738 [2024-11-19 17:52:26.513304] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.738 [2024-11-19 17:52:26.513319] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.738 [2024-11-19 17:52:26.513368] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.738 [2024-11-19 17:52:26.513382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:33.738 #13 NEW cov: 11871 ft: 13866 corp: 10/765b lim: 105 exec/s: 0 rss: 67Mb L: 99/101 MS: 1 InsertRepeatedBytes- 00:08:33.738 [2024-11-19 17:52:26.553257] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1467069654916928532 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.738 [2024-11-19 17:52:26.553285] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.738 [2024-11-19 17:52:26.553323] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.738 [2024-11-19 17:52:26.553340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.738 [2024-11-19 17:52:26.553393] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.738 [2024-11-19 17:52:26.553409] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.739 [2024-11-19 17:52:26.553460] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.739 [2024-11-19 17:52:26.553475] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:33.739 #14 NEW cov: 11871 ft: 13887 corp: 11/857b lim: 105 exec/s: 0 rss: 67Mb L: 92/101 MS: 1 ChangeBinInt- 00:08:33.739 [2024-11-19 17:52:26.593243] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.739 [2024-11-19 17:52:26.593271] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.739 [2024-11-19 17:52:26.593310] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.739 [2024-11-19 17:52:26.593325] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.739 [2024-11-19 17:52:26.593378] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.739 [2024-11-19 17:52:26.593393] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.999 #15 NEW cov: 11871 ft: 13930 corp: 12/938b lim: 105 exec/s: 0 rss: 67Mb L: 81/101 MS: 1 ChangeBit- 00:08:33.999 [2024-11-19 17:52:26.633466] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1446803456593761300 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.999 [2024-11-19 17:52:26.633492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.999 [2024-11-19 17:52:26.633539] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.999 [2024-11-19 17:52:26.633555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.999 [2024-11-19 17:52:26.633612] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:70 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.999 [2024-11-19 17:52:26.633627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.999 [2024-11-19 17:52:26.633692] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.999 [2024-11-19 17:52:26.633708] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:33.999 #16 NEW cov: 11871 ft: 14000 corp: 13/1031b lim: 105 exec/s: 0 rss: 67Mb L: 93/101 MS: 1 InsertByte- 00:08:33.999 [2024-11-19 17:52:26.673335] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1446803456593761300 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.999 [2024-11-19 17:52:26.673361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.999 [2024-11-19 17:52:26.673398] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.999 [2024-11-19 17:52:26.673413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.999 #17 NEW cov: 11871 ft: 14086 corp: 14/1093b lim: 105 exec/s: 0 rss: 67Mb L: 62/101 MS: 1 EraseBytes- 00:08:33.999 [2024-11-19 17:52:26.713709] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.999 [2024-11-19 17:52:26.713735] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.999 [2024-11-19 17:52:26.713801] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.999 [2024-11-19 17:52:26.713816] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.999 [2024-11-19 17:52:26.713868] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.999 [2024-11-19 17:52:26.713882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.999 [2024-11-19 17:52:26.713933] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.999 [2024-11-19 17:52:26.713948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:33.999 #18 NEW cov: 11871 ft: 14099 corp: 15/1192b lim: 105 exec/s: 0 rss: 68Mb L: 99/101 MS: 1 ShuffleBytes- 00:08:33.999 [2024-11-19 17:52:26.753741] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.999 [2024-11-19 17:52:26.753769] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.999 [2024-11-19 17:52:26.753805] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:281470681743360 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.999 [2024-11-19 17:52:26.753820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.999 [2024-11-19 17:52:26.753873] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:8800387989503 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.999 [2024-11-19 17:52:26.753888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.999 #19 NEW cov: 11871 ft: 14109 corp: 16/1257b lim: 105 exec/s: 0 rss: 68Mb L: 65/101 MS: 1 EraseBytes- 00:08:33.999 [2024-11-19 17:52:26.793806] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1446803456593761300 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.999 [2024-11-19 17:52:26.793833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.999 [2024-11-19 17:52:26.793871] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.999 [2024-11-19 17:52:26.793886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.999 [2024-11-19 17:52:26.793938] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.999 [2024-11-19 17:52:26.793952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.999 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:33.999 #20 NEW cov: 11894 ft: 14140 corp: 17/1322b lim: 105 exec/s: 0 rss: 68Mb L: 65/101 MS: 1 CrossOver- 00:08:33.999 [2024-11-19 17:52:26.843934] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:2836825899008000 len:5141 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.999 [2024-11-19 17:52:26.843961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.999 [2024-11-19 17:52:26.843998] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.999 [2024-11-19 17:52:26.844013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.999 [2024-11-19 17:52:26.844063] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.999 [2024-11-19 17:52:26.844078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.260 #21 NEW cov: 11894 ft: 14171 corp: 18/1389b lim: 105 exec/s: 0 rss: 68Mb L: 67/101 MS: 1 CrossOver- 00:08:34.260 [2024-11-19 17:52:26.884087] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1446803456593761300 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.260 [2024-11-19 17:52:26.884112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.260 [2024-11-19 17:52:26.884164] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.260 [2024-11-19 17:52:26.884179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.260 [2024-11-19 17:52:26.884221] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.260 [2024-11-19 17:52:26.884236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.260 #22 NEW cov: 11894 ft: 14184 corp: 19/1467b lim: 105 exec/s: 22 rss: 68Mb L: 78/101 MS: 1 EraseBytes- 00:08:34.260 [2024-11-19 17:52:26.924285] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.260 [2024-11-19 17:52:26.924311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.260 [2024-11-19 17:52:26.924350] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.260 [2024-11-19 17:52:26.924363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.260 [2024-11-19 17:52:26.924418] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:94 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.260 [2024-11-19 17:52:26.924433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.260 [2024-11-19 17:52:26.924488] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.260 [2024-11-19 17:52:26.924503] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:34.260 #28 NEW cov: 11894 ft: 14284 corp: 20/1569b lim: 105 exec/s: 28 rss: 68Mb L: 102/102 MS: 1 InsertByte- 00:08:34.260 [2024-11-19 17:52:26.964367] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.260 [2024-11-19 17:52:26.964394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.260 [2024-11-19 17:52:26.964441] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.260 [2024-11-19 17:52:26.964456] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.260 [2024-11-19 17:52:26.964493] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:94 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.260 [2024-11-19 17:52:26.964508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.260 [2024-11-19 17:52:26.964560] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.260 [2024-11-19 17:52:26.964575] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:34.260 #29 NEW cov: 11894 ft: 14310 corp: 21/1671b lim: 105 exec/s: 29 rss: 68Mb L: 102/102 MS: 1 CMP- DE: "\322\000\000\000"- 00:08:34.260 [2024-11-19 17:52:27.004517] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.260 [2024-11-19 17:52:27.004544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.260 [2024-11-19 17:52:27.004592] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.260 [2024-11-19 17:52:27.004611] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.260 [2024-11-19 17:52:27.004665] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.260 [2024-11-19 17:52:27.004695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.260 [2024-11-19 17:52:27.004746] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.260 [2024-11-19 17:52:27.004762] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:34.260 #30 NEW cov: 11894 ft: 14325 corp: 22/1772b lim: 105 exec/s: 30 rss: 68Mb L: 101/102 MS: 1 PersAutoDict- DE: "\322\000\000\000"- 00:08:34.260 [2024-11-19 17:52:27.044625] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1446803456593761300 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.260 [2024-11-19 17:52:27.044651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.260 [2024-11-19 17:52:27.044700] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.261 [2024-11-19 17:52:27.044718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.261 [2024-11-19 17:52:27.044768] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:70 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.261 [2024-11-19 17:52:27.044783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.261 [2024-11-19 17:52:27.044834] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.261 [2024-11-19 17:52:27.044849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:34.261 #31 NEW cov: 11894 ft: 14336 corp: 23/1866b lim: 105 exec/s: 31 rss: 68Mb L: 94/102 MS: 1 CrossOver- 00:08:34.261 [2024-11-19 17:52:27.084742] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.261 [2024-11-19 17:52:27.084768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.261 [2024-11-19 17:52:27.084817] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.261 [2024-11-19 17:52:27.084832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.261 [2024-11-19 17:52:27.084885] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.261 [2024-11-19 17:52:27.084900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.261 [2024-11-19 17:52:27.084951] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.261 [2024-11-19 17:52:27.084965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:34.261 #32 NEW cov: 11894 ft: 14355 corp: 24/1959b lim: 105 exec/s: 32 rss: 68Mb L: 93/102 MS: 1 CopyPart- 00:08:34.521 [2024-11-19 17:52:27.124751] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.521 [2024-11-19 17:52:27.124778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.521 [2024-11-19 17:52:27.124818] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:281470681743360 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.522 [2024-11-19 17:52:27.124834] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.522 [2024-11-19 17:52:27.124886] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:4294967295 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.522 [2024-11-19 17:52:27.124901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.522 #33 NEW cov: 11894 ft: 14360 corp: 25/2024b lim: 105 exec/s: 33 rss: 68Mb L: 65/102 MS: 1 CopyPart- 00:08:34.522 [2024-11-19 17:52:27.164876] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:2836825899008000 len:60396 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.522 [2024-11-19 17:52:27.164904] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.522 [2024-11-19 17:52:27.164961] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.522 [2024-11-19 17:52:27.164976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.522 [2024-11-19 17:52:27.165033] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.522 [2024-11-19 17:52:27.165048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.522 #39 NEW cov: 11894 ft: 14389 corp: 26/2091b lim: 105 exec/s: 39 rss: 68Mb L: 67/102 MS: 1 ChangeBinInt- 00:08:34.522 [2024-11-19 17:52:27.205121] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:9217 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.522 [2024-11-19 17:52:27.205147] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.522 [2024-11-19 17:52:27.205208] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.522 [2024-11-19 17:52:27.205225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.522 [2024-11-19 17:52:27.205276] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:6773413839565225984 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.522 [2024-11-19 17:52:27.205291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.522 [2024-11-19 17:52:27.205345] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.522 [2024-11-19 17:52:27.205360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:34.522 #40 NEW cov: 11894 ft: 14418 corp: 27/2194b lim: 105 exec/s: 40 rss: 68Mb L: 103/103 MS: 1 InsertByte- 00:08:34.522 [2024-11-19 17:52:27.245042] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:2836825899008000 len:5141 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.522 [2024-11-19 17:52:27.245069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.522 [2024-11-19 17:52:27.245126] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.522 [2024-11-19 17:52:27.245141] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.522 [2024-11-19 17:52:27.245195] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.522 [2024-11-19 17:52:27.245211] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.522 #41 NEW cov: 11894 ft: 14427 corp: 28/2261b lim: 105 exec/s: 41 rss: 68Mb L: 67/103 MS: 1 ChangeByte- 00:08:34.522 [2024-11-19 17:52:27.285179] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.522 [2024-11-19 17:52:27.285206] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.522 [2024-11-19 17:52:27.285266] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:281470681743360 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.522 [2024-11-19 17:52:27.285282] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.522 [2024-11-19 17:52:27.285333] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:4294967295 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.522 [2024-11-19 17:52:27.285349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.522 #42 NEW cov: 11894 ft: 14435 corp: 29/2326b lim: 105 exec/s: 42 rss: 68Mb L: 65/103 MS: 1 ChangeBit- 00:08:34.522 [2024-11-19 17:52:27.325402] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:9217 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.522 [2024-11-19 17:52:27.325429] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.522 [2024-11-19 17:52:27.325493] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.522 [2024-11-19 17:52:27.325509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.522 [2024-11-19 17:52:27.325563] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:2359296 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.522 [2024-11-19 17:52:27.325578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.522 [2024-11-19 17:52:27.325641] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.522 [2024-11-19 17:52:27.325657] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:34.522 #43 NEW cov: 11894 ft: 14437 corp: 30/2429b lim: 105 exec/s: 43 rss: 68Mb L: 103/103 MS: 1 CrossOver- 00:08:34.522 [2024-11-19 17:52:27.365519] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:48641 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.522 [2024-11-19 17:52:27.365546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.522 [2024-11-19 17:52:27.365617] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.522 [2024-11-19 17:52:27.365632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.522 [2024-11-19 17:52:27.365682] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:6773413839565225984 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.522 [2024-11-19 17:52:27.365698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.522 [2024-11-19 17:52:27.365760] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.522 [2024-11-19 17:52:27.365775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:34.783 #49 NEW cov: 11894 ft: 14443 corp: 31/2532b lim: 105 exec/s: 49 rss: 69Mb L: 103/103 MS: 1 ChangeByte- 00:08:34.783 [2024-11-19 17:52:27.405668] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.783 [2024-11-19 17:52:27.405695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.783 [2024-11-19 17:52:27.405758] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.783 [2024-11-19 17:52:27.405773] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.783 [2024-11-19 17:52:27.405822] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.783 [2024-11-19 17:52:27.405837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.783 [2024-11-19 17:52:27.405887] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.783 [2024-11-19 17:52:27.405906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:34.783 #50 NEW cov: 11894 ft: 14468 corp: 32/2633b lim: 105 exec/s: 50 rss: 69Mb L: 101/103 MS: 1 CopyPart- 00:08:34.783 [2024-11-19 17:52:27.435650] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.783 [2024-11-19 17:52:27.435677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.783 [2024-11-19 17:52:27.435713] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:281470681743360 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.783 [2024-11-19 17:52:27.435728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.783 [2024-11-19 17:52:27.435778] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:8800387989503 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.783 [2024-11-19 17:52:27.435793] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.783 #51 NEW cov: 11894 ft: 14497 corp: 33/2698b lim: 105 exec/s: 51 rss: 69Mb L: 65/103 MS: 1 ShuffleBytes- 00:08:34.783 [2024-11-19 17:52:27.475867] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1446803456593761300 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.783 [2024-11-19 17:52:27.475893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.783 [2024-11-19 17:52:27.475941] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.783 [2024-11-19 17:52:27.475956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.783 [2024-11-19 17:52:27.476007] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.783 [2024-11-19 17:52:27.476022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.783 [2024-11-19 17:52:27.476074] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.783 [2024-11-19 17:52:27.476089] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:34.783 #52 NEW cov: 11894 ft: 14536 corp: 34/2791b lim: 105 exec/s: 52 rss: 69Mb L: 93/103 MS: 1 CopyPart- 00:08:34.783 [2024-11-19 17:52:27.515950] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.783 [2024-11-19 17:52:27.515977] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.783 [2024-11-19 17:52:27.516024] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.783 [2024-11-19 17:52:27.516040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.783 [2024-11-19 17:52:27.516090] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.783 [2024-11-19 17:52:27.516105] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.783 [2024-11-19 17:52:27.516156] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.783 [2024-11-19 17:52:27.516171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:34.783 #53 NEW cov: 11894 ft: 14562 corp: 35/2877b lim: 105 exec/s: 53 rss: 69Mb L: 86/103 MS: 1 EraseBytes- 00:08:34.783 [2024-11-19 17:52:27.555779] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069414649855 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.783 [2024-11-19 17:52:27.555806] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.783 #54 NEW cov: 11894 ft: 14984 corp: 36/2910b lim: 105 exec/s: 54 rss: 69Mb L: 33/103 MS: 1 EraseBytes- 00:08:34.783 [2024-11-19 17:52:27.596180] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1446803456593761300 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.783 [2024-11-19 17:52:27.596206] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.783 [2024-11-19 17:52:27.596256] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.783 [2024-11-19 17:52:27.596272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.783 [2024-11-19 17:52:27.596323] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.783 [2024-11-19 17:52:27.596338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.783 [2024-11-19 17:52:27.596388] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.783 [2024-11-19 17:52:27.596403] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:34.783 #55 NEW cov: 11894 ft: 15002 corp: 37/3002b lim: 105 exec/s: 55 rss: 69Mb L: 92/103 MS: 1 ChangeByte- 00:08:34.783 [2024-11-19 17:52:27.625914] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069582356735 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.783 [2024-11-19 17:52:27.625940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.044 #56 NEW cov: 11894 ft: 15041 corp: 38/3036b lim: 105 exec/s: 56 rss: 69Mb L: 34/103 MS: 1 CrossOver- 00:08:35.044 [2024-11-19 17:52:27.666355] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:9217 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.044 [2024-11-19 17:52:27.666381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.044 [2024-11-19 17:52:27.666430] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.044 [2024-11-19 17:52:27.666446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.044 [2024-11-19 17:52:27.666494] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:6773413839565225984 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.044 [2024-11-19 17:52:27.666509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.044 [2024-11-19 17:52:27.666562] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.045 [2024-11-19 17:52:27.666576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:35.045 #57 NEW cov: 11894 ft: 15045 corp: 39/3139b lim: 105 exec/s: 57 rss: 69Mb L: 103/103 MS: 1 ShuffleBytes- 00:08:35.045 [2024-11-19 17:52:27.696363] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:2836825899008000 len:5141 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.045 [2024-11-19 17:52:27.696390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.045 [2024-11-19 17:52:27.696442] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.045 [2024-11-19 17:52:27.696456] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.045 [2024-11-19 17:52:27.696508] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.045 [2024-11-19 17:52:27.696524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.045 #58 NEW cov: 11894 ft: 15051 corp: 40/3206b lim: 105 exec/s: 58 rss: 69Mb L: 67/103 MS: 1 ChangeBit- 00:08:35.045 [2024-11-19 17:52:27.736588] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.045 [2024-11-19 17:52:27.736620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.045 [2024-11-19 17:52:27.736686] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.045 [2024-11-19 17:52:27.736699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.045 [2024-11-19 17:52:27.736748] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:101 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.045 [2024-11-19 17:52:27.736774] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.045 [2024-11-19 17:52:27.736825] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.045 [2024-11-19 17:52:27.736840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:35.045 #59 NEW cov: 11894 ft: 15126 corp: 41/3307b lim: 105 exec/s: 59 rss: 69Mb L: 101/103 MS: 1 ChangeBinInt- 00:08:35.045 [2024-11-19 17:52:27.776681] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1446803456593761300 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.045 [2024-11-19 17:52:27.776708] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.045 [2024-11-19 17:52:27.776774] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:14128939165286400 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.045 [2024-11-19 17:52:27.776790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.045 [2024-11-19 17:52:27.776841] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.045 [2024-11-19 17:52:27.776856] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.045 [2024-11-19 17:52:27.776891] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.045 [2024-11-19 17:52:27.776907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:35.045 #60 NEW cov: 11894 ft: 15127 corp: 42/3404b lim: 105 exec/s: 60 rss: 69Mb L: 97/103 MS: 1 InsertRepeatedBytes- 00:08:35.045 [2024-11-19 17:52:27.806762] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.045 [2024-11-19 17:52:27.806788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.045 [2024-11-19 17:52:27.806837] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.045 [2024-11-19 17:52:27.806854] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.045 [2024-11-19 17:52:27.806903] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.045 [2024-11-19 17:52:27.806934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.045 [2024-11-19 17:52:27.806985] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.045 [2024-11-19 17:52:27.807001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:35.045 #61 NEW cov: 11894 ft: 15162 corp: 43/3497b lim: 105 exec/s: 61 rss: 69Mb L: 93/103 MS: 1 ChangeBinInt- 00:08:35.045 [2024-11-19 17:52:27.846928] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.045 [2024-11-19 17:52:27.846955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.045 [2024-11-19 17:52:27.847018] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.045 [2024-11-19 17:52:27.847033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.045 [2024-11-19 17:52:27.847084] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:7277816997830786816 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.045 [2024-11-19 17:52:27.847099] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.045 [2024-11-19 17:52:27.847150] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.045 [2024-11-19 17:52:27.847164] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:35.045 #62 NEW cov: 11894 ft: 15206 corp: 44/3599b lim: 105 exec/s: 62 rss: 69Mb L: 102/103 MS: 1 InsertByte- 00:08:35.045 [2024-11-19 17:52:27.886678] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069414649855 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.045 [2024-11-19 17:52:27.886704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.306 #63 NEW cov: 11894 ft: 15227 corp: 45/3636b lim: 105 exec/s: 31 rss: 69Mb L: 37/103 MS: 1 PersAutoDict- DE: "\322\000\000\000"- 00:08:35.306 #63 DONE cov: 11894 ft: 15227 corp: 45/3636b lim: 105 exec/s: 31 rss: 69Mb 00:08:35.306 ###### Recommended dictionary. ###### 00:08:35.306 "\322\000\000\000" # Uses: 2 00:08:35.306 ###### End of recommended dictionary. ###### 00:08:35.306 Done 63 runs in 2 second(s) 00:08:35.306 17:52:28 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_16.conf 00:08:35.306 17:52:28 -- ../common.sh@72 -- # (( i++ )) 00:08:35.306 17:52:28 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:35.306 17:52:28 -- ../common.sh@73 -- # start_llvm_fuzz 17 1 0x1 00:08:35.306 17:52:28 -- nvmf/run.sh@23 -- # local fuzzer_type=17 00:08:35.306 17:52:28 -- nvmf/run.sh@24 -- # local timen=1 00:08:35.306 17:52:28 -- nvmf/run.sh@25 -- # local core=0x1 00:08:35.306 17:52:28 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:08:35.306 17:52:28 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_17.conf 00:08:35.306 17:52:28 -- nvmf/run.sh@29 -- # printf %02d 17 00:08:35.306 17:52:28 -- nvmf/run.sh@29 -- # port=4417 00:08:35.306 17:52:28 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:08:35.306 17:52:28 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4417' 00:08:35.306 17:52:28 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4417"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:35.306 17:52:28 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4417' -c /tmp/fuzz_json_17.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 -Z 17 -r /var/tmp/spdk17.sock 00:08:35.306 [2024-11-19 17:52:28.072695] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:35.306 [2024-11-19 17:52:28.072788] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid641584 ] 00:08:35.306 EAL: No free 2048 kB hugepages reported on node 1 00:08:35.566 [2024-11-19 17:52:28.331536] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:35.566 [2024-11-19 17:52:28.359428] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:35.566 [2024-11-19 17:52:28.359566] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:35.566 [2024-11-19 17:52:28.410952] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:35.566 [2024-11-19 17:52:28.427293] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4417 *** 00:08:35.825 INFO: Running with entropic power schedule (0xFF, 100). 00:08:35.825 INFO: Seed: 1903613043 00:08:35.825 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:35.825 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:35.825 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:08:35.825 INFO: A corpus is not provided, starting from an empty corpus 00:08:35.825 #2 INITED exec/s: 0 rss: 59Mb 00:08:35.825 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:35.825 This may also happen if the target rejected all inputs we tried so far 00:08:35.825 [2024-11-19 17:52:28.494711] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.825 [2024-11-19 17:52:28.494753] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.825 [2024-11-19 17:52:28.494840] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.825 [2024-11-19 17:52:28.494859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.825 [2024-11-19 17:52:28.494929] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.825 [2024-11-19 17:52:28.494949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.825 [2024-11-19 17:52:28.495024] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.825 [2024-11-19 17:52:28.495044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.085 NEW_FUNC[1/672]: 0x46b1b8 in fuzz_nvm_write_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:540 00:08:36.085 NEW_FUNC[2/672]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:36.085 #3 NEW cov: 11688 ft: 11689 corp: 2/110b lim: 120 exec/s: 0 rss: 66Mb L: 109/109 MS: 1 InsertRepeatedBytes- 00:08:36.085 [2024-11-19 17:52:28.814505] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.085 [2024-11-19 17:52:28.814547] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.085 [2024-11-19 17:52:28.814671] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.085 [2024-11-19 17:52:28.814691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.085 [2024-11-19 17:52:28.814804] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.085 [2024-11-19 17:52:28.814825] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.085 [2024-11-19 17:52:28.814945] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.085 [2024-11-19 17:52:28.814967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.085 #4 NEW cov: 11801 ft: 12366 corp: 3/222b lim: 120 exec/s: 0 rss: 67Mb L: 112/112 MS: 1 CopyPart- 00:08:36.085 [2024-11-19 17:52:28.864657] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.085 [2024-11-19 17:52:28.864689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.085 [2024-11-19 17:52:28.864762] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.085 [2024-11-19 17:52:28.864784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.085 [2024-11-19 17:52:28.864897] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.085 [2024-11-19 17:52:28.864914] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.085 [2024-11-19 17:52:28.865028] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.085 [2024-11-19 17:52:28.865049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.085 #5 NEW cov: 11807 ft: 12588 corp: 4/331b lim: 120 exec/s: 0 rss: 67Mb L: 109/112 MS: 1 ChangeByte- 00:08:36.085 [2024-11-19 17:52:28.904590] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.086 [2024-11-19 17:52:28.904627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.086 [2024-11-19 17:52:28.904724] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.086 [2024-11-19 17:52:28.904746] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.086 [2024-11-19 17:52:28.904855] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.086 [2024-11-19 17:52:28.904874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.086 [2024-11-19 17:52:28.904988] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.086 [2024-11-19 17:52:28.905009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.086 #6 NEW cov: 11892 ft: 12828 corp: 5/443b lim: 120 exec/s: 0 rss: 67Mb L: 112/112 MS: 1 ChangeByte- 00:08:36.346 [2024-11-19 17:52:28.954589] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.346 [2024-11-19 17:52:28.954641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.346 [2024-11-19 17:52:28.954739] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.346 [2024-11-19 17:52:28.954761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.346 [2024-11-19 17:52:28.954872] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.346 [2024-11-19 17:52:28.954893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.346 #7 NEW cov: 11892 ft: 13410 corp: 6/521b lim: 120 exec/s: 0 rss: 67Mb L: 78/112 MS: 1 EraseBytes- 00:08:36.346 [2024-11-19 17:52:29.004740] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.346 [2024-11-19 17:52:29.004773] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.346 [2024-11-19 17:52:29.004858] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.346 [2024-11-19 17:52:29.004881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.346 [2024-11-19 17:52:29.004994] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.346 [2024-11-19 17:52:29.005016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.346 #13 NEW cov: 11892 ft: 13466 corp: 7/616b lim: 120 exec/s: 0 rss: 67Mb L: 95/112 MS: 1 EraseBytes- 00:08:36.346 [2024-11-19 17:52:29.045068] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.346 [2024-11-19 17:52:29.045100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.346 [2024-11-19 17:52:29.045198] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.346 [2024-11-19 17:52:29.045219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.346 [2024-11-19 17:52:29.045333] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.346 [2024-11-19 17:52:29.045352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.346 [2024-11-19 17:52:29.045470] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.346 [2024-11-19 17:52:29.045491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.346 #14 NEW cov: 11892 ft: 13570 corp: 8/728b lim: 120 exec/s: 0 rss: 67Mb L: 112/112 MS: 1 CopyPart- 00:08:36.346 [2024-11-19 17:52:29.084899] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.346 [2024-11-19 17:52:29.084929] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.346 [2024-11-19 17:52:29.085056] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.346 [2024-11-19 17:52:29.085080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.346 #16 NEW cov: 11892 ft: 13916 corp: 9/799b lim: 120 exec/s: 0 rss: 67Mb L: 71/112 MS: 2 ChangeBit-InsertRepeatedBytes- 00:08:36.346 [2024-11-19 17:52:29.125172] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.346 [2024-11-19 17:52:29.125204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.346 [2024-11-19 17:52:29.125328] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.346 [2024-11-19 17:52:29.125347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.346 [2024-11-19 17:52:29.125460] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.346 [2024-11-19 17:52:29.125484] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.346 #17 NEW cov: 11892 ft: 13961 corp: 10/894b lim: 120 exec/s: 0 rss: 67Mb L: 95/112 MS: 1 CrossOver- 00:08:36.346 [2024-11-19 17:52:29.165282] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.346 [2024-11-19 17:52:29.165310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.346 [2024-11-19 17:52:29.165408] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.346 [2024-11-19 17:52:29.165429] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.346 [2024-11-19 17:52:29.165549] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.347 [2024-11-19 17:52:29.165570] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.347 #18 NEW cov: 11892 ft: 14064 corp: 11/988b lim: 120 exec/s: 0 rss: 67Mb L: 94/112 MS: 1 CrossOver- 00:08:36.605 [2024-11-19 17:52:29.215740] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.605 [2024-11-19 17:52:29.215769] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.605 [2024-11-19 17:52:29.215855] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.605 [2024-11-19 17:52:29.215877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.605 [2024-11-19 17:52:29.215989] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.605 [2024-11-19 17:52:29.216013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.605 [2024-11-19 17:52:29.216130] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.605 [2024-11-19 17:52:29.216152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.605 #19 NEW cov: 11892 ft: 14092 corp: 12/1097b lim: 120 exec/s: 0 rss: 68Mb L: 109/112 MS: 1 ChangeBit- 00:08:36.605 [2024-11-19 17:52:29.255695] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.605 [2024-11-19 17:52:29.255727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.605 [2024-11-19 17:52:29.255821] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.605 [2024-11-19 17:52:29.255845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.605 [2024-11-19 17:52:29.255963] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:9985 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.605 [2024-11-19 17:52:29.255983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.605 [2024-11-19 17:52:29.256089] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.605 [2024-11-19 17:52:29.256112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.605 #20 NEW cov: 11892 ft: 14117 corp: 13/1207b lim: 120 exec/s: 0 rss: 68Mb L: 110/112 MS: 1 InsertByte- 00:08:36.605 [2024-11-19 17:52:29.295898] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.605 [2024-11-19 17:52:29.295926] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.605 [2024-11-19 17:52:29.296047] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.605 [2024-11-19 17:52:29.296068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.605 [2024-11-19 17:52:29.296184] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.605 [2024-11-19 17:52:29.296207] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.605 [2024-11-19 17:52:29.296290] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.605 [2024-11-19 17:52:29.296314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.605 #21 NEW cov: 11892 ft: 14139 corp: 14/1316b lim: 120 exec/s: 0 rss: 68Mb L: 109/112 MS: 1 CopyPart- 00:08:36.605 [2024-11-19 17:52:29.335977] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.605 [2024-11-19 17:52:29.336009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.605 [2024-11-19 17:52:29.336090] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.605 [2024-11-19 17:52:29.336109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.605 [2024-11-19 17:52:29.336223] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.605 [2024-11-19 17:52:29.336243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.605 [2024-11-19 17:52:29.336353] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.605 [2024-11-19 17:52:29.336376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.605 #22 NEW cov: 11892 ft: 14192 corp: 15/1425b lim: 120 exec/s: 0 rss: 68Mb L: 109/112 MS: 1 CrossOver- 00:08:36.605 [2024-11-19 17:52:29.376139] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.605 [2024-11-19 17:52:29.376168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.605 [2024-11-19 17:52:29.376274] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.605 [2024-11-19 17:52:29.376295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.605 [2024-11-19 17:52:29.376406] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.605 [2024-11-19 17:52:29.376426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.605 [2024-11-19 17:52:29.376535] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.605 [2024-11-19 17:52:29.376554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.605 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:36.605 #23 NEW cov: 11915 ft: 14224 corp: 16/1529b lim: 120 exec/s: 0 rss: 68Mb L: 104/112 MS: 1 EraseBytes- 00:08:36.605 [2024-11-19 17:52:29.416318] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.605 [2024-11-19 17:52:29.416350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.605 [2024-11-19 17:52:29.416452] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.605 [2024-11-19 17:52:29.416472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.605 [2024-11-19 17:52:29.416585] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.605 [2024-11-19 17:52:29.416611] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.605 [2024-11-19 17:52:29.416738] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.605 [2024-11-19 17:52:29.416759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.606 #24 NEW cov: 11915 ft: 14238 corp: 17/1633b lim: 120 exec/s: 0 rss: 68Mb L: 104/112 MS: 1 ChangeBit- 00:08:36.606 [2024-11-19 17:52:29.456371] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.606 [2024-11-19 17:52:29.456404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.606 [2024-11-19 17:52:29.456503] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.606 [2024-11-19 17:52:29.456529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.606 [2024-11-19 17:52:29.456657] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:3221225472 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.606 [2024-11-19 17:52:29.456675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.606 [2024-11-19 17:52:29.456789] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.606 [2024-11-19 17:52:29.456807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.865 #25 NEW cov: 11915 ft: 14333 corp: 18/1742b lim: 120 exec/s: 25 rss: 68Mb L: 109/112 MS: 1 CopyPart- 00:08:36.865 [2024-11-19 17:52:29.496538] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.865 [2024-11-19 17:52:29.496567] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.865 [2024-11-19 17:52:29.496667] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.865 [2024-11-19 17:52:29.496690] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.865 [2024-11-19 17:52:29.496810] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:4076007936 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.865 [2024-11-19 17:52:29.496831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.865 [2024-11-19 17:52:29.496950] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.865 [2024-11-19 17:52:29.496971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.865 #26 NEW cov: 11915 ft: 14364 corp: 19/1859b lim: 120 exec/s: 26 rss: 69Mb L: 117/117 MS: 1 InsertRepeatedBytes- 00:08:36.865 [2024-11-19 17:52:29.536438] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.865 [2024-11-19 17:52:29.536468] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.865 [2024-11-19 17:52:29.536557] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.865 [2024-11-19 17:52:29.536576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.865 [2024-11-19 17:52:29.536699] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.865 [2024-11-19 17:52:29.536720] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.865 #27 NEW cov: 11915 ft: 14384 corp: 20/1954b lim: 120 exec/s: 27 rss: 69Mb L: 95/117 MS: 1 CopyPart- 00:08:36.865 [2024-11-19 17:52:29.576784] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.865 [2024-11-19 17:52:29.576814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.865 [2024-11-19 17:52:29.576904] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.866 [2024-11-19 17:52:29.576925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.866 [2024-11-19 17:52:29.577035] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.866 [2024-11-19 17:52:29.577055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.866 [2024-11-19 17:52:29.577172] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.866 [2024-11-19 17:52:29.577192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.866 #28 NEW cov: 11915 ft: 14422 corp: 21/2071b lim: 120 exec/s: 28 rss: 69Mb L: 117/117 MS: 1 CrossOver- 00:08:36.866 [2024-11-19 17:52:29.626733] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.866 [2024-11-19 17:52:29.626767] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.866 [2024-11-19 17:52:29.626868] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.866 [2024-11-19 17:52:29.626887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.866 [2024-11-19 17:52:29.626993] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.866 [2024-11-19 17:52:29.627015] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.866 #29 NEW cov: 11915 ft: 14446 corp: 22/2166b lim: 120 exec/s: 29 rss: 69Mb L: 95/117 MS: 1 ChangeBit- 00:08:36.866 [2024-11-19 17:52:29.666864] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.866 [2024-11-19 17:52:29.666898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.866 [2024-11-19 17:52:29.666983] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:12582912 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.866 [2024-11-19 17:52:29.667005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.866 [2024-11-19 17:52:29.667120] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.866 [2024-11-19 17:52:29.667140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.866 #30 NEW cov: 11915 ft: 14476 corp: 23/2256b lim: 120 exec/s: 30 rss: 69Mb L: 90/117 MS: 1 CrossOver- 00:08:36.866 [2024-11-19 17:52:29.717281] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.866 [2024-11-19 17:52:29.717313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.866 [2024-11-19 17:52:29.717404] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.866 [2024-11-19 17:52:29.717427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.866 [2024-11-19 17:52:29.717537] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.866 [2024-11-19 17:52:29.717560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.866 [2024-11-19 17:52:29.717674] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.866 [2024-11-19 17:52:29.717697] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:37.126 #31 NEW cov: 11915 ft: 14483 corp: 24/2368b lim: 120 exec/s: 31 rss: 69Mb L: 112/117 MS: 1 ChangeBinInt- 00:08:37.126 [2024-11-19 17:52:29.757326] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.126 [2024-11-19 17:52:29.757358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.126 [2024-11-19 17:52:29.757448] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.126 [2024-11-19 17:52:29.757467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.126 [2024-11-19 17:52:29.757582] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.126 [2024-11-19 17:52:29.757604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:37.126 [2024-11-19 17:52:29.757725] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:4294967040 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.126 [2024-11-19 17:52:29.757748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:37.126 #32 NEW cov: 11915 ft: 14507 corp: 25/2466b lim: 120 exec/s: 32 rss: 69Mb L: 98/117 MS: 1 InsertRepeatedBytes- 00:08:37.126 [2024-11-19 17:52:29.807496] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.126 [2024-11-19 17:52:29.807529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.126 [2024-11-19 17:52:29.807609] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.126 [2024-11-19 17:52:29.807630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.127 [2024-11-19 17:52:29.807755] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.127 [2024-11-19 17:52:29.807775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:37.127 [2024-11-19 17:52:29.807891] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.127 [2024-11-19 17:52:29.807911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:37.127 #33 NEW cov: 11915 ft: 14514 corp: 26/2575b lim: 120 exec/s: 33 rss: 69Mb L: 109/117 MS: 1 ShuffleBytes- 00:08:37.127 [2024-11-19 17:52:29.847658] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.127 [2024-11-19 17:52:29.847690] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.127 [2024-11-19 17:52:29.847794] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.127 [2024-11-19 17:52:29.847816] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.127 [2024-11-19 17:52:29.847922] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.127 [2024-11-19 17:52:29.847945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:37.127 [2024-11-19 17:52:29.848061] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.127 [2024-11-19 17:52:29.848083] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:37.127 #34 NEW cov: 11915 ft: 14526 corp: 27/2694b lim: 120 exec/s: 34 rss: 69Mb L: 119/119 MS: 1 CrossOver- 00:08:37.127 [2024-11-19 17:52:29.897695] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.127 [2024-11-19 17:52:29.897726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.127 [2024-11-19 17:52:29.897810] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.127 [2024-11-19 17:52:29.897833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.127 [2024-11-19 17:52:29.897948] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:4076007936 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.127 [2024-11-19 17:52:29.897968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:37.127 [2024-11-19 17:52:29.898084] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:4128768 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.127 [2024-11-19 17:52:29.898107] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:37.127 #35 NEW cov: 11915 ft: 14535 corp: 28/2812b lim: 120 exec/s: 35 rss: 69Mb L: 118/119 MS: 1 InsertByte- 00:08:37.127 [2024-11-19 17:52:29.937956] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:730312212480 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.127 [2024-11-19 17:52:29.937987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.127 [2024-11-19 17:52:29.938095] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.127 [2024-11-19 17:52:29.938115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.127 [2024-11-19 17:52:29.938231] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.127 [2024-11-19 17:52:29.938249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:37.127 [2024-11-19 17:52:29.938369] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.127 [2024-11-19 17:52:29.938391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:37.127 #36 NEW cov: 11915 ft: 14544 corp: 29/2921b lim: 120 exec/s: 36 rss: 69Mb L: 109/119 MS: 1 ChangeByte- 00:08:37.127 [2024-11-19 17:52:29.977997] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.127 [2024-11-19 17:52:29.978026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.127 [2024-11-19 17:52:29.978114] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.127 [2024-11-19 17:52:29.978135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.127 [2024-11-19 17:52:29.978250] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.127 [2024-11-19 17:52:29.978272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:37.127 [2024-11-19 17:52:29.978387] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.127 [2024-11-19 17:52:29.978410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:37.387 #37 NEW cov: 11915 ft: 14554 corp: 30/3030b lim: 120 exec/s: 37 rss: 69Mb L: 109/119 MS: 1 InsertRepeatedBytes- 00:08:37.387 [2024-11-19 17:52:30.028208] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.387 [2024-11-19 17:52:30.028239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.387 [2024-11-19 17:52:30.028356] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.387 [2024-11-19 17:52:30.028378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.387 [2024-11-19 17:52:30.028490] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.387 [2024-11-19 17:52:30.028509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:37.387 [2024-11-19 17:52:30.028621] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.387 [2024-11-19 17:52:30.028644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:37.387 #38 NEW cov: 11915 ft: 14635 corp: 31/3140b lim: 120 exec/s: 38 rss: 69Mb L: 110/119 MS: 1 InsertByte- 00:08:37.387 [2024-11-19 17:52:30.068295] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.387 [2024-11-19 17:52:30.068329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.387 [2024-11-19 17:52:30.068438] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.387 [2024-11-19 17:52:30.068457] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.387 [2024-11-19 17:52:30.068573] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.387 [2024-11-19 17:52:30.068595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:37.387 [2024-11-19 17:52:30.068721] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.387 [2024-11-19 17:52:30.068746] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:37.387 #39 NEW cov: 11915 ft: 14675 corp: 32/3252b lim: 120 exec/s: 39 rss: 69Mb L: 112/119 MS: 1 ChangeBit- 00:08:37.387 [2024-11-19 17:52:30.118178] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.387 [2024-11-19 17:52:30.118215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.387 [2024-11-19 17:52:30.118318] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.387 [2024-11-19 17:52:30.118341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.387 [2024-11-19 17:52:30.118463] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.387 [2024-11-19 17:52:30.118481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:37.387 #40 NEW cov: 11915 ft: 14706 corp: 33/3347b lim: 120 exec/s: 40 rss: 69Mb L: 95/119 MS: 1 ChangeByte- 00:08:37.387 [2024-11-19 17:52:30.158568] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.387 [2024-11-19 17:52:30.158602] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.387 [2024-11-19 17:52:30.158709] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.387 [2024-11-19 17:52:30.158734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.388 [2024-11-19 17:52:30.158848] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.388 [2024-11-19 17:52:30.158869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:37.388 [2024-11-19 17:52:30.158986] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.388 [2024-11-19 17:52:30.159010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:37.388 #41 NEW cov: 11915 ft: 14711 corp: 34/3457b lim: 120 exec/s: 41 rss: 69Mb L: 110/119 MS: 1 ChangeByte- 00:08:37.388 [2024-11-19 17:52:30.207866] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.388 [2024-11-19 17:52:30.207896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.388 #46 NEW cov: 11915 ft: 15548 corp: 35/3502b lim: 120 exec/s: 46 rss: 69Mb L: 45/119 MS: 5 ChangeBit-ChangeByte-InsertByte-ChangeBit-CrossOver- 00:08:37.388 [2024-11-19 17:52:30.248884] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.388 [2024-11-19 17:52:30.248913] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.388 [2024-11-19 17:52:30.249001] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.388 [2024-11-19 17:52:30.249026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.388 [2024-11-19 17:52:30.249131] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.388 [2024-11-19 17:52:30.249153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:37.388 [2024-11-19 17:52:30.249260] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.388 [2024-11-19 17:52:30.249281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:37.648 #47 NEW cov: 11915 ft: 15583 corp: 36/3604b lim: 120 exec/s: 47 rss: 69Mb L: 102/119 MS: 1 InsertRepeatedBytes- 00:08:37.648 [2024-11-19 17:52:30.288928] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.648 [2024-11-19 17:52:30.288956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.648 [2024-11-19 17:52:30.289048] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.648 [2024-11-19 17:52:30.289072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.648 [2024-11-19 17:52:30.289188] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.648 [2024-11-19 17:52:30.289208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:37.648 [2024-11-19 17:52:30.289319] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.648 [2024-11-19 17:52:30.289355] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:37.648 #48 NEW cov: 11915 ft: 15633 corp: 37/3714b lim: 120 exec/s: 48 rss: 69Mb L: 110/119 MS: 1 CopyPart- 00:08:37.648 [2024-11-19 17:52:30.338866] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.648 [2024-11-19 17:52:30.338900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.648 [2024-11-19 17:52:30.339008] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.648 [2024-11-19 17:52:30.339032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.648 [2024-11-19 17:52:30.339145] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.648 [2024-11-19 17:52:30.339167] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:37.648 #49 NEW cov: 11915 ft: 15687 corp: 38/3809b lim: 120 exec/s: 49 rss: 69Mb L: 95/119 MS: 1 ChangeBinInt- 00:08:37.648 [2024-11-19 17:52:30.379006] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.648 [2024-11-19 17:52:30.379035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.648 [2024-11-19 17:52:30.379121] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.648 [2024-11-19 17:52:30.379139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.648 [2024-11-19 17:52:30.379253] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.648 [2024-11-19 17:52:30.379274] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:37.648 #50 NEW cov: 11915 ft: 15691 corp: 39/3904b lim: 120 exec/s: 50 rss: 70Mb L: 95/119 MS: 1 ChangeByte- 00:08:37.648 [2024-11-19 17:52:30.429162] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.648 [2024-11-19 17:52:30.429195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.648 [2024-11-19 17:52:30.429288] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:12582912 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.648 [2024-11-19 17:52:30.429309] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.648 [2024-11-19 17:52:30.429418] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.648 [2024-11-19 17:52:30.429436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:37.648 #51 NEW cov: 11915 ft: 15702 corp: 40/3994b lim: 120 exec/s: 51 rss: 70Mb L: 90/119 MS: 1 CopyPart- 00:08:37.648 [2024-11-19 17:52:30.479243] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.648 [2024-11-19 17:52:30.479273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.648 [2024-11-19 17:52:30.479357] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.648 [2024-11-19 17:52:30.479375] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.648 [2024-11-19 17:52:30.479489] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.648 [2024-11-19 17:52:30.479508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:37.648 #52 NEW cov: 11915 ft: 15707 corp: 41/4089b lim: 120 exec/s: 26 rss: 70Mb L: 95/119 MS: 1 ShuffleBytes- 00:08:37.648 #52 DONE cov: 11915 ft: 15707 corp: 41/4089b lim: 120 exec/s: 26 rss: 70Mb 00:08:37.648 Done 52 runs in 2 second(s) 00:08:37.909 17:52:30 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_17.conf 00:08:37.909 17:52:30 -- ../common.sh@72 -- # (( i++ )) 00:08:37.909 17:52:30 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:37.909 17:52:30 -- ../common.sh@73 -- # start_llvm_fuzz 18 1 0x1 00:08:37.909 17:52:30 -- nvmf/run.sh@23 -- # local fuzzer_type=18 00:08:37.909 17:52:30 -- nvmf/run.sh@24 -- # local timen=1 00:08:37.909 17:52:30 -- nvmf/run.sh@25 -- # local core=0x1 00:08:37.909 17:52:30 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:08:37.909 17:52:30 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_18.conf 00:08:37.909 17:52:30 -- nvmf/run.sh@29 -- # printf %02d 18 00:08:37.909 17:52:30 -- nvmf/run.sh@29 -- # port=4418 00:08:37.909 17:52:30 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:08:37.909 17:52:30 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4418' 00:08:37.909 17:52:30 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4418"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:37.909 17:52:30 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4418' -c /tmp/fuzz_json_18.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 -Z 18 -r /var/tmp/spdk18.sock 00:08:37.909 [2024-11-19 17:52:30.664135] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:37.909 [2024-11-19 17:52:30.664222] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid642127 ] 00:08:37.909 EAL: No free 2048 kB hugepages reported on node 1 00:08:38.169 [2024-11-19 17:52:30.915426] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:38.169 [2024-11-19 17:52:30.944231] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:38.169 [2024-11-19 17:52:30.944347] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:38.169 [2024-11-19 17:52:30.995694] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:38.169 [2024-11-19 17:52:31.012016] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4418 *** 00:08:38.169 INFO: Running with entropic power schedule (0xFF, 100). 00:08:38.169 INFO: Seed: 191673273 00:08:38.430 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:38.430 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:38.430 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:08:38.430 INFO: A corpus is not provided, starting from an empty corpus 00:08:38.430 #2 INITED exec/s: 0 rss: 60Mb 00:08:38.430 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:38.430 This may also happen if the target rejected all inputs we tried so far 00:08:38.430 [2024-11-19 17:52:31.083520] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:38.430 [2024-11-19 17:52:31.083565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.430 [2024-11-19 17:52:31.083684] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:38.430 [2024-11-19 17:52:31.083698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.430 [2024-11-19 17:52:31.083780] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:38.430 [2024-11-19 17:52:31.083798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.430 [2024-11-19 17:52:31.083871] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:38.430 [2024-11-19 17:52:31.083892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:38.690 NEW_FUNC[1/670]: 0x46ea18 in fuzz_nvm_write_zeroes_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:562 00:08:38.690 NEW_FUNC[2/670]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:38.690 #4 NEW cov: 11632 ft: 11633 corp: 2/88b lim: 100 exec/s: 0 rss: 67Mb L: 87/87 MS: 2 InsertByte-InsertRepeatedBytes- 00:08:38.690 [2024-11-19 17:52:31.393157] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:38.690 [2024-11-19 17:52:31.393205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.690 [2024-11-19 17:52:31.393334] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:38.690 [2024-11-19 17:52:31.393358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.690 [2024-11-19 17:52:31.393487] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:38.690 [2024-11-19 17:52:31.393515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.690 #7 NEW cov: 11745 ft: 12524 corp: 3/161b lim: 100 exec/s: 0 rss: 67Mb L: 73/87 MS: 3 ChangeBit-ChangeByte-CrossOver- 00:08:38.690 [2024-11-19 17:52:31.432515] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:38.690 [2024-11-19 17:52:31.432549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.690 #8 NEW cov: 11751 ft: 13144 corp: 4/184b lim: 100 exec/s: 0 rss: 67Mb L: 23/87 MS: 1 InsertRepeatedBytes- 00:08:38.690 [2024-11-19 17:52:31.473183] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:38.690 [2024-11-19 17:52:31.473217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.690 [2024-11-19 17:52:31.473308] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:38.690 [2024-11-19 17:52:31.473331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.690 [2024-11-19 17:52:31.473441] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:38.690 [2024-11-19 17:52:31.473459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.690 [2024-11-19 17:52:31.473565] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:38.690 [2024-11-19 17:52:31.473586] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:38.690 #9 NEW cov: 11836 ft: 13342 corp: 5/280b lim: 100 exec/s: 0 rss: 67Mb L: 96/96 MS: 1 InsertRepeatedBytes- 00:08:38.690 [2024-11-19 17:52:31.522936] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:38.690 [2024-11-19 17:52:31.522962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.690 #10 NEW cov: 11836 ft: 13496 corp: 6/303b lim: 100 exec/s: 0 rss: 67Mb L: 23/96 MS: 1 ChangeBit- 00:08:38.951 [2024-11-19 17:52:31.573724] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:38.951 [2024-11-19 17:52:31.573754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.951 [2024-11-19 17:52:31.573820] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:38.951 [2024-11-19 17:52:31.573836] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.951 [2024-11-19 17:52:31.573945] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:38.951 [2024-11-19 17:52:31.573963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.951 [2024-11-19 17:52:31.574077] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:38.951 [2024-11-19 17:52:31.574095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:38.951 [2024-11-19 17:52:31.574214] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:4 nsid:0 00:08:38.951 [2024-11-19 17:52:31.574235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:38.951 #11 NEW cov: 11836 ft: 13662 corp: 7/403b lim: 100 exec/s: 0 rss: 67Mb L: 100/100 MS: 1 InsertRepeatedBytes- 00:08:38.951 [2024-11-19 17:52:31.613191] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:38.951 [2024-11-19 17:52:31.613223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.951 #12 NEW cov: 11836 ft: 13842 corp: 8/426b lim: 100 exec/s: 0 rss: 67Mb L: 23/100 MS: 1 ChangeByte- 00:08:38.951 [2024-11-19 17:52:31.663593] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:38.951 [2024-11-19 17:52:31.663627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.951 [2024-11-19 17:52:31.663737] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:38.951 [2024-11-19 17:52:31.663757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.951 [2024-11-19 17:52:31.663869] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:38.951 [2024-11-19 17:52:31.663890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.951 #13 NEW cov: 11836 ft: 13864 corp: 9/499b lim: 100 exec/s: 0 rss: 68Mb L: 73/100 MS: 1 ChangeByte- 00:08:38.951 [2024-11-19 17:52:31.713923] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:38.951 [2024-11-19 17:52:31.713955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.951 [2024-11-19 17:52:31.714062] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:38.951 [2024-11-19 17:52:31.714082] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.951 [2024-11-19 17:52:31.714200] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:38.951 [2024-11-19 17:52:31.714222] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.951 [2024-11-19 17:52:31.714337] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:38.951 [2024-11-19 17:52:31.714358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:38.951 #14 NEW cov: 11836 ft: 13913 corp: 10/581b lim: 100 exec/s: 0 rss: 68Mb L: 82/100 MS: 1 EraseBytes- 00:08:38.951 [2024-11-19 17:52:31.754013] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:38.951 [2024-11-19 17:52:31.754043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.951 [2024-11-19 17:52:31.754108] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:38.951 [2024-11-19 17:52:31.754129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.951 [2024-11-19 17:52:31.754241] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:38.951 [2024-11-19 17:52:31.754264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.951 [2024-11-19 17:52:31.754394] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:38.951 [2024-11-19 17:52:31.754414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:38.951 #15 NEW cov: 11836 ft: 13949 corp: 11/668b lim: 100 exec/s: 0 rss: 68Mb L: 87/100 MS: 1 CMP- DE: "\001\000\000\011"- 00:08:38.951 [2024-11-19 17:52:31.794203] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:38.951 [2024-11-19 17:52:31.794232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.951 [2024-11-19 17:52:31.794331] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:38.951 [2024-11-19 17:52:31.794351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.951 [2024-11-19 17:52:31.794461] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:38.951 [2024-11-19 17:52:31.794484] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.951 [2024-11-19 17:52:31.794602] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:38.951 [2024-11-19 17:52:31.794623] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:39.211 #16 NEW cov: 11836 ft: 14043 corp: 12/750b lim: 100 exec/s: 0 rss: 68Mb L: 82/100 MS: 1 CopyPart- 00:08:39.211 [2024-11-19 17:52:31.843981] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:39.211 [2024-11-19 17:52:31.844008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.211 [2024-11-19 17:52:31.844116] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:39.211 [2024-11-19 17:52:31.844135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.211 #17 NEW cov: 11836 ft: 14392 corp: 13/805b lim: 100 exec/s: 0 rss: 68Mb L: 55/100 MS: 1 EraseBytes- 00:08:39.211 [2024-11-19 17:52:31.884464] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:39.211 [2024-11-19 17:52:31.884493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.211 [2024-11-19 17:52:31.884586] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:39.211 [2024-11-19 17:52:31.884605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.211 [2024-11-19 17:52:31.884720] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:39.211 [2024-11-19 17:52:31.884735] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:39.211 [2024-11-19 17:52:31.884850] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:39.211 [2024-11-19 17:52:31.884870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:39.211 #18 NEW cov: 11836 ft: 14415 corp: 14/889b lim: 100 exec/s: 0 rss: 68Mb L: 84/100 MS: 1 CrossOver- 00:08:39.211 [2024-11-19 17:52:31.924391] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:39.211 [2024-11-19 17:52:31.924421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.211 [2024-11-19 17:52:31.924520] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:39.211 [2024-11-19 17:52:31.924537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.211 [2024-11-19 17:52:31.924661] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:39.211 [2024-11-19 17:52:31.924683] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:39.211 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:39.211 #19 NEW cov: 11859 ft: 14476 corp: 15/962b lim: 100 exec/s: 0 rss: 68Mb L: 73/100 MS: 1 ChangeByte- 00:08:39.211 [2024-11-19 17:52:31.974229] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:39.211 [2024-11-19 17:52:31.974253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.211 #20 NEW cov: 11859 ft: 14529 corp: 16/985b lim: 100 exec/s: 0 rss: 68Mb L: 23/100 MS: 1 ShuffleBytes- 00:08:39.211 [2024-11-19 17:52:32.014371] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:39.211 [2024-11-19 17:52:32.014395] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.211 #21 NEW cov: 11859 ft: 14553 corp: 17/1016b lim: 100 exec/s: 0 rss: 68Mb L: 31/100 MS: 1 CMP- DE: "\377\377\377\377\377\377\377\377"- 00:08:39.211 [2024-11-19 17:52:32.054616] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:39.211 [2024-11-19 17:52:32.054649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.212 [2024-11-19 17:52:32.054770] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:39.212 [2024-11-19 17:52:32.054792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.472 #22 NEW cov: 11859 ft: 14573 corp: 18/1068b lim: 100 exec/s: 22 rss: 68Mb L: 52/100 MS: 1 EraseBytes- 00:08:39.472 [2024-11-19 17:52:32.094656] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:39.472 [2024-11-19 17:52:32.094687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.472 [2024-11-19 17:52:32.094783] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:39.472 [2024-11-19 17:52:32.094805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.472 #23 NEW cov: 11859 ft: 14582 corp: 19/1125b lim: 100 exec/s: 23 rss: 68Mb L: 57/100 MS: 1 EraseBytes- 00:08:39.472 [2024-11-19 17:52:32.134904] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:39.472 [2024-11-19 17:52:32.134935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.472 [2024-11-19 17:52:32.135054] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:39.472 [2024-11-19 17:52:32.135079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.472 #24 NEW cov: 11859 ft: 14596 corp: 20/1180b lim: 100 exec/s: 24 rss: 68Mb L: 55/100 MS: 1 ShuffleBytes- 00:08:39.472 [2024-11-19 17:52:32.174959] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:39.472 [2024-11-19 17:52:32.174988] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.472 [2024-11-19 17:52:32.175092] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:39.472 [2024-11-19 17:52:32.175114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.472 #25 NEW cov: 11859 ft: 14615 corp: 21/1232b lim: 100 exec/s: 25 rss: 68Mb L: 52/100 MS: 1 ChangeByte- 00:08:39.472 [2024-11-19 17:52:32.225354] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:39.472 [2024-11-19 17:52:32.225384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.472 [2024-11-19 17:52:32.225470] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:39.472 [2024-11-19 17:52:32.225490] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.472 [2024-11-19 17:52:32.225612] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:39.472 [2024-11-19 17:52:32.225628] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:39.472 #26 NEW cov: 11859 ft: 14639 corp: 22/1305b lim: 100 exec/s: 26 rss: 68Mb L: 73/100 MS: 1 ChangeBinInt- 00:08:39.472 [2024-11-19 17:52:32.265151] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:39.472 [2024-11-19 17:52:32.265176] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.472 #27 NEW cov: 11859 ft: 14650 corp: 23/1328b lim: 100 exec/s: 27 rss: 69Mb L: 23/100 MS: 1 ShuffleBytes- 00:08:39.472 [2024-11-19 17:52:32.315489] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:39.472 [2024-11-19 17:52:32.315518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.472 [2024-11-19 17:52:32.315643] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:39.472 [2024-11-19 17:52:32.315664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.732 #28 NEW cov: 11859 ft: 14700 corp: 24/1383b lim: 100 exec/s: 28 rss: 69Mb L: 55/100 MS: 1 ChangeByte- 00:08:39.732 [2024-11-19 17:52:32.355459] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:39.732 [2024-11-19 17:52:32.355489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.732 #29 NEW cov: 11859 ft: 14769 corp: 25/1421b lim: 100 exec/s: 29 rss: 69Mb L: 38/100 MS: 1 EraseBytes- 00:08:39.732 [2024-11-19 17:52:32.395739] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:39.732 [2024-11-19 17:52:32.395768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.732 [2024-11-19 17:52:32.395859] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:39.732 [2024-11-19 17:52:32.395879] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.732 [2024-11-19 17:52:32.396004] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:39.732 [2024-11-19 17:52:32.396026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:39.732 #30 NEW cov: 11859 ft: 14773 corp: 26/1494b lim: 100 exec/s: 30 rss: 69Mb L: 73/100 MS: 1 ChangeByte- 00:08:39.732 [2024-11-19 17:52:32.436092] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:39.732 [2024-11-19 17:52:32.436122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.732 [2024-11-19 17:52:32.436225] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:39.732 [2024-11-19 17:52:32.436241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.732 [2024-11-19 17:52:32.436351] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:39.732 [2024-11-19 17:52:32.436370] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:39.732 [2024-11-19 17:52:32.436483] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:39.732 [2024-11-19 17:52:32.436500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:39.732 #31 NEW cov: 11859 ft: 14787 corp: 27/1576b lim: 100 exec/s: 31 rss: 69Mb L: 82/100 MS: 1 ChangeByte- 00:08:39.732 [2024-11-19 17:52:32.475832] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:39.732 [2024-11-19 17:52:32.475861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.732 [2024-11-19 17:52:32.475967] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:39.732 [2024-11-19 17:52:32.475984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.732 #32 NEW cov: 11859 ft: 14792 corp: 28/1632b lim: 100 exec/s: 32 rss: 69Mb L: 56/100 MS: 1 InsertByte- 00:08:39.732 [2024-11-19 17:52:32.515971] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:39.732 [2024-11-19 17:52:32.515994] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.732 #33 NEW cov: 11859 ft: 14798 corp: 29/1664b lim: 100 exec/s: 33 rss: 69Mb L: 32/100 MS: 1 EraseBytes- 00:08:39.732 [2024-11-19 17:52:32.556734] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:39.732 [2024-11-19 17:52:32.556762] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.732 [2024-11-19 17:52:32.556847] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:39.732 [2024-11-19 17:52:32.556862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.732 [2024-11-19 17:52:32.556972] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:39.732 [2024-11-19 17:52:32.556988] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:39.732 [2024-11-19 17:52:32.557098] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:39.732 [2024-11-19 17:52:32.557115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:39.732 [2024-11-19 17:52:32.557229] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:4 nsid:0 00:08:39.732 [2024-11-19 17:52:32.557245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:39.733 #34 NEW cov: 11859 ft: 14819 corp: 30/1764b lim: 100 exec/s: 34 rss: 69Mb L: 100/100 MS: 1 ChangeByte- 00:08:39.993 [2024-11-19 17:52:32.596611] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:39.993 [2024-11-19 17:52:32.596639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.993 [2024-11-19 17:52:32.596719] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:39.993 [2024-11-19 17:52:32.596740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.993 [2024-11-19 17:52:32.596851] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:39.993 [2024-11-19 17:52:32.596872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:39.993 [2024-11-19 17:52:32.596990] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:39.993 [2024-11-19 17:52:32.597010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:39.993 #35 NEW cov: 11859 ft: 14873 corp: 31/1862b lim: 100 exec/s: 35 rss: 69Mb L: 98/100 MS: 1 InsertRepeatedBytes- 00:08:39.993 [2024-11-19 17:52:32.636778] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:39.993 [2024-11-19 17:52:32.636809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.993 [2024-11-19 17:52:32.636873] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:39.993 [2024-11-19 17:52:32.636888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.993 [2024-11-19 17:52:32.637000] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:39.993 [2024-11-19 17:52:32.637020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:39.993 [2024-11-19 17:52:32.637129] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:39.993 [2024-11-19 17:52:32.637147] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:39.994 [2024-11-19 17:52:32.676766] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:39.994 [2024-11-19 17:52:32.676795] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.994 [2024-11-19 17:52:32.676883] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:39.994 [2024-11-19 17:52:32.676904] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.994 [2024-11-19 17:52:32.677012] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:39.994 [2024-11-19 17:52:32.677033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:39.994 #40 NEW cov: 11859 ft: 14881 corp: 32/1940b lim: 100 exec/s: 40 rss: 69Mb L: 78/100 MS: 5 EraseBytes-PersAutoDict-CrossOver-CrossOver-EraseBytes- DE: "\001\000\000\011"- 00:08:39.994 [2024-11-19 17:52:32.717256] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:39.994 [2024-11-19 17:52:32.717285] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.994 [2024-11-19 17:52:32.717370] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:39.994 [2024-11-19 17:52:32.717387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.994 [2024-11-19 17:52:32.717505] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:39.994 [2024-11-19 17:52:32.717528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:39.994 [2024-11-19 17:52:32.717662] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:39.994 [2024-11-19 17:52:32.717708] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:39.994 [2024-11-19 17:52:32.717774] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:4 nsid:0 00:08:39.994 [2024-11-19 17:52:32.717793] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:39.994 #41 NEW cov: 11859 ft: 14903 corp: 33/2040b lim: 100 exec/s: 41 rss: 69Mb L: 100/100 MS: 1 ChangeBinInt- 00:08:39.994 [2024-11-19 17:52:32.757003] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:39.994 [2024-11-19 17:52:32.757033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.994 [2024-11-19 17:52:32.757138] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:39.994 [2024-11-19 17:52:32.757160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.994 [2024-11-19 17:52:32.757272] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:39.994 [2024-11-19 17:52:32.757297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:39.994 #42 NEW cov: 11859 ft: 14911 corp: 34/2114b lim: 100 exec/s: 42 rss: 69Mb L: 74/100 MS: 1 InsertByte- 00:08:39.994 [2024-11-19 17:52:32.807524] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:39.994 [2024-11-19 17:52:32.807554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.994 [2024-11-19 17:52:32.807682] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:39.994 [2024-11-19 17:52:32.807703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.994 [2024-11-19 17:52:32.807821] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:39.994 [2024-11-19 17:52:32.807842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:39.994 [2024-11-19 17:52:32.807958] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:39.994 [2024-11-19 17:52:32.807981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:39.994 [2024-11-19 17:52:32.808098] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:4 nsid:0 00:08:39.994 [2024-11-19 17:52:32.808119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:39.994 #43 NEW cov: 11859 ft: 14919 corp: 35/2214b lim: 100 exec/s: 43 rss: 69Mb L: 100/100 MS: 1 ShuffleBytes- 00:08:40.384 [2024-11-19 17:52:32.857341] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:40.384 [2024-11-19 17:52:32.857372] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.384 [2024-11-19 17:52:32.857486] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:40.384 [2024-11-19 17:52:32.857507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.384 [2024-11-19 17:52:32.857618] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:40.384 [2024-11-19 17:52:32.857639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:40.384 #44 NEW cov: 11859 ft: 14946 corp: 36/2287b lim: 100 exec/s: 44 rss: 69Mb L: 73/100 MS: 1 ChangeByte- 00:08:40.384 [2024-11-19 17:52:32.897622] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:40.384 [2024-11-19 17:52:32.897655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.384 [2024-11-19 17:52:32.897730] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:40.384 [2024-11-19 17:52:32.897752] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.384 [2024-11-19 17:52:32.897868] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:40.384 [2024-11-19 17:52:32.897891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:40.384 [2024-11-19 17:52:32.898005] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:40.384 [2024-11-19 17:52:32.898027] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:40.384 #45 NEW cov: 11859 ft: 14954 corp: 37/2383b lim: 100 exec/s: 45 rss: 69Mb L: 96/100 MS: 1 ChangeBit- 00:08:40.384 [2024-11-19 17:52:32.937263] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:40.384 [2024-11-19 17:52:32.937296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.384 #46 NEW cov: 11859 ft: 15083 corp: 38/2414b lim: 100 exec/s: 46 rss: 69Mb L: 31/100 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\377\377"- 00:08:40.384 [2024-11-19 17:52:32.987884] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:40.384 [2024-11-19 17:52:32.987914] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.384 [2024-11-19 17:52:32.987977] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:40.384 [2024-11-19 17:52:32.987997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.384 [2024-11-19 17:52:32.988109] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:40.384 [2024-11-19 17:52:32.988127] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:40.384 [2024-11-19 17:52:32.988241] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:40.384 [2024-11-19 17:52:32.988260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:40.384 #47 NEW cov: 11859 ft: 15123 corp: 39/2496b lim: 100 exec/s: 47 rss: 69Mb L: 82/100 MS: 1 CMP- DE: "\000\010"- 00:08:40.384 [2024-11-19 17:52:33.037905] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:40.384 [2024-11-19 17:52:33.037935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.384 [2024-11-19 17:52:33.038031] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:40.384 [2024-11-19 17:52:33.038050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.384 [2024-11-19 17:52:33.038167] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:40.384 [2024-11-19 17:52:33.038186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:40.384 #48 NEW cov: 11859 ft: 15153 corp: 40/2570b lim: 100 exec/s: 24 rss: 69Mb L: 74/100 MS: 1 CopyPart- 00:08:40.384 #48 DONE cov: 11859 ft: 15153 corp: 40/2570b lim: 100 exec/s: 24 rss: 69Mb 00:08:40.384 ###### Recommended dictionary. ###### 00:08:40.384 "\001\000\000\011" # Uses: 1 00:08:40.384 "\377\377\377\377\377\377\377\377" # Uses: 1 00:08:40.384 "\000\010" # Uses: 0 00:08:40.384 ###### End of recommended dictionary. ###### 00:08:40.384 Done 48 runs in 2 second(s) 00:08:40.384 17:52:33 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_18.conf 00:08:40.384 17:52:33 -- ../common.sh@72 -- # (( i++ )) 00:08:40.384 17:52:33 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:40.384 17:52:33 -- ../common.sh@73 -- # start_llvm_fuzz 19 1 0x1 00:08:40.384 17:52:33 -- nvmf/run.sh@23 -- # local fuzzer_type=19 00:08:40.384 17:52:33 -- nvmf/run.sh@24 -- # local timen=1 00:08:40.384 17:52:33 -- nvmf/run.sh@25 -- # local core=0x1 00:08:40.384 17:52:33 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:08:40.384 17:52:33 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_19.conf 00:08:40.384 17:52:33 -- nvmf/run.sh@29 -- # printf %02d 19 00:08:40.384 17:52:33 -- nvmf/run.sh@29 -- # port=4419 00:08:40.384 17:52:33 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:08:40.384 17:52:33 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4419' 00:08:40.384 17:52:33 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4419"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:40.384 17:52:33 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4419' -c /tmp/fuzz_json_19.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 -Z 19 -r /var/tmp/spdk19.sock 00:08:40.384 [2024-11-19 17:52:33.199155] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:40.384 [2024-11-19 17:52:33.199219] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid642665 ] 00:08:40.701 EAL: No free 2048 kB hugepages reported on node 1 00:08:40.701 [2024-11-19 17:52:33.370550] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:40.701 [2024-11-19 17:52:33.389539] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:40.701 [2024-11-19 17:52:33.389680] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:40.701 [2024-11-19 17:52:33.441014] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:40.701 [2024-11-19 17:52:33.457339] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4419 *** 00:08:40.701 INFO: Running with entropic power schedule (0xFF, 100). 00:08:40.701 INFO: Seed: 2638651737 00:08:40.701 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:40.701 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:40.701 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:08:40.701 INFO: A corpus is not provided, starting from an empty corpus 00:08:40.701 #2 INITED exec/s: 0 rss: 59Mb 00:08:40.701 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:40.701 This may also happen if the target rejected all inputs we tried so far 00:08:40.701 [2024-11-19 17:52:33.501951] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 00:08:40.701 [2024-11-19 17:52:33.501984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.701 [2024-11-19 17:52:33.502016] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:08:40.701 [2024-11-19 17:52:33.502032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.701 [2024-11-19 17:52:33.502064] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:08:40.701 [2024-11-19 17:52:33.502079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:40.961 NEW_FUNC[1/670]: 0x4719d8 in fuzz_nvm_write_uncorrectable_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:582 00:08:40.961 NEW_FUNC[2/670]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:40.961 #3 NEW cov: 11610 ft: 11611 corp: 2/39b lim: 50 exec/s: 0 rss: 67Mb L: 38/38 MS: 1 InsertRepeatedBytes- 00:08:40.961 [2024-11-19 17:52:33.822614] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551370 len:65536 00:08:40.961 [2024-11-19 17:52:33.822651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.220 #4 NEW cov: 11723 ft: 12499 corp: 3/58b lim: 50 exec/s: 0 rss: 67Mb L: 19/38 MS: 1 CrossOver- 00:08:41.220 [2024-11-19 17:52:33.892715] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551370 len:65536 00:08:41.220 [2024-11-19 17:52:33.892745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.220 #5 NEW cov: 11729 ft: 12652 corp: 4/68b lim: 50 exec/s: 0 rss: 67Mb L: 10/38 MS: 1 EraseBytes- 00:08:41.220 [2024-11-19 17:52:33.952896] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551370 len:65536 00:08:41.220 [2024-11-19 17:52:33.952925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.220 [2024-11-19 17:52:33.952973] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744069599133695 len:65536 00:08:41.220 [2024-11-19 17:52:33.952990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.220 #7 NEW cov: 11814 ft: 13215 corp: 5/96b lim: 50 exec/s: 0 rss: 67Mb L: 28/38 MS: 2 EraseBytes-CrossOver- 00:08:41.220 [2024-11-19 17:52:34.023087] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551370 len:65536 00:08:41.220 [2024-11-19 17:52:34.023118] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.221 [2024-11-19 17:52:34.023151] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:08:41.221 [2024-11-19 17:52:34.023168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.221 #8 NEW cov: 11814 ft: 13342 corp: 6/116b lim: 50 exec/s: 0 rss: 67Mb L: 20/38 MS: 1 InsertByte- 00:08:41.221 [2024-11-19 17:52:34.073232] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446463698244468735 len:65536 00:08:41.221 [2024-11-19 17:52:34.073261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.221 [2024-11-19 17:52:34.073308] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:08:41.221 [2024-11-19 17:52:34.073325] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.221 [2024-11-19 17:52:34.073353] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:08:41.221 [2024-11-19 17:52:34.073369] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:41.480 #9 NEW cov: 11814 ft: 13495 corp: 7/155b lim: 50 exec/s: 0 rss: 67Mb L: 39/39 MS: 1 InsertByte- 00:08:41.480 [2024-11-19 17:52:34.133282] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551370 len:65536 00:08:41.480 [2024-11-19 17:52:34.133310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.480 #10 NEW cov: 11814 ft: 13546 corp: 8/174b lim: 50 exec/s: 0 rss: 67Mb L: 19/39 MS: 1 CMP- DE: "\377\213lk\013\211e\356"- 00:08:41.480 [2024-11-19 17:52:34.183496] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551370 len:65536 00:08:41.480 [2024-11-19 17:52:34.183535] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.480 [2024-11-19 17:52:34.183567] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18377782704415440895 len:65536 00:08:41.480 [2024-11-19 17:52:34.183585] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.480 #11 NEW cov: 11814 ft: 13589 corp: 9/197b lim: 50 exec/s: 0 rss: 67Mb L: 23/39 MS: 1 CopyPart- 00:08:41.480 [2024-11-19 17:52:34.233633] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551370 len:65536 00:08:41.480 [2024-11-19 17:52:34.233663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.481 [2024-11-19 17:52:34.233695] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18384831673461112831 len:65536 00:08:41.481 [2024-11-19 17:52:34.233713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.481 #12 NEW cov: 11814 ft: 13654 corp: 10/221b lim: 50 exec/s: 0 rss: 67Mb L: 24/39 MS: 1 InsertByte- 00:08:41.481 [2024-11-19 17:52:34.293855] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551370 len:65536 00:08:41.481 [2024-11-19 17:52:34.293885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.481 [2024-11-19 17:52:34.293930] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18377782704415440895 len:65536 00:08:41.481 [2024-11-19 17:52:34.293947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.481 [2024-11-19 17:52:34.293975] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:08:41.481 [2024-11-19 17:52:34.293991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:41.481 [2024-11-19 17:52:34.294018] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:2597169610109222911 len:65536 00:08:41.481 [2024-11-19 17:52:34.294034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:41.481 #13 NEW cov: 11814 ft: 13924 corp: 11/264b lim: 50 exec/s: 0 rss: 67Mb L: 43/43 MS: 1 CrossOver- 00:08:41.741 [2024-11-19 17:52:34.353989] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551370 len:65536 00:08:41.741 [2024-11-19 17:52:34.354019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.741 [2024-11-19 17:52:34.354065] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446462598917390335 len:1 00:08:41.741 [2024-11-19 17:52:34.354085] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.741 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:41.741 #19 NEW cov: 11831 ft: 13982 corp: 12/292b lim: 50 exec/s: 0 rss: 68Mb L: 28/43 MS: 1 CMP- DE: "\000\000\000\000\000\000\000\000"- 00:08:41.741 [2024-11-19 17:52:34.424181] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446743021442564095 len:65536 00:08:41.741 [2024-11-19 17:52:34.424212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.741 [2024-11-19 17:52:34.424244] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:4294967040 len:1 00:08:41.741 [2024-11-19 17:52:34.424262] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.741 #20 NEW cov: 11831 ft: 14010 corp: 13/320b lim: 50 exec/s: 0 rss: 68Mb L: 28/43 MS: 1 CopyPart- 00:08:41.741 [2024-11-19 17:52:34.484370] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446463698244468735 len:65536 00:08:41.741 [2024-11-19 17:52:34.484401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.741 [2024-11-19 17:52:34.484458] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:08:41.741 [2024-11-19 17:52:34.484475] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.741 [2024-11-19 17:52:34.484503] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:08:41.741 [2024-11-19 17:52:34.484518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:41.741 [2024-11-19 17:52:34.484545] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18374969054159962111 len:65536 00:08:41.741 [2024-11-19 17:52:34.484561] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:41.741 #21 NEW cov: 11831 ft: 14029 corp: 14/361b lim: 50 exec/s: 21 rss: 68Mb L: 41/43 MS: 1 CMP- DE: "\001\000"- 00:08:41.741 [2024-11-19 17:52:34.554443] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446743081572105994 len:65536 00:08:41.741 [2024-11-19 17:52:34.554472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.741 [2024-11-19 17:52:34.554519] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18384831673461112831 len:65536 00:08:41.741 [2024-11-19 17:52:34.554536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.741 #22 NEW cov: 11831 ft: 14121 corp: 15/385b lim: 50 exec/s: 22 rss: 68Mb L: 24/43 MS: 1 ChangeBinInt- 00:08:42.002 [2024-11-19 17:52:34.604676] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446463698244468735 len:65536 00:08:42.002 [2024-11-19 17:52:34.604707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.002 [2024-11-19 17:52:34.604739] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:08:42.002 [2024-11-19 17:52:34.604756] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.002 [2024-11-19 17:52:34.604785] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:11264 00:08:42.002 [2024-11-19 17:52:34.604800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:42.002 #28 NEW cov: 11831 ft: 14158 corp: 16/424b lim: 50 exec/s: 28 rss: 68Mb L: 39/43 MS: 1 ChangeByte- 00:08:42.002 [2024-11-19 17:52:34.654805] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446463698244468735 len:65536 00:08:42.002 [2024-11-19 17:52:34.654834] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.002 [2024-11-19 17:52:34.654880] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:08:42.002 [2024-11-19 17:52:34.654897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.002 [2024-11-19 17:52:34.654925] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:9472 00:08:42.002 [2024-11-19 17:52:34.654941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:42.002 [2024-11-19 17:52:34.654968] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446463702539436031 len:65536 00:08:42.002 [2024-11-19 17:52:34.654983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:42.002 #29 NEW cov: 11831 ft: 14228 corp: 17/466b lim: 50 exec/s: 29 rss: 68Mb L: 42/43 MS: 1 InsertByte- 00:08:42.002 [2024-11-19 17:52:34.724940] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551370 len:65536 00:08:42.002 [2024-11-19 17:52:34.724971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.002 [2024-11-19 17:52:34.725004] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446462598917390335 len:1 00:08:42.002 [2024-11-19 17:52:34.725021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.002 #30 NEW cov: 11831 ft: 14304 corp: 18/494b lim: 50 exec/s: 30 rss: 68Mb L: 28/43 MS: 1 CMP- DE: "\000\000\000\366"- 00:08:42.002 [2024-11-19 17:52:34.775109] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 00:08:42.002 [2024-11-19 17:52:34.775139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.002 [2024-11-19 17:52:34.775183] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:08:42.002 [2024-11-19 17:52:34.775201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.002 [2024-11-19 17:52:34.775229] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446462603027808255 len:1 00:08:42.002 [2024-11-19 17:52:34.775244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:42.002 [2024-11-19 17:52:34.775271] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446744069414584320 len:65536 00:08:42.002 [2024-11-19 17:52:34.775287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:42.002 #31 NEW cov: 11831 ft: 14334 corp: 19/540b lim: 50 exec/s: 31 rss: 68Mb L: 46/46 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000\000"- 00:08:42.002 [2024-11-19 17:52:34.825277] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446463698244468735 len:65536 00:08:42.002 [2024-11-19 17:52:34.825306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.002 [2024-11-19 17:52:34.825355] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:08:42.002 [2024-11-19 17:52:34.825373] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.002 [2024-11-19 17:52:34.825401] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:9472 00:08:42.002 [2024-11-19 17:52:34.825417] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:42.002 [2024-11-19 17:52:34.825444] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446463702539379455 len:65536 00:08:42.002 [2024-11-19 17:52:34.825459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:42.262 #32 NEW cov: 11831 ft: 14415 corp: 20/582b lim: 50 exec/s: 32 rss: 68Mb L: 42/46 MS: 1 ChangeByte- 00:08:42.262 [2024-11-19 17:52:34.895433] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551370 len:65536 00:08:42.262 [2024-11-19 17:52:34.895462] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.262 [2024-11-19 17:52:34.895507] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18377782704415440895 len:65536 00:08:42.262 [2024-11-19 17:52:34.895524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.262 [2024-11-19 17:52:34.895552] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:08:42.262 [2024-11-19 17:52:34.895568] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:42.262 [2024-11-19 17:52:34.895595] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:2597169610109222911 len:65536 00:08:42.262 [2024-11-19 17:52:34.895619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:42.262 #33 NEW cov: 11831 ft: 14447 corp: 21/625b lim: 50 exec/s: 33 rss: 68Mb L: 43/46 MS: 1 CrossOver- 00:08:42.263 [2024-11-19 17:52:34.955517] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551370 len:9216 00:08:42.263 [2024-11-19 17:52:34.955546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.263 [2024-11-19 17:52:34.955593] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:08:42.263 [2024-11-19 17:52:34.955618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.263 #34 NEW cov: 11831 ft: 14494 corp: 22/645b lim: 50 exec/s: 34 rss: 68Mb L: 20/46 MS: 1 InsertByte- 00:08:42.263 [2024-11-19 17:52:35.005652] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744039349813002 len:65536 00:08:42.263 [2024-11-19 17:52:35.005680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.263 [2024-11-19 17:52:35.005727] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744069599133695 len:65536 00:08:42.263 [2024-11-19 17:52:35.005744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.263 #35 NEW cov: 11831 ft: 14512 corp: 23/673b lim: 50 exec/s: 35 rss: 68Mb L: 28/46 MS: 1 ChangeBinInt- 00:08:42.263 [2024-11-19 17:52:35.055780] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551370 len:65536 00:08:42.263 [2024-11-19 17:52:35.055809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.263 [2024-11-19 17:52:35.055855] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18411841176781979647 len:1 00:08:42.263 [2024-11-19 17:52:35.055873] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.263 #36 NEW cov: 11831 ft: 14518 corp: 24/701b lim: 50 exec/s: 36 rss: 68Mb L: 28/46 MS: 1 ChangeByte- 00:08:42.263 [2024-11-19 17:52:35.115970] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551370 len:65536 00:08:42.263 [2024-11-19 17:52:35.115998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.263 [2024-11-19 17:52:35.116043] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:12876550762359075506 len:45747 00:08:42.263 [2024-11-19 17:52:35.116060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.263 [2024-11-19 17:52:35.116089] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:9511602417301454847 len:1 00:08:42.263 [2024-11-19 17:52:35.116105] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:42.523 #37 NEW cov: 11831 ft: 14531 corp: 25/738b lim: 50 exec/s: 37 rss: 69Mb L: 37/46 MS: 1 InsertRepeatedBytes- 00:08:42.523 [2024-11-19 17:52:35.176158] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551370 len:65536 00:08:42.523 [2024-11-19 17:52:35.176186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.523 [2024-11-19 17:52:35.176232] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:12876550762359075506 len:45747 00:08:42.523 [2024-11-19 17:52:35.176249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.523 [2024-11-19 17:52:35.176278] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:9511602881157922815 len:1 00:08:42.523 [2024-11-19 17:52:35.176294] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:42.523 #38 NEW cov: 11831 ft: 14542 corp: 26/775b lim: 50 exec/s: 38 rss: 69Mb L: 37/46 MS: 1 ChangeByte- 00:08:42.523 [2024-11-19 17:52:35.236357] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 00:08:42.523 [2024-11-19 17:52:35.236386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.523 [2024-11-19 17:52:35.236432] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:08:42.523 [2024-11-19 17:52:35.236449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.523 [2024-11-19 17:52:35.236477] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446462603027808255 len:1 00:08:42.523 [2024-11-19 17:52:35.236493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:42.523 [2024-11-19 17:52:35.236520] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446744069414584320 len:65536 00:08:42.523 [2024-11-19 17:52:35.236536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:42.523 #39 NEW cov: 11831 ft: 14552 corp: 27/822b lim: 50 exec/s: 39 rss: 69Mb L: 47/47 MS: 1 InsertByte- 00:08:42.523 [2024-11-19 17:52:35.306445] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446743081572105994 len:65536 00:08:42.523 [2024-11-19 17:52:35.306474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.523 [2024-11-19 17:52:35.306520] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18384831673461112831 len:17152 00:08:42.523 [2024-11-19 17:52:35.306538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.523 #40 NEW cov: 11831 ft: 14579 corp: 28/846b lim: 50 exec/s: 40 rss: 69Mb L: 24/47 MS: 1 ChangeByte- 00:08:42.523 [2024-11-19 17:52:35.366709] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551370 len:65536 00:08:42.523 [2024-11-19 17:52:35.366738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.523 [2024-11-19 17:52:35.366783] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18377782704415440895 len:65536 00:08:42.523 [2024-11-19 17:52:35.366800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.523 [2024-11-19 17:52:35.366828] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:08:42.523 [2024-11-19 17:52:35.366844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:42.523 [2024-11-19 17:52:35.366871] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:2594073389660372991 len:1024 00:08:42.523 [2024-11-19 17:52:35.366886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:42.782 #41 NEW cov: 11838 ft: 14592 corp: 29/889b lim: 50 exec/s: 41 rss: 69Mb L: 43/47 MS: 1 CMP- DE: "\000\000\000\003"- 00:08:42.783 [2024-11-19 17:52:35.436839] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551370 len:65536 00:08:42.783 [2024-11-19 17:52:35.436869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.783 [2024-11-19 17:52:35.436902] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:12876550762359075506 len:45824 00:08:42.783 [2024-11-19 17:52:35.436920] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.783 #42 NEW cov: 11838 ft: 14653 corp: 30/912b lim: 50 exec/s: 42 rss: 69Mb L: 23/47 MS: 1 EraseBytes- 00:08:42.783 [2024-11-19 17:52:35.486905] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551370 len:65536 00:08:42.783 [2024-11-19 17:52:35.486934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.783 [2024-11-19 17:52:35.486981] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744069599133695 len:65536 00:08:42.783 [2024-11-19 17:52:35.486998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.783 #43 NEW cov: 11838 ft: 14663 corp: 31/940b lim: 50 exec/s: 21 rss: 69Mb L: 28/47 MS: 1 ChangeBit- 00:08:42.783 #43 DONE cov: 11838 ft: 14663 corp: 31/940b lim: 50 exec/s: 21 rss: 69Mb 00:08:42.783 ###### Recommended dictionary. ###### 00:08:42.783 "\377\213lk\013\211e\356" # Uses: 0 00:08:42.783 "\000\000\000\000\000\000\000\000" # Uses: 1 00:08:42.783 "\001\000" # Uses: 0 00:08:42.783 "\000\000\000\366" # Uses: 0 00:08:42.783 "\000\000\000\003" # Uses: 0 00:08:42.783 ###### End of recommended dictionary. ###### 00:08:42.783 Done 43 runs in 2 second(s) 00:08:42.783 17:52:35 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_19.conf 00:08:42.783 17:52:35 -- ../common.sh@72 -- # (( i++ )) 00:08:42.783 17:52:35 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:42.783 17:52:35 -- ../common.sh@73 -- # start_llvm_fuzz 20 1 0x1 00:08:42.783 17:52:35 -- nvmf/run.sh@23 -- # local fuzzer_type=20 00:08:42.783 17:52:35 -- nvmf/run.sh@24 -- # local timen=1 00:08:42.783 17:52:35 -- nvmf/run.sh@25 -- # local core=0x1 00:08:42.783 17:52:35 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:08:42.783 17:52:35 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_20.conf 00:08:42.783 17:52:35 -- nvmf/run.sh@29 -- # printf %02d 20 00:08:42.783 17:52:35 -- nvmf/run.sh@29 -- # port=4420 00:08:42.783 17:52:35 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:08:42.783 17:52:35 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4420' 00:08:42.783 17:52:35 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4420"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:42.783 17:52:35 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4420' -c /tmp/fuzz_json_20.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 -Z 20 -r /var/tmp/spdk20.sock 00:08:43.042 [2024-11-19 17:52:35.668024] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:43.042 [2024-11-19 17:52:35.668117] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid642960 ] 00:08:43.042 EAL: No free 2048 kB hugepages reported on node 1 00:08:43.042 [2024-11-19 17:52:35.846426] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:43.042 [2024-11-19 17:52:35.866710] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:43.043 [2024-11-19 17:52:35.866844] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:43.302 [2024-11-19 17:52:35.918267] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:43.302 [2024-11-19 17:52:35.934616] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4420 *** 00:08:43.302 INFO: Running with entropic power schedule (0xFF, 100). 00:08:43.302 INFO: Seed: 818685502 00:08:43.302 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:43.302 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:43.302 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:08:43.302 INFO: A corpus is not provided, starting from an empty corpus 00:08:43.302 #2 INITED exec/s: 0 rss: 59Mb 00:08:43.302 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:43.302 This may also happen if the target rejected all inputs we tried so far 00:08:43.302 [2024-11-19 17:52:35.983124] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:43.302 [2024-11-19 17:52:35.983155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.302 [2024-11-19 17:52:35.983211] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:43.302 [2024-11-19 17:52:35.983227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.561 NEW_FUNC[1/672]: 0x473598 in fuzz_nvm_reservation_acquire_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:597 00:08:43.561 NEW_FUNC[2/672]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:43.561 #18 NEW cov: 11668 ft: 11669 corp: 2/44b lim: 90 exec/s: 0 rss: 67Mb L: 43/43 MS: 1 InsertRepeatedBytes- 00:08:43.561 [2024-11-19 17:52:36.304647] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:43.561 [2024-11-19 17:52:36.304714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.561 [2024-11-19 17:52:36.304876] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:43.561 [2024-11-19 17:52:36.304914] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.561 #22 NEW cov: 11781 ft: 12555 corp: 3/94b lim: 90 exec/s: 0 rss: 67Mb L: 50/50 MS: 4 InsertByte-CopyPart-CrossOver-InsertRepeatedBytes- 00:08:43.561 [2024-11-19 17:52:36.344589] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:43.561 [2024-11-19 17:52:36.344625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.562 [2024-11-19 17:52:36.344749] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:43.562 [2024-11-19 17:52:36.344770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.562 #28 NEW cov: 11787 ft: 12734 corp: 4/144b lim: 90 exec/s: 0 rss: 67Mb L: 50/50 MS: 1 ShuffleBytes- 00:08:43.562 [2024-11-19 17:52:36.384447] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:43.562 [2024-11-19 17:52:36.384473] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.562 #29 NEW cov: 11872 ft: 13713 corp: 5/169b lim: 90 exec/s: 0 rss: 67Mb L: 25/50 MS: 1 EraseBytes- 00:08:43.821 [2024-11-19 17:52:36.434759] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:43.821 [2024-11-19 17:52:36.434784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.821 [2024-11-19 17:52:36.434909] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:43.821 [2024-11-19 17:52:36.434930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.821 #30 NEW cov: 11872 ft: 13873 corp: 6/219b lim: 90 exec/s: 0 rss: 67Mb L: 50/50 MS: 1 ChangeASCIIInt- 00:08:43.821 [2024-11-19 17:52:36.474671] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:43.821 [2024-11-19 17:52:36.474696] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.821 #34 NEW cov: 11872 ft: 13976 corp: 7/240b lim: 90 exec/s: 0 rss: 67Mb L: 21/50 MS: 4 CMP-CMP-ChangeByte-CrossOver- DE: "\377\377\377\013"-"\001\003"- 00:08:43.821 [2024-11-19 17:52:36.515058] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:43.821 [2024-11-19 17:52:36.515084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.821 [2024-11-19 17:52:36.515207] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:43.821 [2024-11-19 17:52:36.515225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.821 #35 NEW cov: 11872 ft: 14099 corp: 8/283b lim: 90 exec/s: 0 rss: 67Mb L: 43/50 MS: 1 ShuffleBytes- 00:08:43.821 [2024-11-19 17:52:36.554979] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:43.821 [2024-11-19 17:52:36.555004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.821 #36 NEW cov: 11872 ft: 14128 corp: 9/308b lim: 90 exec/s: 0 rss: 67Mb L: 25/50 MS: 1 CopyPart- 00:08:43.821 [2024-11-19 17:52:36.595612] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:43.821 [2024-11-19 17:52:36.595646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.821 [2024-11-19 17:52:36.595731] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:43.821 [2024-11-19 17:52:36.595755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.821 [2024-11-19 17:52:36.595876] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:43.821 [2024-11-19 17:52:36.595898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:43.821 #37 NEW cov: 11872 ft: 14504 corp: 10/376b lim: 90 exec/s: 0 rss: 67Mb L: 68/68 MS: 1 CopyPart- 00:08:43.821 [2024-11-19 17:52:36.635647] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:43.821 [2024-11-19 17:52:36.635680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.821 [2024-11-19 17:52:36.635779] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:43.821 [2024-11-19 17:52:36.635801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.821 [2024-11-19 17:52:36.635911] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:43.821 [2024-11-19 17:52:36.635937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:43.821 #38 NEW cov: 11872 ft: 14583 corp: 11/444b lim: 90 exec/s: 0 rss: 67Mb L: 68/68 MS: 1 ShuffleBytes- 00:08:43.821 [2024-11-19 17:52:36.675383] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:43.821 [2024-11-19 17:52:36.675408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.081 #39 NEW cov: 11872 ft: 14607 corp: 12/465b lim: 90 exec/s: 0 rss: 67Mb L: 21/68 MS: 1 EraseBytes- 00:08:44.081 [2024-11-19 17:52:36.715861] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:44.081 [2024-11-19 17:52:36.715893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.081 [2024-11-19 17:52:36.715999] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:44.081 [2024-11-19 17:52:36.716018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.081 [2024-11-19 17:52:36.716128] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:44.081 [2024-11-19 17:52:36.716149] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:44.081 #40 NEW cov: 11872 ft: 14631 corp: 13/533b lim: 90 exec/s: 0 rss: 67Mb L: 68/68 MS: 1 CopyPart- 00:08:44.081 [2024-11-19 17:52:36.755585] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:44.081 [2024-11-19 17:52:36.755616] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.081 #41 NEW cov: 11872 ft: 14648 corp: 14/558b lim: 90 exec/s: 0 rss: 67Mb L: 25/68 MS: 1 ChangeBinInt- 00:08:44.081 [2024-11-19 17:52:36.795695] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:44.081 [2024-11-19 17:52:36.795722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.081 #42 NEW cov: 11872 ft: 14686 corp: 15/579b lim: 90 exec/s: 0 rss: 68Mb L: 21/68 MS: 1 CopyPart- 00:08:44.081 [2024-11-19 17:52:36.846068] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:44.081 [2024-11-19 17:52:36.846100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.081 [2024-11-19 17:52:36.846216] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:44.081 [2024-11-19 17:52:36.846242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.081 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:44.081 #48 NEW cov: 11895 ft: 14739 corp: 16/629b lim: 90 exec/s: 0 rss: 68Mb L: 50/68 MS: 1 ChangeByte- 00:08:44.081 [2024-11-19 17:52:36.896264] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:44.081 [2024-11-19 17:52:36.896297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.081 [2024-11-19 17:52:36.896423] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:44.081 [2024-11-19 17:52:36.896443] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.081 #49 NEW cov: 11895 ft: 14776 corp: 17/679b lim: 90 exec/s: 0 rss: 68Mb L: 50/68 MS: 1 ChangeBit- 00:08:44.081 [2024-11-19 17:52:36.936054] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:44.081 [2024-11-19 17:52:36.936080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.341 #50 NEW cov: 11895 ft: 14790 corp: 18/705b lim: 90 exec/s: 0 rss: 68Mb L: 26/68 MS: 1 CrossOver- 00:08:44.341 [2024-11-19 17:52:36.986179] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:44.341 [2024-11-19 17:52:36.986220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.341 #51 NEW cov: 11895 ft: 14811 corp: 19/727b lim: 90 exec/s: 51 rss: 68Mb L: 22/68 MS: 1 EraseBytes- 00:08:44.341 [2024-11-19 17:52:37.026327] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:44.341 [2024-11-19 17:52:37.026358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.341 #52 NEW cov: 11895 ft: 14884 corp: 20/754b lim: 90 exec/s: 52 rss: 68Mb L: 27/68 MS: 1 EraseBytes- 00:08:44.341 [2024-11-19 17:52:37.066949] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:44.341 [2024-11-19 17:52:37.066980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.341 [2024-11-19 17:52:37.067048] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:44.341 [2024-11-19 17:52:37.067084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.341 [2024-11-19 17:52:37.067201] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:44.341 [2024-11-19 17:52:37.067224] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:44.341 #53 NEW cov: 11895 ft: 14915 corp: 21/824b lim: 90 exec/s: 53 rss: 68Mb L: 70/70 MS: 1 PersAutoDict- DE: "\001\003"- 00:08:44.341 [2024-11-19 17:52:37.116818] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:44.341 [2024-11-19 17:52:37.116849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.341 [2024-11-19 17:52:37.116981] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:44.341 [2024-11-19 17:52:37.117005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.341 #54 NEW cov: 11895 ft: 14928 corp: 22/874b lim: 90 exec/s: 54 rss: 68Mb L: 50/70 MS: 1 ChangeBinInt- 00:08:44.341 [2024-11-19 17:52:37.156998] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:44.341 [2024-11-19 17:52:37.157023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.341 [2024-11-19 17:52:37.157157] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:44.341 [2024-11-19 17:52:37.157178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.341 #55 NEW cov: 11895 ft: 14962 corp: 23/924b lim: 90 exec/s: 55 rss: 68Mb L: 50/70 MS: 1 ChangeBit- 00:08:44.341 [2024-11-19 17:52:37.197074] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:44.341 [2024-11-19 17:52:37.197104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.341 [2024-11-19 17:52:37.197229] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:44.341 [2024-11-19 17:52:37.197248] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.601 #56 NEW cov: 11895 ft: 14972 corp: 24/972b lim: 90 exec/s: 56 rss: 68Mb L: 48/70 MS: 1 CrossOver- 00:08:44.601 [2024-11-19 17:52:37.237419] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:44.601 [2024-11-19 17:52:37.237452] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.601 [2024-11-19 17:52:37.237577] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:44.601 [2024-11-19 17:52:37.237603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.601 [2024-11-19 17:52:37.237727] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:44.601 [2024-11-19 17:52:37.237750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:44.601 #57 NEW cov: 11895 ft: 14986 corp: 25/1042b lim: 90 exec/s: 57 rss: 68Mb L: 70/70 MS: 1 ChangeBinInt- 00:08:44.601 [2024-11-19 17:52:37.287183] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:44.601 [2024-11-19 17:52:37.287217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.601 #58 NEW cov: 11895 ft: 14995 corp: 26/1067b lim: 90 exec/s: 58 rss: 69Mb L: 25/70 MS: 1 ShuffleBytes- 00:08:44.601 [2024-11-19 17:52:37.337287] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:44.601 [2024-11-19 17:52:37.337313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.601 #59 NEW cov: 11895 ft: 14996 corp: 27/1092b lim: 90 exec/s: 59 rss: 69Mb L: 25/70 MS: 1 PersAutoDict- DE: "\377\377\377\013"- 00:08:44.601 [2024-11-19 17:52:37.377424] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:44.601 [2024-11-19 17:52:37.377451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.601 #60 NEW cov: 11895 ft: 15005 corp: 28/1117b lim: 90 exec/s: 60 rss: 69Mb L: 25/70 MS: 1 PersAutoDict- DE: "\377\377\377\013"- 00:08:44.601 [2024-11-19 17:52:37.428073] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:44.601 [2024-11-19 17:52:37.428109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.601 [2024-11-19 17:52:37.428234] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:44.601 [2024-11-19 17:52:37.428259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.601 [2024-11-19 17:52:37.428391] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:44.601 [2024-11-19 17:52:37.428414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:44.601 #61 NEW cov: 11895 ft: 15019 corp: 29/1185b lim: 90 exec/s: 61 rss: 69Mb L: 68/70 MS: 1 ChangeBinInt- 00:08:44.860 [2024-11-19 17:52:37.477957] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:44.861 [2024-11-19 17:52:37.477983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.861 [2024-11-19 17:52:37.478103] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:44.861 [2024-11-19 17:52:37.478128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.861 #62 NEW cov: 11895 ft: 15022 corp: 30/1228b lim: 90 exec/s: 62 rss: 69Mb L: 43/70 MS: 1 ChangeByte- 00:08:44.861 [2024-11-19 17:52:37.518011] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:44.861 [2024-11-19 17:52:37.518038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.861 [2024-11-19 17:52:37.518162] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:44.861 [2024-11-19 17:52:37.518186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.861 #63 NEW cov: 11895 ft: 15033 corp: 31/1278b lim: 90 exec/s: 63 rss: 69Mb L: 50/70 MS: 1 ChangeByte- 00:08:44.861 [2024-11-19 17:52:37.558546] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:44.861 [2024-11-19 17:52:37.558578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.861 [2024-11-19 17:52:37.558707] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:44.861 [2024-11-19 17:52:37.558727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.861 [2024-11-19 17:52:37.558847] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:44.861 [2024-11-19 17:52:37.558871] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:44.861 [2024-11-19 17:52:37.558990] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:44.861 [2024-11-19 17:52:37.559011] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:44.861 #64 NEW cov: 11895 ft: 15362 corp: 32/1351b lim: 90 exec/s: 64 rss: 69Mb L: 73/73 MS: 1 CopyPart- 00:08:44.861 [2024-11-19 17:52:37.608487] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:44.861 [2024-11-19 17:52:37.608518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.861 [2024-11-19 17:52:37.608651] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:44.861 [2024-11-19 17:52:37.608673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.861 [2024-11-19 17:52:37.608799] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:44.861 [2024-11-19 17:52:37.608821] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:44.861 #65 NEW cov: 11895 ft: 15367 corp: 33/1420b lim: 90 exec/s: 65 rss: 69Mb L: 69/73 MS: 1 InsertByte- 00:08:44.861 [2024-11-19 17:52:37.658676] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:44.861 [2024-11-19 17:52:37.658712] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.861 [2024-11-19 17:52:37.658846] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:44.861 [2024-11-19 17:52:37.658870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.861 [2024-11-19 17:52:37.659005] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:44.861 [2024-11-19 17:52:37.659028] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:44.861 #66 NEW cov: 11895 ft: 15388 corp: 34/1489b lim: 90 exec/s: 66 rss: 69Mb L: 69/73 MS: 1 ChangeByte- 00:08:44.861 [2024-11-19 17:52:37.708570] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:44.861 [2024-11-19 17:52:37.708604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.861 [2024-11-19 17:52:37.708723] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:44.861 [2024-11-19 17:52:37.708741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:45.121 #67 NEW cov: 11895 ft: 15400 corp: 35/1536b lim: 90 exec/s: 67 rss: 69Mb L: 47/73 MS: 1 EraseBytes- 00:08:45.121 [2024-11-19 17:52:37.748398] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:45.121 [2024-11-19 17:52:37.748424] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.121 #68 NEW cov: 11895 ft: 15410 corp: 36/1555b lim: 90 exec/s: 68 rss: 69Mb L: 19/73 MS: 1 EraseBytes- 00:08:45.121 [2024-11-19 17:52:37.788841] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:45.121 [2024-11-19 17:52:37.788866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.121 [2024-11-19 17:52:37.788999] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:45.121 [2024-11-19 17:52:37.789022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:45.121 #69 NEW cov: 11895 ft: 15420 corp: 37/1605b lim: 90 exec/s: 69 rss: 69Mb L: 50/73 MS: 1 ChangeBinInt- 00:08:45.121 [2024-11-19 17:52:37.829049] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:45.121 [2024-11-19 17:52:37.829077] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.121 [2024-11-19 17:52:37.829202] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:45.121 [2024-11-19 17:52:37.829226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:45.121 #70 NEW cov: 11895 ft: 15426 corp: 38/1642b lim: 90 exec/s: 70 rss: 69Mb L: 37/73 MS: 1 CrossOver- 00:08:45.121 [2024-11-19 17:52:37.879164] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:45.121 [2024-11-19 17:52:37.879190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.121 [2024-11-19 17:52:37.879312] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:45.121 [2024-11-19 17:52:37.879337] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:45.121 #71 NEW cov: 11895 ft: 15461 corp: 39/1681b lim: 90 exec/s: 71 rss: 69Mb L: 39/73 MS: 1 PersAutoDict- DE: "\001\003"- 00:08:45.121 [2024-11-19 17:52:37.919537] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:45.121 [2024-11-19 17:52:37.919568] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.121 [2024-11-19 17:52:37.919688] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:45.121 [2024-11-19 17:52:37.919710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:45.121 [2024-11-19 17:52:37.919839] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:45.121 [2024-11-19 17:52:37.919863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:45.121 #77 NEW cov: 11895 ft: 15470 corp: 40/1735b lim: 90 exec/s: 77 rss: 69Mb L: 54/73 MS: 1 InsertRepeatedBytes- 00:08:45.121 [2024-11-19 17:52:37.959847] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:45.121 [2024-11-19 17:52:37.959878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.121 [2024-11-19 17:52:37.959948] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:45.121 [2024-11-19 17:52:37.959968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:45.121 [2024-11-19 17:52:37.960077] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:45.121 [2024-11-19 17:52:37.960098] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:45.121 [2024-11-19 17:52:37.960213] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:45.121 [2024-11-19 17:52:37.960234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:45.121 #78 NEW cov: 11895 ft: 15493 corp: 41/1824b lim: 90 exec/s: 39 rss: 69Mb L: 89/89 MS: 1 InsertRepeatedBytes- 00:08:45.121 #78 DONE cov: 11895 ft: 15493 corp: 41/1824b lim: 90 exec/s: 39 rss: 69Mb 00:08:45.121 ###### Recommended dictionary. ###### 00:08:45.121 "\377\377\377\013" # Uses: 4 00:08:45.121 "\001\003" # Uses: 2 00:08:45.121 ###### End of recommended dictionary. ###### 00:08:45.121 Done 78 runs in 2 second(s) 00:08:45.381 17:52:38 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_20.conf 00:08:45.381 17:52:38 -- ../common.sh@72 -- # (( i++ )) 00:08:45.381 17:52:38 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:45.381 17:52:38 -- ../common.sh@73 -- # start_llvm_fuzz 21 1 0x1 00:08:45.381 17:52:38 -- nvmf/run.sh@23 -- # local fuzzer_type=21 00:08:45.381 17:52:38 -- nvmf/run.sh@24 -- # local timen=1 00:08:45.381 17:52:38 -- nvmf/run.sh@25 -- # local core=0x1 00:08:45.381 17:52:38 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:45.381 17:52:38 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_21.conf 00:08:45.381 17:52:38 -- nvmf/run.sh@29 -- # printf %02d 21 00:08:45.381 17:52:38 -- nvmf/run.sh@29 -- # port=4421 00:08:45.381 17:52:38 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:45.381 17:52:38 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4421' 00:08:45.381 17:52:38 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4421"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:45.381 17:52:38 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4421' -c /tmp/fuzz_json_21.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 -Z 21 -r /var/tmp/spdk21.sock 00:08:45.381 [2024-11-19 17:52:38.126412] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:45.381 [2024-11-19 17:52:38.126479] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid643500 ] 00:08:45.381 EAL: No free 2048 kB hugepages reported on node 1 00:08:45.640 [2024-11-19 17:52:38.300398] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:45.640 [2024-11-19 17:52:38.319796] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:45.640 [2024-11-19 17:52:38.319913] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:45.640 [2024-11-19 17:52:38.371129] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:45.640 [2024-11-19 17:52:38.387417] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4421 *** 00:08:45.640 INFO: Running with entropic power schedule (0xFF, 100). 00:08:45.640 INFO: Seed: 3273683696 00:08:45.640 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:45.640 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:45.640 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:45.640 INFO: A corpus is not provided, starting from an empty corpus 00:08:45.640 #2 INITED exec/s: 0 rss: 59Mb 00:08:45.640 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:45.640 This may also happen if the target rejected all inputs we tried so far 00:08:45.640 [2024-11-19 17:52:38.432099] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:45.640 [2024-11-19 17:52:38.432133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.640 [2024-11-19 17:52:38.432167] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:45.640 [2024-11-19 17:52:38.432183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:45.640 [2024-11-19 17:52:38.432211] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:45.640 [2024-11-19 17:52:38.432226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:45.900 NEW_FUNC[1/672]: 0x4767c8 in fuzz_nvm_reservation_release_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:623 00:08:45.900 NEW_FUNC[2/672]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:45.900 #17 NEW cov: 11643 ft: 11644 corp: 2/34b lim: 50 exec/s: 0 rss: 67Mb L: 33/33 MS: 5 InsertByte-ChangeBit-CopyPart-InsertByte-InsertRepeatedBytes- 00:08:45.900 [2024-11-19 17:52:38.742770] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:45.900 [2024-11-19 17:52:38.742806] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.900 [2024-11-19 17:52:38.742856] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:45.900 [2024-11-19 17:52:38.742874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.159 #21 NEW cov: 11756 ft: 12427 corp: 3/54b lim: 50 exec/s: 0 rss: 67Mb L: 20/33 MS: 4 ShuffleBytes-CrossOver-ChangeBit-CrossOver- 00:08:46.159 [2024-11-19 17:52:38.802899] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:46.159 [2024-11-19 17:52:38.802931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.159 [2024-11-19 17:52:38.802966] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:46.159 [2024-11-19 17:52:38.802985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.159 [2024-11-19 17:52:38.803017] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:46.159 [2024-11-19 17:52:38.803035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:46.159 #22 NEW cov: 11762 ft: 12769 corp: 4/88b lim: 50 exec/s: 0 rss: 67Mb L: 34/34 MS: 1 InsertByte- 00:08:46.159 [2024-11-19 17:52:38.863059] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:46.159 [2024-11-19 17:52:38.863090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.159 [2024-11-19 17:52:38.863139] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:46.159 [2024-11-19 17:52:38.863156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.159 [2024-11-19 17:52:38.863185] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:46.159 [2024-11-19 17:52:38.863202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:46.159 #25 NEW cov: 11847 ft: 13019 corp: 5/124b lim: 50 exec/s: 0 rss: 67Mb L: 36/36 MS: 3 InsertByte-ChangeBit-CrossOver- 00:08:46.159 [2024-11-19 17:52:38.913215] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:46.159 [2024-11-19 17:52:38.913244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.160 [2024-11-19 17:52:38.913292] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:46.160 [2024-11-19 17:52:38.913309] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.160 [2024-11-19 17:52:38.913338] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:46.160 [2024-11-19 17:52:38.913354] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:46.160 [2024-11-19 17:52:38.913382] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:46.160 [2024-11-19 17:52:38.913397] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:46.160 #26 NEW cov: 11847 ft: 13410 corp: 6/167b lim: 50 exec/s: 0 rss: 67Mb L: 43/43 MS: 1 CrossOver- 00:08:46.160 [2024-11-19 17:52:38.963171] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:46.160 [2024-11-19 17:52:38.963200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.160 #27 NEW cov: 11847 ft: 14277 corp: 7/184b lim: 50 exec/s: 0 rss: 67Mb L: 17/43 MS: 1 EraseBytes- 00:08:46.419 [2024-11-19 17:52:39.033555] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:46.419 [2024-11-19 17:52:39.033586] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.419 [2024-11-19 17:52:39.033642] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:46.419 [2024-11-19 17:52:39.033664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.419 [2024-11-19 17:52:39.033693] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:46.419 [2024-11-19 17:52:39.033709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:46.419 [2024-11-19 17:52:39.033737] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:46.419 [2024-11-19 17:52:39.033753] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:46.419 #28 NEW cov: 11847 ft: 14364 corp: 8/227b lim: 50 exec/s: 0 rss: 67Mb L: 43/43 MS: 1 CMP- DE: "\001\000\000\000\000\000\000\000"- 00:08:46.419 [2024-11-19 17:52:39.103576] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:46.419 [2024-11-19 17:52:39.103612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.419 #29 NEW cov: 11847 ft: 14431 corp: 9/244b lim: 50 exec/s: 0 rss: 67Mb L: 17/43 MS: 1 ShuffleBytes- 00:08:46.419 [2024-11-19 17:52:39.173759] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:46.419 [2024-11-19 17:52:39.173789] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.419 #35 NEW cov: 11847 ft: 14474 corp: 10/261b lim: 50 exec/s: 0 rss: 67Mb L: 17/43 MS: 1 ChangeBit- 00:08:46.419 [2024-11-19 17:52:39.224079] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:46.419 [2024-11-19 17:52:39.224108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.419 [2024-11-19 17:52:39.224154] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:46.419 [2024-11-19 17:52:39.224172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.419 [2024-11-19 17:52:39.224201] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:46.419 [2024-11-19 17:52:39.224217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:46.419 [2024-11-19 17:52:39.224245] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:46.419 [2024-11-19 17:52:39.224260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:46.419 [2024-11-19 17:52:39.224288] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:08:46.419 [2024-11-19 17:52:39.224303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:46.419 #36 NEW cov: 11847 ft: 14634 corp: 11/311b lim: 50 exec/s: 0 rss: 67Mb L: 50/50 MS: 1 InsertRepeatedBytes- 00:08:46.679 [2024-11-19 17:52:39.294100] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:46.679 [2024-11-19 17:52:39.294130] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.679 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:46.679 #38 NEW cov: 11870 ft: 14680 corp: 12/328b lim: 50 exec/s: 0 rss: 68Mb L: 17/50 MS: 2 CrossOver-CrossOver- 00:08:46.679 [2024-11-19 17:52:39.354188] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:46.679 [2024-11-19 17:52:39.354217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.679 #39 NEW cov: 11870 ft: 14804 corp: 13/345b lim: 50 exec/s: 0 rss: 68Mb L: 17/50 MS: 1 CopyPart- 00:08:46.679 [2024-11-19 17:52:39.424405] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:46.679 [2024-11-19 17:52:39.424433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.679 #40 NEW cov: 11870 ft: 14839 corp: 14/362b lim: 50 exec/s: 40 rss: 68Mb L: 17/50 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000\000"- 00:08:46.679 [2024-11-19 17:52:39.494815] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:46.679 [2024-11-19 17:52:39.494844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.679 [2024-11-19 17:52:39.494890] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:46.679 [2024-11-19 17:52:39.494908] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.679 [2024-11-19 17:52:39.494937] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:46.679 [2024-11-19 17:52:39.494953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:46.679 [2024-11-19 17:52:39.494981] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:46.679 [2024-11-19 17:52:39.494996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:46.679 [2024-11-19 17:52:39.495024] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:08:46.679 [2024-11-19 17:52:39.495039] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:46.938 #41 NEW cov: 11870 ft: 14856 corp: 15/412b lim: 50 exec/s: 41 rss: 68Mb L: 50/50 MS: 1 CrossOver- 00:08:46.938 [2024-11-19 17:52:39.554756] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:46.938 [2024-11-19 17:52:39.554785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.938 #42 NEW cov: 11870 ft: 14884 corp: 16/428b lim: 50 exec/s: 42 rss: 68Mb L: 16/50 MS: 1 CrossOver- 00:08:46.938 [2024-11-19 17:52:39.604915] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:46.938 [2024-11-19 17:52:39.604944] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.938 [2024-11-19 17:52:39.604992] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:46.938 [2024-11-19 17:52:39.605009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.938 #43 NEW cov: 11870 ft: 14927 corp: 17/450b lim: 50 exec/s: 43 rss: 68Mb L: 22/50 MS: 1 CMP- DE: "\377\027"- 00:08:46.938 [2024-11-19 17:52:39.655105] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:46.938 [2024-11-19 17:52:39.655134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.938 [2024-11-19 17:52:39.655182] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:46.938 [2024-11-19 17:52:39.655199] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.938 [2024-11-19 17:52:39.655228] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:46.939 [2024-11-19 17:52:39.655244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:46.939 #44 NEW cov: 11870 ft: 14970 corp: 18/483b lim: 50 exec/s: 44 rss: 68Mb L: 33/50 MS: 1 ChangeASCIIInt- 00:08:46.939 [2024-11-19 17:52:39.705117] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:46.939 [2024-11-19 17:52:39.705146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.939 #50 NEW cov: 11870 ft: 14976 corp: 19/502b lim: 50 exec/s: 50 rss: 68Mb L: 19/50 MS: 1 PersAutoDict- DE: "\377\027"- 00:08:46.939 [2024-11-19 17:52:39.755978] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:46.939 [2024-11-19 17:52:39.756006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.939 [2024-11-19 17:52:39.756049] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:46.939 [2024-11-19 17:52:39.756065] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.939 #51 NEW cov: 11870 ft: 15084 corp: 20/526b lim: 50 exec/s: 51 rss: 68Mb L: 24/50 MS: 1 CopyPart- 00:08:46.939 [2024-11-19 17:52:39.796190] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:46.939 [2024-11-19 17:52:39.796215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.939 [2024-11-19 17:52:39.796253] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:46.939 [2024-11-19 17:52:39.796269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.939 [2024-11-19 17:52:39.796321] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:46.939 [2024-11-19 17:52:39.796336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:47.198 #52 NEW cov: 11870 ft: 15135 corp: 21/558b lim: 50 exec/s: 52 rss: 68Mb L: 32/50 MS: 1 InsertRepeatedBytes- 00:08:47.198 [2024-11-19 17:52:39.836576] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:47.198 [2024-11-19 17:52:39.836607] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.198 [2024-11-19 17:52:39.836658] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:47.198 [2024-11-19 17:52:39.836674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.198 [2024-11-19 17:52:39.836724] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:47.198 [2024-11-19 17:52:39.836756] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:47.198 [2024-11-19 17:52:39.836805] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:47.198 [2024-11-19 17:52:39.836819] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:47.198 [2024-11-19 17:52:39.836870] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:08:47.198 [2024-11-19 17:52:39.836886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:47.198 #53 NEW cov: 11870 ft: 15219 corp: 22/608b lim: 50 exec/s: 53 rss: 68Mb L: 50/50 MS: 1 ChangeBinInt- 00:08:47.198 [2024-11-19 17:52:39.876199] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:47.198 [2024-11-19 17:52:39.876225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.198 #54 NEW cov: 11870 ft: 15274 corp: 23/625b lim: 50 exec/s: 54 rss: 68Mb L: 17/50 MS: 1 ChangeByte- 00:08:47.198 [2024-11-19 17:52:39.916422] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:47.198 [2024-11-19 17:52:39.916449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.198 [2024-11-19 17:52:39.916498] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:47.198 [2024-11-19 17:52:39.916513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.198 #57 NEW cov: 11870 ft: 15333 corp: 24/647b lim: 50 exec/s: 57 rss: 68Mb L: 22/50 MS: 3 CopyPart-CMP-InsertRepeatedBytes- DE: "\377\377~|\200\016K\011"- 00:08:47.198 [2024-11-19 17:52:39.956831] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:47.198 [2024-11-19 17:52:39.956858] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.198 [2024-11-19 17:52:39.956920] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:47.198 [2024-11-19 17:52:39.956936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.198 [2024-11-19 17:52:39.956987] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:47.198 [2024-11-19 17:52:39.957002] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:47.198 [2024-11-19 17:52:39.957057] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:47.198 [2024-11-19 17:52:39.957072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:47.198 #58 NEW cov: 11870 ft: 15343 corp: 25/690b lim: 50 exec/s: 58 rss: 68Mb L: 43/50 MS: 1 ChangeBit- 00:08:47.198 [2024-11-19 17:52:39.996558] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:47.198 [2024-11-19 17:52:39.996585] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.198 #59 NEW cov: 11870 ft: 15347 corp: 26/707b lim: 50 exec/s: 59 rss: 68Mb L: 17/50 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000\000"- 00:08:47.198 [2024-11-19 17:52:40.046746] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:47.198 [2024-11-19 17:52:40.046773] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.457 #60 NEW cov: 11870 ft: 15357 corp: 27/725b lim: 50 exec/s: 60 rss: 68Mb L: 18/50 MS: 1 InsertByte- 00:08:47.457 [2024-11-19 17:52:40.087396] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:47.457 [2024-11-19 17:52:40.087424] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.457 [2024-11-19 17:52:40.087472] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:47.457 [2024-11-19 17:52:40.087488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.457 [2024-11-19 17:52:40.087539] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:47.457 [2024-11-19 17:52:40.087554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:47.457 [2024-11-19 17:52:40.087606] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:47.457 [2024-11-19 17:52:40.087622] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:47.457 [2024-11-19 17:52:40.087680] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:08:47.457 [2024-11-19 17:52:40.087695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:47.457 #61 NEW cov: 11870 ft: 15377 corp: 28/775b lim: 50 exec/s: 61 rss: 68Mb L: 50/50 MS: 1 ChangeBit- 00:08:47.457 [2024-11-19 17:52:40.127171] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:47.457 [2024-11-19 17:52:40.127198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.457 [2024-11-19 17:52:40.127238] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:47.457 [2024-11-19 17:52:40.127253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.457 [2024-11-19 17:52:40.127306] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:47.457 [2024-11-19 17:52:40.127322] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:47.457 #62 NEW cov: 11870 ft: 15412 corp: 29/814b lim: 50 exec/s: 62 rss: 68Mb L: 39/50 MS: 1 InsertRepeatedBytes- 00:08:47.457 [2024-11-19 17:52:40.167313] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:47.457 [2024-11-19 17:52:40.167340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.457 [2024-11-19 17:52:40.167380] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:47.457 [2024-11-19 17:52:40.167396] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.457 [2024-11-19 17:52:40.167450] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:47.457 [2024-11-19 17:52:40.167465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:47.457 #63 NEW cov: 11870 ft: 15461 corp: 30/847b lim: 50 exec/s: 63 rss: 68Mb L: 33/50 MS: 1 InsertRepeatedBytes- 00:08:47.457 [2024-11-19 17:52:40.207477] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:47.457 [2024-11-19 17:52:40.207505] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.457 [2024-11-19 17:52:40.207539] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:47.457 [2024-11-19 17:52:40.207555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.457 [2024-11-19 17:52:40.207611] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:47.457 [2024-11-19 17:52:40.207627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:47.457 #64 NEW cov: 11870 ft: 15481 corp: 31/880b lim: 50 exec/s: 64 rss: 68Mb L: 33/50 MS: 1 ChangeBit- 00:08:47.457 [2024-11-19 17:52:40.247730] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:47.457 [2024-11-19 17:52:40.247757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.457 [2024-11-19 17:52:40.247804] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:47.457 [2024-11-19 17:52:40.247819] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.457 [2024-11-19 17:52:40.247873] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:47.457 [2024-11-19 17:52:40.247891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:47.457 [2024-11-19 17:52:40.247944] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:47.457 [2024-11-19 17:52:40.247958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:47.457 #65 NEW cov: 11870 ft: 15484 corp: 32/928b lim: 50 exec/s: 65 rss: 69Mb L: 48/50 MS: 1 InsertRepeatedBytes- 00:08:47.457 [2024-11-19 17:52:40.287516] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:47.457 [2024-11-19 17:52:40.287542] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.457 [2024-11-19 17:52:40.287592] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:47.457 [2024-11-19 17:52:40.287611] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.457 #66 NEW cov: 11870 ft: 15499 corp: 33/948b lim: 50 exec/s: 66 rss: 69Mb L: 20/50 MS: 1 ShuffleBytes- 00:08:47.716 [2024-11-19 17:52:40.328021] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:47.716 [2024-11-19 17:52:40.328049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.716 [2024-11-19 17:52:40.328096] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:47.716 [2024-11-19 17:52:40.328111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.716 [2024-11-19 17:52:40.328164] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:47.716 [2024-11-19 17:52:40.328179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:47.716 [2024-11-19 17:52:40.328230] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:47.716 [2024-11-19 17:52:40.328245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:47.716 [2024-11-19 17:52:40.328298] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:08:47.716 [2024-11-19 17:52:40.328312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:47.716 #67 NEW cov: 11870 ft: 15519 corp: 34/998b lim: 50 exec/s: 67 rss: 69Mb L: 50/50 MS: 1 ChangeBit- 00:08:47.716 [2024-11-19 17:52:40.367565] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:47.716 [2024-11-19 17:52:40.367592] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.716 #68 NEW cov: 11870 ft: 15536 corp: 35/1015b lim: 50 exec/s: 68 rss: 69Mb L: 17/50 MS: 1 PersAutoDict- DE: "\377\377~|\200\016K\011"- 00:08:47.716 [2024-11-19 17:52:40.407823] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:47.716 [2024-11-19 17:52:40.407850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.716 [2024-11-19 17:52:40.407920] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:47.716 [2024-11-19 17:52:40.407936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.716 #69 NEW cov: 11870 ft: 15552 corp: 36/1044b lim: 50 exec/s: 34 rss: 69Mb L: 29/50 MS: 1 EraseBytes- 00:08:47.716 #69 DONE cov: 11870 ft: 15552 corp: 36/1044b lim: 50 exec/s: 34 rss: 69Mb 00:08:47.716 ###### Recommended dictionary. ###### 00:08:47.716 "\001\000\000\000\000\000\000\000" # Uses: 2 00:08:47.716 "\377\027" # Uses: 1 00:08:47.716 "\377\377~|\200\016K\011" # Uses: 1 00:08:47.716 ###### End of recommended dictionary. ###### 00:08:47.716 Done 69 runs in 2 second(s) 00:08:47.716 17:52:40 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_21.conf 00:08:47.716 17:52:40 -- ../common.sh@72 -- # (( i++ )) 00:08:47.716 17:52:40 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:47.716 17:52:40 -- ../common.sh@73 -- # start_llvm_fuzz 22 1 0x1 00:08:47.716 17:52:40 -- nvmf/run.sh@23 -- # local fuzzer_type=22 00:08:47.716 17:52:40 -- nvmf/run.sh@24 -- # local timen=1 00:08:47.716 17:52:40 -- nvmf/run.sh@25 -- # local core=0x1 00:08:47.716 17:52:40 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:47.716 17:52:40 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_22.conf 00:08:47.716 17:52:40 -- nvmf/run.sh@29 -- # printf %02d 22 00:08:47.716 17:52:40 -- nvmf/run.sh@29 -- # port=4422 00:08:47.716 17:52:40 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:47.716 17:52:40 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4422' 00:08:47.716 17:52:40 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4422"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:47.716 17:52:40 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4422' -c /tmp/fuzz_json_22.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 -Z 22 -r /var/tmp/spdk22.sock 00:08:47.976 [2024-11-19 17:52:40.582335] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:47.976 [2024-11-19 17:52:40.582404] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid643897 ] 00:08:47.976 EAL: No free 2048 kB hugepages reported on node 1 00:08:47.976 [2024-11-19 17:52:40.764471] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:47.976 [2024-11-19 17:52:40.783995] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:47.976 [2024-11-19 17:52:40.784133] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:47.976 [2024-11-19 17:52:40.835725] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:48.236 [2024-11-19 17:52:40.852052] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4422 *** 00:08:48.236 INFO: Running with entropic power schedule (0xFF, 100). 00:08:48.236 INFO: Seed: 1441716729 00:08:48.236 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:48.236 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:48.236 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:48.236 INFO: A corpus is not provided, starting from an empty corpus 00:08:48.236 #2 INITED exec/s: 0 rss: 59Mb 00:08:48.236 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:48.236 This may also happen if the target rejected all inputs we tried so far 00:08:48.236 [2024-11-19 17:52:40.901281] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:48.236 [2024-11-19 17:52:40.901311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.236 [2024-11-19 17:52:40.901348] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:48.236 [2024-11-19 17:52:40.901366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:48.236 [2024-11-19 17:52:40.901419] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:48.236 [2024-11-19 17:52:40.901433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:48.236 [2024-11-19 17:52:40.901487] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:48.236 [2024-11-19 17:52:40.901503] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:48.496 NEW_FUNC[1/671]: 0x478a98 in fuzz_nvm_reservation_register_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:644 00:08:48.496 NEW_FUNC[2/671]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:48.496 #24 NEW cov: 11663 ft: 11665 corp: 2/80b lim: 85 exec/s: 0 rss: 67Mb L: 79/79 MS: 2 CrossOver-InsertRepeatedBytes- 00:08:48.496 [2024-11-19 17:52:41.191633] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:48.496 [2024-11-19 17:52:41.191666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.496 NEW_FUNC[1/1]: 0x1c78a18 in spdk_thread_get_last_tsc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:1312 00:08:48.496 #34 NEW cov: 11782 ft: 13084 corp: 3/111b lim: 85 exec/s: 0 rss: 67Mb L: 31/79 MS: 5 InsertByte-InsertByte-ChangeBit-EraseBytes-InsertRepeatedBytes- 00:08:48.496 [2024-11-19 17:52:41.232109] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:48.496 [2024-11-19 17:52:41.232138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.496 [2024-11-19 17:52:41.232193] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:48.496 [2024-11-19 17:52:41.232209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:48.496 [2024-11-19 17:52:41.232262] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:48.496 [2024-11-19 17:52:41.232278] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:48.496 [2024-11-19 17:52:41.232332] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:48.496 [2024-11-19 17:52:41.232347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:48.496 #35 NEW cov: 11788 ft: 13264 corp: 4/190b lim: 85 exec/s: 0 rss: 67Mb L: 79/79 MS: 1 ChangeByte- 00:08:48.496 [2024-11-19 17:52:41.271771] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:48.496 [2024-11-19 17:52:41.271799] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.496 #36 NEW cov: 11873 ft: 13571 corp: 5/221b lim: 85 exec/s: 0 rss: 67Mb L: 31/79 MS: 1 ChangeBit- 00:08:48.496 [2024-11-19 17:52:41.321890] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:48.496 [2024-11-19 17:52:41.321917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.496 #37 NEW cov: 11873 ft: 13695 corp: 6/252b lim: 85 exec/s: 0 rss: 67Mb L: 31/79 MS: 1 CopyPart- 00:08:48.756 [2024-11-19 17:52:41.362027] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:48.756 [2024-11-19 17:52:41.362055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.756 #38 NEW cov: 11873 ft: 13746 corp: 7/285b lim: 85 exec/s: 0 rss: 67Mb L: 33/79 MS: 1 InsertRepeatedBytes- 00:08:48.756 [2024-11-19 17:52:41.402129] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:48.756 [2024-11-19 17:52:41.402156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.756 #39 NEW cov: 11873 ft: 13871 corp: 8/316b lim: 85 exec/s: 0 rss: 67Mb L: 31/79 MS: 1 ShuffleBytes- 00:08:48.756 [2024-11-19 17:52:41.442721] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:48.756 [2024-11-19 17:52:41.442748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.756 [2024-11-19 17:52:41.442791] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:48.756 [2024-11-19 17:52:41.442806] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:48.756 [2024-11-19 17:52:41.442859] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:48.756 [2024-11-19 17:52:41.442889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:48.756 [2024-11-19 17:52:41.442943] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:48.756 [2024-11-19 17:52:41.442959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:48.756 #40 NEW cov: 11873 ft: 13949 corp: 9/395b lim: 85 exec/s: 0 rss: 67Mb L: 79/79 MS: 1 ChangeBinInt- 00:08:48.756 [2024-11-19 17:52:41.482411] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:48.756 [2024-11-19 17:52:41.482438] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.756 #42 NEW cov: 11873 ft: 13964 corp: 10/420b lim: 85 exec/s: 0 rss: 67Mb L: 25/79 MS: 2 CopyPart-CrossOver- 00:08:48.756 [2024-11-19 17:52:41.522533] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:48.756 [2024-11-19 17:52:41.522559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.756 #43 NEW cov: 11873 ft: 14023 corp: 11/452b lim: 85 exec/s: 0 rss: 67Mb L: 32/79 MS: 1 InsertByte- 00:08:48.756 [2024-11-19 17:52:41.562609] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:48.756 [2024-11-19 17:52:41.562636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.756 #44 NEW cov: 11873 ft: 14067 corp: 12/473b lim: 85 exec/s: 0 rss: 67Mb L: 21/79 MS: 1 EraseBytes- 00:08:48.756 [2024-11-19 17:52:41.603192] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:48.756 [2024-11-19 17:52:41.603219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.756 [2024-11-19 17:52:41.603266] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:48.756 [2024-11-19 17:52:41.603281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:48.756 [2024-11-19 17:52:41.603335] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:48.756 [2024-11-19 17:52:41.603349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:48.756 [2024-11-19 17:52:41.603403] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:48.756 [2024-11-19 17:52:41.603419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:49.016 #45 NEW cov: 11873 ft: 14103 corp: 13/552b lim: 85 exec/s: 0 rss: 67Mb L: 79/79 MS: 1 CrossOver- 00:08:49.016 [2024-11-19 17:52:41.643129] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:49.016 [2024-11-19 17:52:41.643155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.016 [2024-11-19 17:52:41.643197] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:49.016 [2024-11-19 17:52:41.643213] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.016 [2024-11-19 17:52:41.643269] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:49.016 [2024-11-19 17:52:41.643284] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.016 #46 NEW cov: 11873 ft: 14443 corp: 14/613b lim: 85 exec/s: 0 rss: 67Mb L: 61/79 MS: 1 InsertRepeatedBytes- 00:08:49.016 [2024-11-19 17:52:41.682987] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:49.016 [2024-11-19 17:52:41.683014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.016 #47 NEW cov: 11873 ft: 14452 corp: 15/638b lim: 85 exec/s: 0 rss: 68Mb L: 25/79 MS: 1 ChangeByte- 00:08:49.016 [2024-11-19 17:52:41.723131] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:49.016 [2024-11-19 17:52:41.723158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.016 #48 NEW cov: 11873 ft: 14491 corp: 16/664b lim: 85 exec/s: 0 rss: 68Mb L: 26/79 MS: 1 InsertByte- 00:08:49.016 [2024-11-19 17:52:41.763663] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:49.016 [2024-11-19 17:52:41.763689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.016 [2024-11-19 17:52:41.763741] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:49.016 [2024-11-19 17:52:41.763756] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.016 [2024-11-19 17:52:41.763807] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:49.016 [2024-11-19 17:52:41.763822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.016 [2024-11-19 17:52:41.763875] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:49.016 [2024-11-19 17:52:41.763891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:49.016 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:49.016 #49 NEW cov: 11896 ft: 14583 corp: 17/741b lim: 85 exec/s: 0 rss: 68Mb L: 77/79 MS: 1 CopyPart- 00:08:49.016 [2024-11-19 17:52:41.803346] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:49.016 [2024-11-19 17:52:41.803373] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.016 #50 NEW cov: 11896 ft: 14594 corp: 18/772b lim: 85 exec/s: 0 rss: 68Mb L: 31/79 MS: 1 ChangeBinInt- 00:08:49.016 [2024-11-19 17:52:41.843487] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:49.016 [2024-11-19 17:52:41.843515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.016 #51 NEW cov: 11896 ft: 14612 corp: 19/790b lim: 85 exec/s: 0 rss: 68Mb L: 18/79 MS: 1 EraseBytes- 00:08:49.276 [2024-11-19 17:52:41.883592] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:49.276 [2024-11-19 17:52:41.883626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.276 #52 NEW cov: 11896 ft: 14626 corp: 20/822b lim: 85 exec/s: 52 rss: 68Mb L: 32/79 MS: 1 ShuffleBytes- 00:08:49.276 [2024-11-19 17:52:41.924175] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:49.276 [2024-11-19 17:52:41.924203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.276 [2024-11-19 17:52:41.924247] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:49.276 [2024-11-19 17:52:41.924262] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.276 [2024-11-19 17:52:41.924315] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:49.276 [2024-11-19 17:52:41.924330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.276 [2024-11-19 17:52:41.924385] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:49.276 [2024-11-19 17:52:41.924400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:49.276 #53 NEW cov: 11896 ft: 14640 corp: 21/899b lim: 85 exec/s: 53 rss: 68Mb L: 77/79 MS: 1 CrossOver- 00:08:49.276 [2024-11-19 17:52:41.963799] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:49.276 [2024-11-19 17:52:41.963827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.276 #54 NEW cov: 11896 ft: 14652 corp: 22/920b lim: 85 exec/s: 54 rss: 68Mb L: 21/79 MS: 1 CMP- DE: "\000\000\000\000"- 00:08:49.276 [2024-11-19 17:52:42.004306] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:49.276 [2024-11-19 17:52:42.004333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.276 [2024-11-19 17:52:42.004369] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:49.276 [2024-11-19 17:52:42.004385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.276 [2024-11-19 17:52:42.004439] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:49.276 [2024-11-19 17:52:42.004453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.276 #55 NEW cov: 11896 ft: 14663 corp: 23/971b lim: 85 exec/s: 55 rss: 68Mb L: 51/79 MS: 1 InsertRepeatedBytes- 00:08:49.276 [2024-11-19 17:52:42.044054] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:49.276 [2024-11-19 17:52:42.044081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.276 #56 NEW cov: 11896 ft: 14709 corp: 24/992b lim: 85 exec/s: 56 rss: 68Mb L: 21/79 MS: 1 ChangeByte- 00:08:49.276 [2024-11-19 17:52:42.084487] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:49.276 [2024-11-19 17:52:42.084515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.276 [2024-11-19 17:52:42.084569] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:49.276 [2024-11-19 17:52:42.084584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.276 [2024-11-19 17:52:42.084643] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:49.276 [2024-11-19 17:52:42.084658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.276 #57 NEW cov: 11896 ft: 14730 corp: 25/1052b lim: 85 exec/s: 57 rss: 68Mb L: 60/79 MS: 1 InsertRepeatedBytes- 00:08:49.276 [2024-11-19 17:52:42.124727] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:49.276 [2024-11-19 17:52:42.124754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.276 [2024-11-19 17:52:42.124815] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:49.276 [2024-11-19 17:52:42.124831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.276 [2024-11-19 17:52:42.124886] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:49.276 [2024-11-19 17:52:42.124901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.276 [2024-11-19 17:52:42.124958] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:49.276 [2024-11-19 17:52:42.124974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:49.536 #58 NEW cov: 11896 ft: 14791 corp: 26/1132b lim: 85 exec/s: 58 rss: 68Mb L: 80/80 MS: 1 InsertByte- 00:08:49.536 [2024-11-19 17:52:42.164586] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:49.536 [2024-11-19 17:52:42.164618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.536 [2024-11-19 17:52:42.164673] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:49.536 [2024-11-19 17:52:42.164688] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.536 #59 NEW cov: 11896 ft: 15082 corp: 27/1167b lim: 85 exec/s: 59 rss: 68Mb L: 35/80 MS: 1 PersAutoDict- DE: "\000\000\000\000"- 00:08:49.536 [2024-11-19 17:52:42.204972] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:49.536 [2024-11-19 17:52:42.205000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.536 [2024-11-19 17:52:42.205057] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:49.536 [2024-11-19 17:52:42.205073] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.536 [2024-11-19 17:52:42.205127] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:49.536 [2024-11-19 17:52:42.205142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.536 [2024-11-19 17:52:42.205198] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:49.536 [2024-11-19 17:52:42.205213] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:49.536 #60 NEW cov: 11896 ft: 15136 corp: 28/1246b lim: 85 exec/s: 60 rss: 68Mb L: 79/80 MS: 1 CopyPart- 00:08:49.536 [2024-11-19 17:52:42.245106] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:49.536 [2024-11-19 17:52:42.245134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.536 [2024-11-19 17:52:42.245181] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:49.536 [2024-11-19 17:52:42.245197] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.536 [2024-11-19 17:52:42.245250] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:49.536 [2024-11-19 17:52:42.245269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.536 [2024-11-19 17:52:42.245323] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:49.536 [2024-11-19 17:52:42.245338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:49.536 #61 NEW cov: 11896 ft: 15142 corp: 29/1325b lim: 85 exec/s: 61 rss: 68Mb L: 79/80 MS: 1 ChangeBit- 00:08:49.536 [2024-11-19 17:52:42.284755] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:49.536 [2024-11-19 17:52:42.284783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.536 #62 NEW cov: 11896 ft: 15166 corp: 30/1350b lim: 85 exec/s: 62 rss: 68Mb L: 25/80 MS: 1 ChangeBit- 00:08:49.536 [2024-11-19 17:52:42.325017] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:49.536 [2024-11-19 17:52:42.325045] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.536 [2024-11-19 17:52:42.325098] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:49.536 [2024-11-19 17:52:42.325115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.536 #63 NEW cov: 11896 ft: 15178 corp: 31/1398b lim: 85 exec/s: 63 rss: 68Mb L: 48/80 MS: 1 InsertRepeatedBytes- 00:08:49.536 [2024-11-19 17:52:42.365019] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:49.536 [2024-11-19 17:52:42.365046] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.536 #64 NEW cov: 11896 ft: 15185 corp: 32/1428b lim: 85 exec/s: 64 rss: 69Mb L: 30/80 MS: 1 PersAutoDict- DE: "\000\000\000\000"- 00:08:49.796 [2024-11-19 17:52:42.405574] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:49.796 [2024-11-19 17:52:42.405606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.796 [2024-11-19 17:52:42.405668] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:49.796 [2024-11-19 17:52:42.405685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.796 [2024-11-19 17:52:42.405741] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:49.796 [2024-11-19 17:52:42.405756] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.796 [2024-11-19 17:52:42.405810] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:49.796 [2024-11-19 17:52:42.405826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:49.796 #65 NEW cov: 11896 ft: 15186 corp: 33/1507b lim: 85 exec/s: 65 rss: 69Mb L: 79/80 MS: 1 ShuffleBytes- 00:08:49.796 [2024-11-19 17:52:42.445205] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:49.796 [2024-11-19 17:52:42.445232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.796 #66 NEW cov: 11896 ft: 15217 corp: 34/1536b lim: 85 exec/s: 66 rss: 69Mb L: 29/80 MS: 1 CMP- DE: "\001\000\000\000\000\000\000\001"- 00:08:49.796 [2024-11-19 17:52:42.485277] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:49.796 [2024-11-19 17:52:42.485305] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.796 #67 NEW cov: 11896 ft: 15286 corp: 35/1557b lim: 85 exec/s: 67 rss: 69Mb L: 21/80 MS: 1 ChangeByte- 00:08:49.796 [2024-11-19 17:52:42.525451] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:49.796 [2024-11-19 17:52:42.525478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.796 #68 NEW cov: 11896 ft: 15295 corp: 36/1582b lim: 85 exec/s: 68 rss: 69Mb L: 25/80 MS: 1 ChangeByte- 00:08:49.796 [2024-11-19 17:52:42.566002] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:49.796 [2024-11-19 17:52:42.566030] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.796 [2024-11-19 17:52:42.566078] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:49.796 [2024-11-19 17:52:42.566094] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.796 [2024-11-19 17:52:42.566148] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:49.796 [2024-11-19 17:52:42.566181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.796 [2024-11-19 17:52:42.566235] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:49.796 [2024-11-19 17:52:42.566250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:49.796 #69 NEW cov: 11896 ft: 15313 corp: 37/1659b lim: 85 exec/s: 69 rss: 69Mb L: 77/80 MS: 1 ShuffleBytes- 00:08:49.796 [2024-11-19 17:52:42.605724] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:49.796 [2024-11-19 17:52:42.605752] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.796 #70 NEW cov: 11896 ft: 15329 corp: 38/1692b lim: 85 exec/s: 70 rss: 69Mb L: 33/80 MS: 1 CMP- DE: "\372\233\016\010\306\177\000\000"- 00:08:49.796 [2024-11-19 17:52:42.646083] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:49.796 [2024-11-19 17:52:42.646109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.796 [2024-11-19 17:52:42.646147] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:49.796 [2024-11-19 17:52:42.646163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.796 [2024-11-19 17:52:42.646217] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:49.796 [2024-11-19 17:52:42.646233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:50.056 #71 NEW cov: 11896 ft: 15334 corp: 39/1752b lim: 85 exec/s: 71 rss: 69Mb L: 60/80 MS: 1 ChangeBinInt- 00:08:50.056 [2024-11-19 17:52:42.685930] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:50.056 [2024-11-19 17:52:42.685957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:50.056 #72 NEW cov: 11896 ft: 15343 corp: 40/1783b lim: 85 exec/s: 72 rss: 69Mb L: 31/80 MS: 1 CopyPart- 00:08:50.056 [2024-11-19 17:52:42.725991] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:50.056 [2024-11-19 17:52:42.726018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:50.056 #73 NEW cov: 11896 ft: 15348 corp: 41/1812b lim: 85 exec/s: 73 rss: 69Mb L: 29/80 MS: 1 ChangeBit- 00:08:50.056 [2024-11-19 17:52:42.766646] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:50.056 [2024-11-19 17:52:42.766676] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:50.056 [2024-11-19 17:52:42.766715] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:50.056 [2024-11-19 17:52:42.766730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:50.056 [2024-11-19 17:52:42.766786] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:50.056 [2024-11-19 17:52:42.766801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:50.056 [2024-11-19 17:52:42.766857] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:50.056 [2024-11-19 17:52:42.766872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:50.056 #74 NEW cov: 11896 ft: 15365 corp: 42/1894b lim: 85 exec/s: 74 rss: 69Mb L: 82/82 MS: 1 CrossOver- 00:08:50.056 [2024-11-19 17:52:42.806711] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:50.056 [2024-11-19 17:52:42.806738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:50.056 [2024-11-19 17:52:42.806786] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:50.056 [2024-11-19 17:52:42.806802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:50.056 [2024-11-19 17:52:42.806855] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:50.057 [2024-11-19 17:52:42.806870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:50.057 [2024-11-19 17:52:42.806927] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:50.057 [2024-11-19 17:52:42.806941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:50.057 #75 NEW cov: 11896 ft: 15516 corp: 43/1977b lim: 85 exec/s: 75 rss: 69Mb L: 83/83 MS: 1 InsertRepeatedBytes- 00:08:50.057 [2024-11-19 17:52:42.846395] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:50.057 [2024-11-19 17:52:42.846422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:50.057 [2024-11-19 17:52:42.876476] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:50.057 [2024-11-19 17:52:42.876502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:50.057 #77 NEW cov: 11896 ft: 15548 corp: 44/2002b lim: 85 exec/s: 38 rss: 70Mb L: 25/83 MS: 2 CrossOver-ChangeBit- 00:08:50.057 #77 DONE cov: 11896 ft: 15548 corp: 44/2002b lim: 85 exec/s: 38 rss: 70Mb 00:08:50.057 ###### Recommended dictionary. ###### 00:08:50.057 "\000\000\000\000" # Uses: 2 00:08:50.057 "\001\000\000\000\000\000\000\001" # Uses: 0 00:08:50.057 "\372\233\016\010\306\177\000\000" # Uses: 0 00:08:50.057 ###### End of recommended dictionary. ###### 00:08:50.057 Done 77 runs in 2 second(s) 00:08:50.316 17:52:43 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_22.conf 00:08:50.316 17:52:43 -- ../common.sh@72 -- # (( i++ )) 00:08:50.316 17:52:43 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:50.316 17:52:43 -- ../common.sh@73 -- # start_llvm_fuzz 23 1 0x1 00:08:50.316 17:52:43 -- nvmf/run.sh@23 -- # local fuzzer_type=23 00:08:50.316 17:52:43 -- nvmf/run.sh@24 -- # local timen=1 00:08:50.316 17:52:43 -- nvmf/run.sh@25 -- # local core=0x1 00:08:50.316 17:52:43 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:50.316 17:52:43 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_23.conf 00:08:50.316 17:52:43 -- nvmf/run.sh@29 -- # printf %02d 23 00:08:50.316 17:52:43 -- nvmf/run.sh@29 -- # port=4423 00:08:50.316 17:52:43 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:50.316 17:52:43 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4423' 00:08:50.316 17:52:43 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4423"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:50.316 17:52:43 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4423' -c /tmp/fuzz_json_23.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 -Z 23 -r /var/tmp/spdk23.sock 00:08:50.316 [2024-11-19 17:52:43.030469] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:50.316 [2024-11-19 17:52:43.030524] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid644333 ] 00:08:50.316 EAL: No free 2048 kB hugepages reported on node 1 00:08:50.575 [2024-11-19 17:52:43.201737] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:50.575 [2024-11-19 17:52:43.221004] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:50.575 [2024-11-19 17:52:43.221128] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:50.575 [2024-11-19 17:52:43.272393] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:50.575 [2024-11-19 17:52:43.288733] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4423 *** 00:08:50.575 INFO: Running with entropic power schedule (0xFF, 100). 00:08:50.575 INFO: Seed: 3879730851 00:08:50.575 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:50.575 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:50.575 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:50.575 INFO: A corpus is not provided, starting from an empty corpus 00:08:50.575 #2 INITED exec/s: 0 rss: 59Mb 00:08:50.575 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:50.575 This may also happen if the target rejected all inputs we tried so far 00:08:50.575 [2024-11-19 17:52:43.333926] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:50.575 [2024-11-19 17:52:43.333956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:50.575 [2024-11-19 17:52:43.333992] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:50.575 [2024-11-19 17:52:43.334006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:50.575 [2024-11-19 17:52:43.334056] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:50.575 [2024-11-19 17:52:43.334071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:50.835 NEW_FUNC[1/671]: 0x47bcd8 in fuzz_nvm_reservation_report_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:671 00:08:50.835 NEW_FUNC[2/671]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:50.835 #3 NEW cov: 11600 ft: 11601 corp: 2/18b lim: 25 exec/s: 0 rss: 67Mb L: 17/17 MS: 1 InsertRepeatedBytes- 00:08:50.835 [2024-11-19 17:52:43.654830] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:50.835 [2024-11-19 17:52:43.654865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:50.835 [2024-11-19 17:52:43.654935] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:50.835 [2024-11-19 17:52:43.654951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:50.835 [2024-11-19 17:52:43.655001] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:50.835 [2024-11-19 17:52:43.655016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:50.835 [2024-11-19 17:52:43.655067] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:50.835 [2024-11-19 17:52:43.655081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:50.835 #8 NEW cov: 11715 ft: 12418 corp: 3/41b lim: 25 exec/s: 0 rss: 67Mb L: 23/23 MS: 5 CrossOver-ShuffleBytes-ShuffleBytes-ShuffleBytes-InsertRepeatedBytes- 00:08:50.835 [2024-11-19 17:52:43.694771] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:50.835 [2024-11-19 17:52:43.694800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:50.835 [2024-11-19 17:52:43.694836] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:50.836 [2024-11-19 17:52:43.694851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:50.836 [2024-11-19 17:52:43.694903] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:50.836 [2024-11-19 17:52:43.694917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.095 #9 NEW cov: 11721 ft: 12739 corp: 4/58b lim: 25 exec/s: 0 rss: 67Mb L: 17/23 MS: 1 ChangeBinInt- 00:08:51.095 [2024-11-19 17:52:43.734880] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:51.095 [2024-11-19 17:52:43.734907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.095 [2024-11-19 17:52:43.734956] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:51.095 [2024-11-19 17:52:43.734972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.095 [2024-11-19 17:52:43.735027] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:51.095 [2024-11-19 17:52:43.735042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.095 #10 NEW cov: 11806 ft: 13024 corp: 5/75b lim: 25 exec/s: 0 rss: 67Mb L: 17/23 MS: 1 CopyPart- 00:08:51.095 [2024-11-19 17:52:43.775227] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:51.095 [2024-11-19 17:52:43.775254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.095 [2024-11-19 17:52:43.775323] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:51.095 [2024-11-19 17:52:43.775338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.095 [2024-11-19 17:52:43.775390] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:51.095 [2024-11-19 17:52:43.775404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.095 [2024-11-19 17:52:43.775456] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:51.095 [2024-11-19 17:52:43.775470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:51.096 [2024-11-19 17:52:43.775526] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:51.096 [2024-11-19 17:52:43.775541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:51.096 #11 NEW cov: 11806 ft: 13182 corp: 6/100b lim: 25 exec/s: 0 rss: 67Mb L: 25/25 MS: 1 InsertRepeatedBytes- 00:08:51.096 [2024-11-19 17:52:43.815208] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:51.096 [2024-11-19 17:52:43.815236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.096 [2024-11-19 17:52:43.815298] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:51.096 [2024-11-19 17:52:43.815313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.096 [2024-11-19 17:52:43.815367] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:51.096 [2024-11-19 17:52:43.815381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.096 [2024-11-19 17:52:43.815434] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:51.096 [2024-11-19 17:52:43.815449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:51.096 #12 NEW cov: 11806 ft: 13270 corp: 7/121b lim: 25 exec/s: 0 rss: 67Mb L: 21/25 MS: 1 InsertRepeatedBytes- 00:08:51.096 [2024-11-19 17:52:43.855230] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:51.096 [2024-11-19 17:52:43.855258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.096 [2024-11-19 17:52:43.855296] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:51.096 [2024-11-19 17:52:43.855310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.096 [2024-11-19 17:52:43.855361] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:51.096 [2024-11-19 17:52:43.855375] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.096 #13 NEW cov: 11806 ft: 13320 corp: 8/138b lim: 25 exec/s: 0 rss: 67Mb L: 17/25 MS: 1 ChangeByte- 00:08:51.096 [2024-11-19 17:52:43.895356] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:51.096 [2024-11-19 17:52:43.895382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.096 [2024-11-19 17:52:43.895420] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:51.096 [2024-11-19 17:52:43.895435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.096 [2024-11-19 17:52:43.895486] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:51.096 [2024-11-19 17:52:43.895501] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.096 #14 NEW cov: 11806 ft: 13372 corp: 9/155b lim: 25 exec/s: 0 rss: 67Mb L: 17/25 MS: 1 ChangeBit- 00:08:51.096 [2024-11-19 17:52:43.935396] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:51.096 [2024-11-19 17:52:43.935423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.096 [2024-11-19 17:52:43.935464] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:51.096 [2024-11-19 17:52:43.935481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.096 [2024-11-19 17:52:43.935531] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:51.096 [2024-11-19 17:52:43.935545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.096 #15 NEW cov: 11806 ft: 13402 corp: 10/171b lim: 25 exec/s: 0 rss: 67Mb L: 16/25 MS: 1 CrossOver- 00:08:51.355 [2024-11-19 17:52:43.965732] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:51.355 [2024-11-19 17:52:43.965759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.355 [2024-11-19 17:52:43.965804] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:51.355 [2024-11-19 17:52:43.965816] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.355 [2024-11-19 17:52:43.965867] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:51.355 [2024-11-19 17:52:43.965882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.355 [2024-11-19 17:52:43.965933] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:51.355 [2024-11-19 17:52:43.965947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:51.356 [2024-11-19 17:52:43.965998] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:51.356 [2024-11-19 17:52:43.966012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:51.356 #16 NEW cov: 11806 ft: 13463 corp: 11/196b lim: 25 exec/s: 0 rss: 67Mb L: 25/25 MS: 1 InsertRepeatedBytes- 00:08:51.356 [2024-11-19 17:52:44.005750] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:51.356 [2024-11-19 17:52:44.005776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.356 [2024-11-19 17:52:44.005821] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:51.356 [2024-11-19 17:52:44.005836] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.356 [2024-11-19 17:52:44.005887] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:51.356 [2024-11-19 17:52:44.005901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.356 [2024-11-19 17:52:44.005951] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:51.356 [2024-11-19 17:52:44.005964] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:51.356 #17 NEW cov: 11806 ft: 13565 corp: 12/217b lim: 25 exec/s: 0 rss: 67Mb L: 21/25 MS: 1 InsertRepeatedBytes- 00:08:51.356 [2024-11-19 17:52:44.045826] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:51.356 [2024-11-19 17:52:44.045854] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.356 [2024-11-19 17:52:44.045892] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:51.356 [2024-11-19 17:52:44.045906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.356 [2024-11-19 17:52:44.045957] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:51.356 [2024-11-19 17:52:44.045975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.356 [2024-11-19 17:52:44.046024] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:51.356 [2024-11-19 17:52:44.046039] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:51.356 #18 NEW cov: 11806 ft: 13595 corp: 13/240b lim: 25 exec/s: 0 rss: 67Mb L: 23/25 MS: 1 CMP- DE: "\377\377\377~"- 00:08:51.356 [2024-11-19 17:52:44.085847] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:51.356 [2024-11-19 17:52:44.085873] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.356 [2024-11-19 17:52:44.085918] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:51.356 [2024-11-19 17:52:44.085934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.356 [2024-11-19 17:52:44.085987] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:51.356 [2024-11-19 17:52:44.086001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.356 #19 NEW cov: 11806 ft: 13687 corp: 14/257b lim: 25 exec/s: 0 rss: 67Mb L: 17/25 MS: 1 ChangeBit- 00:08:51.356 [2024-11-19 17:52:44.126167] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:51.356 [2024-11-19 17:52:44.126194] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.356 [2024-11-19 17:52:44.126247] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:51.356 [2024-11-19 17:52:44.126261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.356 [2024-11-19 17:52:44.126312] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:51.356 [2024-11-19 17:52:44.126327] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.356 [2024-11-19 17:52:44.126380] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:51.356 [2024-11-19 17:52:44.126394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:51.356 [2024-11-19 17:52:44.126432] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:51.356 [2024-11-19 17:52:44.126446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:51.356 #20 NEW cov: 11806 ft: 13715 corp: 15/282b lim: 25 exec/s: 0 rss: 68Mb L: 25/25 MS: 1 PersAutoDict- DE: "\377\377\377~"- 00:08:51.356 [2024-11-19 17:52:44.166194] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:51.356 [2024-11-19 17:52:44.166220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.356 [2024-11-19 17:52:44.166267] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:51.356 [2024-11-19 17:52:44.166282] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.356 [2024-11-19 17:52:44.166334] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:51.356 [2024-11-19 17:52:44.166348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.356 [2024-11-19 17:52:44.166403] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:51.356 [2024-11-19 17:52:44.166418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:51.356 #21 NEW cov: 11806 ft: 13799 corp: 16/305b lim: 25 exec/s: 0 rss: 68Mb L: 23/25 MS: 1 InsertRepeatedBytes- 00:08:51.356 [2024-11-19 17:52:44.206317] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:51.356 [2024-11-19 17:52:44.206343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.356 [2024-11-19 17:52:44.206407] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:51.356 [2024-11-19 17:52:44.206422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.356 [2024-11-19 17:52:44.206473] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:51.356 [2024-11-19 17:52:44.206488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.356 [2024-11-19 17:52:44.206541] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:51.356 [2024-11-19 17:52:44.206555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:51.615 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:51.615 #22 NEW cov: 11829 ft: 13813 corp: 17/325b lim: 25 exec/s: 0 rss: 68Mb L: 20/25 MS: 1 CrossOver- 00:08:51.615 [2024-11-19 17:52:44.246548] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:51.615 [2024-11-19 17:52:44.246575] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.615 [2024-11-19 17:52:44.246646] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:51.615 [2024-11-19 17:52:44.246663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.615 [2024-11-19 17:52:44.246714] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:51.615 [2024-11-19 17:52:44.246728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.616 [2024-11-19 17:52:44.246788] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:51.616 [2024-11-19 17:52:44.246803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:51.616 [2024-11-19 17:52:44.246851] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:51.616 [2024-11-19 17:52:44.246865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:51.616 #23 NEW cov: 11829 ft: 13865 corp: 18/350b lim: 25 exec/s: 0 rss: 68Mb L: 25/25 MS: 1 ShuffleBytes- 00:08:51.616 [2024-11-19 17:52:44.286553] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:51.616 [2024-11-19 17:52:44.286580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.616 [2024-11-19 17:52:44.286647] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:51.616 [2024-11-19 17:52:44.286663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.616 [2024-11-19 17:52:44.286714] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:51.616 [2024-11-19 17:52:44.286732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.616 [2024-11-19 17:52:44.286795] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:51.616 [2024-11-19 17:52:44.286809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:51.616 #24 NEW cov: 11829 ft: 13923 corp: 19/371b lim: 25 exec/s: 0 rss: 68Mb L: 21/25 MS: 1 CrossOver- 00:08:51.616 [2024-11-19 17:52:44.326667] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:51.616 [2024-11-19 17:52:44.326694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.616 [2024-11-19 17:52:44.326750] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:51.616 [2024-11-19 17:52:44.326775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.616 [2024-11-19 17:52:44.326825] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:51.616 [2024-11-19 17:52:44.326840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.616 [2024-11-19 17:52:44.326891] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:51.616 [2024-11-19 17:52:44.326905] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:51.616 #25 NEW cov: 11829 ft: 13945 corp: 20/394b lim: 25 exec/s: 25 rss: 68Mb L: 23/25 MS: 1 ChangeBinInt- 00:08:51.616 [2024-11-19 17:52:44.366655] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:51.616 [2024-11-19 17:52:44.366682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.616 [2024-11-19 17:52:44.366744] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:51.616 [2024-11-19 17:52:44.366770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.616 [2024-11-19 17:52:44.366821] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:51.616 [2024-11-19 17:52:44.366835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.616 #26 NEW cov: 11829 ft: 13968 corp: 21/410b lim: 25 exec/s: 26 rss: 68Mb L: 16/25 MS: 1 ShuffleBytes- 00:08:51.616 [2024-11-19 17:52:44.406769] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:51.616 [2024-11-19 17:52:44.406796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.616 [2024-11-19 17:52:44.406841] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:51.616 [2024-11-19 17:52:44.406856] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.616 [2024-11-19 17:52:44.406908] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:51.616 [2024-11-19 17:52:44.406923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.616 #27 NEW cov: 11829 ft: 13992 corp: 22/427b lim: 25 exec/s: 27 rss: 68Mb L: 17/25 MS: 1 ShuffleBytes- 00:08:51.616 [2024-11-19 17:52:44.446885] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:51.616 [2024-11-19 17:52:44.446912] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.616 [2024-11-19 17:52:44.446953] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:51.616 [2024-11-19 17:52:44.446967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.616 [2024-11-19 17:52:44.447018] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:51.616 [2024-11-19 17:52:44.447032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.616 #28 NEW cov: 11829 ft: 14006 corp: 23/445b lim: 25 exec/s: 28 rss: 68Mb L: 18/25 MS: 1 EraseBytes- 00:08:51.875 [2024-11-19 17:52:44.487025] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:51.875 [2024-11-19 17:52:44.487052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.875 [2024-11-19 17:52:44.487088] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:51.875 [2024-11-19 17:52:44.487103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.875 [2024-11-19 17:52:44.487153] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:51.875 [2024-11-19 17:52:44.487167] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.875 #29 NEW cov: 11829 ft: 14064 corp: 24/463b lim: 25 exec/s: 29 rss: 68Mb L: 18/25 MS: 1 ShuffleBytes- 00:08:51.875 [2024-11-19 17:52:44.527273] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:51.876 [2024-11-19 17:52:44.527299] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.876 [2024-11-19 17:52:44.527351] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:51.876 [2024-11-19 17:52:44.527366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.876 [2024-11-19 17:52:44.527420] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:51.876 [2024-11-19 17:52:44.527436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.876 [2024-11-19 17:52:44.527489] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:51.876 [2024-11-19 17:52:44.527504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:51.876 #35 NEW cov: 11829 ft: 14074 corp: 25/487b lim: 25 exec/s: 35 rss: 68Mb L: 24/25 MS: 1 InsertByte- 00:08:51.876 [2024-11-19 17:52:44.567513] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:51.876 [2024-11-19 17:52:44.567539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.876 [2024-11-19 17:52:44.567593] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:51.876 [2024-11-19 17:52:44.567611] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.876 [2024-11-19 17:52:44.567663] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:51.876 [2024-11-19 17:52:44.567678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.876 [2024-11-19 17:52:44.567731] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:51.876 [2024-11-19 17:52:44.567746] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:51.876 [2024-11-19 17:52:44.567799] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:51.876 [2024-11-19 17:52:44.567814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:51.876 #36 NEW cov: 11829 ft: 14100 corp: 26/512b lim: 25 exec/s: 36 rss: 69Mb L: 25/25 MS: 1 CopyPart- 00:08:51.876 [2024-11-19 17:52:44.607386] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:51.876 [2024-11-19 17:52:44.607413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.876 [2024-11-19 17:52:44.607452] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:51.876 [2024-11-19 17:52:44.607467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.876 [2024-11-19 17:52:44.607521] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:51.876 [2024-11-19 17:52:44.607536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.876 #37 NEW cov: 11829 ft: 14125 corp: 27/529b lim: 25 exec/s: 37 rss: 69Mb L: 17/25 MS: 1 CrossOver- 00:08:51.876 [2024-11-19 17:52:44.647618] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:51.876 [2024-11-19 17:52:44.647644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.876 [2024-11-19 17:52:44.647694] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:51.876 [2024-11-19 17:52:44.647708] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.876 [2024-11-19 17:52:44.647760] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:51.876 [2024-11-19 17:52:44.647774] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.876 [2024-11-19 17:52:44.647824] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:51.876 [2024-11-19 17:52:44.647838] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:51.876 #38 NEW cov: 11829 ft: 14129 corp: 28/552b lim: 25 exec/s: 38 rss: 69Mb L: 23/25 MS: 1 CMP- DE: "\003\000\000\000"- 00:08:51.876 [2024-11-19 17:52:44.687605] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:51.876 [2024-11-19 17:52:44.687633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.876 [2024-11-19 17:52:44.687672] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:51.876 [2024-11-19 17:52:44.687687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.876 [2024-11-19 17:52:44.687738] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:51.876 [2024-11-19 17:52:44.687753] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.876 #39 NEW cov: 11829 ft: 14141 corp: 29/569b lim: 25 exec/s: 39 rss: 69Mb L: 17/25 MS: 1 ChangeBinInt- 00:08:51.876 [2024-11-19 17:52:44.727712] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:51.876 [2024-11-19 17:52:44.727738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.876 [2024-11-19 17:52:44.727798] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:51.876 [2024-11-19 17:52:44.727817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.876 [2024-11-19 17:52:44.727871] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:51.876 [2024-11-19 17:52:44.727886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:52.135 #40 NEW cov: 11829 ft: 14144 corp: 30/585b lim: 25 exec/s: 40 rss: 69Mb L: 16/25 MS: 1 ShuffleBytes- 00:08:52.135 [2024-11-19 17:52:44.767867] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:52.135 [2024-11-19 17:52:44.767893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.135 [2024-11-19 17:52:44.767928] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:52.135 [2024-11-19 17:52:44.767942] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.135 [2024-11-19 17:52:44.767993] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:52.135 [2024-11-19 17:52:44.768007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:52.135 #41 NEW cov: 11829 ft: 14182 corp: 31/602b lim: 25 exec/s: 41 rss: 69Mb L: 17/25 MS: 1 CrossOver- 00:08:52.135 [2024-11-19 17:52:44.808064] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:52.135 [2024-11-19 17:52:44.808089] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.135 [2024-11-19 17:52:44.808140] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:52.135 [2024-11-19 17:52:44.808154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.135 [2024-11-19 17:52:44.808202] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:52.135 [2024-11-19 17:52:44.808217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:52.135 [2024-11-19 17:52:44.808266] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:52.135 [2024-11-19 17:52:44.808279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:52.135 #42 NEW cov: 11829 ft: 14195 corp: 32/623b lim: 25 exec/s: 42 rss: 69Mb L: 21/25 MS: 1 CMP- DE: "\377\377\377\377"- 00:08:52.135 [2024-11-19 17:52:44.848070] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:52.135 [2024-11-19 17:52:44.848097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.135 [2024-11-19 17:52:44.848159] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:52.135 [2024-11-19 17:52:44.848173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.135 [2024-11-19 17:52:44.848223] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:52.135 [2024-11-19 17:52:44.848237] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:52.135 #43 NEW cov: 11829 ft: 14260 corp: 33/640b lim: 25 exec/s: 43 rss: 69Mb L: 17/25 MS: 1 ChangeBinInt- 00:08:52.135 [2024-11-19 17:52:44.888224] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:52.135 [2024-11-19 17:52:44.888249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.135 [2024-11-19 17:52:44.888305] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:52.135 [2024-11-19 17:52:44.888319] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.135 [2024-11-19 17:52:44.888369] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:52.135 [2024-11-19 17:52:44.888383] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:52.135 #44 NEW cov: 11829 ft: 14270 corp: 34/658b lim: 25 exec/s: 44 rss: 69Mb L: 18/25 MS: 1 InsertByte- 00:08:52.135 [2024-11-19 17:52:44.928220] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:52.135 [2024-11-19 17:52:44.928247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.135 [2024-11-19 17:52:44.928308] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:52.135 [2024-11-19 17:52:44.928323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.135 #45 NEW cov: 11829 ft: 14516 corp: 35/669b lim: 25 exec/s: 45 rss: 69Mb L: 11/25 MS: 1 EraseBytes- 00:08:52.135 [2024-11-19 17:52:44.968533] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:52.135 [2024-11-19 17:52:44.968560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.136 [2024-11-19 17:52:44.968608] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:52.136 [2024-11-19 17:52:44.968624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.136 [2024-11-19 17:52:44.968676] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:52.136 [2024-11-19 17:52:44.968690] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:52.136 [2024-11-19 17:52:44.968743] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:52.136 [2024-11-19 17:52:44.968758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:52.136 #46 NEW cov: 11829 ft: 14530 corp: 36/692b lim: 25 exec/s: 46 rss: 69Mb L: 23/25 MS: 1 ChangeByte- 00:08:52.395 [2024-11-19 17:52:45.008561] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:52.395 [2024-11-19 17:52:45.008589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.395 [2024-11-19 17:52:45.008640] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:52.395 [2024-11-19 17:52:45.008655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.395 [2024-11-19 17:52:45.008707] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:52.395 [2024-11-19 17:52:45.008721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:52.395 #47 NEW cov: 11829 ft: 14538 corp: 37/709b lim: 25 exec/s: 47 rss: 69Mb L: 17/25 MS: 1 ChangeBinInt- 00:08:52.395 [2024-11-19 17:52:45.038785] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:52.395 [2024-11-19 17:52:45.038813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.395 [2024-11-19 17:52:45.038855] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:52.395 [2024-11-19 17:52:45.038874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.395 [2024-11-19 17:52:45.038928] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:52.395 [2024-11-19 17:52:45.038945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:52.395 [2024-11-19 17:52:45.038997] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:52.395 [2024-11-19 17:52:45.039011] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:52.395 #48 NEW cov: 11829 ft: 14583 corp: 38/730b lim: 25 exec/s: 48 rss: 69Mb L: 21/25 MS: 1 ChangeByte- 00:08:52.395 [2024-11-19 17:52:45.078905] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:52.395 [2024-11-19 17:52:45.078932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.395 [2024-11-19 17:52:45.078968] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:52.395 [2024-11-19 17:52:45.078982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.395 [2024-11-19 17:52:45.079032] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:52.395 [2024-11-19 17:52:45.079045] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:52.395 [2024-11-19 17:52:45.079095] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:52.395 [2024-11-19 17:52:45.079110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:52.395 #49 NEW cov: 11829 ft: 14605 corp: 39/753b lim: 25 exec/s: 49 rss: 69Mb L: 23/25 MS: 1 CopyPart- 00:08:52.395 [2024-11-19 17:52:45.118988] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:52.395 [2024-11-19 17:52:45.119014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.395 [2024-11-19 17:52:45.119064] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:52.396 [2024-11-19 17:52:45.119080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.396 [2024-11-19 17:52:45.119131] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:52.396 [2024-11-19 17:52:45.119145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:52.396 [2024-11-19 17:52:45.119196] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:52.396 [2024-11-19 17:52:45.119210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:52.396 #50 NEW cov: 11829 ft: 14612 corp: 40/774b lim: 25 exec/s: 50 rss: 69Mb L: 21/25 MS: 1 PersAutoDict- DE: "\377\377\377\377"- 00:08:52.396 [2024-11-19 17:52:45.159195] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:52.396 [2024-11-19 17:52:45.159222] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.396 [2024-11-19 17:52:45.159277] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:52.396 [2024-11-19 17:52:45.159291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.396 [2024-11-19 17:52:45.159349] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:52.396 [2024-11-19 17:52:45.159361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:52.396 [2024-11-19 17:52:45.159413] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:52.396 [2024-11-19 17:52:45.159427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:52.396 [2024-11-19 17:52:45.159482] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:52.396 [2024-11-19 17:52:45.159496] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:52.396 #51 NEW cov: 11829 ft: 14613 corp: 41/799b lim: 25 exec/s: 51 rss: 69Mb L: 25/25 MS: 1 ShuffleBytes- 00:08:52.396 [2024-11-19 17:52:45.199069] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:52.396 [2024-11-19 17:52:45.199095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.396 [2024-11-19 17:52:45.199131] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:52.396 [2024-11-19 17:52:45.199146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.396 [2024-11-19 17:52:45.199197] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:52.396 [2024-11-19 17:52:45.199211] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:52.396 #52 NEW cov: 11829 ft: 14615 corp: 42/816b lim: 25 exec/s: 52 rss: 69Mb L: 17/25 MS: 1 ShuffleBytes- 00:08:52.396 [2024-11-19 17:52:45.229160] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:52.396 [2024-11-19 17:52:45.229188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.396 [2024-11-19 17:52:45.229242] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:52.396 [2024-11-19 17:52:45.229258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.396 [2024-11-19 17:52:45.229310] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:52.396 [2024-11-19 17:52:45.229325] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:52.396 #53 NEW cov: 11829 ft: 14674 corp: 43/833b lim: 25 exec/s: 53 rss: 69Mb L: 17/25 MS: 1 ChangeByte- 00:08:52.656 [2024-11-19 17:52:45.269370] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:52.656 [2024-11-19 17:52:45.269397] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.656 [2024-11-19 17:52:45.269459] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:52.656 [2024-11-19 17:52:45.269475] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.656 [2024-11-19 17:52:45.269525] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:52.656 [2024-11-19 17:52:45.269540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:52.656 [2024-11-19 17:52:45.269592] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:52.656 [2024-11-19 17:52:45.269613] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:52.656 #54 NEW cov: 11829 ft: 14679 corp: 44/857b lim: 25 exec/s: 54 rss: 69Mb L: 24/25 MS: 1 CrossOver- 00:08:52.656 [2024-11-19 17:52:45.309479] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:52.656 [2024-11-19 17:52:45.309506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.656 [2024-11-19 17:52:45.309552] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:52.656 [2024-11-19 17:52:45.309567] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.656 [2024-11-19 17:52:45.309614] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:52.656 [2024-11-19 17:52:45.309629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:52.656 [2024-11-19 17:52:45.309696] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:52.656 [2024-11-19 17:52:45.309711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:52.656 #55 NEW cov: 11829 ft: 14729 corp: 45/880b lim: 25 exec/s: 27 rss: 69Mb L: 23/25 MS: 1 ChangeByte- 00:08:52.656 #55 DONE cov: 11829 ft: 14729 corp: 45/880b lim: 25 exec/s: 27 rss: 69Mb 00:08:52.656 ###### Recommended dictionary. ###### 00:08:52.656 "\377\377\377~" # Uses: 1 00:08:52.656 "\003\000\000\000" # Uses: 0 00:08:52.656 "\377\377\377\377" # Uses: 1 00:08:52.656 ###### End of recommended dictionary. ###### 00:08:52.656 Done 55 runs in 2 second(s) 00:08:52.656 17:52:45 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_23.conf 00:08:52.656 17:52:45 -- ../common.sh@72 -- # (( i++ )) 00:08:52.656 17:52:45 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:52.656 17:52:45 -- ../common.sh@73 -- # start_llvm_fuzz 24 1 0x1 00:08:52.656 17:52:45 -- nvmf/run.sh@23 -- # local fuzzer_type=24 00:08:52.656 17:52:45 -- nvmf/run.sh@24 -- # local timen=1 00:08:52.656 17:52:45 -- nvmf/run.sh@25 -- # local core=0x1 00:08:52.656 17:52:45 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:52.656 17:52:45 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_24.conf 00:08:52.656 17:52:45 -- nvmf/run.sh@29 -- # printf %02d 24 00:08:52.656 17:52:45 -- nvmf/run.sh@29 -- # port=4424 00:08:52.656 17:52:45 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:52.656 17:52:45 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4424' 00:08:52.656 17:52:45 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4424"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:52.656 17:52:45 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4424' -c /tmp/fuzz_json_24.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 -Z 24 -r /var/tmp/spdk24.sock 00:08:52.656 [2024-11-19 17:52:45.474078] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:52.656 [2024-11-19 17:52:45.474148] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid644862 ] 00:08:52.656 EAL: No free 2048 kB hugepages reported on node 1 00:08:52.916 [2024-11-19 17:52:45.648741] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:52.916 [2024-11-19 17:52:45.667729] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:52.916 [2024-11-19 17:52:45.667864] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:52.916 [2024-11-19 17:52:45.719152] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:52.916 [2024-11-19 17:52:45.735482] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4424 *** 00:08:52.916 INFO: Running with entropic power schedule (0xFF, 100). 00:08:52.916 INFO: Seed: 2031769555 00:08:52.916 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:52.916 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:52.916 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:52.916 INFO: A corpus is not provided, starting from an empty corpus 00:08:52.916 #2 INITED exec/s: 0 rss: 59Mb 00:08:52.916 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:52.916 This may also happen if the target rejected all inputs we tried so far 00:08:53.175 [2024-11-19 17:52:45.780915] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:8608480567731124087 len:30584 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.175 [2024-11-19 17:52:45.780946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.175 [2024-11-19 17:52:45.780982] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:8608480567731124087 len:30584 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.175 [2024-11-19 17:52:45.781013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:53.176 [2024-11-19 17:52:45.781064] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:8608480567731124087 len:30584 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.176 [2024-11-19 17:52:45.781079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:53.176 [2024-11-19 17:52:45.781129] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:8608480567731124087 len:30584 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.176 [2024-11-19 17:52:45.781143] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:53.436 NEW_FUNC[1/672]: 0x47cdc8 in fuzz_nvm_compare_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:685 00:08:53.436 NEW_FUNC[2/672]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:53.436 #3 NEW cov: 11674 ft: 11672 corp: 2/85b lim: 100 exec/s: 0 rss: 67Mb L: 84/84 MS: 1 InsertRepeatedBytes- 00:08:53.436 [2024-11-19 17:52:46.081532] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.436 [2024-11-19 17:52:46.081564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.436 [2024-11-19 17:52:46.081620] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.436 [2024-11-19 17:52:46.081635] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:53.436 [2024-11-19 17:52:46.081690] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.436 [2024-11-19 17:52:46.081721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:53.436 #10 NEW cov: 11787 ft: 12566 corp: 3/155b lim: 100 exec/s: 0 rss: 67Mb L: 70/84 MS: 2 ChangeByte-InsertRepeatedBytes- 00:08:53.436 [2024-11-19 17:52:46.121744] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:2162416581277635031 len:30584 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.436 [2024-11-19 17:52:46.121773] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.436 [2024-11-19 17:52:46.121811] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:8608480567731124087 len:30584 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.436 [2024-11-19 17:52:46.121830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:53.436 [2024-11-19 17:52:46.121880] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:8608480567731124087 len:30584 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.436 [2024-11-19 17:52:46.121895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:53.436 [2024-11-19 17:52:46.121944] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:8608480567731124087 len:30584 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.436 [2024-11-19 17:52:46.121959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:53.436 #11 NEW cov: 11793 ft: 12821 corp: 4/247b lim: 100 exec/s: 0 rss: 67Mb L: 92/92 MS: 1 CMP- DE: "\321\327\036\002rl\214\000"- 00:08:53.436 [2024-11-19 17:52:46.161861] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1982660608 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.436 [2024-11-19 17:52:46.161891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.436 [2024-11-19 17:52:46.161926] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.436 [2024-11-19 17:52:46.161941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:53.436 [2024-11-19 17:52:46.161993] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.436 [2024-11-19 17:52:46.162008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:53.436 [2024-11-19 17:52:46.162057] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.436 [2024-11-19 17:52:46.162072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:53.436 #14 NEW cov: 11878 ft: 13065 corp: 5/346b lim: 100 exec/s: 0 rss: 67Mb L: 99/99 MS: 3 ChangeByte-InsertByte-InsertRepeatedBytes- 00:08:53.436 [2024-11-19 17:52:46.202140] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:2162416581277635031 len:30584 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.436 [2024-11-19 17:52:46.202167] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.436 [2024-11-19 17:52:46.202217] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:8608480567731124087 len:30584 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.436 [2024-11-19 17:52:46.202231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:53.436 [2024-11-19 17:52:46.202282] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:8608480567731124087 len:30584 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.436 [2024-11-19 17:52:46.202297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:53.436 [2024-11-19 17:52:46.202346] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:8608480567731124087 len:30584 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.436 [2024-11-19 17:52:46.202361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:53.436 [2024-11-19 17:52:46.202413] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:3492391389842386992 len:30584 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.436 [2024-11-19 17:52:46.202428] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:53.436 #15 NEW cov: 11878 ft: 13215 corp: 6/446b lim: 100 exec/s: 0 rss: 67Mb L: 100/100 MS: 1 InsertRepeatedBytes- 00:08:53.436 [2024-11-19 17:52:46.242113] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1982660734 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.436 [2024-11-19 17:52:46.242140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.436 [2024-11-19 17:52:46.242178] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.436 [2024-11-19 17:52:46.242194] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:53.436 [2024-11-19 17:52:46.242245] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.436 [2024-11-19 17:52:46.242260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:53.436 [2024-11-19 17:52:46.242312] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.436 [2024-11-19 17:52:46.242326] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:53.436 #16 NEW cov: 11878 ft: 13335 corp: 7/545b lim: 100 exec/s: 0 rss: 67Mb L: 99/100 MS: 1 ChangeByte- 00:08:53.436 [2024-11-19 17:52:46.282331] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1982660734 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.436 [2024-11-19 17:52:46.282358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.436 [2024-11-19 17:52:46.282407] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.436 [2024-11-19 17:52:46.282422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:53.436 [2024-11-19 17:52:46.282471] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.436 [2024-11-19 17:52:46.282486] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:53.436 [2024-11-19 17:52:46.282538] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.436 [2024-11-19 17:52:46.282553] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:53.436 [2024-11-19 17:52:46.282634] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.436 [2024-11-19 17:52:46.282650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:53.696 #17 NEW cov: 11878 ft: 13364 corp: 8/645b lim: 100 exec/s: 0 rss: 67Mb L: 100/100 MS: 1 CopyPart- 00:08:53.696 [2024-11-19 17:52:46.322444] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1982660734 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.696 [2024-11-19 17:52:46.322471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.696 [2024-11-19 17:52:46.322523] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:32512 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.696 [2024-11-19 17:52:46.322539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:53.696 [2024-11-19 17:52:46.322589] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.696 [2024-11-19 17:52:46.322612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:53.696 [2024-11-19 17:52:46.322664] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.696 [2024-11-19 17:52:46.322679] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:53.696 [2024-11-19 17:52:46.322730] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.696 [2024-11-19 17:52:46.322745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:53.696 #18 NEW cov: 11878 ft: 13392 corp: 9/745b lim: 100 exec/s: 0 rss: 67Mb L: 100/100 MS: 1 InsertByte- 00:08:53.696 [2024-11-19 17:52:46.362431] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1982660734 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.696 [2024-11-19 17:52:46.362458] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.696 [2024-11-19 17:52:46.362507] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.696 [2024-11-19 17:52:46.362522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:53.696 [2024-11-19 17:52:46.362574] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.696 [2024-11-19 17:52:46.362589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:53.696 [2024-11-19 17:52:46.362663] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.696 [2024-11-19 17:52:46.362679] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:53.696 #19 NEW cov: 11878 ft: 13502 corp: 10/844b lim: 100 exec/s: 0 rss: 67Mb L: 99/100 MS: 1 ShuffleBytes- 00:08:53.697 [2024-11-19 17:52:46.402726] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:72057596020588670 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.697 [2024-11-19 17:52:46.402752] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.697 [2024-11-19 17:52:46.402801] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:32512 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.697 [2024-11-19 17:52:46.402817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:53.697 [2024-11-19 17:52:46.402867] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.697 [2024-11-19 17:52:46.402882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:53.697 [2024-11-19 17:52:46.402930] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.697 [2024-11-19 17:52:46.402945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:53.697 [2024-11-19 17:52:46.402997] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.697 [2024-11-19 17:52:46.403011] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:53.697 #20 NEW cov: 11878 ft: 13566 corp: 11/944b lim: 100 exec/s: 0 rss: 67Mb L: 100/100 MS: 1 ChangeBit- 00:08:53.697 [2024-11-19 17:52:46.442558] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1982660608 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.697 [2024-11-19 17:52:46.442586] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.697 [2024-11-19 17:52:46.442631] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.697 [2024-11-19 17:52:46.442646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:53.697 [2024-11-19 17:52:46.442698] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.697 [2024-11-19 17:52:46.442714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:53.697 #21 NEW cov: 11878 ft: 13597 corp: 12/1021b lim: 100 exec/s: 0 rss: 67Mb L: 77/100 MS: 1 EraseBytes- 00:08:53.697 [2024-11-19 17:52:46.482924] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:2162416581277592023 len:30584 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.697 [2024-11-19 17:52:46.482952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.697 [2024-11-19 17:52:46.483001] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:8608480567731124087 len:30584 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.697 [2024-11-19 17:52:46.483017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:53.697 [2024-11-19 17:52:46.483068] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:8608480567731124087 len:30584 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.697 [2024-11-19 17:52:46.483083] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:53.697 [2024-11-19 17:52:46.483132] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:8608480567731124087 len:30584 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.697 [2024-11-19 17:52:46.483147] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:53.697 [2024-11-19 17:52:46.483196] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:3492391389842386992 len:30584 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.697 [2024-11-19 17:52:46.483212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:53.697 #22 NEW cov: 11878 ft: 13648 corp: 13/1121b lim: 100 exec/s: 0 rss: 68Mb L: 100/100 MS: 1 ChangeBinInt- 00:08:53.697 [2024-11-19 17:52:46.522924] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1982660734 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.697 [2024-11-19 17:52:46.522953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.697 [2024-11-19 17:52:46.523011] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.697 [2024-11-19 17:52:46.523027] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:53.697 [2024-11-19 17:52:46.523078] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.697 [2024-11-19 17:52:46.523094] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:53.697 [2024-11-19 17:52:46.523146] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.697 [2024-11-19 17:52:46.523165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:53.697 #23 NEW cov: 11878 ft: 13689 corp: 14/1220b lim: 100 exec/s: 0 rss: 68Mb L: 99/100 MS: 1 CopyPart- 00:08:53.957 [2024-11-19 17:52:46.563078] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:2162416581277635031 len:30584 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.957 [2024-11-19 17:52:46.563105] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.957 [2024-11-19 17:52:46.563144] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:8608480567731124087 len:30584 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.957 [2024-11-19 17:52:46.563159] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:53.957 [2024-11-19 17:52:46.563211] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:8608480567731124087 len:30584 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.957 [2024-11-19 17:52:46.563226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:53.957 [2024-11-19 17:52:46.563278] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:8608480567731124087 len:30584 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.957 [2024-11-19 17:52:46.563293] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:53.957 #24 NEW cov: 11878 ft: 13712 corp: 15/1312b lim: 100 exec/s: 0 rss: 68Mb L: 92/100 MS: 1 ChangeBit- 00:08:53.957 [2024-11-19 17:52:46.603277] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:72057596020588670 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.957 [2024-11-19 17:52:46.603304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.957 [2024-11-19 17:52:46.603356] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:32512 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.957 [2024-11-19 17:52:46.603370] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:53.957 [2024-11-19 17:52:46.603419] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.957 [2024-11-19 17:52:46.603433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:53.957 [2024-11-19 17:52:46.603486] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.957 [2024-11-19 17:52:46.603501] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:53.957 [2024-11-19 17:52:46.603554] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.957 [2024-11-19 17:52:46.603568] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:53.957 #25 NEW cov: 11878 ft: 13719 corp: 16/1412b lim: 100 exec/s: 0 rss: 68Mb L: 100/100 MS: 1 CopyPart- 00:08:53.957 [2024-11-19 17:52:46.643085] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1982660608 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.957 [2024-11-19 17:52:46.643113] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.957 [2024-11-19 17:52:46.643152] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.957 [2024-11-19 17:52:46.643169] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:53.957 [2024-11-19 17:52:46.643221] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.957 [2024-11-19 17:52:46.643251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:53.957 #26 NEW cov: 11878 ft: 13761 corp: 17/1489b lim: 100 exec/s: 0 rss: 68Mb L: 77/100 MS: 1 CopyPart- 00:08:53.957 [2024-11-19 17:52:46.683243] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1982660608 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.957 [2024-11-19 17:52:46.683270] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.957 [2024-11-19 17:52:46.683311] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.957 [2024-11-19 17:52:46.683326] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:53.957 [2024-11-19 17:52:46.683378] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.957 [2024-11-19 17:52:46.683393] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:53.957 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:53.957 #27 NEW cov: 11901 ft: 13814 corp: 18/1555b lim: 100 exec/s: 0 rss: 68Mb L: 66/100 MS: 1 EraseBytes- 00:08:53.957 [2024-11-19 17:52:46.723311] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:2162416581277635031 len:30584 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.958 [2024-11-19 17:52:46.723338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.958 [2024-11-19 17:52:46.723391] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:8608480567731124087 len:30584 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.958 [2024-11-19 17:52:46.723407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:53.958 [2024-11-19 17:52:46.723459] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:8608480567731124087 len:30584 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.958 [2024-11-19 17:52:46.723473] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:53.958 #28 NEW cov: 11901 ft: 13881 corp: 19/1633b lim: 100 exec/s: 0 rss: 68Mb L: 78/100 MS: 1 EraseBytes- 00:08:53.958 [2024-11-19 17:52:46.763587] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:8608480567731124087 len:30584 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.958 [2024-11-19 17:52:46.763619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.958 [2024-11-19 17:52:46.763660] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:8608480567731124087 len:30584 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.958 [2024-11-19 17:52:46.763676] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:53.958 [2024-11-19 17:52:46.763728] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:8608480567731124087 len:30584 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.958 [2024-11-19 17:52:46.763743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:53.958 [2024-11-19 17:52:46.763794] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:8608480567731124087 len:30584 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.958 [2024-11-19 17:52:46.763813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:53.958 #29 NEW cov: 11901 ft: 13922 corp: 20/1717b lim: 100 exec/s: 29 rss: 68Mb L: 84/100 MS: 1 ChangeByte- 00:08:53.958 [2024-11-19 17:52:46.803844] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1982660734 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.958 [2024-11-19 17:52:46.803872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.958 [2024-11-19 17:52:46.803914] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.958 [2024-11-19 17:52:46.803929] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:53.958 [2024-11-19 17:52:46.803982] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.958 [2024-11-19 17:52:46.803996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:53.958 [2024-11-19 17:52:46.804049] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.958 [2024-11-19 17:52:46.804064] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:53.958 [2024-11-19 17:52:46.804116] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.958 [2024-11-19 17:52:46.804130] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:54.218 #30 NEW cov: 11901 ft: 13932 corp: 21/1817b lim: 100 exec/s: 30 rss: 68Mb L: 100/100 MS: 1 ShuffleBytes- 00:08:54.218 [2024-11-19 17:52:46.843950] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:2162416581277592023 len:30584 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.218 [2024-11-19 17:52:46.843978] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.218 [2024-11-19 17:52:46.844042] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:33626875213381632 len:30584 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.218 [2024-11-19 17:52:46.844058] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.218 [2024-11-19 17:52:46.844110] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:8608480567731124087 len:30584 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.218 [2024-11-19 17:52:46.844125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:54.218 [2024-11-19 17:52:46.844174] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:8608480567731124087 len:30584 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.218 [2024-11-19 17:52:46.844189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:54.218 [2024-11-19 17:52:46.844239] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:3492391389842386992 len:30584 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.218 [2024-11-19 17:52:46.844254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:54.218 #31 NEW cov: 11901 ft: 13941 corp: 22/1917b lim: 100 exec/s: 31 rss: 68Mb L: 100/100 MS: 1 ChangeBinInt- 00:08:54.218 [2024-11-19 17:52:46.883974] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1982660608 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.218 [2024-11-19 17:52:46.884005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.218 [2024-11-19 17:52:46.884057] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.218 [2024-11-19 17:52:46.884073] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.218 [2024-11-19 17:52:46.884126] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.218 [2024-11-19 17:52:46.884141] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:54.218 [2024-11-19 17:52:46.884193] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.218 [2024-11-19 17:52:46.884209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:54.218 #32 NEW cov: 11901 ft: 14037 corp: 23/2016b lim: 100 exec/s: 32 rss: 68Mb L: 99/100 MS: 1 ChangeByte- 00:08:54.218 [2024-11-19 17:52:46.924204] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:72057596020588670 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.218 [2024-11-19 17:52:46.924231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.218 [2024-11-19 17:52:46.924282] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:1060856954624 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.218 [2024-11-19 17:52:46.924297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.218 [2024-11-19 17:52:46.924350] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.218 [2024-11-19 17:52:46.924379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:54.219 [2024-11-19 17:52:46.924431] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.219 [2024-11-19 17:52:46.924447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:54.219 [2024-11-19 17:52:46.924500] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.219 [2024-11-19 17:52:46.924515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:54.219 #33 NEW cov: 11901 ft: 14049 corp: 24/2116b lim: 100 exec/s: 33 rss: 68Mb L: 100/100 MS: 1 ChangeBinInt- 00:08:54.219 [2024-11-19 17:52:46.964326] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:2162416581277592023 len:30584 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.219 [2024-11-19 17:52:46.964352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.219 [2024-11-19 17:52:46.964424] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:8608480567731124087 len:30584 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.219 [2024-11-19 17:52:46.964439] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.219 [2024-11-19 17:52:46.964491] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:8608480567731124087 len:30584 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.219 [2024-11-19 17:52:46.964505] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:54.219 [2024-11-19 17:52:46.964559] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:8608480567731124087 len:30584 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.219 [2024-11-19 17:52:46.964574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:54.219 [2024-11-19 17:52:46.964631] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:3492391389842386992 len:30584 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.219 [2024-11-19 17:52:46.964646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:54.219 #34 NEW cov: 11901 ft: 14062 corp: 25/2216b lim: 100 exec/s: 34 rss: 68Mb L: 100/100 MS: 1 ChangeBit- 00:08:54.219 [2024-11-19 17:52:47.004292] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1982660734 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.219 [2024-11-19 17:52:47.004318] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.219 [2024-11-19 17:52:47.004372] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.219 [2024-11-19 17:52:47.004388] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.219 [2024-11-19 17:52:47.004443] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.219 [2024-11-19 17:52:47.004457] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:54.219 [2024-11-19 17:52:47.004510] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.219 [2024-11-19 17:52:47.004525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:54.219 #35 NEW cov: 11901 ft: 14076 corp: 26/2315b lim: 100 exec/s: 35 rss: 69Mb L: 99/100 MS: 1 ChangeByte- 00:08:54.219 [2024-11-19 17:52:47.044552] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:72057596020588670 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.219 [2024-11-19 17:52:47.044580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.219 [2024-11-19 17:52:47.044651] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:32512 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.219 [2024-11-19 17:52:47.044668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.219 [2024-11-19 17:52:47.044718] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:261993005056 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.219 [2024-11-19 17:52:47.044732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:54.219 [2024-11-19 17:52:47.044794] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.219 [2024-11-19 17:52:47.044809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:54.219 [2024-11-19 17:52:47.044862] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.219 [2024-11-19 17:52:47.044877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:54.219 #36 NEW cov: 11901 ft: 14088 corp: 27/2415b lim: 100 exec/s: 36 rss: 69Mb L: 100/100 MS: 1 ChangeByte- 00:08:54.480 [2024-11-19 17:52:47.084466] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:2162416581277635031 len:30584 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.480 [2024-11-19 17:52:47.084496] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.480 [2024-11-19 17:52:47.084533] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:8608480567731124087 len:30584 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.480 [2024-11-19 17:52:47.084549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.480 [2024-11-19 17:52:47.084620] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:8608480567731124087 len:30584 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.480 [2024-11-19 17:52:47.084637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:54.480 [2024-11-19 17:52:47.084690] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:8608480567731124087 len:30584 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.480 [2024-11-19 17:52:47.084706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:54.480 #37 NEW cov: 11901 ft: 14098 corp: 28/2510b lim: 100 exec/s: 37 rss: 69Mb L: 95/100 MS: 1 CopyPart- 00:08:54.480 [2024-11-19 17:52:47.124745] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1982660734 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.480 [2024-11-19 17:52:47.124772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.480 [2024-11-19 17:52:47.124839] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.480 [2024-11-19 17:52:47.124855] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.480 [2024-11-19 17:52:47.124910] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.480 [2024-11-19 17:52:47.124925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:54.480 [2024-11-19 17:52:47.124977] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.480 [2024-11-19 17:52:47.124993] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:54.480 [2024-11-19 17:52:47.125047] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.480 [2024-11-19 17:52:47.125062] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:54.480 #38 NEW cov: 11901 ft: 14132 corp: 29/2610b lim: 100 exec/s: 38 rss: 69Mb L: 100/100 MS: 1 ChangeBit- 00:08:54.480 [2024-11-19 17:52:47.164448] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1982660734 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.480 [2024-11-19 17:52:47.164474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.480 [2024-11-19 17:52:47.164508] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:2113929216 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.480 [2024-11-19 17:52:47.164523] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.480 #39 NEW cov: 11901 ft: 14464 corp: 30/2667b lim: 100 exec/s: 39 rss: 69Mb L: 57/100 MS: 1 CrossOver- 00:08:54.480 [2024-11-19 17:52:47.204840] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:2162416581277635031 len:30584 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.480 [2024-11-19 17:52:47.204872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.480 [2024-11-19 17:52:47.204905] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:8608480567731124087 len:30584 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.480 [2024-11-19 17:52:47.204920] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.480 [2024-11-19 17:52:47.204974] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:8608480567731124087 len:30584 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.480 [2024-11-19 17:52:47.204988] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:54.480 [2024-11-19 17:52:47.205058] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:8608480567731124087 len:30584 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.480 [2024-11-19 17:52:47.205074] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:54.480 #40 NEW cov: 11901 ft: 14496 corp: 31/2748b lim: 100 exec/s: 40 rss: 69Mb L: 81/100 MS: 1 CopyPart- 00:08:54.480 [2024-11-19 17:52:47.245007] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1982660608 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.480 [2024-11-19 17:52:47.245034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.480 [2024-11-19 17:52:47.245081] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.480 [2024-11-19 17:52:47.245096] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.480 [2024-11-19 17:52:47.245145] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.480 [2024-11-19 17:52:47.245160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:54.480 [2024-11-19 17:52:47.245211] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.480 [2024-11-19 17:52:47.245225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:54.480 #41 NEW cov: 11901 ft: 14502 corp: 32/2847b lim: 100 exec/s: 41 rss: 69Mb L: 99/100 MS: 1 ChangeByte- 00:08:54.480 [2024-11-19 17:52:47.285233] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:72057596020588670 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.480 [2024-11-19 17:52:47.285260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.480 [2024-11-19 17:52:47.285303] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:32512 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.480 [2024-11-19 17:52:47.285316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.480 [2024-11-19 17:52:47.285367] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.480 [2024-11-19 17:52:47.285383] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:54.480 [2024-11-19 17:52:47.285433] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.480 [2024-11-19 17:52:47.285448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:54.480 [2024-11-19 17:52:47.285506] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.480 [2024-11-19 17:52:47.285521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:54.480 #42 NEW cov: 11901 ft: 14543 corp: 33/2947b lim: 100 exec/s: 42 rss: 69Mb L: 100/100 MS: 1 CopyPart- 00:08:54.480 [2024-11-19 17:52:47.325346] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1982660734 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.480 [2024-11-19 17:52:47.325372] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.480 [2024-11-19 17:52:47.325426] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.480 [2024-11-19 17:52:47.325441] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.480 [2024-11-19 17:52:47.325492] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.480 [2024-11-19 17:52:47.325507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:54.480 [2024-11-19 17:52:47.325560] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.480 [2024-11-19 17:52:47.325575] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:54.480 [2024-11-19 17:52:47.325633] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.480 [2024-11-19 17:52:47.325648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:54.740 #43 NEW cov: 11901 ft: 14617 corp: 34/3047b lim: 100 exec/s: 43 rss: 69Mb L: 100/100 MS: 1 ShuffleBytes- 00:08:54.740 [2024-11-19 17:52:47.365154] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1982685184 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.740 [2024-11-19 17:52:47.365180] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.740 [2024-11-19 17:52:47.365217] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.740 [2024-11-19 17:52:47.365232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.740 [2024-11-19 17:52:47.365290] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.740 [2024-11-19 17:52:47.365305] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:54.740 #44 NEW cov: 11901 ft: 14643 corp: 35/3114b lim: 100 exec/s: 44 rss: 69Mb L: 67/100 MS: 1 InsertByte- 00:08:54.740 [2024-11-19 17:52:47.405431] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:2162416581277592023 len:30584 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.740 [2024-11-19 17:52:47.405457] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.740 [2024-11-19 17:52:47.405506] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:33626875213381632 len:30584 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.740 [2024-11-19 17:52:47.405520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.740 [2024-11-19 17:52:47.405573] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:8608480567731124087 len:30584 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.740 [2024-11-19 17:52:47.405591] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:54.740 [2024-11-19 17:52:47.405651] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:8608480567731124087 len:30584 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.740 [2024-11-19 17:52:47.405667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:54.740 #45 NEW cov: 11901 ft: 14656 corp: 36/3196b lim: 100 exec/s: 45 rss: 69Mb L: 82/100 MS: 1 EraseBytes- 00:08:54.740 [2024-11-19 17:52:47.445528] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:2162416581256008151 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.740 [2024-11-19 17:52:47.445555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.740 [2024-11-19 17:52:47.445606] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.740 [2024-11-19 17:52:47.445622] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.740 [2024-11-19 17:52:47.445674] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.740 [2024-11-19 17:52:47.445690] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:54.740 [2024-11-19 17:52:47.445744] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.740 [2024-11-19 17:52:47.445759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:54.740 #46 NEW cov: 11901 ft: 14670 corp: 37/3295b lim: 100 exec/s: 46 rss: 69Mb L: 99/100 MS: 1 PersAutoDict- DE: "\321\327\036\002rl\214\000"- 00:08:54.740 [2024-11-19 17:52:47.485796] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:72057596020588670 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.740 [2024-11-19 17:52:47.485822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.740 [2024-11-19 17:52:47.485885] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:32512 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.740 [2024-11-19 17:52:47.485901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.740 [2024-11-19 17:52:47.485951] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.740 [2024-11-19 17:52:47.485966] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:54.740 [2024-11-19 17:52:47.486017] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:16777216 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.740 [2024-11-19 17:52:47.486033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:54.740 [2024-11-19 17:52:47.486084] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.740 [2024-11-19 17:52:47.486098] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:54.740 #47 NEW cov: 11901 ft: 14713 corp: 38/3395b lim: 100 exec/s: 47 rss: 69Mb L: 100/100 MS: 1 CopyPart- 00:08:54.740 [2024-11-19 17:52:47.525772] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1982660608 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.740 [2024-11-19 17:52:47.525802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.740 [2024-11-19 17:52:47.525839] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.740 [2024-11-19 17:52:47.525855] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.740 [2024-11-19 17:52:47.525907] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.740 [2024-11-19 17:52:47.525922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:54.740 [2024-11-19 17:52:47.525974] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.740 [2024-11-19 17:52:47.525988] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:54.740 #48 NEW cov: 11901 ft: 14751 corp: 39/3494b lim: 100 exec/s: 48 rss: 69Mb L: 99/100 MS: 1 CopyPart- 00:08:54.740 [2024-11-19 17:52:47.565768] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1982660734 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.740 [2024-11-19 17:52:47.565795] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.740 [2024-11-19 17:52:47.565835] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.740 [2024-11-19 17:52:47.565851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.740 [2024-11-19 17:52:47.565904] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.740 [2024-11-19 17:52:47.565920] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:54.740 #49 NEW cov: 11901 ft: 14783 corp: 40/3563b lim: 100 exec/s: 49 rss: 69Mb L: 69/100 MS: 1 EraseBytes- 00:08:55.001 [2024-11-19 17:52:47.605955] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1982685184 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.001 [2024-11-19 17:52:47.605981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:55.001 [2024-11-19 17:52:47.606014] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.001 [2024-11-19 17:52:47.606029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:55.001 [2024-11-19 17:52:47.606081] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.001 [2024-11-19 17:52:47.606096] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:55.001 #50 NEW cov: 11901 ft: 14792 corp: 41/3632b lim: 100 exec/s: 50 rss: 70Mb L: 69/100 MS: 1 CMP- DE: "\017\000"- 00:08:55.001 [2024-11-19 17:52:47.646130] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1982685184 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.001 [2024-11-19 17:52:47.646158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:55.001 [2024-11-19 17:52:47.646201] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.001 [2024-11-19 17:52:47.646217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:55.001 [2024-11-19 17:52:47.646274] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.001 [2024-11-19 17:52:47.646290] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:55.001 [2024-11-19 17:52:47.646341] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.001 [2024-11-19 17:52:47.646356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:55.001 #51 NEW cov: 11901 ft: 14803 corp: 42/3717b lim: 100 exec/s: 51 rss: 70Mb L: 85/100 MS: 1 CrossOver- 00:08:55.001 [2024-11-19 17:52:47.686415] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:72057596020588670 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.001 [2024-11-19 17:52:47.686441] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:55.001 [2024-11-19 17:52:47.686493] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:32512 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.001 [2024-11-19 17:52:47.686509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:55.001 [2024-11-19 17:52:47.686561] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:261993005056 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.001 [2024-11-19 17:52:47.686576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:55.001 [2024-11-19 17:52:47.686646] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.001 [2024-11-19 17:52:47.686662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:55.001 [2024-11-19 17:52:47.686715] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.001 [2024-11-19 17:52:47.686731] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:55.001 #52 NEW cov: 11901 ft: 14817 corp: 43/3817b lim: 100 exec/s: 52 rss: 70Mb L: 100/100 MS: 1 CrossOver- 00:08:55.001 [2024-11-19 17:52:47.726484] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1982660734 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.001 [2024-11-19 17:52:47.726511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:55.001 [2024-11-19 17:52:47.726566] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.001 [2024-11-19 17:52:47.726581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:55.001 [2024-11-19 17:52:47.726653] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.001 [2024-11-19 17:52:47.726669] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:55.001 [2024-11-19 17:52:47.726721] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.001 [2024-11-19 17:52:47.726737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:55.001 [2024-11-19 17:52:47.726793] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.001 [2024-11-19 17:52:47.726808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:55.001 #53 NEW cov: 11901 ft: 14826 corp: 44/3917b lim: 100 exec/s: 53 rss: 70Mb L: 100/100 MS: 1 CopyPart- 00:08:55.001 [2024-11-19 17:52:47.766493] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:2162416581277592023 len:30584 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.001 [2024-11-19 17:52:47.766521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:55.001 [2024-11-19 17:52:47.766576] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:33626875213381632 len:30584 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.001 [2024-11-19 17:52:47.766590] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:55.001 [2024-11-19 17:52:47.766649] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:8597935151709124471 len:30584 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.001 [2024-11-19 17:52:47.766663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:55.001 [2024-11-19 17:52:47.766718] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:8608480567731124087 len:30584 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.001 [2024-11-19 17:52:47.766734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:55.001 #54 NEW cov: 11901 ft: 14835 corp: 45/3999b lim: 100 exec/s: 27 rss: 70Mb L: 82/100 MS: 1 ChangeBinInt- 00:08:55.001 #54 DONE cov: 11901 ft: 14835 corp: 45/3999b lim: 100 exec/s: 27 rss: 70Mb 00:08:55.001 ###### Recommended dictionary. ###### 00:08:55.001 "\321\327\036\002rl\214\000" # Uses: 1 00:08:55.001 "\017\000" # Uses: 0 00:08:55.001 ###### End of recommended dictionary. ###### 00:08:55.001 Done 54 runs in 2 second(s) 00:08:55.262 17:52:47 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_24.conf 00:08:55.262 17:52:47 -- ../common.sh@72 -- # (( i++ )) 00:08:55.262 17:52:47 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:55.262 17:52:47 -- nvmf/run.sh@71 -- # trap - SIGINT SIGTERM EXIT 00:08:55.262 00:08:55.262 real 1m3.913s 00:08:55.262 user 1m39.214s 00:08:55.262 sys 0m8.455s 00:08:55.262 17:52:47 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:08:55.262 17:52:47 -- common/autotest_common.sh@10 -- # set +x 00:08:55.262 ************************************ 00:08:55.262 END TEST nvmf_fuzz 00:08:55.262 ************************************ 00:08:55.262 17:52:47 -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:08:55.262 17:52:47 -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:08:55.262 17:52:47 -- fuzz/llvm.sh@20 -- # run_test vfio_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/run.sh 00:08:55.262 17:52:47 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:08:55.262 17:52:47 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:08:55.262 17:52:47 -- common/autotest_common.sh@10 -- # set +x 00:08:55.262 ************************************ 00:08:55.262 START TEST vfio_fuzz 00:08:55.262 ************************************ 00:08:55.262 17:52:47 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/run.sh 00:08:55.262 * Looking for test storage... 00:08:55.262 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:55.262 17:52:48 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:08:55.262 17:52:48 -- common/autotest_common.sh@1690 -- # lcov --version 00:08:55.262 17:52:48 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:08:55.262 17:52:48 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:08:55.262 17:52:48 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:08:55.262 17:52:48 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:08:55.262 17:52:48 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:08:55.262 17:52:48 -- scripts/common.sh@335 -- # IFS=.-: 00:08:55.262 17:52:48 -- scripts/common.sh@335 -- # read -ra ver1 00:08:55.262 17:52:48 -- scripts/common.sh@336 -- # IFS=.-: 00:08:55.262 17:52:48 -- scripts/common.sh@336 -- # read -ra ver2 00:08:55.262 17:52:48 -- scripts/common.sh@337 -- # local 'op=<' 00:08:55.262 17:52:48 -- scripts/common.sh@339 -- # ver1_l=2 00:08:55.262 17:52:48 -- scripts/common.sh@340 -- # ver2_l=1 00:08:55.262 17:52:48 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:08:55.262 17:52:48 -- scripts/common.sh@343 -- # case "$op" in 00:08:55.262 17:52:48 -- scripts/common.sh@344 -- # : 1 00:08:55.262 17:52:48 -- scripts/common.sh@363 -- # (( v = 0 )) 00:08:55.262 17:52:48 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:08:55.262 17:52:48 -- scripts/common.sh@364 -- # decimal 1 00:08:55.262 17:52:48 -- scripts/common.sh@352 -- # local d=1 00:08:55.262 17:52:48 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:08:55.262 17:52:48 -- scripts/common.sh@354 -- # echo 1 00:08:55.525 17:52:48 -- scripts/common.sh@364 -- # ver1[v]=1 00:08:55.525 17:52:48 -- scripts/common.sh@365 -- # decimal 2 00:08:55.525 17:52:48 -- scripts/common.sh@352 -- # local d=2 00:08:55.525 17:52:48 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:08:55.525 17:52:48 -- scripts/common.sh@354 -- # echo 2 00:08:55.525 17:52:48 -- scripts/common.sh@365 -- # ver2[v]=2 00:08:55.525 17:52:48 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:08:55.525 17:52:48 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:08:55.525 17:52:48 -- scripts/common.sh@367 -- # return 0 00:08:55.525 17:52:48 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:08:55.525 17:52:48 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:08:55.525 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:55.525 --rc genhtml_branch_coverage=1 00:08:55.525 --rc genhtml_function_coverage=1 00:08:55.525 --rc genhtml_legend=1 00:08:55.525 --rc geninfo_all_blocks=1 00:08:55.525 --rc geninfo_unexecuted_blocks=1 00:08:55.525 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:55.525 ' 00:08:55.525 17:52:48 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:08:55.525 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:55.525 --rc genhtml_branch_coverage=1 00:08:55.525 --rc genhtml_function_coverage=1 00:08:55.525 --rc genhtml_legend=1 00:08:55.525 --rc geninfo_all_blocks=1 00:08:55.525 --rc geninfo_unexecuted_blocks=1 00:08:55.525 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:55.525 ' 00:08:55.525 17:52:48 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:08:55.525 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:55.525 --rc genhtml_branch_coverage=1 00:08:55.525 --rc genhtml_function_coverage=1 00:08:55.525 --rc genhtml_legend=1 00:08:55.525 --rc geninfo_all_blocks=1 00:08:55.525 --rc geninfo_unexecuted_blocks=1 00:08:55.526 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:55.526 ' 00:08:55.526 17:52:48 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:08:55.526 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:55.526 --rc genhtml_branch_coverage=1 00:08:55.526 --rc genhtml_function_coverage=1 00:08:55.526 --rc genhtml_legend=1 00:08:55.526 --rc geninfo_all_blocks=1 00:08:55.526 --rc geninfo_unexecuted_blocks=1 00:08:55.526 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:55.526 ' 00:08:55.526 17:52:48 -- vfio/run.sh@55 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/common.sh 00:08:55.526 17:52:48 -- setup/common.sh@6 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh 00:08:55.526 17:52:48 -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:08:55.526 17:52:48 -- common/autotest_common.sh@34 -- # set -e 00:08:55.526 17:52:48 -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:08:55.526 17:52:48 -- common/autotest_common.sh@36 -- # shopt -s extglob 00:08:55.526 17:52:48 -- common/autotest_common.sh@38 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh ]] 00:08:55.526 17:52:48 -- common/autotest_common.sh@39 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh 00:08:55.526 17:52:48 -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:08:55.526 17:52:48 -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:08:55.526 17:52:48 -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=n 00:08:55.526 17:52:48 -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:08:55.526 17:52:48 -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:08:55.526 17:52:48 -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:08:55.526 17:52:48 -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:08:55.526 17:52:48 -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:08:55.526 17:52:48 -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:08:55.526 17:52:48 -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:08:55.526 17:52:48 -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:08:55.526 17:52:48 -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:08:55.526 17:52:48 -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:08:55.526 17:52:48 -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:08:55.526 17:52:48 -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:08:55.526 17:52:48 -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:08:55.526 17:52:48 -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:08:55.526 17:52:48 -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:08:55.526 17:52:48 -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:08:55.526 17:52:48 -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:08:55.526 17:52:48 -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:08:55.526 17:52:48 -- common/build_config.sh@22 -- # CONFIG_CET=n 00:08:55.526 17:52:48 -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=n 00:08:55.526 17:52:48 -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:08:55.526 17:52:48 -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:08:55.526 17:52:48 -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:08:55.526 17:52:48 -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:08:55.526 17:52:48 -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:08:55.526 17:52:48 -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:08:55.526 17:52:48 -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:08:55.526 17:52:48 -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:08:55.526 17:52:48 -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:08:55.526 17:52:48 -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:08:55.526 17:52:48 -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:08:55.526 17:52:48 -- common/build_config.sh@35 -- # CONFIG_FUZZER=y 00:08:55.526 17:52:48 -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:08:55.526 17:52:48 -- common/build_config.sh@37 -- # CONFIG_CRYPTO=n 00:08:55.526 17:52:48 -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:08:55.526 17:52:48 -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:08:55.526 17:52:48 -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:08:55.526 17:52:48 -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR=//var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:08:55.526 17:52:48 -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:08:55.526 17:52:48 -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:08:55.526 17:52:48 -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:08:55.526 17:52:48 -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:08:55.526 17:52:48 -- common/build_config.sh@46 -- # CONFIG_COVERAGE=y 00:08:55.526 17:52:48 -- common/build_config.sh@47 -- # CONFIG_RDMA=y 00:08:55.526 17:52:48 -- common/build_config.sh@48 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:08:55.526 17:52:48 -- common/build_config.sh@49 -- # CONFIG_URING_PATH= 00:08:55.526 17:52:48 -- common/build_config.sh@50 -- # CONFIG_XNVME=n 00:08:55.526 17:52:48 -- common/build_config.sh@51 -- # CONFIG_VFIO_USER=y 00:08:55.526 17:52:48 -- common/build_config.sh@52 -- # CONFIG_ARCH=native 00:08:55.526 17:52:48 -- common/build_config.sh@53 -- # CONFIG_URING_ZNS=n 00:08:55.526 17:52:48 -- common/build_config.sh@54 -- # CONFIG_WERROR=y 00:08:55.526 17:52:48 -- common/build_config.sh@55 -- # CONFIG_HAVE_LIBBSD=n 00:08:55.526 17:52:48 -- common/build_config.sh@56 -- # CONFIG_UBSAN=y 00:08:55.526 17:52:48 -- common/build_config.sh@57 -- # CONFIG_IPSEC_MB_DIR= 00:08:55.526 17:52:48 -- common/build_config.sh@58 -- # CONFIG_GOLANG=n 00:08:55.526 17:52:48 -- common/build_config.sh@59 -- # CONFIG_ISAL=y 00:08:55.526 17:52:48 -- common/build_config.sh@60 -- # CONFIG_IDXD_KERNEL=y 00:08:55.526 17:52:48 -- common/build_config.sh@61 -- # CONFIG_DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:55.526 17:52:48 -- common/build_config.sh@62 -- # CONFIG_RDMA_PROV=verbs 00:08:55.526 17:52:48 -- common/build_config.sh@63 -- # CONFIG_APPS=y 00:08:55.526 17:52:48 -- common/build_config.sh@64 -- # CONFIG_SHARED=n 00:08:55.526 17:52:48 -- common/build_config.sh@65 -- # CONFIG_FC_PATH= 00:08:55.526 17:52:48 -- common/build_config.sh@66 -- # CONFIG_DPDK_PKG_CONFIG=n 00:08:55.526 17:52:48 -- common/build_config.sh@67 -- # CONFIG_FC=n 00:08:55.526 17:52:48 -- common/build_config.sh@68 -- # CONFIG_AVAHI=n 00:08:55.526 17:52:48 -- common/build_config.sh@69 -- # CONFIG_FIO_PLUGIN=y 00:08:55.526 17:52:48 -- common/build_config.sh@70 -- # CONFIG_RAID5F=n 00:08:55.526 17:52:48 -- common/build_config.sh@71 -- # CONFIG_EXAMPLES=y 00:08:55.526 17:52:48 -- common/build_config.sh@72 -- # CONFIG_TESTS=y 00:08:55.526 17:52:48 -- common/build_config.sh@73 -- # CONFIG_CRYPTO_MLX5=n 00:08:55.526 17:52:48 -- common/build_config.sh@74 -- # CONFIG_MAX_LCORES= 00:08:55.526 17:52:48 -- common/build_config.sh@75 -- # CONFIG_IPSEC_MB=n 00:08:55.526 17:52:48 -- common/build_config.sh@76 -- # CONFIG_DEBUG=y 00:08:55.526 17:52:48 -- common/build_config.sh@77 -- # CONFIG_DPDK_COMPRESSDEV=n 00:08:55.526 17:52:48 -- common/build_config.sh@78 -- # CONFIG_CROSS_PREFIX= 00:08:55.526 17:52:48 -- common/build_config.sh@79 -- # CONFIG_URING=n 00:08:55.526 17:52:48 -- common/autotest_common.sh@48 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:08:55.526 17:52:48 -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:08:55.526 17:52:48 -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:08:55.526 17:52:48 -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:08:55.526 17:52:48 -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:08:55.526 17:52:48 -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:55.526 17:52:48 -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:08:55.526 17:52:48 -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:55.526 17:52:48 -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:08:55.526 17:52:48 -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:08:55.526 17:52:48 -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:08:55.526 17:52:48 -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:08:55.526 17:52:48 -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:08:55.526 17:52:48 -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:08:55.526 17:52:48 -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/config.h ]] 00:08:55.526 17:52:48 -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:08:55.526 #define SPDK_CONFIG_H 00:08:55.526 #define SPDK_CONFIG_APPS 1 00:08:55.526 #define SPDK_CONFIG_ARCH native 00:08:55.526 #undef SPDK_CONFIG_ASAN 00:08:55.526 #undef SPDK_CONFIG_AVAHI 00:08:55.526 #undef SPDK_CONFIG_CET 00:08:55.526 #define SPDK_CONFIG_COVERAGE 1 00:08:55.526 #define SPDK_CONFIG_CROSS_PREFIX 00:08:55.526 #undef SPDK_CONFIG_CRYPTO 00:08:55.526 #undef SPDK_CONFIG_CRYPTO_MLX5 00:08:55.526 #undef SPDK_CONFIG_CUSTOMOCF 00:08:55.526 #undef SPDK_CONFIG_DAOS 00:08:55.526 #define SPDK_CONFIG_DAOS_DIR 00:08:55.527 #define SPDK_CONFIG_DEBUG 1 00:08:55.527 #undef SPDK_CONFIG_DPDK_COMPRESSDEV 00:08:55.527 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:08:55.527 #define SPDK_CONFIG_DPDK_INC_DIR //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:08:55.527 #define SPDK_CONFIG_DPDK_LIB_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:55.527 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:08:55.527 #define SPDK_CONFIG_ENV /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:08:55.527 #define SPDK_CONFIG_EXAMPLES 1 00:08:55.527 #undef SPDK_CONFIG_FC 00:08:55.527 #define SPDK_CONFIG_FC_PATH 00:08:55.527 #define SPDK_CONFIG_FIO_PLUGIN 1 00:08:55.527 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:08:55.527 #undef SPDK_CONFIG_FUSE 00:08:55.527 #define SPDK_CONFIG_FUZZER 1 00:08:55.527 #define SPDK_CONFIG_FUZZER_LIB /usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:08:55.527 #undef SPDK_CONFIG_GOLANG 00:08:55.527 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:08:55.527 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:08:55.527 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:08:55.527 #undef SPDK_CONFIG_HAVE_LIBBSD 00:08:55.527 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:08:55.527 #define SPDK_CONFIG_IDXD 1 00:08:55.527 #define SPDK_CONFIG_IDXD_KERNEL 1 00:08:55.527 #undef SPDK_CONFIG_IPSEC_MB 00:08:55.527 #define SPDK_CONFIG_IPSEC_MB_DIR 00:08:55.527 #define SPDK_CONFIG_ISAL 1 00:08:55.527 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:08:55.527 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:08:55.527 #define SPDK_CONFIG_LIBDIR 00:08:55.527 #undef SPDK_CONFIG_LTO 00:08:55.527 #define SPDK_CONFIG_MAX_LCORES 00:08:55.527 #define SPDK_CONFIG_NVME_CUSE 1 00:08:55.527 #undef SPDK_CONFIG_OCF 00:08:55.527 #define SPDK_CONFIG_OCF_PATH 00:08:55.527 #define SPDK_CONFIG_OPENSSL_PATH 00:08:55.527 #undef SPDK_CONFIG_PGO_CAPTURE 00:08:55.527 #undef SPDK_CONFIG_PGO_USE 00:08:55.527 #define SPDK_CONFIG_PREFIX /usr/local 00:08:55.527 #undef SPDK_CONFIG_RAID5F 00:08:55.527 #undef SPDK_CONFIG_RBD 00:08:55.527 #define SPDK_CONFIG_RDMA 1 00:08:55.527 #define SPDK_CONFIG_RDMA_PROV verbs 00:08:55.527 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:08:55.527 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:08:55.527 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:08:55.527 #undef SPDK_CONFIG_SHARED 00:08:55.527 #undef SPDK_CONFIG_SMA 00:08:55.527 #define SPDK_CONFIG_TESTS 1 00:08:55.527 #undef SPDK_CONFIG_TSAN 00:08:55.527 #define SPDK_CONFIG_UBLK 1 00:08:55.527 #define SPDK_CONFIG_UBSAN 1 00:08:55.527 #undef SPDK_CONFIG_UNIT_TESTS 00:08:55.527 #undef SPDK_CONFIG_URING 00:08:55.527 #define SPDK_CONFIG_URING_PATH 00:08:55.527 #undef SPDK_CONFIG_URING_ZNS 00:08:55.527 #undef SPDK_CONFIG_USDT 00:08:55.527 #undef SPDK_CONFIG_VBDEV_COMPRESS 00:08:55.527 #undef SPDK_CONFIG_VBDEV_COMPRESS_MLX5 00:08:55.527 #define SPDK_CONFIG_VFIO_USER 1 00:08:55.527 #define SPDK_CONFIG_VFIO_USER_DIR 00:08:55.527 #define SPDK_CONFIG_VHOST 1 00:08:55.527 #define SPDK_CONFIG_VIRTIO 1 00:08:55.527 #undef SPDK_CONFIG_VTUNE 00:08:55.527 #define SPDK_CONFIG_VTUNE_DIR 00:08:55.527 #define SPDK_CONFIG_WERROR 1 00:08:55.527 #define SPDK_CONFIG_WPDK_DIR 00:08:55.527 #undef SPDK_CONFIG_XNVME 00:08:55.527 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:08:55.527 17:52:48 -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:08:55.527 17:52:48 -- common/autotest_common.sh@49 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:08:55.527 17:52:48 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:08:55.527 17:52:48 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:08:55.527 17:52:48 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:08:55.527 17:52:48 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:55.527 17:52:48 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:55.527 17:52:48 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:55.527 17:52:48 -- paths/export.sh@5 -- # export PATH 00:08:55.527 17:52:48 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:55.527 17:52:48 -- common/autotest_common.sh@50 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:08:55.527 17:52:48 -- pm/common@6 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:08:55.527 17:52:48 -- pm/common@6 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:08:55.527 17:52:48 -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:08:55.527 17:52:48 -- pm/common@7 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/../../../ 00:08:55.527 17:52:48 -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:08:55.527 17:52:48 -- pm/common@16 -- # TEST_TAG=N/A 00:08:55.527 17:52:48 -- pm/common@17 -- # TEST_TAG_FILE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.run_test_name 00:08:55.527 17:52:48 -- common/autotest_common.sh@52 -- # : 1 00:08:55.527 17:52:48 -- common/autotest_common.sh@53 -- # export RUN_NIGHTLY 00:08:55.527 17:52:48 -- common/autotest_common.sh@56 -- # : 0 00:08:55.527 17:52:48 -- common/autotest_common.sh@57 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:08:55.527 17:52:48 -- common/autotest_common.sh@58 -- # : 0 00:08:55.527 17:52:48 -- common/autotest_common.sh@59 -- # export SPDK_RUN_VALGRIND 00:08:55.527 17:52:48 -- common/autotest_common.sh@60 -- # : 1 00:08:55.527 17:52:48 -- common/autotest_common.sh@61 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:08:55.527 17:52:48 -- common/autotest_common.sh@62 -- # : 0 00:08:55.527 17:52:48 -- common/autotest_common.sh@63 -- # export SPDK_TEST_UNITTEST 00:08:55.527 17:52:48 -- common/autotest_common.sh@64 -- # : 00:08:55.527 17:52:48 -- common/autotest_common.sh@65 -- # export SPDK_TEST_AUTOBUILD 00:08:55.527 17:52:48 -- common/autotest_common.sh@66 -- # : 0 00:08:55.527 17:52:48 -- common/autotest_common.sh@67 -- # export SPDK_TEST_RELEASE_BUILD 00:08:55.527 17:52:48 -- common/autotest_common.sh@68 -- # : 0 00:08:55.527 17:52:48 -- common/autotest_common.sh@69 -- # export SPDK_TEST_ISAL 00:08:55.527 17:52:48 -- common/autotest_common.sh@70 -- # : 0 00:08:55.527 17:52:48 -- common/autotest_common.sh@71 -- # export SPDK_TEST_ISCSI 00:08:55.527 17:52:48 -- common/autotest_common.sh@72 -- # : 0 00:08:55.527 17:52:48 -- common/autotest_common.sh@73 -- # export SPDK_TEST_ISCSI_INITIATOR 00:08:55.527 17:52:48 -- common/autotest_common.sh@74 -- # : 0 00:08:55.527 17:52:48 -- common/autotest_common.sh@75 -- # export SPDK_TEST_NVME 00:08:55.527 17:52:48 -- common/autotest_common.sh@76 -- # : 0 00:08:55.527 17:52:48 -- common/autotest_common.sh@77 -- # export SPDK_TEST_NVME_PMR 00:08:55.527 17:52:48 -- common/autotest_common.sh@78 -- # : 0 00:08:55.527 17:52:48 -- common/autotest_common.sh@79 -- # export SPDK_TEST_NVME_BP 00:08:55.527 17:52:48 -- common/autotest_common.sh@80 -- # : 0 00:08:55.527 17:52:48 -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME_CLI 00:08:55.527 17:52:48 -- common/autotest_common.sh@82 -- # : 0 00:08:55.527 17:52:48 -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_CUSE 00:08:55.527 17:52:48 -- common/autotest_common.sh@84 -- # : 0 00:08:55.527 17:52:48 -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_FDP 00:08:55.527 17:52:48 -- common/autotest_common.sh@86 -- # : 0 00:08:55.527 17:52:48 -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVMF 00:08:55.527 17:52:48 -- common/autotest_common.sh@88 -- # : 0 00:08:55.527 17:52:48 -- common/autotest_common.sh@89 -- # export SPDK_TEST_VFIOUSER 00:08:55.527 17:52:48 -- common/autotest_common.sh@90 -- # : 0 00:08:55.527 17:52:48 -- common/autotest_common.sh@91 -- # export SPDK_TEST_VFIOUSER_QEMU 00:08:55.527 17:52:48 -- common/autotest_common.sh@92 -- # : 1 00:08:55.527 17:52:48 -- common/autotest_common.sh@93 -- # export SPDK_TEST_FUZZER 00:08:55.527 17:52:48 -- common/autotest_common.sh@94 -- # : 1 00:08:55.527 17:52:48 -- common/autotest_common.sh@95 -- # export SPDK_TEST_FUZZER_SHORT 00:08:55.527 17:52:48 -- common/autotest_common.sh@96 -- # : rdma 00:08:55.527 17:52:48 -- common/autotest_common.sh@97 -- # export SPDK_TEST_NVMF_TRANSPORT 00:08:55.527 17:52:48 -- common/autotest_common.sh@98 -- # : 0 00:08:55.527 17:52:48 -- common/autotest_common.sh@99 -- # export SPDK_TEST_RBD 00:08:55.527 17:52:48 -- common/autotest_common.sh@100 -- # : 0 00:08:55.527 17:52:48 -- common/autotest_common.sh@101 -- # export SPDK_TEST_VHOST 00:08:55.527 17:52:48 -- common/autotest_common.sh@102 -- # : 0 00:08:55.527 17:52:48 -- common/autotest_common.sh@103 -- # export SPDK_TEST_BLOCKDEV 00:08:55.527 17:52:48 -- common/autotest_common.sh@104 -- # : 0 00:08:55.527 17:52:48 -- common/autotest_common.sh@105 -- # export SPDK_TEST_IOAT 00:08:55.527 17:52:48 -- common/autotest_common.sh@106 -- # : 0 00:08:55.527 17:52:48 -- common/autotest_common.sh@107 -- # export SPDK_TEST_BLOBFS 00:08:55.527 17:52:48 -- common/autotest_common.sh@108 -- # : 0 00:08:55.527 17:52:48 -- common/autotest_common.sh@109 -- # export SPDK_TEST_VHOST_INIT 00:08:55.527 17:52:48 -- common/autotest_common.sh@110 -- # : 0 00:08:55.527 17:52:48 -- common/autotest_common.sh@111 -- # export SPDK_TEST_LVOL 00:08:55.528 17:52:48 -- common/autotest_common.sh@112 -- # : 0 00:08:55.528 17:52:48 -- common/autotest_common.sh@113 -- # export SPDK_TEST_VBDEV_COMPRESS 00:08:55.528 17:52:48 -- common/autotest_common.sh@114 -- # : 0 00:08:55.528 17:52:48 -- common/autotest_common.sh@115 -- # export SPDK_RUN_ASAN 00:08:55.528 17:52:48 -- common/autotest_common.sh@116 -- # : 1 00:08:55.528 17:52:48 -- common/autotest_common.sh@117 -- # export SPDK_RUN_UBSAN 00:08:55.528 17:52:48 -- common/autotest_common.sh@118 -- # : /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:08:55.528 17:52:48 -- common/autotest_common.sh@119 -- # export SPDK_RUN_EXTERNAL_DPDK 00:08:55.528 17:52:48 -- common/autotest_common.sh@120 -- # : 0 00:08:55.528 17:52:48 -- common/autotest_common.sh@121 -- # export SPDK_RUN_NON_ROOT 00:08:55.528 17:52:48 -- common/autotest_common.sh@122 -- # : 0 00:08:55.528 17:52:48 -- common/autotest_common.sh@123 -- # export SPDK_TEST_CRYPTO 00:08:55.528 17:52:48 -- common/autotest_common.sh@124 -- # : 0 00:08:55.528 17:52:48 -- common/autotest_common.sh@125 -- # export SPDK_TEST_FTL 00:08:55.528 17:52:48 -- common/autotest_common.sh@126 -- # : 0 00:08:55.528 17:52:48 -- common/autotest_common.sh@127 -- # export SPDK_TEST_OCF 00:08:55.528 17:52:48 -- common/autotest_common.sh@128 -- # : 0 00:08:55.528 17:52:48 -- common/autotest_common.sh@129 -- # export SPDK_TEST_VMD 00:08:55.528 17:52:48 -- common/autotest_common.sh@130 -- # : 0 00:08:55.528 17:52:48 -- common/autotest_common.sh@131 -- # export SPDK_TEST_OPAL 00:08:55.528 17:52:48 -- common/autotest_common.sh@132 -- # : v22.11.4 00:08:55.528 17:52:48 -- common/autotest_common.sh@133 -- # export SPDK_TEST_NATIVE_DPDK 00:08:55.528 17:52:48 -- common/autotest_common.sh@134 -- # : true 00:08:55.528 17:52:48 -- common/autotest_common.sh@135 -- # export SPDK_AUTOTEST_X 00:08:55.528 17:52:48 -- common/autotest_common.sh@136 -- # : 0 00:08:55.528 17:52:48 -- common/autotest_common.sh@137 -- # export SPDK_TEST_RAID5 00:08:55.528 17:52:48 -- common/autotest_common.sh@138 -- # : 0 00:08:55.528 17:52:48 -- common/autotest_common.sh@139 -- # export SPDK_TEST_URING 00:08:55.528 17:52:48 -- common/autotest_common.sh@140 -- # : 0 00:08:55.528 17:52:48 -- common/autotest_common.sh@141 -- # export SPDK_TEST_USDT 00:08:55.528 17:52:48 -- common/autotest_common.sh@142 -- # : 0 00:08:55.528 17:52:48 -- common/autotest_common.sh@143 -- # export SPDK_TEST_USE_IGB_UIO 00:08:55.528 17:52:48 -- common/autotest_common.sh@144 -- # : 0 00:08:55.528 17:52:48 -- common/autotest_common.sh@145 -- # export SPDK_TEST_SCHEDULER 00:08:55.528 17:52:48 -- common/autotest_common.sh@146 -- # : 0 00:08:55.528 17:52:48 -- common/autotest_common.sh@147 -- # export SPDK_TEST_SCANBUILD 00:08:55.528 17:52:48 -- common/autotest_common.sh@148 -- # : 00:08:55.528 17:52:48 -- common/autotest_common.sh@149 -- # export SPDK_TEST_NVMF_NICS 00:08:55.528 17:52:48 -- common/autotest_common.sh@150 -- # : 0 00:08:55.528 17:52:48 -- common/autotest_common.sh@151 -- # export SPDK_TEST_SMA 00:08:55.528 17:52:48 -- common/autotest_common.sh@152 -- # : 0 00:08:55.528 17:52:48 -- common/autotest_common.sh@153 -- # export SPDK_TEST_DAOS 00:08:55.528 17:52:48 -- common/autotest_common.sh@154 -- # : 0 00:08:55.528 17:52:48 -- common/autotest_common.sh@155 -- # export SPDK_TEST_XNVME 00:08:55.528 17:52:48 -- common/autotest_common.sh@156 -- # : 0 00:08:55.528 17:52:48 -- common/autotest_common.sh@157 -- # export SPDK_TEST_ACCEL_DSA 00:08:55.528 17:52:48 -- common/autotest_common.sh@158 -- # : 0 00:08:55.528 17:52:48 -- common/autotest_common.sh@159 -- # export SPDK_TEST_ACCEL_IAA 00:08:55.528 17:52:48 -- common/autotest_common.sh@160 -- # : 0 00:08:55.528 17:52:48 -- common/autotest_common.sh@161 -- # export SPDK_TEST_ACCEL_IOAT 00:08:55.528 17:52:48 -- common/autotest_common.sh@163 -- # : 00:08:55.528 17:52:48 -- common/autotest_common.sh@164 -- # export SPDK_TEST_FUZZER_TARGET 00:08:55.528 17:52:48 -- common/autotest_common.sh@165 -- # : 0 00:08:55.528 17:52:48 -- common/autotest_common.sh@166 -- # export SPDK_TEST_NVMF_MDNS 00:08:55.528 17:52:48 -- common/autotest_common.sh@167 -- # : 0 00:08:55.528 17:52:48 -- common/autotest_common.sh@168 -- # export SPDK_JSONRPC_GO_CLIENT 00:08:55.528 17:52:48 -- common/autotest_common.sh@171 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:08:55.528 17:52:48 -- common/autotest_common.sh@171 -- # SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:08:55.528 17:52:48 -- common/autotest_common.sh@172 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:55.528 17:52:48 -- common/autotest_common.sh@172 -- # DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:55.528 17:52:48 -- common/autotest_common.sh@173 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:55.528 17:52:48 -- common/autotest_common.sh@173 -- # VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:55.528 17:52:48 -- common/autotest_common.sh@174 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:55.528 17:52:48 -- common/autotest_common.sh@174 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:55.528 17:52:48 -- common/autotest_common.sh@177 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:08:55.528 17:52:48 -- common/autotest_common.sh@177 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:08:55.528 17:52:48 -- common/autotest_common.sh@181 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:08:55.528 17:52:48 -- common/autotest_common.sh@181 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:08:55.528 17:52:48 -- common/autotest_common.sh@185 -- # export PYTHONDONTWRITEBYTECODE=1 00:08:55.528 17:52:48 -- common/autotest_common.sh@185 -- # PYTHONDONTWRITEBYTECODE=1 00:08:55.528 17:52:48 -- common/autotest_common.sh@189 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:08:55.528 17:52:48 -- common/autotest_common.sh@189 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:08:55.528 17:52:48 -- common/autotest_common.sh@190 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:08:55.528 17:52:48 -- common/autotest_common.sh@190 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:08:55.528 17:52:48 -- common/autotest_common.sh@194 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:08:55.528 17:52:48 -- common/autotest_common.sh@195 -- # rm -rf /var/tmp/asan_suppression_file 00:08:55.528 17:52:48 -- common/autotest_common.sh@196 -- # cat 00:08:55.528 17:52:48 -- common/autotest_common.sh@222 -- # echo leak:libfuse3.so 00:08:55.528 17:52:48 -- common/autotest_common.sh@224 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:08:55.528 17:52:48 -- common/autotest_common.sh@224 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:08:55.528 17:52:48 -- common/autotest_common.sh@226 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:08:55.528 17:52:48 -- common/autotest_common.sh@226 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:08:55.528 17:52:48 -- common/autotest_common.sh@228 -- # '[' -z /var/spdk/dependencies ']' 00:08:55.528 17:52:48 -- common/autotest_common.sh@231 -- # export DEPENDENCY_DIR 00:08:55.528 17:52:48 -- common/autotest_common.sh@235 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:55.528 17:52:48 -- common/autotest_common.sh@235 -- # SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:55.528 17:52:48 -- common/autotest_common.sh@236 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:55.528 17:52:48 -- common/autotest_common.sh@236 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:55.528 17:52:48 -- common/autotest_common.sh@239 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:08:55.528 17:52:48 -- common/autotest_common.sh@239 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:08:55.528 17:52:48 -- common/autotest_common.sh@240 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:08:55.528 17:52:48 -- common/autotest_common.sh@240 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:08:55.528 17:52:48 -- common/autotest_common.sh@242 -- # export AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:08:55.528 17:52:48 -- common/autotest_common.sh@242 -- # AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:08:55.528 17:52:48 -- common/autotest_common.sh@245 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:08:55.528 17:52:48 -- common/autotest_common.sh@245 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:08:55.528 17:52:48 -- common/autotest_common.sh@247 -- # _LCOV_MAIN=0 00:08:55.528 17:52:48 -- common/autotest_common.sh@248 -- # _LCOV_LLVM=1 00:08:55.528 17:52:48 -- common/autotest_common.sh@249 -- # _LCOV= 00:08:55.528 17:52:48 -- common/autotest_common.sh@250 -- # [[ '' == *clang* ]] 00:08:55.528 17:52:48 -- common/autotest_common.sh@250 -- # [[ 1 -eq 1 ]] 00:08:55.528 17:52:48 -- common/autotest_common.sh@250 -- # _LCOV=1 00:08:55.529 17:52:48 -- common/autotest_common.sh@252 -- # _lcov_opt[_LCOV_LLVM]='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:08:55.529 17:52:48 -- common/autotest_common.sh@253 -- # _lcov_opt[_LCOV_MAIN]= 00:08:55.529 17:52:48 -- common/autotest_common.sh@255 -- # lcov_opt='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:08:55.529 17:52:48 -- common/autotest_common.sh@258 -- # '[' 0 -eq 0 ']' 00:08:55.529 17:52:48 -- common/autotest_common.sh@259 -- # export valgrind= 00:08:55.529 17:52:48 -- common/autotest_common.sh@259 -- # valgrind= 00:08:55.529 17:52:48 -- common/autotest_common.sh@265 -- # uname -s 00:08:55.529 17:52:48 -- common/autotest_common.sh@265 -- # '[' Linux = Linux ']' 00:08:55.529 17:52:48 -- common/autotest_common.sh@266 -- # HUGEMEM=4096 00:08:55.529 17:52:48 -- common/autotest_common.sh@267 -- # export CLEAR_HUGE=yes 00:08:55.529 17:52:48 -- common/autotest_common.sh@267 -- # CLEAR_HUGE=yes 00:08:55.529 17:52:48 -- common/autotest_common.sh@268 -- # [[ 0 -eq 1 ]] 00:08:55.529 17:52:48 -- common/autotest_common.sh@268 -- # [[ 0 -eq 1 ]] 00:08:55.529 17:52:48 -- common/autotest_common.sh@275 -- # MAKE=make 00:08:55.529 17:52:48 -- common/autotest_common.sh@276 -- # MAKEFLAGS=-j112 00:08:55.529 17:52:48 -- common/autotest_common.sh@292 -- # export HUGEMEM=4096 00:08:55.529 17:52:48 -- common/autotest_common.sh@292 -- # HUGEMEM=4096 00:08:55.529 17:52:48 -- common/autotest_common.sh@294 -- # '[' -z /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output ']' 00:08:55.529 17:52:48 -- common/autotest_common.sh@299 -- # NO_HUGE=() 00:08:55.529 17:52:48 -- common/autotest_common.sh@300 -- # TEST_MODE= 00:08:55.529 17:52:48 -- common/autotest_common.sh@319 -- # [[ -z 645187 ]] 00:08:55.529 17:52:48 -- common/autotest_common.sh@319 -- # kill -0 645187 00:08:55.529 17:52:48 -- common/autotest_common.sh@1675 -- # set_test_storage 2147483648 00:08:55.529 17:52:48 -- common/autotest_common.sh@329 -- # [[ -v testdir ]] 00:08:55.529 17:52:48 -- common/autotest_common.sh@331 -- # local requested_size=2147483648 00:08:55.529 17:52:48 -- common/autotest_common.sh@332 -- # local mount target_dir 00:08:55.529 17:52:48 -- common/autotest_common.sh@334 -- # local -A mounts fss sizes avails uses 00:08:55.529 17:52:48 -- common/autotest_common.sh@335 -- # local source fs size avail mount use 00:08:55.529 17:52:48 -- common/autotest_common.sh@337 -- # local storage_fallback storage_candidates 00:08:55.529 17:52:48 -- common/autotest_common.sh@339 -- # mktemp -udt spdk.XXXXXX 00:08:55.529 17:52:48 -- common/autotest_common.sh@339 -- # storage_fallback=/tmp/spdk.wrc11V 00:08:55.529 17:52:48 -- common/autotest_common.sh@344 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:08:55.529 17:52:48 -- common/autotest_common.sh@346 -- # [[ -n '' ]] 00:08:55.529 17:52:48 -- common/autotest_common.sh@351 -- # [[ -n '' ]] 00:08:55.529 17:52:48 -- common/autotest_common.sh@356 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio /tmp/spdk.wrc11V/tests/vfio /tmp/spdk.wrc11V 00:08:55.529 17:52:48 -- common/autotest_common.sh@359 -- # requested_size=2214592512 00:08:55.529 17:52:48 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:08:55.529 17:52:48 -- common/autotest_common.sh@328 -- # df -T 00:08:55.529 17:52:48 -- common/autotest_common.sh@328 -- # grep -v Filesystem 00:08:55.529 17:52:48 -- common/autotest_common.sh@362 -- # mounts["$mount"]=spdk_devtmpfs 00:08:55.529 17:52:48 -- common/autotest_common.sh@362 -- # fss["$mount"]=devtmpfs 00:08:55.529 17:52:48 -- common/autotest_common.sh@363 -- # avails["$mount"]=67108864 00:08:55.529 17:52:48 -- common/autotest_common.sh@363 -- # sizes["$mount"]=67108864 00:08:55.529 17:52:48 -- common/autotest_common.sh@364 -- # uses["$mount"]=0 00:08:55.529 17:52:48 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:08:55.529 17:52:48 -- common/autotest_common.sh@362 -- # mounts["$mount"]=/dev/pmem0 00:08:55.529 17:52:48 -- common/autotest_common.sh@362 -- # fss["$mount"]=ext2 00:08:55.529 17:52:48 -- common/autotest_common.sh@363 -- # avails["$mount"]=4096 00:08:55.529 17:52:48 -- common/autotest_common.sh@363 -- # sizes["$mount"]=5284429824 00:08:55.529 17:52:48 -- common/autotest_common.sh@364 -- # uses["$mount"]=5284425728 00:08:55.529 17:52:48 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:08:55.529 17:52:48 -- common/autotest_common.sh@362 -- # mounts["$mount"]=spdk_root 00:08:55.529 17:52:48 -- common/autotest_common.sh@362 -- # fss["$mount"]=overlay 00:08:55.529 17:52:48 -- common/autotest_common.sh@363 -- # avails["$mount"]=53087490048 00:08:55.529 17:52:48 -- common/autotest_common.sh@363 -- # sizes["$mount"]=61730607104 00:08:55.529 17:52:48 -- common/autotest_common.sh@364 -- # uses["$mount"]=8643117056 00:08:55.529 17:52:48 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:08:55.529 17:52:48 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:08:55.529 17:52:48 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:08:55.529 17:52:48 -- common/autotest_common.sh@363 -- # avails["$mount"]=30864044032 00:08:55.529 17:52:48 -- common/autotest_common.sh@363 -- # sizes["$mount"]=30865301504 00:08:55.529 17:52:48 -- common/autotest_common.sh@364 -- # uses["$mount"]=1257472 00:08:55.529 17:52:48 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:08:55.529 17:52:48 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:08:55.529 17:52:48 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:08:55.529 17:52:48 -- common/autotest_common.sh@363 -- # avails["$mount"]=12340121600 00:08:55.529 17:52:48 -- common/autotest_common.sh@363 -- # sizes["$mount"]=12346122240 00:08:55.529 17:52:48 -- common/autotest_common.sh@364 -- # uses["$mount"]=6000640 00:08:55.529 17:52:48 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:08:55.529 17:52:48 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:08:55.529 17:52:48 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:08:55.529 17:52:48 -- common/autotest_common.sh@363 -- # avails["$mount"]=30864982016 00:08:55.529 17:52:48 -- common/autotest_common.sh@363 -- # sizes["$mount"]=30865305600 00:08:55.529 17:52:48 -- common/autotest_common.sh@364 -- # uses["$mount"]=323584 00:08:55.529 17:52:48 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:08:55.529 17:52:48 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:08:55.529 17:52:48 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:08:55.529 17:52:48 -- common/autotest_common.sh@363 -- # avails["$mount"]=6173044736 00:08:55.529 17:52:48 -- common/autotest_common.sh@363 -- # sizes["$mount"]=6173057024 00:08:55.529 17:52:48 -- common/autotest_common.sh@364 -- # uses["$mount"]=12288 00:08:55.529 17:52:48 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:08:55.529 17:52:48 -- common/autotest_common.sh@367 -- # printf '* Looking for test storage...\n' 00:08:55.529 * Looking for test storage... 00:08:55.529 17:52:48 -- common/autotest_common.sh@369 -- # local target_space new_size 00:08:55.529 17:52:48 -- common/autotest_common.sh@370 -- # for target_dir in "${storage_candidates[@]}" 00:08:55.529 17:52:48 -- common/autotest_common.sh@373 -- # df /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:55.529 17:52:48 -- common/autotest_common.sh@373 -- # awk '$1 !~ /Filesystem/{print $6}' 00:08:55.529 17:52:48 -- common/autotest_common.sh@373 -- # mount=/ 00:08:55.529 17:52:48 -- common/autotest_common.sh@375 -- # target_space=53087490048 00:08:55.529 17:52:48 -- common/autotest_common.sh@376 -- # (( target_space == 0 || target_space < requested_size )) 00:08:55.529 17:52:48 -- common/autotest_common.sh@379 -- # (( target_space >= requested_size )) 00:08:55.529 17:52:48 -- common/autotest_common.sh@381 -- # [[ overlay == tmpfs ]] 00:08:55.529 17:52:48 -- common/autotest_common.sh@381 -- # [[ overlay == ramfs ]] 00:08:55.529 17:52:48 -- common/autotest_common.sh@381 -- # [[ / == / ]] 00:08:55.529 17:52:48 -- common/autotest_common.sh@382 -- # new_size=10857709568 00:08:55.529 17:52:48 -- common/autotest_common.sh@383 -- # (( new_size * 100 / sizes[/] > 95 )) 00:08:55.529 17:52:48 -- common/autotest_common.sh@388 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:55.529 17:52:48 -- common/autotest_common.sh@388 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:55.529 17:52:48 -- common/autotest_common.sh@389 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:55.529 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:55.529 17:52:48 -- common/autotest_common.sh@390 -- # return 0 00:08:55.529 17:52:48 -- common/autotest_common.sh@1677 -- # set -o errtrace 00:08:55.529 17:52:48 -- common/autotest_common.sh@1678 -- # shopt -s extdebug 00:08:55.529 17:52:48 -- common/autotest_common.sh@1679 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:08:55.529 17:52:48 -- common/autotest_common.sh@1681 -- # PS4=' \t -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:08:55.529 17:52:48 -- common/autotest_common.sh@1682 -- # true 00:08:55.529 17:52:48 -- common/autotest_common.sh@1684 -- # xtrace_fd 00:08:55.529 17:52:48 -- common/autotest_common.sh@25 -- # [[ -n 14 ]] 00:08:55.529 17:52:48 -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/14 ]] 00:08:55.529 17:52:48 -- common/autotest_common.sh@27 -- # exec 00:08:55.529 17:52:48 -- common/autotest_common.sh@29 -- # exec 00:08:55.529 17:52:48 -- common/autotest_common.sh@31 -- # xtrace_restore 00:08:55.529 17:52:48 -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:08:55.529 17:52:48 -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:08:55.529 17:52:48 -- common/autotest_common.sh@18 -- # set -x 00:08:55.529 17:52:48 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:08:55.529 17:52:48 -- common/autotest_common.sh@1690 -- # lcov --version 00:08:55.529 17:52:48 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:08:55.529 17:52:48 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:08:55.529 17:52:48 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:08:55.529 17:52:48 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:08:55.529 17:52:48 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:08:55.529 17:52:48 -- scripts/common.sh@335 -- # IFS=.-: 00:08:55.529 17:52:48 -- scripts/common.sh@335 -- # read -ra ver1 00:08:55.529 17:52:48 -- scripts/common.sh@336 -- # IFS=.-: 00:08:55.529 17:52:48 -- scripts/common.sh@336 -- # read -ra ver2 00:08:55.530 17:52:48 -- scripts/common.sh@337 -- # local 'op=<' 00:08:55.530 17:52:48 -- scripts/common.sh@339 -- # ver1_l=2 00:08:55.530 17:52:48 -- scripts/common.sh@340 -- # ver2_l=1 00:08:55.530 17:52:48 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:08:55.530 17:52:48 -- scripts/common.sh@343 -- # case "$op" in 00:08:55.530 17:52:48 -- scripts/common.sh@344 -- # : 1 00:08:55.530 17:52:48 -- scripts/common.sh@363 -- # (( v = 0 )) 00:08:55.530 17:52:48 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:08:55.530 17:52:48 -- scripts/common.sh@364 -- # decimal 1 00:08:55.530 17:52:48 -- scripts/common.sh@352 -- # local d=1 00:08:55.530 17:52:48 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:08:55.530 17:52:48 -- scripts/common.sh@354 -- # echo 1 00:08:55.530 17:52:48 -- scripts/common.sh@364 -- # ver1[v]=1 00:08:55.530 17:52:48 -- scripts/common.sh@365 -- # decimal 2 00:08:55.530 17:52:48 -- scripts/common.sh@352 -- # local d=2 00:08:55.530 17:52:48 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:08:55.530 17:52:48 -- scripts/common.sh@354 -- # echo 2 00:08:55.530 17:52:48 -- scripts/common.sh@365 -- # ver2[v]=2 00:08:55.530 17:52:48 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:08:55.530 17:52:48 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:08:55.530 17:52:48 -- scripts/common.sh@367 -- # return 0 00:08:55.530 17:52:48 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:08:55.530 17:52:48 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:08:55.530 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:55.530 --rc genhtml_branch_coverage=1 00:08:55.530 --rc genhtml_function_coverage=1 00:08:55.530 --rc genhtml_legend=1 00:08:55.530 --rc geninfo_all_blocks=1 00:08:55.530 --rc geninfo_unexecuted_blocks=1 00:08:55.530 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:55.530 ' 00:08:55.530 17:52:48 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:08:55.530 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:55.530 --rc genhtml_branch_coverage=1 00:08:55.530 --rc genhtml_function_coverage=1 00:08:55.530 --rc genhtml_legend=1 00:08:55.530 --rc geninfo_all_blocks=1 00:08:55.530 --rc geninfo_unexecuted_blocks=1 00:08:55.530 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:55.530 ' 00:08:55.530 17:52:48 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:08:55.530 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:55.530 --rc genhtml_branch_coverage=1 00:08:55.530 --rc genhtml_function_coverage=1 00:08:55.530 --rc genhtml_legend=1 00:08:55.530 --rc geninfo_all_blocks=1 00:08:55.530 --rc geninfo_unexecuted_blocks=1 00:08:55.530 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:55.530 ' 00:08:55.530 17:52:48 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:08:55.530 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:55.530 --rc genhtml_branch_coverage=1 00:08:55.530 --rc genhtml_function_coverage=1 00:08:55.530 --rc genhtml_legend=1 00:08:55.530 --rc geninfo_all_blocks=1 00:08:55.530 --rc geninfo_unexecuted_blocks=1 00:08:55.530 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:55.530 ' 00:08:55.530 17:52:48 -- vfio/run.sh@56 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/../common.sh 00:08:55.530 17:52:48 -- ../common.sh@8 -- # pids=() 00:08:55.530 17:52:48 -- vfio/run.sh@58 -- # fuzzfile=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c 00:08:55.530 17:52:48 -- vfio/run.sh@59 -- # grep -c '\.fn =' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c 00:08:55.530 17:52:48 -- vfio/run.sh@59 -- # fuzz_num=7 00:08:55.530 17:52:48 -- vfio/run.sh@60 -- # (( fuzz_num != 0 )) 00:08:55.530 17:52:48 -- vfio/run.sh@62 -- # trap 'cleanup /tmp/vfio-user-*; exit 1' SIGINT SIGTERM EXIT 00:08:55.530 17:52:48 -- vfio/run.sh@65 -- # mem_size=0 00:08:55.530 17:52:48 -- vfio/run.sh@66 -- # [[ 1 -eq 1 ]] 00:08:55.530 17:52:48 -- vfio/run.sh@67 -- # start_llvm_fuzz_short 7 1 00:08:55.530 17:52:48 -- ../common.sh@69 -- # local fuzz_num=7 00:08:55.530 17:52:48 -- ../common.sh@70 -- # local time=1 00:08:55.530 17:52:48 -- ../common.sh@72 -- # (( i = 0 )) 00:08:55.530 17:52:48 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:55.530 17:52:48 -- ../common.sh@73 -- # start_llvm_fuzz 0 1 0x1 00:08:55.530 17:52:48 -- vfio/run.sh@22 -- # local fuzzer_type=0 00:08:55.530 17:52:48 -- vfio/run.sh@23 -- # local timen=1 00:08:55.530 17:52:48 -- vfio/run.sh@24 -- # local core=0x1 00:08:55.530 17:52:48 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:55.530 17:52:48 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-0 00:08:55.530 17:52:48 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-0/domain/1 00:08:55.789 17:52:48 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-0/domain/2 00:08:55.789 17:52:48 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-0/fuzz_vfio_json.conf 00:08:55.789 17:52:48 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-0 /tmp/vfio-user-0/domain/1 /tmp/vfio-user-0/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:55.789 17:52:48 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-0/domain/1%; 00:08:55.789 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-0/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:55.789 17:52:48 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-0/domain/1 -c /tmp/vfio-user-0/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 -Y /tmp/vfio-user-0/domain/2 -r /tmp/vfio-user-0/spdk0.sock -Z 0 00:08:55.789 [2024-11-19 17:52:48.409247] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:55.789 [2024-11-19 17:52:48.409302] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid645410 ] 00:08:55.789 EAL: No free 2048 kB hugepages reported on node 1 00:08:55.789 [2024-11-19 17:52:48.476951] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:55.789 [2024-11-19 17:52:48.512924] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:55.789 [2024-11-19 17:52:48.513090] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:56.048 INFO: Running with entropic power schedule (0xFF, 100). 00:08:56.048 INFO: Seed: 678803484 00:08:56.049 INFO: Loaded 1 modules (341841 inline 8-bit counters): 341841 [0x263b5cc, 0x268ed1d), 00:08:56.049 INFO: Loaded 1 PC tables (341841 PCs): 341841 [0x268ed20,0x2bc6230), 00:08:56.049 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:56.049 INFO: A corpus is not provided, starting from an empty corpus 00:08:56.049 #2 INITED exec/s: 0 rss: 59Mb 00:08:56.049 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:56.049 This may also happen if the target rejected all inputs we tried so far 00:08:56.309 NEW_FUNC[1/631]: 0x450dd8 in fuzz_vfio_user_region_rw /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:85 00:08:56.309 NEW_FUNC[2/631]: 0x456978 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:56.309 #10 NEW cov: 10762 ft: 10419 corp: 2/36b lim: 60 exec/s: 0 rss: 66Mb L: 35/35 MS: 3 InsertByte-ChangeBinInt-InsertRepeatedBytes- 00:08:56.568 #11 NEW cov: 10779 ft: 13595 corp: 3/72b lim: 60 exec/s: 0 rss: 67Mb L: 36/36 MS: 1 InsertByte- 00:08:56.568 #17 NEW cov: 10779 ft: 14227 corp: 4/108b lim: 60 exec/s: 0 rss: 68Mb L: 36/36 MS: 1 CopyPart- 00:08:56.828 #18 NEW cov: 10779 ft: 15361 corp: 5/121b lim: 60 exec/s: 0 rss: 68Mb L: 13/36 MS: 1 InsertRepeatedBytes- 00:08:56.828 NEW_FUNC[1/1]: 0x19341e8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:56.828 #19 NEW cov: 10796 ft: 15835 corp: 6/157b lim: 60 exec/s: 0 rss: 68Mb L: 36/36 MS: 1 ChangeBinInt- 00:08:57.087 #20 NEW cov: 10796 ft: 15883 corp: 7/170b lim: 60 exec/s: 20 rss: 68Mb L: 13/36 MS: 1 ChangeByte- 00:08:57.087 #21 NEW cov: 10796 ft: 16014 corp: 8/206b lim: 60 exec/s: 21 rss: 68Mb L: 36/36 MS: 1 InsertByte- 00:08:57.347 #22 NEW cov: 10796 ft: 16622 corp: 9/242b lim: 60 exec/s: 22 rss: 68Mb L: 36/36 MS: 1 ChangeBinInt- 00:08:57.347 #23 NEW cov: 10796 ft: 16911 corp: 10/282b lim: 60 exec/s: 23 rss: 68Mb L: 40/40 MS: 1 CMP- DE: "\001\000\000\000"- 00:08:57.605 #24 NEW cov: 10796 ft: 17103 corp: 11/322b lim: 60 exec/s: 24 rss: 69Mb L: 40/40 MS: 1 CopyPart- 00:08:57.605 #25 NEW cov: 10796 ft: 17147 corp: 12/358b lim: 60 exec/s: 25 rss: 69Mb L: 36/40 MS: 1 PersAutoDict- DE: "\001\000\000\000"- 00:08:57.605 #26 NEW cov: 10796 ft: 17260 corp: 13/397b lim: 60 exec/s: 26 rss: 69Mb L: 39/40 MS: 1 PersAutoDict- DE: "\001\000\000\000"- 00:08:57.865 #27 NEW cov: 10803 ft: 17495 corp: 14/433b lim: 60 exec/s: 27 rss: 69Mb L: 36/40 MS: 1 ChangeBit- 00:08:57.865 #28 NEW cov: 10803 ft: 17707 corp: 15/469b lim: 60 exec/s: 28 rss: 69Mb L: 36/40 MS: 1 CrossOver- 00:08:58.125 #29 NEW cov: 10803 ft: 17791 corp: 16/509b lim: 60 exec/s: 14 rss: 69Mb L: 40/40 MS: 1 ChangeBit- 00:08:58.125 #29 DONE cov: 10803 ft: 17791 corp: 16/509b lim: 60 exec/s: 14 rss: 69Mb 00:08:58.125 ###### Recommended dictionary. ###### 00:08:58.125 "\001\000\000\000" # Uses: 2 00:08:58.125 ###### End of recommended dictionary. ###### 00:08:58.125 Done 29 runs in 2 second(s) 00:08:58.384 17:52:51 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-0 00:08:58.384 17:52:51 -- ../common.sh@72 -- # (( i++ )) 00:08:58.384 17:52:51 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:58.384 17:52:51 -- ../common.sh@73 -- # start_llvm_fuzz 1 1 0x1 00:08:58.384 17:52:51 -- vfio/run.sh@22 -- # local fuzzer_type=1 00:08:58.384 17:52:51 -- vfio/run.sh@23 -- # local timen=1 00:08:58.384 17:52:51 -- vfio/run.sh@24 -- # local core=0x1 00:08:58.384 17:52:51 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:08:58.384 17:52:51 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-1 00:08:58.384 17:52:51 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-1/domain/1 00:08:58.384 17:52:51 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-1/domain/2 00:08:58.384 17:52:51 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-1/fuzz_vfio_json.conf 00:08:58.384 17:52:51 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-1 /tmp/vfio-user-1/domain/1 /tmp/vfio-user-1/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:08:58.384 17:52:51 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-1/domain/1%; 00:08:58.384 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-1/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:58.385 17:52:51 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-1/domain/1 -c /tmp/vfio-user-1/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 -Y /tmp/vfio-user-1/domain/2 -r /tmp/vfio-user-1/spdk1.sock -Z 1 00:08:58.385 [2024-11-19 17:52:51.091174] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:58.385 [2024-11-19 17:52:51.091243] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid645789 ] 00:08:58.385 EAL: No free 2048 kB hugepages reported on node 1 00:08:58.385 [2024-11-19 17:52:51.162279] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:58.385 [2024-11-19 17:52:51.198774] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:58.385 [2024-11-19 17:52:51.198937] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:58.644 INFO: Running with entropic power schedule (0xFF, 100). 00:08:58.644 INFO: Seed: 3363791832 00:08:58.644 INFO: Loaded 1 modules (341841 inline 8-bit counters): 341841 [0x263b5cc, 0x268ed1d), 00:08:58.644 INFO: Loaded 1 PC tables (341841 PCs): 341841 [0x268ed20,0x2bc6230), 00:08:58.644 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:08:58.644 INFO: A corpus is not provided, starting from an empty corpus 00:08:58.644 #2 INITED exec/s: 0 rss: 60Mb 00:08:58.644 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:58.645 This may also happen if the target rejected all inputs we tried so far 00:08:58.645 [2024-11-19 17:52:51.492631] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:58.645 [2024-11-19 17:52:51.492663] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:58.645 [2024-11-19 17:52:51.492681] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:59.164 NEW_FUNC[1/634]: 0x451378 in fuzz_vfio_user_version /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:72 00:08:59.164 NEW_FUNC[2/634]: 0x456978 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:59.164 #3 NEW cov: 10767 ft: 10702 corp: 2/9b lim: 40 exec/s: 0 rss: 65Mb L: 8/8 MS: 1 InsertRepeatedBytes- 00:08:59.164 [2024-11-19 17:52:51.939566] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:59.164 [2024-11-19 17:52:51.939605] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:59.164 [2024-11-19 17:52:51.939625] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:59.423 #7 NEW cov: 10781 ft: 14130 corp: 3/19b lim: 40 exec/s: 0 rss: 67Mb L: 10/10 MS: 4 InsertByte-EraseBytes-CopyPart-InsertRepeatedBytes- 00:08:59.423 [2024-11-19 17:52:52.120809] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:59.424 [2024-11-19 17:52:52.120832] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:59.424 [2024-11-19 17:52:52.120849] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:59.424 NEW_FUNC[1/1]: 0x19341e8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:59.424 #8 NEW cov: 10798 ft: 15808 corp: 4/27b lim: 40 exec/s: 0 rss: 68Mb L: 8/10 MS: 1 ChangeBinInt- 00:08:59.683 [2024-11-19 17:52:52.291834] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:59.683 [2024-11-19 17:52:52.291857] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:59.683 [2024-11-19 17:52:52.291874] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:59.683 #9 NEW cov: 10798 ft: 16272 corp: 5/36b lim: 40 exec/s: 0 rss: 68Mb L: 9/10 MS: 1 EraseBytes- 00:08:59.683 [2024-11-19 17:52:52.466144] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:59.683 [2024-11-19 17:52:52.466166] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:59.683 [2024-11-19 17:52:52.466184] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:59.942 #18 NEW cov: 10798 ft: 16538 corp: 6/47b lim: 40 exec/s: 18 rss: 70Mb L: 11/11 MS: 4 ChangeBit-InsertByte-ChangeByte-InsertRepeatedBytes- 00:08:59.942 [2024-11-19 17:52:52.646635] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:59.942 [2024-11-19 17:52:52.646657] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:59.942 [2024-11-19 17:52:52.646675] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:59.942 #27 NEW cov: 10798 ft: 16660 corp: 7/66b lim: 40 exec/s: 27 rss: 70Mb L: 19/19 MS: 4 ChangeBinInt-ShuffleBytes-ChangeBinInt-InsertRepeatedBytes- 00:09:00.201 [2024-11-19 17:52:52.818237] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:09:00.201 [2024-11-19 17:52:52.818260] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:09:00.201 [2024-11-19 17:52:52.818278] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:09:00.201 #28 NEW cov: 10798 ft: 16769 corp: 8/76b lim: 40 exec/s: 28 rss: 70Mb L: 10/19 MS: 1 ShuffleBytes- 00:09:00.201 [2024-11-19 17:52:52.989668] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:09:00.201 [2024-11-19 17:52:52.989690] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:09:00.201 [2024-11-19 17:52:52.989726] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:09:00.460 #29 NEW cov: 10798 ft: 16870 corp: 9/95b lim: 40 exec/s: 29 rss: 70Mb L: 19/19 MS: 1 ChangeBinInt- 00:09:00.460 [2024-11-19 17:52:53.158859] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:09:00.460 [2024-11-19 17:52:53.158882] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:09:00.460 [2024-11-19 17:52:53.158899] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:09:00.460 #30 NEW cov: 10805 ft: 17018 corp: 10/103b lim: 40 exec/s: 30 rss: 70Mb L: 8/19 MS: 1 ShuffleBytes- 00:09:00.720 [2024-11-19 17:52:53.329159] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:09:00.720 [2024-11-19 17:52:53.329182] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:09:00.720 [2024-11-19 17:52:53.329200] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:09:00.720 #33 NEW cov: 10805 ft: 17159 corp: 11/124b lim: 40 exec/s: 16 rss: 70Mb L: 21/21 MS: 3 CopyPart-CopyPart-CrossOver- 00:09:00.720 #33 DONE cov: 10805 ft: 17159 corp: 11/124b lim: 40 exec/s: 16 rss: 70Mb 00:09:00.720 Done 33 runs in 2 second(s) 00:09:00.980 17:52:53 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-1 00:09:00.980 17:52:53 -- ../common.sh@72 -- # (( i++ )) 00:09:00.980 17:52:53 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:00.980 17:52:53 -- ../common.sh@73 -- # start_llvm_fuzz 2 1 0x1 00:09:00.980 17:52:53 -- vfio/run.sh@22 -- # local fuzzer_type=2 00:09:00.980 17:52:53 -- vfio/run.sh@23 -- # local timen=1 00:09:00.980 17:52:53 -- vfio/run.sh@24 -- # local core=0x1 00:09:00.980 17:52:53 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:09:00.980 17:52:53 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-2 00:09:00.980 17:52:53 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-2/domain/1 00:09:00.980 17:52:53 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-2/domain/2 00:09:00.980 17:52:53 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-2/fuzz_vfio_json.conf 00:09:00.980 17:52:53 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-2 /tmp/vfio-user-2/domain/1 /tmp/vfio-user-2/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:09:00.980 17:52:53 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-2/domain/1%; 00:09:00.980 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-2/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:09:00.980 17:52:53 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-2/domain/1 -c /tmp/vfio-user-2/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 -Y /tmp/vfio-user-2/domain/2 -r /tmp/vfio-user-2/spdk2.sock -Z 2 00:09:00.980 [2024-11-19 17:52:53.729079] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:09:00.980 [2024-11-19 17:52:53.729173] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid646333 ] 00:09:00.980 EAL: No free 2048 kB hugepages reported on node 1 00:09:00.980 [2024-11-19 17:52:53.799760] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:00.980 [2024-11-19 17:52:53.835323] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:09:00.980 [2024-11-19 17:52:53.835467] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:01.239 INFO: Running with entropic power schedule (0xFF, 100). 00:09:01.239 INFO: Seed: 1705817128 00:09:01.239 INFO: Loaded 1 modules (341841 inline 8-bit counters): 341841 [0x263b5cc, 0x268ed1d), 00:09:01.239 INFO: Loaded 1 PC tables (341841 PCs): 341841 [0x268ed20,0x2bc6230), 00:09:01.239 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:09:01.239 INFO: A corpus is not provided, starting from an empty corpus 00:09:01.239 #2 INITED exec/s: 0 rss: 60Mb 00:09:01.239 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:01.240 This may also happen if the target rejected all inputs we tried so far 00:09:01.499 [2024-11-19 17:52:54.119616] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-2/domain/1: msg0: no payload for cmd5 00:09:01.499 [2024-11-19 17:52:54.119663] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 5 return failure 00:09:01.759 NEW_FUNC[1/634]: 0x451d68 in fuzz_vfio_user_get_region_info /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:104 00:09:01.759 NEW_FUNC[2/634]: 0x456978 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:09:01.759 #11 NEW cov: 10755 ft: 10556 corp: 2/27b lim: 80 exec/s: 0 rss: 65Mb L: 26/26 MS: 4 ChangeByte-ShuffleBytes-ChangeBit-InsertRepeatedBytes- 00:09:01.759 [2024-11-19 17:52:54.562042] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-2/domain/1: msg0: no payload for cmd5 00:09:01.759 [2024-11-19 17:52:54.562083] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 5 return failure 00:09:02.018 #17 NEW cov: 10774 ft: 14622 corp: 3/47b lim: 80 exec/s: 0 rss: 67Mb L: 20/26 MS: 1 EraseBytes- 00:09:02.018 [2024-11-19 17:52:54.734489] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-2/domain/1: msg0: no payload for cmd5 00:09:02.018 [2024-11-19 17:52:54.734518] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 5 return failure 00:09:02.018 #18 NEW cov: 10777 ft: 15337 corp: 4/84b lim: 80 exec/s: 0 rss: 68Mb L: 37/37 MS: 1 InsertRepeatedBytes- 00:09:02.278 [2024-11-19 17:52:54.905117] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-2/domain/1: msg0: no payload for cmd5 00:09:02.278 [2024-11-19 17:52:54.905145] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 5 return failure 00:09:02.278 NEW_FUNC[1/1]: 0x19341e8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:09:02.278 #19 NEW cov: 10794 ft: 15762 corp: 5/120b lim: 80 exec/s: 0 rss: 68Mb L: 36/37 MS: 1 CopyPart- 00:09:02.278 [2024-11-19 17:52:55.077164] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-2/domain/1: msg0: no payload for cmd5 00:09:02.278 [2024-11-19 17:52:55.077194] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 5 return failure 00:09:02.538 #20 NEW cov: 10794 ft: 16323 corp: 6/146b lim: 80 exec/s: 20 rss: 68Mb L: 26/37 MS: 1 ShuffleBytes- 00:09:02.538 [2024-11-19 17:52:55.248501] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-2/domain/1: msg0: no payload for cmd5 00:09:02.538 [2024-11-19 17:52:55.248530] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 5 return failure 00:09:02.538 #21 NEW cov: 10794 ft: 16595 corp: 7/224b lim: 80 exec/s: 21 rss: 68Mb L: 78/78 MS: 1 InsertRepeatedBytes- 00:09:02.798 [2024-11-19 17:52:55.419929] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-2/domain/1: msg0: no payload for cmd5 00:09:02.798 [2024-11-19 17:52:55.419959] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 5 return failure 00:09:02.798 #23 NEW cov: 10794 ft: 17417 corp: 8/254b lim: 80 exec/s: 23 rss: 68Mb L: 30/78 MS: 2 ShuffleBytes-CrossOver- 00:09:02.798 [2024-11-19 17:52:55.602107] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-2/domain/1: msg0: no payload for cmd5 00:09:02.798 [2024-11-19 17:52:55.602135] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 5 return failure 00:09:03.057 #24 NEW cov: 10794 ft: 17533 corp: 9/313b lim: 80 exec/s: 24 rss: 68Mb L: 59/78 MS: 1 EraseBytes- 00:09:03.057 [2024-11-19 17:52:55.774492] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-2/domain/1: msg0: no payload for cmd5 00:09:03.057 [2024-11-19 17:52:55.774522] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 5 return failure 00:09:03.057 #25 NEW cov: 10801 ft: 17571 corp: 10/333b lim: 80 exec/s: 25 rss: 68Mb L: 20/78 MS: 1 CrossOver- 00:09:03.317 [2024-11-19 17:52:55.946372] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-2/domain/1: msg0: no payload for cmd5 00:09:03.317 [2024-11-19 17:52:55.946402] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 5 return failure 00:09:03.317 #26 NEW cov: 10801 ft: 17635 corp: 11/370b lim: 80 exec/s: 13 rss: 68Mb L: 37/78 MS: 1 CopyPart- 00:09:03.317 #26 DONE cov: 10801 ft: 17635 corp: 11/370b lim: 80 exec/s: 13 rss: 68Mb 00:09:03.317 Done 26 runs in 2 second(s) 00:09:03.577 17:52:56 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-2 00:09:03.577 17:52:56 -- ../common.sh@72 -- # (( i++ )) 00:09:03.577 17:52:56 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:03.577 17:52:56 -- ../common.sh@73 -- # start_llvm_fuzz 3 1 0x1 00:09:03.577 17:52:56 -- vfio/run.sh@22 -- # local fuzzer_type=3 00:09:03.577 17:52:56 -- vfio/run.sh@23 -- # local timen=1 00:09:03.577 17:52:56 -- vfio/run.sh@24 -- # local core=0x1 00:09:03.577 17:52:56 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:09:03.577 17:52:56 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-3 00:09:03.577 17:52:56 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-3/domain/1 00:09:03.577 17:52:56 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-3/domain/2 00:09:03.577 17:52:56 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-3/fuzz_vfio_json.conf 00:09:03.577 17:52:56 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-3 /tmp/vfio-user-3/domain/1 /tmp/vfio-user-3/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:09:03.577 17:52:56 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-3/domain/1%; 00:09:03.577 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-3/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:09:03.577 17:52:56 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-3/domain/1 -c /tmp/vfio-user-3/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 -Y /tmp/vfio-user-3/domain/2 -r /tmp/vfio-user-3/spdk3.sock -Z 3 00:09:03.577 [2024-11-19 17:52:56.350044] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:09:03.577 [2024-11-19 17:52:56.350114] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid646876 ] 00:09:03.577 EAL: No free 2048 kB hugepages reported on node 1 00:09:03.577 [2024-11-19 17:52:56.420663] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:03.837 [2024-11-19 17:52:56.456593] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:09:03.837 [2024-11-19 17:52:56.456767] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:03.837 INFO: Running with entropic power schedule (0xFF, 100). 00:09:03.837 INFO: Seed: 28849926 00:09:03.837 INFO: Loaded 1 modules (341841 inline 8-bit counters): 341841 [0x263b5cc, 0x268ed1d), 00:09:03.837 INFO: Loaded 1 PC tables (341841 PCs): 341841 [0x268ed20,0x2bc6230), 00:09:03.837 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:09:03.837 INFO: A corpus is not provided, starting from an empty corpus 00:09:03.837 #2 INITED exec/s: 0 rss: 60Mb 00:09:03.837 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:03.837 This may also happen if the target rejected all inputs we tried so far 00:09:04.097 [2024-11-19 17:52:56.712643] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [(nil), (nil)) fd=323 offset=0 prot=0x3: Invalid argument 00:09:04.097 [2024-11-19 17:52:56.712677] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:09:04.097 [2024-11-19 17:52:56.712688] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Invalid argument 00:09:04.097 [2024-11-19 17:52:56.712720] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:09:04.356 NEW_FUNC[1/638]: 0x452458 in fuzz_vfio_user_dma_map /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:125 00:09:04.356 NEW_FUNC[2/638]: 0x456978 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:09:04.356 #10 NEW cov: 10777 ft: 10676 corp: 2/114b lim: 320 exec/s: 0 rss: 65Mb L: 113/113 MS: 3 ShuffleBytes-CopyPart-InsertRepeatedBytes- 00:09:04.356 [2024-11-19 17:52:57.114632] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0 prot=0x3: Invalid argument 00:09:04.356 [2024-11-19 17:52:57.114666] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:09:04.356 [2024-11-19 17:52:57.114682] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Invalid argument 00:09:04.356 [2024-11-19 17:52:57.114700] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:09:04.356 #16 NEW cov: 10794 ft: 13268 corp: 3/227b lim: 320 exec/s: 0 rss: 67Mb L: 113/113 MS: 1 CrossOver- 00:09:04.616 [2024-11-19 17:52:57.229500] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0 prot=0x3: Invalid argument 00:09:04.616 [2024-11-19 17:52:57.229527] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:09:04.616 [2024-11-19 17:52:57.229537] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Invalid argument 00:09:04.616 [2024-11-19 17:52:57.229570] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:09:04.616 #17 NEW cov: 10794 ft: 14534 corp: 4/341b lim: 320 exec/s: 0 rss: 68Mb L: 114/114 MS: 1 InsertByte- 00:09:04.616 [2024-11-19 17:52:57.344497] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0 prot=0x3: Invalid argument 00:09:04.616 [2024-11-19 17:52:57.344523] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:09:04.616 [2024-11-19 17:52:57.344535] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Invalid argument 00:09:04.616 [2024-11-19 17:52:57.344569] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:09:04.616 #18 NEW cov: 10794 ft: 14743 corp: 5/455b lim: 320 exec/s: 0 rss: 68Mb L: 114/114 MS: 1 ChangeByte- 00:09:04.616 [2024-11-19 17:52:57.460421] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0xdc00000000 prot=0x3: Invalid argument 00:09:04.616 [2024-11-19 17:52:57.460448] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0, 0) offset=0xdc00000000 flags=0x3: Invalid argument 00:09:04.616 [2024-11-19 17:52:57.460458] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Invalid argument 00:09:04.616 [2024-11-19 17:52:57.460476] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:09:04.876 NEW_FUNC[1/1]: 0x19341e8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:09:04.876 #19 NEW cov: 10811 ft: 15567 corp: 6/570b lim: 320 exec/s: 0 rss: 68Mb L: 115/115 MS: 1 InsertByte- 00:09:04.876 [2024-11-19 17:52:57.575393] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0 prot=0x3: Invalid argument 00:09:04.876 [2024-11-19 17:52:57.575419] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:09:04.876 [2024-11-19 17:52:57.575429] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Invalid argument 00:09:04.876 [2024-11-19 17:52:57.575449] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:09:04.876 #20 NEW cov: 10811 ft: 15683 corp: 7/684b lim: 320 exec/s: 0 rss: 68Mb L: 114/115 MS: 1 ChangeBinInt- 00:09:04.876 [2024-11-19 17:52:57.690202] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0xdc00000000 prot=0x3: Invalid argument 00:09:04.876 [2024-11-19 17:52:57.690228] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0, 0) offset=0xdc00000000 flags=0x3: Invalid argument 00:09:04.876 [2024-11-19 17:52:57.690239] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Invalid argument 00:09:04.876 [2024-11-19 17:52:57.690257] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:09:05.136 #31 NEW cov: 10811 ft: 15724 corp: 8/922b lim: 320 exec/s: 31 rss: 68Mb L: 238/238 MS: 1 InsertRepeatedBytes- 00:09:05.136 [2024-11-19 17:52:57.805144] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0 prot=0x3: Invalid argument 00:09:05.136 [2024-11-19 17:52:57.805174] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:09:05.136 [2024-11-19 17:52:57.805185] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Invalid argument 00:09:05.136 [2024-11-19 17:52:57.805217] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:09:05.136 #32 NEW cov: 10811 ft: 15749 corp: 9/1035b lim: 320 exec/s: 32 rss: 68Mb L: 113/238 MS: 1 CopyPart- 00:09:05.136 [2024-11-19 17:52:57.919968] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0 prot=0x3: Invalid argument 00:09:05.136 [2024-11-19 17:52:57.919993] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:09:05.136 [2024-11-19 17:52:57.920003] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Invalid argument 00:09:05.136 [2024-11-19 17:52:57.920021] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:09:05.136 #33 NEW cov: 10811 ft: 16058 corp: 10/1148b lim: 320 exec/s: 33 rss: 68Mb L: 113/238 MS: 1 ChangeBinInt- 00:09:05.396 [2024-11-19 17:52:58.034887] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0 prot=0x3: Invalid argument 00:09:05.396 [2024-11-19 17:52:58.034911] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:09:05.396 [2024-11-19 17:52:58.034922] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Invalid argument 00:09:05.396 [2024-11-19 17:52:58.034955] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:09:05.396 #34 NEW cov: 10811 ft: 16107 corp: 11/1268b lim: 320 exec/s: 34 rss: 68Mb L: 120/238 MS: 1 CrossOver- 00:09:05.396 [2024-11-19 17:52:58.149905] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0 prot=0x3: Invalid argument 00:09:05.396 [2024-11-19 17:52:58.149929] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:09:05.396 [2024-11-19 17:52:58.149939] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Invalid argument 00:09:05.396 [2024-11-19 17:52:58.149972] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:09:05.396 #35 NEW cov: 10811 ft: 16528 corp: 12/1381b lim: 320 exec/s: 35 rss: 68Mb L: 113/238 MS: 1 ChangeBit- 00:09:05.655 [2024-11-19 17:52:58.264695] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0 prot=0x3: Invalid argument 00:09:05.655 [2024-11-19 17:52:58.264719] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:09:05.655 [2024-11-19 17:52:58.264730] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Invalid argument 00:09:05.655 [2024-11-19 17:52:58.264748] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:09:05.655 #37 NEW cov: 10811 ft: 16800 corp: 13/1502b lim: 320 exec/s: 37 rss: 69Mb L: 121/238 MS: 2 ChangeBit-CrossOver- 00:09:05.655 [2024-11-19 17:52:58.389492] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0xdc00000000 prot=0x3: Invalid argument 00:09:05.655 [2024-11-19 17:52:58.389515] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0, 0) offset=0xdc00000000 flags=0x3: Invalid argument 00:09:05.655 [2024-11-19 17:52:58.389526] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Invalid argument 00:09:05.655 [2024-11-19 17:52:58.389558] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:09:05.655 #38 NEW cov: 10811 ft: 17288 corp: 14/1688b lim: 320 exec/s: 38 rss: 69Mb L: 186/238 MS: 1 InsertRepeatedBytes- 00:09:05.655 [2024-11-19 17:52:58.504375] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0 prot=0x3: Invalid argument 00:09:05.655 [2024-11-19 17:52:58.504401] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:09:05.655 [2024-11-19 17:52:58.504414] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Invalid argument 00:09:05.655 [2024-11-19 17:52:58.504449] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:09:05.915 #39 NEW cov: 10811 ft: 17370 corp: 15/1838b lim: 320 exec/s: 39 rss: 69Mb L: 150/238 MS: 1 CopyPart- 00:09:05.915 [2024-11-19 17:52:58.618353] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0xdc00000000 prot=0x3: Invalid argument 00:09:05.915 [2024-11-19 17:52:58.618378] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0, 0) offset=0xdc00000000 flags=0x3: Invalid argument 00:09:05.915 [2024-11-19 17:52:58.618388] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Invalid argument 00:09:05.915 [2024-11-19 17:52:58.618421] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:09:05.915 #40 NEW cov: 10811 ft: 17446 corp: 16/2024b lim: 320 exec/s: 20 rss: 69Mb L: 186/238 MS: 1 ShuffleBytes- 00:09:05.915 #40 DONE cov: 10811 ft: 17446 corp: 16/2024b lim: 320 exec/s: 20 rss: 69Mb 00:09:05.915 Done 40 runs in 2 second(s) 00:09:06.175 17:52:58 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-3 00:09:06.175 17:52:58 -- ../common.sh@72 -- # (( i++ )) 00:09:06.175 17:52:58 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:06.175 17:52:58 -- ../common.sh@73 -- # start_llvm_fuzz 4 1 0x1 00:09:06.175 17:52:58 -- vfio/run.sh@22 -- # local fuzzer_type=4 00:09:06.175 17:52:58 -- vfio/run.sh@23 -- # local timen=1 00:09:06.175 17:52:58 -- vfio/run.sh@24 -- # local core=0x1 00:09:06.175 17:52:58 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:09:06.175 17:52:58 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-4 00:09:06.175 17:52:58 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-4/domain/1 00:09:06.175 17:52:58 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-4/domain/2 00:09:06.175 17:52:58 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-4/fuzz_vfio_json.conf 00:09:06.175 17:52:58 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-4 /tmp/vfio-user-4/domain/1 /tmp/vfio-user-4/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:09:06.175 17:52:58 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-4/domain/1%; 00:09:06.175 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-4/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:09:06.175 17:52:58 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-4/domain/1 -c /tmp/vfio-user-4/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 -Y /tmp/vfio-user-4/domain/2 -r /tmp/vfio-user-4/spdk4.sock -Z 4 00:09:06.175 [2024-11-19 17:52:58.988494] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:09:06.175 [2024-11-19 17:52:58.988563] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid647203 ] 00:09:06.175 EAL: No free 2048 kB hugepages reported on node 1 00:09:06.434 [2024-11-19 17:52:59.059070] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:06.434 [2024-11-19 17:52:59.095206] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:09:06.434 [2024-11-19 17:52:59.095354] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:06.434 INFO: Running with entropic power schedule (0xFF, 100). 00:09:06.434 INFO: Seed: 2674218532 00:09:06.434 INFO: Loaded 1 modules (341841 inline 8-bit counters): 341841 [0x263b5cc, 0x268ed1d), 00:09:06.434 INFO: Loaded 1 PC tables (341841 PCs): 341841 [0x268ed20,0x2bc6230), 00:09:06.434 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:09:06.434 INFO: A corpus is not provided, starting from an empty corpus 00:09:06.434 #2 INITED exec/s: 0 rss: 60Mb 00:09:06.434 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:06.434 This may also happen if the target rejected all inputs we tried so far 00:09:06.952 NEW_FUNC[1/629]: 0x452cd8 in fuzz_vfio_user_dma_unmap /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:145 00:09:06.952 NEW_FUNC[2/629]: 0x456978 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:09:06.952 #22 NEW cov: 10727 ft: 10706 corp: 2/52b lim: 320 exec/s: 0 rss: 66Mb L: 51/51 MS: 5 ChangeByte-CopyPart-InsertByte-ChangeBinInt-InsertRepeatedBytes- 00:09:07.211 NEW_FUNC[1/3]: 0x168eb18 in _nvme_qpair_complete_abort_queued_reqs /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_qpair.c:593 00:09:07.211 NEW_FUNC[2/3]: 0x1690048 in spdk_nvme_qpair_process_completions /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_qpair.c:757 00:09:07.211 #23 NEW cov: 10764 ft: 13884 corp: 3/84b lim: 320 exec/s: 0 rss: 67Mb L: 32/51 MS: 1 EraseBytes- 00:09:07.211 #29 NEW cov: 10764 ft: 14433 corp: 4/135b lim: 320 exec/s: 0 rss: 68Mb L: 51/51 MS: 1 ShuffleBytes- 00:09:07.470 #31 NEW cov: 10764 ft: 15195 corp: 5/182b lim: 320 exec/s: 0 rss: 68Mb L: 47/51 MS: 2 InsertByte-InsertRepeatedBytes- 00:09:07.470 NEW_FUNC[1/1]: 0x19341e8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:09:07.470 #47 NEW cov: 10781 ft: 15703 corp: 6/259b lim: 320 exec/s: 0 rss: 68Mb L: 77/77 MS: 1 InsertRepeatedBytes- 00:09:07.471 #48 NEW cov: 10781 ft: 15755 corp: 7/292b lim: 320 exec/s: 48 rss: 68Mb L: 33/77 MS: 1 InsertByte- 00:09:07.730 #49 NEW cov: 10781 ft: 16015 corp: 8/324b lim: 320 exec/s: 49 rss: 68Mb L: 32/77 MS: 1 ChangeBit- 00:09:07.730 #50 NEW cov: 10781 ft: 16196 corp: 9/478b lim: 320 exec/s: 50 rss: 68Mb L: 154/154 MS: 1 CrossOver- 00:09:07.989 #51 NEW cov: 10781 ft: 16236 corp: 10/530b lim: 320 exec/s: 51 rss: 68Mb L: 52/154 MS: 1 InsertByte- 00:09:07.989 #52 NEW cov: 10781 ft: 16498 corp: 11/562b lim: 320 exec/s: 52 rss: 68Mb L: 32/154 MS: 1 ShuffleBytes- 00:09:08.248 #53 NEW cov: 10781 ft: 16919 corp: 12/594b lim: 320 exec/s: 53 rss: 69Mb L: 32/154 MS: 1 ChangeBit- 00:09:08.248 #54 NEW cov: 10781 ft: 17026 corp: 13/641b lim: 320 exec/s: 54 rss: 69Mb L: 47/154 MS: 1 EraseBytes- 00:09:08.248 #55 NEW cov: 10781 ft: 17053 corp: 14/714b lim: 320 exec/s: 55 rss: 69Mb L: 73/154 MS: 1 InsertRepeatedBytes- 00:09:08.508 #56 NEW cov: 10788 ft: 17074 corp: 15/765b lim: 320 exec/s: 56 rss: 69Mb L: 51/154 MS: 1 ChangeBit- 00:09:08.508 [2024-11-19 17:53:01.261413] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: DMA region size 16927600444109941482 > max 8796093022208 00:09:08.508 [2024-11-19 17:53:01.261452] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to add DMA region [0xeaeaeaeaeaeaeaea, 0xd5d5d5d5d5d5d5d4) offset=0xeaeaeaeaeaeaeaea flags=0x3: No space left on device 00:09:08.508 [2024-11-19 17:53:01.261464] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 2 failed: No space left on device 00:09:08.508 [2024-11-19 17:53:01.261500] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:09:08.508 [2024-11-19 17:53:01.262419] vfio_user.c:3094:vfio_user_log: *WARNING*: /tmp/vfio-user-4/domain/1: failed to remove DMA region [0xeaeaeaeaeaeaeaea, 0xd5d5d5d5d5d5d5d4) flags=0: No such file or directory 00:09:08.508 [2024-11-19 17:53:01.262443] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 3 failed: No such file or directory 00:09:08.508 [2024-11-19 17:53:01.262461] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 3 return failure 00:09:08.508 NEW_FUNC[1/6]: 0x1330b08 in endpoint_id /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/vfio_user.c:638 00:09:08.508 NEW_FUNC[2/6]: 0x1330da8 in vfio_user_log /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/vfio_user.c:3084 00:09:08.508 #57 NEW cov: 10822 ft: 17507 corp: 16/816b lim: 320 exec/s: 28 rss: 69Mb L: 51/154 MS: 1 ChangeBinInt- 00:09:08.508 #57 DONE cov: 10822 ft: 17507 corp: 16/816b lim: 320 exec/s: 28 rss: 69Mb 00:09:08.508 Done 57 runs in 2 second(s) 00:09:08.768 17:53:01 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-4 00:09:08.768 17:53:01 -- ../common.sh@72 -- # (( i++ )) 00:09:08.768 17:53:01 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:08.768 17:53:01 -- ../common.sh@73 -- # start_llvm_fuzz 5 1 0x1 00:09:08.768 17:53:01 -- vfio/run.sh@22 -- # local fuzzer_type=5 00:09:08.768 17:53:01 -- vfio/run.sh@23 -- # local timen=1 00:09:08.768 17:53:01 -- vfio/run.sh@24 -- # local core=0x1 00:09:08.768 17:53:01 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:09:08.768 17:53:01 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-5 00:09:08.768 17:53:01 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-5/domain/1 00:09:08.768 17:53:01 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-5/domain/2 00:09:08.768 17:53:01 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-5/fuzz_vfio_json.conf 00:09:08.768 17:53:01 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-5 /tmp/vfio-user-5/domain/1 /tmp/vfio-user-5/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:09:08.768 17:53:01 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-5/domain/1%; 00:09:08.768 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-5/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:09:08.768 17:53:01 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-5/domain/1 -c /tmp/vfio-user-5/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 -Y /tmp/vfio-user-5/domain/2 -r /tmp/vfio-user-5/spdk5.sock -Z 5 00:09:09.027 [2024-11-19 17:53:01.636729] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:09:09.027 [2024-11-19 17:53:01.636820] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid647730 ] 00:09:09.027 EAL: No free 2048 kB hugepages reported on node 1 00:09:09.027 [2024-11-19 17:53:01.707065] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:09.027 [2024-11-19 17:53:01.742942] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:09:09.027 [2024-11-19 17:53:01.743104] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:09.287 INFO: Running with entropic power schedule (0xFF, 100). 00:09:09.287 INFO: Seed: 1019915949 00:09:09.287 INFO: Loaded 1 modules (341841 inline 8-bit counters): 341841 [0x263b5cc, 0x268ed1d), 00:09:09.287 INFO: Loaded 1 PC tables (341841 PCs): 341841 [0x268ed20,0x2bc6230), 00:09:09.287 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:09:09.287 INFO: A corpus is not provided, starting from an empty corpus 00:09:09.287 #2 INITED exec/s: 0 rss: 60Mb 00:09:09.287 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:09.287 This may also happen if the target rejected all inputs we tried so far 00:09:09.287 [2024-11-19 17:53:02.029633] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:09.287 [2024-11-19 17:53:02.029678] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:09.805 NEW_FUNC[1/638]: 0x4536d8 in fuzz_vfio_user_irq_set /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:172 00:09:09.805 NEW_FUNC[2/638]: 0x456978 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:09:09.805 #13 NEW cov: 10782 ft: 10742 corp: 2/119b lim: 120 exec/s: 0 rss: 65Mb L: 118/118 MS: 1 InsertRepeatedBytes- 00:09:09.805 [2024-11-19 17:53:02.496392] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:09.805 [2024-11-19 17:53:02.496433] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:09.805 #14 NEW cov: 10798 ft: 13733 corp: 3/237b lim: 120 exec/s: 0 rss: 66Mb L: 118/118 MS: 1 ChangeBit- 00:09:10.064 [2024-11-19 17:53:02.675099] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:10.064 [2024-11-19 17:53:02.675134] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:10.064 NEW_FUNC[1/1]: 0x19341e8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:09:10.064 #15 NEW cov: 10815 ft: 14956 corp: 4/355b lim: 120 exec/s: 0 rss: 67Mb L: 118/118 MS: 1 ChangeBit- 00:09:10.064 [2024-11-19 17:53:02.856339] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:10.064 [2024-11-19 17:53:02.856369] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:10.323 #16 NEW cov: 10815 ft: 15853 corp: 5/473b lim: 120 exec/s: 16 rss: 67Mb L: 118/118 MS: 1 ShuffleBytes- 00:09:10.323 [2024-11-19 17:53:03.039819] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:10.323 [2024-11-19 17:53:03.039849] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:10.323 #17 NEW cov: 10815 ft: 16266 corp: 6/591b lim: 120 exec/s: 17 rss: 67Mb L: 118/118 MS: 1 ChangeBit- 00:09:10.582 [2024-11-19 17:53:03.223547] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:10.582 [2024-11-19 17:53:03.223577] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:10.582 #18 NEW cov: 10815 ft: 16750 corp: 7/605b lim: 120 exec/s: 18 rss: 67Mb L: 14/118 MS: 1 CrossOver- 00:09:10.582 [2024-11-19 17:53:03.408281] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:10.582 [2024-11-19 17:53:03.408311] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:10.840 #19 NEW cov: 10815 ft: 16945 corp: 8/723b lim: 120 exec/s: 19 rss: 67Mb L: 118/118 MS: 1 ChangeBit- 00:09:10.840 [2024-11-19 17:53:03.590592] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:10.840 [2024-11-19 17:53:03.590630] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:10.840 #20 NEW cov: 10815 ft: 17062 corp: 9/843b lim: 120 exec/s: 20 rss: 67Mb L: 120/120 MS: 1 CopyPart- 00:09:11.099 [2024-11-19 17:53:03.773419] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:11.099 [2024-11-19 17:53:03.773450] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:11.099 #21 NEW cov: 10822 ft: 17329 corp: 10/953b lim: 120 exec/s: 21 rss: 67Mb L: 110/120 MS: 1 InsertRepeatedBytes- 00:09:11.099 [2024-11-19 17:53:03.957400] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:11.099 [2024-11-19 17:53:03.957431] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:11.359 #22 NEW cov: 10822 ft: 17418 corp: 11/1073b lim: 120 exec/s: 11 rss: 68Mb L: 120/120 MS: 1 CrossOver- 00:09:11.359 #22 DONE cov: 10822 ft: 17418 corp: 11/1073b lim: 120 exec/s: 11 rss: 68Mb 00:09:11.359 Done 22 runs in 2 second(s) 00:09:11.618 17:53:04 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-5 00:09:11.618 17:53:04 -- ../common.sh@72 -- # (( i++ )) 00:09:11.618 17:53:04 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:11.618 17:53:04 -- ../common.sh@73 -- # start_llvm_fuzz 6 1 0x1 00:09:11.618 17:53:04 -- vfio/run.sh@22 -- # local fuzzer_type=6 00:09:11.618 17:53:04 -- vfio/run.sh@23 -- # local timen=1 00:09:11.618 17:53:04 -- vfio/run.sh@24 -- # local core=0x1 00:09:11.618 17:53:04 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:09:11.618 17:53:04 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-6 00:09:11.618 17:53:04 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-6/domain/1 00:09:11.618 17:53:04 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-6/domain/2 00:09:11.618 17:53:04 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-6/fuzz_vfio_json.conf 00:09:11.618 17:53:04 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-6 /tmp/vfio-user-6/domain/1 /tmp/vfio-user-6/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:09:11.618 17:53:04 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-6/domain/1%; 00:09:11.618 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-6/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:09:11.618 17:53:04 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-6/domain/1 -c /tmp/vfio-user-6/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 -Y /tmp/vfio-user-6/domain/2 -r /tmp/vfio-user-6/spdk6.sock -Z 6 00:09:11.618 [2024-11-19 17:53:04.382220] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:09:11.618 [2024-11-19 17:53:04.382304] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid648273 ] 00:09:11.618 EAL: No free 2048 kB hugepages reported on node 1 00:09:11.618 [2024-11-19 17:53:04.452801] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:11.878 [2024-11-19 17:53:04.489286] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:09:11.878 [2024-11-19 17:53:04.489435] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:11.878 INFO: Running with entropic power schedule (0xFF, 100). 00:09:11.878 INFO: Seed: 3765883437 00:09:11.878 INFO: Loaded 1 modules (341841 inline 8-bit counters): 341841 [0x263b5cc, 0x268ed1d), 00:09:11.878 INFO: Loaded 1 PC tables (341841 PCs): 341841 [0x268ed20,0x2bc6230), 00:09:11.878 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:09:11.878 INFO: A corpus is not provided, starting from an empty corpus 00:09:11.878 #2 INITED exec/s: 0 rss: 60Mb 00:09:11.878 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:11.878 This may also happen if the target rejected all inputs we tried so far 00:09:12.137 [2024-11-19 17:53:04.776634] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:12.137 [2024-11-19 17:53:04.776675] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:12.397 NEW_FUNC[1/638]: 0x4543c8 in fuzz_vfio_user_set_msix /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:190 00:09:12.397 NEW_FUNC[2/638]: 0x456978 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:09:12.397 #12 NEW cov: 10776 ft: 10712 corp: 2/80b lim: 90 exec/s: 0 rss: 65Mb L: 79/79 MS: 5 ChangeBinInt-CrossOver-ChangeByte-ChangeByte-InsertRepeatedBytes- 00:09:12.397 [2024-11-19 17:53:05.227700] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:12.397 [2024-11-19 17:53:05.227741] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:12.657 #13 NEW cov: 10790 ft: 13422 corp: 3/159b lim: 90 exec/s: 0 rss: 67Mb L: 79/79 MS: 1 CopyPart- 00:09:12.657 [2024-11-19 17:53:05.404919] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:12.657 [2024-11-19 17:53:05.404952] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:12.657 NEW_FUNC[1/1]: 0x19341e8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:09:12.657 #19 NEW cov: 10807 ft: 14375 corp: 4/199b lim: 90 exec/s: 0 rss: 68Mb L: 40/79 MS: 1 EraseBytes- 00:09:12.916 [2024-11-19 17:53:05.581552] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:12.916 [2024-11-19 17:53:05.581584] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:12.916 #20 NEW cov: 10807 ft: 15743 corp: 5/278b lim: 90 exec/s: 20 rss: 68Mb L: 79/79 MS: 1 CrossOver- 00:09:12.916 [2024-11-19 17:53:05.755758] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:12.916 [2024-11-19 17:53:05.755788] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:13.176 #21 NEW cov: 10807 ft: 16003 corp: 6/313b lim: 90 exec/s: 21 rss: 68Mb L: 35/79 MS: 1 CrossOver- 00:09:13.176 [2024-11-19 17:53:05.927905] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:13.176 [2024-11-19 17:53:05.927935] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:13.176 #22 NEW cov: 10807 ft: 16489 corp: 7/392b lim: 90 exec/s: 22 rss: 68Mb L: 79/79 MS: 1 ChangeByte- 00:09:13.435 [2024-11-19 17:53:06.099307] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:13.435 [2024-11-19 17:53:06.099336] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:13.435 #23 NEW cov: 10807 ft: 16674 corp: 8/472b lim: 90 exec/s: 23 rss: 68Mb L: 80/80 MS: 1 InsertByte- 00:09:13.435 [2024-11-19 17:53:06.270950] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:13.435 [2024-11-19 17:53:06.270979] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:13.695 #24 NEW cov: 10807 ft: 16953 corp: 9/557b lim: 90 exec/s: 24 rss: 68Mb L: 85/85 MS: 1 InsertRepeatedBytes- 00:09:13.695 [2024-11-19 17:53:06.443251] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:13.695 [2024-11-19 17:53:06.443286] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:13.695 #25 NEW cov: 10814 ft: 16976 corp: 10/643b lim: 90 exec/s: 25 rss: 68Mb L: 86/86 MS: 1 InsertByte- 00:09:13.955 [2024-11-19 17:53:06.616408] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:13.955 [2024-11-19 17:53:06.616440] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:13.955 #26 NEW cov: 10814 ft: 17177 corp: 11/723b lim: 90 exec/s: 13 rss: 68Mb L: 80/86 MS: 1 InsertByte- 00:09:13.955 #26 DONE cov: 10814 ft: 17177 corp: 11/723b lim: 90 exec/s: 13 rss: 68Mb 00:09:13.955 Done 26 runs in 2 second(s) 00:09:14.215 17:53:06 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-6 00:09:14.215 17:53:06 -- ../common.sh@72 -- # (( i++ )) 00:09:14.215 17:53:06 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:14.215 17:53:06 -- vfio/run.sh@75 -- # trap - SIGINT SIGTERM EXIT 00:09:14.215 00:09:14.215 real 0m19.025s 00:09:14.215 user 0m26.072s 00:09:14.215 sys 0m1.836s 00:09:14.215 17:53:06 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:09:14.215 17:53:06 -- common/autotest_common.sh@10 -- # set +x 00:09:14.215 ************************************ 00:09:14.215 END TEST vfio_fuzz 00:09:14.215 ************************************ 00:09:14.215 00:09:14.215 real 1m23.242s 00:09:14.215 user 2m5.434s 00:09:14.215 sys 0m10.482s 00:09:14.215 17:53:07 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:09:14.215 17:53:07 -- common/autotest_common.sh@10 -- # set +x 00:09:14.215 ************************************ 00:09:14.215 END TEST llvm_fuzz 00:09:14.215 ************************************ 00:09:14.215 17:53:07 -- spdk/autotest.sh@365 -- # [[ 0 -eq 1 ]] 00:09:14.215 17:53:07 -- spdk/autotest.sh@370 -- # trap - SIGINT SIGTERM EXIT 00:09:14.215 17:53:07 -- spdk/autotest.sh@372 -- # timing_enter post_cleanup 00:09:14.215 17:53:07 -- common/autotest_common.sh@722 -- # xtrace_disable 00:09:14.215 17:53:07 -- common/autotest_common.sh@10 -- # set +x 00:09:14.215 17:53:07 -- spdk/autotest.sh@373 -- # autotest_cleanup 00:09:14.215 17:53:07 -- common/autotest_common.sh@1381 -- # local autotest_es=0 00:09:14.215 17:53:07 -- common/autotest_common.sh@1382 -- # xtrace_disable 00:09:14.215 17:53:07 -- common/autotest_common.sh@10 -- # set +x 00:09:20.790 INFO: APP EXITING 00:09:20.790 INFO: killing all VMs 00:09:20.790 INFO: killing vhost app 00:09:20.790 INFO: EXIT DONE 00:09:23.329 Waiting for block devices as requested 00:09:23.329 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:09:23.329 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:09:23.329 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:09:23.329 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:09:23.329 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:09:23.589 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:09:23.589 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:09:23.589 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:09:23.848 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:09:23.849 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:09:23.849 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:09:23.849 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:09:24.108 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:09:24.108 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:09:24.108 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:09:24.368 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:09:24.368 0000:d8:00.0 (8086 0a54): vfio-pci -> nvme 00:09:28.564 Cleaning 00:09:28.564 Removing: /dev/shm/spdk_tgt_trace.pid610843 00:09:28.564 Removing: /var/run/dpdk/spdk_pid608361 00:09:28.564 Removing: /var/run/dpdk/spdk_pid609633 00:09:28.564 Removing: /var/run/dpdk/spdk_pid610843 00:09:28.564 Removing: /var/run/dpdk/spdk_pid611641 00:09:28.564 Removing: /var/run/dpdk/spdk_pid611971 00:09:28.564 Removing: /var/run/dpdk/spdk_pid612304 00:09:28.564 Removing: /var/run/dpdk/spdk_pid612648 00:09:28.564 Removing: /var/run/dpdk/spdk_pid612990 00:09:28.564 Removing: /var/run/dpdk/spdk_pid613275 00:09:28.564 Removing: /var/run/dpdk/spdk_pid613558 00:09:28.564 Removing: /var/run/dpdk/spdk_pid613875 00:09:28.564 Removing: /var/run/dpdk/spdk_pid614652 00:09:28.564 Removing: /var/run/dpdk/spdk_pid617835 00:09:28.564 Removing: /var/run/dpdk/spdk_pid618215 00:09:28.564 Removing: /var/run/dpdk/spdk_pid618549 00:09:28.564 Removing: /var/run/dpdk/spdk_pid618566 00:09:28.564 Removing: /var/run/dpdk/spdk_pid619136 00:09:28.565 Removing: /var/run/dpdk/spdk_pid619383 00:09:28.565 Removing: /var/run/dpdk/spdk_pid619737 00:09:28.565 Removing: /var/run/dpdk/spdk_pid619987 00:09:28.565 Removing: /var/run/dpdk/spdk_pid620284 00:09:28.565 Removing: /var/run/dpdk/spdk_pid620510 00:09:28.565 Removing: /var/run/dpdk/spdk_pid620602 00:09:28.565 Removing: /var/run/dpdk/spdk_pid620865 00:09:28.565 Removing: /var/run/dpdk/spdk_pid621289 00:09:28.565 Removing: /var/run/dpdk/spdk_pid621532 00:09:28.565 Removing: /var/run/dpdk/spdk_pid621816 00:09:28.565 Removing: /var/run/dpdk/spdk_pid622139 00:09:28.565 Removing: /var/run/dpdk/spdk_pid622349 00:09:28.565 Removing: /var/run/dpdk/spdk_pid622469 00:09:28.565 Removing: /var/run/dpdk/spdk_pid622533 00:09:28.565 Removing: /var/run/dpdk/spdk_pid622799 00:09:28.565 Removing: /var/run/dpdk/spdk_pid623082 00:09:28.565 Removing: /var/run/dpdk/spdk_pid623358 00:09:28.565 Removing: /var/run/dpdk/spdk_pid623542 00:09:28.565 Removing: /var/run/dpdk/spdk_pid623683 00:09:28.565 Removing: /var/run/dpdk/spdk_pid623943 00:09:28.565 Removing: /var/run/dpdk/spdk_pid624217 00:09:28.565 Removing: /var/run/dpdk/spdk_pid624498 00:09:28.565 Removing: /var/run/dpdk/spdk_pid624766 00:09:28.565 Removing: /var/run/dpdk/spdk_pid625105 00:09:28.565 Removing: /var/run/dpdk/spdk_pid625307 00:09:28.565 Removing: /var/run/dpdk/spdk_pid625492 00:09:28.565 Removing: /var/run/dpdk/spdk_pid626056 00:09:28.565 Removing: /var/run/dpdk/spdk_pid626472 00:09:28.565 Removing: /var/run/dpdk/spdk_pid626738 00:09:28.565 Removing: /var/run/dpdk/spdk_pid627021 00:09:28.565 Removing: /var/run/dpdk/spdk_pid627221 00:09:28.565 Removing: /var/run/dpdk/spdk_pid627416 00:09:28.565 Removing: /var/run/dpdk/spdk_pid627600 00:09:28.565 Removing: /var/run/dpdk/spdk_pid627881 00:09:28.565 Removing: /var/run/dpdk/spdk_pid628149 00:09:28.565 Removing: /var/run/dpdk/spdk_pid628436 00:09:28.565 Removing: /var/run/dpdk/spdk_pid628701 00:09:28.565 Removing: /var/run/dpdk/spdk_pid628869 00:09:28.565 Removing: /var/run/dpdk/spdk_pid629018 00:09:28.565 Removing: /var/run/dpdk/spdk_pid629299 00:09:28.565 Removing: /var/run/dpdk/spdk_pid629567 00:09:28.565 Removing: /var/run/dpdk/spdk_pid629848 00:09:28.565 Removing: /var/run/dpdk/spdk_pid630114 00:09:28.565 Removing: /var/run/dpdk/spdk_pid630297 00:09:28.565 Removing: /var/run/dpdk/spdk_pid630450 00:09:28.565 Removing: /var/run/dpdk/spdk_pid630709 00:09:28.565 Removing: /var/run/dpdk/spdk_pid630985 00:09:28.565 Removing: /var/run/dpdk/spdk_pid631272 00:09:28.565 Removing: /var/run/dpdk/spdk_pid631544 00:09:28.565 Removing: /var/run/dpdk/spdk_pid631828 00:09:28.565 Removing: /var/run/dpdk/spdk_pid631988 00:09:28.565 Removing: /var/run/dpdk/spdk_pid632162 00:09:28.565 Removing: /var/run/dpdk/spdk_pid632409 00:09:28.565 Removing: /var/run/dpdk/spdk_pid632697 00:09:28.565 Removing: /var/run/dpdk/spdk_pid632892 00:09:28.565 Removing: /var/run/dpdk/spdk_pid633106 00:09:28.565 Removing: /var/run/dpdk/spdk_pid633864 00:09:28.565 Removing: /var/run/dpdk/spdk_pid634285 00:09:28.565 Removing: /var/run/dpdk/spdk_pid634696 00:09:28.565 Removing: /var/run/dpdk/spdk_pid635233 00:09:28.565 Removing: /var/run/dpdk/spdk_pid635636 00:09:28.565 Removing: /var/run/dpdk/spdk_pid636070 00:09:28.565 Removing: /var/run/dpdk/spdk_pid636626 00:09:28.565 Removing: /var/run/dpdk/spdk_pid637077 00:09:28.565 Removing: /var/run/dpdk/spdk_pid637462 00:09:28.565 Removing: /var/run/dpdk/spdk_pid637999 00:09:28.565 Removing: /var/run/dpdk/spdk_pid638460 00:09:28.565 Removing: /var/run/dpdk/spdk_pid638830 00:09:28.565 Removing: /var/run/dpdk/spdk_pid639378 00:09:28.565 Removing: /var/run/dpdk/spdk_pid639882 00:09:28.565 Removing: /var/run/dpdk/spdk_pid640210 00:09:28.565 Removing: /var/run/dpdk/spdk_pid640746 00:09:28.565 Removing: /var/run/dpdk/spdk_pid641286 00:09:28.565 Removing: /var/run/dpdk/spdk_pid641584 00:09:28.565 Removing: /var/run/dpdk/spdk_pid642127 00:09:28.565 Removing: /var/run/dpdk/spdk_pid642665 00:09:28.565 Removing: /var/run/dpdk/spdk_pid642960 00:09:28.565 Removing: /var/run/dpdk/spdk_pid643500 00:09:28.565 Removing: /var/run/dpdk/spdk_pid643897 00:09:28.565 Removing: /var/run/dpdk/spdk_pid644333 00:09:28.565 Removing: /var/run/dpdk/spdk_pid644862 00:09:28.565 Removing: /var/run/dpdk/spdk_pid645410 00:09:28.565 Removing: /var/run/dpdk/spdk_pid645789 00:09:28.565 Removing: /var/run/dpdk/spdk_pid646333 00:09:28.565 Removing: /var/run/dpdk/spdk_pid646876 00:09:28.565 Removing: /var/run/dpdk/spdk_pid647203 00:09:28.565 Removing: /var/run/dpdk/spdk_pid647730 00:09:28.565 Removing: /var/run/dpdk/spdk_pid648273 00:09:28.565 Clean 00:09:28.565 killing process with pid 559613 00:09:31.857 killing process with pid 559610 00:09:32.117 killing process with pid 559612 00:09:32.117 killing process with pid 559611 00:09:32.117 17:53:24 -- common/autotest_common.sh@1446 -- # return 0 00:09:32.117 17:53:24 -- spdk/autotest.sh@374 -- # timing_exit post_cleanup 00:09:32.117 17:53:24 -- common/autotest_common.sh@728 -- # xtrace_disable 00:09:32.117 17:53:24 -- common/autotest_common.sh@10 -- # set +x 00:09:32.117 17:53:24 -- spdk/autotest.sh@376 -- # timing_exit autotest 00:09:32.117 17:53:24 -- common/autotest_common.sh@728 -- # xtrace_disable 00:09:32.117 17:53:24 -- common/autotest_common.sh@10 -- # set +x 00:09:32.376 17:53:24 -- spdk/autotest.sh@377 -- # chmod a+r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt 00:09:32.376 17:53:24 -- spdk/autotest.sh@379 -- # [[ -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/udev.log ]] 00:09:32.376 17:53:24 -- spdk/autotest.sh@379 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/udev.log 00:09:32.376 17:53:24 -- spdk/autotest.sh@381 -- # [[ y == y ]] 00:09:32.376 17:53:24 -- spdk/autotest.sh@383 -- # hostname 00:09:32.376 17:53:25 -- spdk/autotest.sh@383 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -c --no-external -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk -t spdk-wfp-20 -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_test.info 00:09:32.376 geninfo: WARNING: invalid characters removed from testname! 00:09:33.314 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/ftl/upgrade/ftl_p2l_upgrade.gcda 00:09:33.314 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/ftl/upgrade/ftl_band_upgrade.gcda 00:09:33.314 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/ftl/upgrade/ftl_chunk_upgrade.gcda 00:09:45.526 17:53:36 -- spdk/autotest.sh@384 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -a /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_base.info -a /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_test.info -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:50.952 17:53:42 -- spdk/autotest.sh@385 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/dpdk/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:55.147 17:53:47 -- spdk/autotest.sh@389 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info --ignore-errors unused,unused '/usr/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:10:00.421 17:53:52 -- spdk/autotest.sh@390 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/examples/vmd/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:10:04.614 17:53:57 -- spdk/autotest.sh@391 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_lspci/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:10:09.899 17:54:01 -- spdk/autotest.sh@392 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_top/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:10:14.093 17:54:06 -- spdk/autotest.sh@393 -- # rm -f cov_base.info cov_test.info OLD_STDOUT OLD_STDERR 00:10:14.093 17:54:06 -- common/autotest_common.sh@1689 -- $ [[ y == y ]] 00:10:14.093 17:54:06 -- common/autotest_common.sh@1690 -- $ awk '{print $NF}' 00:10:14.093 17:54:06 -- common/autotest_common.sh@1690 -- $ lcov --version 00:10:14.093 17:54:06 -- common/autotest_common.sh@1690 -- $ lt 1.15 2 00:10:14.093 17:54:06 -- scripts/common.sh@372 -- $ cmp_versions 1.15 '<' 2 00:10:14.093 17:54:06 -- scripts/common.sh@332 -- $ local ver1 ver1_l 00:10:14.093 17:54:06 -- scripts/common.sh@333 -- $ local ver2 ver2_l 00:10:14.093 17:54:06 -- scripts/common.sh@335 -- $ IFS=.-: 00:10:14.093 17:54:06 -- scripts/common.sh@335 -- $ read -ra ver1 00:10:14.093 17:54:06 -- scripts/common.sh@336 -- $ IFS=.-: 00:10:14.093 17:54:06 -- scripts/common.sh@336 -- $ read -ra ver2 00:10:14.093 17:54:06 -- scripts/common.sh@337 -- $ local 'op=<' 00:10:14.093 17:54:06 -- scripts/common.sh@339 -- $ ver1_l=2 00:10:14.093 17:54:06 -- scripts/common.sh@340 -- $ ver2_l=1 00:10:14.093 17:54:06 -- scripts/common.sh@342 -- $ local lt=0 gt=0 eq=0 v 00:10:14.093 17:54:06 -- scripts/common.sh@343 -- $ case "$op" in 00:10:14.093 17:54:06 -- scripts/common.sh@344 -- $ : 1 00:10:14.093 17:54:06 -- scripts/common.sh@363 -- $ (( v = 0 )) 00:10:14.093 17:54:06 -- scripts/common.sh@363 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:10:14.093 17:54:06 -- scripts/common.sh@364 -- $ decimal 1 00:10:14.093 17:54:06 -- scripts/common.sh@352 -- $ local d=1 00:10:14.093 17:54:06 -- scripts/common.sh@353 -- $ [[ 1 =~ ^[0-9]+$ ]] 00:10:14.093 17:54:06 -- scripts/common.sh@354 -- $ echo 1 00:10:14.093 17:54:06 -- scripts/common.sh@364 -- $ ver1[v]=1 00:10:14.093 17:54:06 -- scripts/common.sh@365 -- $ decimal 2 00:10:14.093 17:54:06 -- scripts/common.sh@352 -- $ local d=2 00:10:14.093 17:54:06 -- scripts/common.sh@353 -- $ [[ 2 =~ ^[0-9]+$ ]] 00:10:14.093 17:54:06 -- scripts/common.sh@354 -- $ echo 2 00:10:14.093 17:54:06 -- scripts/common.sh@365 -- $ ver2[v]=2 00:10:14.093 17:54:06 -- scripts/common.sh@366 -- $ (( ver1[v] > ver2[v] )) 00:10:14.093 17:54:06 -- scripts/common.sh@367 -- $ (( ver1[v] < ver2[v] )) 00:10:14.093 17:54:06 -- scripts/common.sh@367 -- $ return 0 00:10:14.093 17:54:06 -- common/autotest_common.sh@1691 -- $ lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:10:14.093 17:54:06 -- common/autotest_common.sh@1703 -- $ export 'LCOV_OPTS= 00:10:14.093 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:10:14.093 --rc genhtml_branch_coverage=1 00:10:14.093 --rc genhtml_function_coverage=1 00:10:14.093 --rc genhtml_legend=1 00:10:14.093 --rc geninfo_all_blocks=1 00:10:14.093 --rc geninfo_unexecuted_blocks=1 00:10:14.093 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:10:14.093 ' 00:10:14.093 17:54:06 -- common/autotest_common.sh@1703 -- $ LCOV_OPTS=' 00:10:14.093 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:10:14.093 --rc genhtml_branch_coverage=1 00:10:14.093 --rc genhtml_function_coverage=1 00:10:14.093 --rc genhtml_legend=1 00:10:14.093 --rc geninfo_all_blocks=1 00:10:14.093 --rc geninfo_unexecuted_blocks=1 00:10:14.093 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:10:14.093 ' 00:10:14.093 17:54:06 -- common/autotest_common.sh@1704 -- $ export 'LCOV=lcov 00:10:14.093 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:10:14.093 --rc genhtml_branch_coverage=1 00:10:14.093 --rc genhtml_function_coverage=1 00:10:14.093 --rc genhtml_legend=1 00:10:14.093 --rc geninfo_all_blocks=1 00:10:14.093 --rc geninfo_unexecuted_blocks=1 00:10:14.093 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:10:14.093 ' 00:10:14.093 17:54:06 -- common/autotest_common.sh@1704 -- $ LCOV='lcov 00:10:14.093 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:10:14.093 --rc genhtml_branch_coverage=1 00:10:14.093 --rc genhtml_function_coverage=1 00:10:14.093 --rc genhtml_legend=1 00:10:14.093 --rc geninfo_all_blocks=1 00:10:14.093 --rc geninfo_unexecuted_blocks=1 00:10:14.093 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:10:14.093 ' 00:10:14.094 17:54:06 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:10:14.094 17:54:06 -- scripts/common.sh@433 -- $ [[ -e /bin/wpdk_common.sh ]] 00:10:14.094 17:54:06 -- scripts/common.sh@441 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:10:14.094 17:54:06 -- scripts/common.sh@442 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:10:14.094 17:54:06 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:10:14.094 17:54:06 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:10:14.094 17:54:06 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:10:14.094 17:54:06 -- paths/export.sh@5 -- $ export PATH 00:10:14.094 17:54:06 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:10:14.094 17:54:06 -- common/autobuild_common.sh@439 -- $ out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:10:14.094 17:54:06 -- common/autobuild_common.sh@440 -- $ date +%s 00:10:14.094 17:54:06 -- common/autobuild_common.sh@440 -- $ mktemp -dt spdk_1732035246.XXXXXX 00:10:14.094 17:54:06 -- common/autobuild_common.sh@440 -- $ SPDK_WORKSPACE=/tmp/spdk_1732035246.ZWGAEd 00:10:14.094 17:54:06 -- common/autobuild_common.sh@442 -- $ [[ -n '' ]] 00:10:14.094 17:54:06 -- common/autobuild_common.sh@446 -- $ '[' -n v22.11.4 ']' 00:10:14.094 17:54:06 -- common/autobuild_common.sh@447 -- $ dirname /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:10:14.094 17:54:06 -- common/autobuild_common.sh@447 -- $ scanbuild_exclude=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk' 00:10:14.094 17:54:06 -- common/autobuild_common.sh@453 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp' 00:10:14.094 17:54:06 -- common/autobuild_common.sh@455 -- $ scanbuild='scan-build -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:10:14.094 17:54:06 -- common/autobuild_common.sh@456 -- $ get_config_params 00:10:14.094 17:54:06 -- common/autotest_common.sh@397 -- $ xtrace_disable 00:10:14.094 17:54:06 -- common/autotest_common.sh@10 -- $ set +x 00:10:14.094 17:54:06 -- common/autobuild_common.sh@456 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user' 00:10:14.094 17:54:06 -- spdk/autopackage.sh@10 -- $ MAKEFLAGS=-j112 00:10:14.094 17:54:06 -- spdk/autopackage.sh@11 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:10:14.094 17:54:06 -- spdk/autopackage.sh@13 -- $ [[ 0 -eq 1 ]] 00:10:14.094 17:54:06 -- spdk/autopackage.sh@18 -- $ [[ 1 -eq 0 ]] 00:10:14.094 17:54:06 -- spdk/autopackage.sh@18 -- $ [[ 0 -eq 0 ]] 00:10:14.094 17:54:06 -- spdk/autopackage.sh@19 -- $ timing_finish 00:10:14.094 17:54:06 -- common/autotest_common.sh@734 -- $ flamegraph=/usr/local/FlameGraph/flamegraph.pl 00:10:14.094 17:54:06 -- common/autotest_common.sh@735 -- $ '[' -x /usr/local/FlameGraph/flamegraph.pl ']' 00:10:14.094 17:54:06 -- common/autotest_common.sh@737 -- $ /usr/local/FlameGraph/flamegraph.pl --title 'Build Timing' --nametype Step: --countname seconds /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt 00:10:14.094 17:54:06 -- spdk/autopackage.sh@20 -- $ exit 0 00:10:14.094 + [[ -n 504304 ]] 00:10:14.094 + sudo kill 504304 00:10:14.105 [Pipeline] } 00:10:14.121 [Pipeline] // stage 00:10:14.126 [Pipeline] } 00:10:14.141 [Pipeline] // timeout 00:10:14.146 [Pipeline] } 00:10:14.161 [Pipeline] // catchError 00:10:14.166 [Pipeline] } 00:10:14.181 [Pipeline] // wrap 00:10:14.187 [Pipeline] } 00:10:14.200 [Pipeline] // catchError 00:10:14.210 [Pipeline] stage 00:10:14.212 [Pipeline] { (Epilogue) 00:10:14.226 [Pipeline] catchError 00:10:14.228 [Pipeline] { 00:10:14.241 [Pipeline] echo 00:10:14.243 Cleanup processes 00:10:14.249 [Pipeline] sh 00:10:14.537 + sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:10:14.537 658092 sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:10:14.552 [Pipeline] sh 00:10:14.840 ++ sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:10:14.840 ++ grep -v 'sudo pgrep' 00:10:14.840 ++ awk '{print $1}' 00:10:14.840 + sudo kill -9 00:10:14.840 + true 00:10:14.852 [Pipeline] sh 00:10:15.138 + jbp/jenkins/jjb-config/jobs/scripts/compress_artifacts.sh 00:10:15.138 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,718 MiB 00:10:15.138 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,718 MiB 00:10:16.516 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,718 MiB 00:10:26.509 [Pipeline] sh 00:10:26.810 + jbp/jenkins/jjb-config/jobs/scripts/check_artifacts_size.sh 00:10:26.810 Artifacts sizes are good 00:10:26.824 [Pipeline] archiveArtifacts 00:10:26.831 Archiving artifacts 00:10:26.956 [Pipeline] sh 00:10:27.242 + sudo chown -R sys_sgci: /var/jenkins/workspace/short-fuzz-phy-autotest 00:10:27.255 [Pipeline] cleanWs 00:10:27.264 [WS-CLEANUP] Deleting project workspace... 00:10:27.264 [WS-CLEANUP] Deferred wipeout is used... 00:10:27.271 [WS-CLEANUP] done 00:10:27.272 [Pipeline] } 00:10:27.287 [Pipeline] // catchError 00:10:27.297 [Pipeline] sh 00:10:27.580 + logger -p user.info -t JENKINS-CI 00:10:27.588 [Pipeline] } 00:10:27.603 [Pipeline] // stage 00:10:27.609 [Pipeline] } 00:10:27.625 [Pipeline] // node 00:10:27.630 [Pipeline] End of Pipeline 00:10:27.672 Finished: SUCCESS