00:00:00.001 Started by upstream project "autotest-nightly-lts" build number 2008 00:00:00.001 originally caused by: 00:00:00.001 Started by upstream project "nightly-trigger" build number 3269 00:00:00.001 originally caused by: 00:00:00.001 Started by timer 00:00:00.001 Started by timer 00:00:00.024 Checking out git https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool into /var/jenkins_home/workspace/short-fuzz-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4 to read jbp/jenkins/jjb-config/jobs/autotest-downstream/autotest-phy.groovy 00:00:00.025 The recommended git tool is: git 00:00:00.025 using credential 00000000-0000-0000-0000-000000000002 00:00:00.027 > git rev-parse --resolve-git-dir /var/jenkins_home/workspace/short-fuzz-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4/jbp/.git # timeout=10 00:00:00.039 Fetching changes from the remote Git repository 00:00:00.042 > git config remote.origin.url https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool # timeout=10 00:00:00.060 Using shallow fetch with depth 1 00:00:00.060 Fetching upstream changes from https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool 00:00:00.060 > git --version # timeout=10 00:00:00.084 > git --version # 'git version 2.39.2' 00:00:00.084 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:00.107 Setting http proxy: proxy-dmz.intel.com:911 00:00:00.107 > git fetch --tags --force --progress --depth=1 -- https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool refs/heads/master # timeout=5 00:00:02.564 > git rev-parse origin/FETCH_HEAD^{commit} # timeout=10 00:00:02.573 > git rev-parse FETCH_HEAD^{commit} # timeout=10 00:00:02.584 Checking out Revision 9bf0dabeadcf84e29a3d5dbec2430e38aceadf8d (FETCH_HEAD) 00:00:02.584 > git config core.sparsecheckout # timeout=10 00:00:02.595 > git read-tree -mu HEAD # timeout=10 00:00:02.607 > git checkout -f 9bf0dabeadcf84e29a3d5dbec2430e38aceadf8d # timeout=5 00:00:02.623 Commit message: "inventory: add WCP3 to free inventory" 00:00:02.623 > git rev-list --no-walk 9bf0dabeadcf84e29a3d5dbec2430e38aceadf8d # timeout=10 00:00:02.692 [Pipeline] Start of Pipeline 00:00:02.704 [Pipeline] library 00:00:02.705 Loading library shm_lib@master 00:00:02.705 Library shm_lib@master is cached. Copying from home. 00:00:02.746 [Pipeline] node 00:00:02.773 Running on WFP20 in /var/jenkins/workspace/short-fuzz-phy-autotest 00:00:02.776 [Pipeline] { 00:00:02.790 [Pipeline] catchError 00:00:02.792 [Pipeline] { 00:00:02.811 [Pipeline] wrap 00:00:02.825 [Pipeline] { 00:00:02.836 [Pipeline] stage 00:00:02.838 [Pipeline] { (Prologue) 00:00:03.030 [Pipeline] sh 00:00:03.315 + logger -p user.info -t JENKINS-CI 00:00:03.333 [Pipeline] echo 00:00:03.335 Node: WFP20 00:00:03.342 [Pipeline] sh 00:00:03.636 [Pipeline] setCustomBuildProperty 00:00:03.644 [Pipeline] echo 00:00:03.646 Cleanup processes 00:00:03.652 [Pipeline] sh 00:00:03.932 + sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:03.932 211228 sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:03.947 [Pipeline] sh 00:00:04.232 ++ sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:04.232 ++ grep -v 'sudo pgrep' 00:00:04.232 ++ awk '{print $1}' 00:00:04.232 + sudo kill -9 00:00:04.232 + true 00:00:04.247 [Pipeline] cleanWs 00:00:04.256 [WS-CLEANUP] Deleting project workspace... 00:00:04.256 [WS-CLEANUP] Deferred wipeout is used... 00:00:04.262 [WS-CLEANUP] done 00:00:04.267 [Pipeline] setCustomBuildProperty 00:00:04.282 [Pipeline] sh 00:00:04.569 + sudo git config --global --replace-all safe.directory '*' 00:00:04.653 [Pipeline] httpRequest 00:00:04.671 [Pipeline] echo 00:00:04.672 Sorcerer 10.211.164.101 is alive 00:00:04.679 [Pipeline] httpRequest 00:00:04.683 HttpMethod: GET 00:00:04.683 URL: http://10.211.164.101/packages/jbp_9bf0dabeadcf84e29a3d5dbec2430e38aceadf8d.tar.gz 00:00:04.684 Sending request to url: http://10.211.164.101/packages/jbp_9bf0dabeadcf84e29a3d5dbec2430e38aceadf8d.tar.gz 00:00:04.686 Response Code: HTTP/1.1 200 OK 00:00:04.686 Success: Status code 200 is in the accepted range: 200,404 00:00:04.686 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/jbp_9bf0dabeadcf84e29a3d5dbec2430e38aceadf8d.tar.gz 00:00:05.428 [Pipeline] sh 00:00:05.710 + tar --no-same-owner -xf jbp_9bf0dabeadcf84e29a3d5dbec2430e38aceadf8d.tar.gz 00:00:05.727 [Pipeline] httpRequest 00:00:05.754 [Pipeline] echo 00:00:05.756 Sorcerer 10.211.164.101 is alive 00:00:05.764 [Pipeline] httpRequest 00:00:05.786 HttpMethod: GET 00:00:05.786 URL: http://10.211.164.101/packages/spdk_4b94202c659be49093c32ec1d2d75efdacf00691.tar.gz 00:00:05.786 Sending request to url: http://10.211.164.101/packages/spdk_4b94202c659be49093c32ec1d2d75efdacf00691.tar.gz 00:00:05.787 Response Code: HTTP/1.1 200 OK 00:00:05.787 Success: Status code 200 is in the accepted range: 200,404 00:00:05.787 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk_4b94202c659be49093c32ec1d2d75efdacf00691.tar.gz 00:01:06.950 [Pipeline] sh 00:01:07.235 + tar --no-same-owner -xf spdk_4b94202c659be49093c32ec1d2d75efdacf00691.tar.gz 00:01:09.788 [Pipeline] sh 00:01:10.073 + git -C spdk log --oneline -n5 00:01:10.073 4b94202c6 lib/event: Bug fix for framework_set_scheduler 00:01:10.073 507e9ba07 nvme: add lock_depth for ctrlr_lock 00:01:10.074 62fda7b5f nvme: check pthread_mutex_destroy() return value 00:01:10.074 e03c164a1 nvme: add nvme_ctrlr_lock 00:01:10.074 d61f89a86 nvme/cuse: Add ctrlr_lock for cuse register and unregister 00:01:10.087 [Pipeline] } 00:01:10.104 [Pipeline] // stage 00:01:10.114 [Pipeline] stage 00:01:10.116 [Pipeline] { (Prepare) 00:01:10.140 [Pipeline] writeFile 00:01:10.163 [Pipeline] sh 00:01:10.449 + logger -p user.info -t JENKINS-CI 00:01:10.462 [Pipeline] sh 00:01:10.744 + logger -p user.info -t JENKINS-CI 00:01:10.757 [Pipeline] sh 00:01:11.040 + cat autorun-spdk.conf 00:01:11.040 SPDK_RUN_FUNCTIONAL_TEST=1 00:01:11.040 SPDK_TEST_FUZZER_SHORT=1 00:01:11.040 SPDK_TEST_FUZZER=1 00:01:11.040 SPDK_RUN_UBSAN=1 00:01:11.047 RUN_NIGHTLY=1 00:01:11.052 [Pipeline] readFile 00:01:11.077 [Pipeline] withEnv 00:01:11.079 [Pipeline] { 00:01:11.092 [Pipeline] sh 00:01:11.376 + set -ex 00:01:11.377 + [[ -f /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf ]] 00:01:11.377 + source /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:01:11.377 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:01:11.377 ++ SPDK_TEST_FUZZER_SHORT=1 00:01:11.377 ++ SPDK_TEST_FUZZER=1 00:01:11.377 ++ SPDK_RUN_UBSAN=1 00:01:11.377 ++ RUN_NIGHTLY=1 00:01:11.377 + case $SPDK_TEST_NVMF_NICS in 00:01:11.377 + DRIVERS= 00:01:11.377 + [[ -n '' ]] 00:01:11.377 + exit 0 00:01:11.393 [Pipeline] } 00:01:11.413 [Pipeline] // withEnv 00:01:11.417 [Pipeline] } 00:01:11.433 [Pipeline] // stage 00:01:11.443 [Pipeline] catchError 00:01:11.446 [Pipeline] { 00:01:11.463 [Pipeline] timeout 00:01:11.464 Timeout set to expire in 30 min 00:01:11.466 [Pipeline] { 00:01:11.485 [Pipeline] stage 00:01:11.488 [Pipeline] { (Tests) 00:01:11.506 [Pipeline] sh 00:01:11.795 + jbp/jenkins/jjb-config/jobs/scripts/autoruner.sh /var/jenkins/workspace/short-fuzz-phy-autotest 00:01:11.795 ++ readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest 00:01:11.795 + DIR_ROOT=/var/jenkins/workspace/short-fuzz-phy-autotest 00:01:11.795 + [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest ]] 00:01:11.795 + DIR_SPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:11.795 + DIR_OUTPUT=/var/jenkins/workspace/short-fuzz-phy-autotest/output 00:01:11.795 + [[ -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk ]] 00:01:11.795 + [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/output ]] 00:01:11.795 + mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/output 00:01:11.795 + [[ -d /var/jenkins/workspace/short-fuzz-phy-autotest/output ]] 00:01:11.795 + [[ short-fuzz-phy-autotest == pkgdep-* ]] 00:01:11.795 + cd /var/jenkins/workspace/short-fuzz-phy-autotest 00:01:11.795 + source /etc/os-release 00:01:11.795 ++ NAME='Fedora Linux' 00:01:11.795 ++ VERSION='38 (Cloud Edition)' 00:01:11.795 ++ ID=fedora 00:01:11.795 ++ VERSION_ID=38 00:01:11.795 ++ VERSION_CODENAME= 00:01:11.795 ++ PLATFORM_ID=platform:f38 00:01:11.795 ++ PRETTY_NAME='Fedora Linux 38 (Cloud Edition)' 00:01:11.795 ++ ANSI_COLOR='0;38;2;60;110;180' 00:01:11.795 ++ LOGO=fedora-logo-icon 00:01:11.795 ++ CPE_NAME=cpe:/o:fedoraproject:fedora:38 00:01:11.795 ++ HOME_URL=https://fedoraproject.org/ 00:01:11.795 ++ DOCUMENTATION_URL=https://docs.fedoraproject.org/en-US/fedora/f38/system-administrators-guide/ 00:01:11.795 ++ SUPPORT_URL=https://ask.fedoraproject.org/ 00:01:11.795 ++ BUG_REPORT_URL=https://bugzilla.redhat.com/ 00:01:11.795 ++ REDHAT_BUGZILLA_PRODUCT=Fedora 00:01:11.795 ++ REDHAT_BUGZILLA_PRODUCT_VERSION=38 00:01:11.795 ++ REDHAT_SUPPORT_PRODUCT=Fedora 00:01:11.795 ++ REDHAT_SUPPORT_PRODUCT_VERSION=38 00:01:11.795 ++ SUPPORT_END=2024-05-14 00:01:11.795 ++ VARIANT='Cloud Edition' 00:01:11.795 ++ VARIANT_ID=cloud 00:01:11.795 + uname -a 00:01:11.795 Linux spdk-wfp-20 6.7.0-68.fc38.x86_64 #1 SMP PREEMPT_DYNAMIC Mon Jan 15 00:59:40 UTC 2024 x86_64 GNU/Linux 00:01:11.795 + sudo /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:01:15.084 Hugepages 00:01:15.084 node hugesize free / total 00:01:15.084 node0 1048576kB 0 / 0 00:01:15.084 node0 2048kB 0 / 0 00:01:15.084 node1 1048576kB 0 / 0 00:01:15.084 node1 2048kB 0 / 0 00:01:15.084 00:01:15.084 Type BDF Vendor Device NUMA Driver Device Block devices 00:01:15.084 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:01:15.084 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:01:15.084 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:01:15.084 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:01:15.084 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:01:15.084 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:01:15.084 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:01:15.084 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:01:15.084 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:01:15.084 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:01:15.084 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:01:15.084 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:01:15.084 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:01:15.084 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:01:15.084 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:01:15.084 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:01:15.084 NVMe 0000:d8:00.0 8086 0a54 1 nvme nvme0 nvme0n1 00:01:15.084 + rm -f /tmp/spdk-ld-path 00:01:15.084 + source autorun-spdk.conf 00:01:15.084 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:01:15.084 ++ SPDK_TEST_FUZZER_SHORT=1 00:01:15.084 ++ SPDK_TEST_FUZZER=1 00:01:15.084 ++ SPDK_RUN_UBSAN=1 00:01:15.084 ++ RUN_NIGHTLY=1 00:01:15.084 + (( SPDK_TEST_NVME_CMB == 1 || SPDK_TEST_NVME_PMR == 1 )) 00:01:15.084 + [[ -n '' ]] 00:01:15.084 + sudo git config --global --add safe.directory /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:15.084 + for M in /var/spdk/build-*-manifest.txt 00:01:15.084 + [[ -f /var/spdk/build-pkg-manifest.txt ]] 00:01:15.084 + cp /var/spdk/build-pkg-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:01:15.084 + for M in /var/spdk/build-*-manifest.txt 00:01:15.084 + [[ -f /var/spdk/build-repo-manifest.txt ]] 00:01:15.084 + cp /var/spdk/build-repo-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:01:15.084 ++ uname 00:01:15.084 + [[ Linux == \L\i\n\u\x ]] 00:01:15.084 + sudo dmesg -T 00:01:15.084 + sudo dmesg --clear 00:01:15.084 + dmesg_pid=212686 00:01:15.084 + [[ Fedora Linux == FreeBSD ]] 00:01:15.084 + export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:01:15.084 + UNBIND_ENTIRE_IOMMU_GROUP=yes 00:01:15.084 + [[ -e /var/spdk/dependencies/vhost/spdk_test_image.qcow2 ]] 00:01:15.084 + [[ -x /usr/src/fio-static/fio ]] 00:01:15.084 + export FIO_BIN=/usr/src/fio-static/fio 00:01:15.084 + FIO_BIN=/usr/src/fio-static/fio 00:01:15.084 + sudo dmesg -Tw 00:01:15.084 + [[ '' == \/\v\a\r\/\j\e\n\k\i\n\s\/\w\o\r\k\s\p\a\c\e\/\s\h\o\r\t\-\f\u\z\z\-\p\h\y\-\a\u\t\o\t\e\s\t\/\q\e\m\u\_\v\f\i\o\/* ]] 00:01:15.084 + [[ ! -v VFIO_QEMU_BIN ]] 00:01:15.084 + [[ -e /usr/local/qemu/vfio-user-latest ]] 00:01:15.084 + export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:01:15.084 + VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:01:15.084 + [[ -e /usr/local/qemu/vanilla-latest ]] 00:01:15.084 + export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:01:15.084 + QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:01:15.084 + spdk/autorun.sh /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:01:15.084 Test configuration: 00:01:15.084 SPDK_RUN_FUNCTIONAL_TEST=1 00:01:15.084 SPDK_TEST_FUZZER_SHORT=1 00:01:15.084 SPDK_TEST_FUZZER=1 00:01:15.084 SPDK_RUN_UBSAN=1 00:01:15.084 RUN_NIGHTLY=1 00:09:13 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:01:15.084 00:09:13 -- scripts/common.sh@433 -- $ [[ -e /bin/wpdk_common.sh ]] 00:01:15.084 00:09:13 -- scripts/common.sh@441 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:01:15.084 00:09:13 -- scripts/common.sh@442 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:01:15.084 00:09:13 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:15.084 00:09:13 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:15.084 00:09:13 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:15.084 00:09:13 -- paths/export.sh@5 -- $ export PATH 00:01:15.084 00:09:13 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:15.084 00:09:13 -- common/autobuild_common.sh@434 -- $ out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:01:15.084 00:09:13 -- common/autobuild_common.sh@435 -- $ date +%s 00:01:15.084 00:09:13 -- common/autobuild_common.sh@435 -- $ mktemp -dt spdk_1720994953.XXXXXX 00:01:15.084 00:09:13 -- common/autobuild_common.sh@435 -- $ SPDK_WORKSPACE=/tmp/spdk_1720994953.weBE2a 00:01:15.084 00:09:13 -- common/autobuild_common.sh@437 -- $ [[ -n '' ]] 00:01:15.084 00:09:13 -- common/autobuild_common.sh@441 -- $ '[' -n '' ']' 00:01:15.084 00:09:13 -- common/autobuild_common.sh@444 -- $ scanbuild_exclude='--exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/' 00:01:15.084 00:09:13 -- common/autobuild_common.sh@448 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp' 00:01:15.084 00:09:13 -- common/autobuild_common.sh@450 -- $ scanbuild='scan-build -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/ --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:01:15.084 00:09:13 -- common/autobuild_common.sh@451 -- $ get_config_params 00:01:15.084 00:09:13 -- common/autotest_common.sh@387 -- $ xtrace_disable 00:01:15.084 00:09:13 -- common/autotest_common.sh@10 -- $ set +x 00:01:15.084 00:09:13 -- common/autobuild_common.sh@451 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-vfio-user' 00:01:15.084 00:09:13 -- spdk/autobuild.sh@11 -- $ SPDK_TEST_AUTOBUILD= 00:01:15.084 00:09:13 -- spdk/autobuild.sh@12 -- $ umask 022 00:01:15.084 00:09:13 -- spdk/autobuild.sh@13 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:15.084 00:09:13 -- spdk/autobuild.sh@16 -- $ date -u 00:01:15.084 Sun Jul 14 10:09:13 PM UTC 2024 00:01:15.084 00:09:13 -- spdk/autobuild.sh@17 -- $ git describe --tags 00:01:15.084 LTS-59-g4b94202c6 00:01:15.084 00:09:13 -- spdk/autobuild.sh@19 -- $ '[' 0 -eq 1 ']' 00:01:15.084 00:09:13 -- spdk/autobuild.sh@23 -- $ '[' 1 -eq 1 ']' 00:01:15.084 00:09:13 -- spdk/autobuild.sh@24 -- $ run_test ubsan echo 'using ubsan' 00:01:15.084 00:09:13 -- common/autotest_common.sh@1077 -- $ '[' 3 -le 1 ']' 00:01:15.084 00:09:13 -- common/autotest_common.sh@1083 -- $ xtrace_disable 00:01:15.084 00:09:13 -- common/autotest_common.sh@10 -- $ set +x 00:01:15.084 ************************************ 00:01:15.084 START TEST ubsan 00:01:15.084 ************************************ 00:01:15.084 00:09:13 -- common/autotest_common.sh@1104 -- $ echo 'using ubsan' 00:01:15.084 using ubsan 00:01:15.084 00:01:15.084 real 0m0.000s 00:01:15.084 user 0m0.000s 00:01:15.084 sys 0m0.000s 00:01:15.085 00:09:13 -- common/autotest_common.sh@1105 -- $ xtrace_disable 00:01:15.085 00:09:13 -- common/autotest_common.sh@10 -- $ set +x 00:01:15.085 ************************************ 00:01:15.085 END TEST ubsan 00:01:15.085 ************************************ 00:01:15.085 00:09:13 -- spdk/autobuild.sh@27 -- $ '[' -n '' ']' 00:01:15.085 00:09:13 -- spdk/autobuild.sh@31 -- $ case "$SPDK_TEST_AUTOBUILD" in 00:01:15.085 00:09:13 -- spdk/autobuild.sh@47 -- $ [[ 0 -eq 1 ]] 00:01:15.085 00:09:13 -- spdk/autobuild.sh@51 -- $ [[ 1 -eq 1 ]] 00:01:15.085 00:09:13 -- spdk/autobuild.sh@52 -- $ llvm_precompile 00:01:15.085 00:09:13 -- common/autobuild_common.sh@423 -- $ run_test autobuild_llvm_precompile _llvm_precompile 00:01:15.085 00:09:13 -- common/autotest_common.sh@1077 -- $ '[' 2 -le 1 ']' 00:01:15.085 00:09:13 -- common/autotest_common.sh@1083 -- $ xtrace_disable 00:01:15.085 00:09:13 -- common/autotest_common.sh@10 -- $ set +x 00:01:15.085 ************************************ 00:01:15.085 START TEST autobuild_llvm_precompile 00:01:15.085 ************************************ 00:01:15.085 00:09:13 -- common/autotest_common.sh@1104 -- $ _llvm_precompile 00:01:15.085 00:09:13 -- common/autobuild_common.sh@32 -- $ clang --version 00:01:15.085 00:09:13 -- common/autobuild_common.sh@32 -- $ [[ clang version 16.0.6 (Fedora 16.0.6-3.fc38) 00:01:15.085 Target: x86_64-redhat-linux-gnu 00:01:15.085 Thread model: posix 00:01:15.085 InstalledDir: /usr/bin =~ version (([0-9]+).([0-9]+).([0-9]+)) ]] 00:01:15.085 00:09:13 -- common/autobuild_common.sh@33 -- $ clang_num=16 00:01:15.085 00:09:13 -- common/autobuild_common.sh@35 -- $ export CC=clang-16 00:01:15.085 00:09:13 -- common/autobuild_common.sh@35 -- $ CC=clang-16 00:01:15.085 00:09:13 -- common/autobuild_common.sh@36 -- $ export CXX=clang++-16 00:01:15.085 00:09:13 -- common/autobuild_common.sh@36 -- $ CXX=clang++-16 00:01:15.085 00:09:13 -- common/autobuild_common.sh@38 -- $ fuzzer_libs=(/usr/lib*/clang/@("$clang_num"|"$clang_version")/lib/linux/libclang_rt.fuzzer_no_main-x86_64.a) 00:01:15.085 00:09:13 -- common/autobuild_common.sh@39 -- $ fuzzer_lib=/usr/lib64/clang/16/lib/linux/libclang_rt.fuzzer_no_main-x86_64.a 00:01:15.085 00:09:13 -- common/autobuild_common.sh@40 -- $ [[ -e /usr/lib64/clang/16/lib/linux/libclang_rt.fuzzer_no_main-x86_64.a ]] 00:01:15.085 00:09:13 -- common/autobuild_common.sh@42 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-vfio-user --with-fuzzer=/usr/lib64/clang/16/lib/linux/libclang_rt.fuzzer_no_main-x86_64.a' 00:01:15.085 00:09:13 -- common/autobuild_common.sh@44 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-vfio-user --with-fuzzer=/usr/lib64/clang/16/lib/linux/libclang_rt.fuzzer_no_main-x86_64.a 00:01:15.344 Using default SPDK env in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:01:15.344 Using default DPDK in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build 00:01:15.911 Using 'verbs' RDMA provider 00:01:31.374 Configuring ISA-L (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l/spdk-isal.log)...done. 00:01:46.294 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l-crypto/spdk-isal-crypto.log)...done. 00:01:46.294 Creating mk/config.mk...done. 00:01:46.294 Creating mk/cc.flags.mk...done. 00:01:46.294 Type 'make' to build. 00:01:46.294 00:01:46.294 real 0m29.360s 00:01:46.294 user 0m12.581s 00:01:46.294 sys 0m16.178s 00:01:46.294 00:09:43 -- common/autotest_common.sh@1105 -- $ xtrace_disable 00:01:46.294 00:09:43 -- common/autotest_common.sh@10 -- $ set +x 00:01:46.294 ************************************ 00:01:46.294 END TEST autobuild_llvm_precompile 00:01:46.294 ************************************ 00:01:46.294 00:09:43 -- spdk/autobuild.sh@55 -- $ [[ -n '' ]] 00:01:46.294 00:09:43 -- spdk/autobuild.sh@57 -- $ [[ 0 -eq 1 ]] 00:01:46.294 00:09:43 -- spdk/autobuild.sh@59 -- $ [[ 0 -eq 1 ]] 00:01:46.294 00:09:43 -- spdk/autobuild.sh@62 -- $ [[ 1 -eq 1 ]] 00:01:46.294 00:09:43 -- spdk/autobuild.sh@64 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-vfio-user --with-fuzzer=/usr/lib64/clang/16/lib/linux/libclang_rt.fuzzer_no_main-x86_64.a 00:01:46.294 Using default SPDK env in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:01:46.294 Using default DPDK in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build 00:01:46.294 Using 'verbs' RDMA provider 00:01:58.514 Configuring ISA-L (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l/spdk-isal.log)...done. 00:02:10.777 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l-crypto/spdk-isal-crypto.log)...done. 00:02:10.777 Creating mk/config.mk...done. 00:02:10.777 Creating mk/cc.flags.mk...done. 00:02:10.777 Type 'make' to build. 00:02:10.777 00:10:08 -- spdk/autobuild.sh@69 -- $ run_test make make -j112 00:02:10.777 00:10:08 -- common/autotest_common.sh@1077 -- $ '[' 3 -le 1 ']' 00:02:10.777 00:10:08 -- common/autotest_common.sh@1083 -- $ xtrace_disable 00:02:10.777 00:10:08 -- common/autotest_common.sh@10 -- $ set +x 00:02:10.777 ************************************ 00:02:10.777 START TEST make 00:02:10.777 ************************************ 00:02:10.777 00:10:08 -- common/autotest_common.sh@1104 -- $ make -j112 00:02:10.777 make[1]: Nothing to be done for 'all'. 00:02:11.344 The Meson build system 00:02:11.344 Version: 1.3.1 00:02:11.344 Source dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user 00:02:11.344 Build dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:02:11.344 Build type: native build 00:02:11.344 Project name: libvfio-user 00:02:11.344 Project version: 0.0.1 00:02:11.344 C compiler for the host machine: clang-16 (clang 16.0.6 "clang version 16.0.6 (Fedora 16.0.6-3.fc38)") 00:02:11.344 C linker for the host machine: clang-16 ld.bfd 2.39-16 00:02:11.344 Host machine cpu family: x86_64 00:02:11.344 Host machine cpu: x86_64 00:02:11.344 Run-time dependency threads found: YES 00:02:11.344 Library dl found: YES 00:02:11.344 Found pkg-config: YES (/usr/bin/pkg-config) 1.8.0 00:02:11.344 Run-time dependency json-c found: YES 0.17 00:02:11.344 Run-time dependency cmocka found: YES 1.1.7 00:02:11.344 Program pytest-3 found: NO 00:02:11.344 Program flake8 found: NO 00:02:11.344 Program misspell-fixer found: NO 00:02:11.344 Program restructuredtext-lint found: NO 00:02:11.344 Program valgrind found: YES (/usr/bin/valgrind) 00:02:11.344 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:02:11.344 Compiler for C supports arguments -Wmissing-declarations: YES 00:02:11.344 Compiler for C supports arguments -Wwrite-strings: YES 00:02:11.344 ../libvfio-user/test/meson.build:20: WARNING: Project targets '>= 0.53.0' but uses feature introduced in '0.57.0': exclude_suites arg in add_test_setup. 00:02:11.344 Program test-lspci.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user/test/test-lspci.sh) 00:02:11.344 Program test-linkage.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user/test/test-linkage.sh) 00:02:11.344 ../libvfio-user/test/py/meson.build:16: WARNING: Project targets '>= 0.53.0' but uses feature introduced in '0.57.0': exclude_suites arg in add_test_setup. 00:02:11.344 Build targets in project: 8 00:02:11.344 WARNING: Project specifies a minimum meson_version '>= 0.53.0' but uses features which were added in newer versions: 00:02:11.344 * 0.57.0: {'exclude_suites arg in add_test_setup'} 00:02:11.344 00:02:11.344 libvfio-user 0.0.1 00:02:11.344 00:02:11.344 User defined options 00:02:11.344 buildtype : debug 00:02:11.344 default_library: static 00:02:11.344 libdir : /usr/local/lib 00:02:11.344 00:02:11.344 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:02:11.603 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug' 00:02:11.603 [1/36] Compiling C object lib/libvfio-user.a.p/irq.c.o 00:02:11.603 [2/36] Compiling C object lib/libvfio-user.a.p/tran.c.o 00:02:11.603 [3/36] Compiling C object samples/gpio-pci-idio-16.p/gpio-pci-idio-16.c.o 00:02:11.603 [4/36] Compiling C object samples/lspci.p/lspci.c.o 00:02:11.603 [5/36] Compiling C object samples/shadow_ioeventfd_server.p/shadow_ioeventfd_server.c.o 00:02:11.603 [6/36] Compiling C object samples/null.p/null.c.o 00:02:11.603 [7/36] Compiling C object samples/client.p/.._lib_tran.c.o 00:02:11.603 [8/36] Compiling C object lib/libvfio-user.a.p/migration.c.o 00:02:11.603 [9/36] Compiling C object lib/libvfio-user.a.p/pci.c.o 00:02:11.603 [10/36] Compiling C object lib/libvfio-user.a.p/pci_caps.c.o 00:02:11.603 [11/36] Compiling C object test/unit_tests.p/.._lib_irq.c.o 00:02:11.603 [12/36] Compiling C object samples/client.p/.._lib_migration.c.o 00:02:11.603 [13/36] Compiling C object test/unit_tests.p/.._lib_tran_pipe.c.o 00:02:11.603 [14/36] Compiling C object test/unit_tests.p/.._lib_migration.c.o 00:02:11.603 [15/36] Compiling C object lib/libvfio-user.a.p/dma.c.o 00:02:11.603 [16/36] Compiling C object lib/libvfio-user.a.p/tran_sock.c.o 00:02:11.603 [17/36] Compiling C object samples/server.p/server.c.o 00:02:11.603 [18/36] Compiling C object test/unit_tests.p/.._lib_pci.c.o 00:02:11.603 [19/36] Compiling C object test/unit_tests.p/.._lib_tran.c.o 00:02:11.603 [20/36] Compiling C object test/unit_tests.p/mocks.c.o 00:02:11.603 [21/36] Compiling C object test/unit_tests.p/.._lib_pci_caps.c.o 00:02:11.603 [22/36] Compiling C object samples/client.p/.._lib_tran_sock.c.o 00:02:11.603 [23/36] Compiling C object test/unit_tests.p/.._lib_dma.c.o 00:02:11.603 [24/36] Compiling C object test/unit_tests.p/unit-tests.c.o 00:02:11.603 [25/36] Compiling C object test/unit_tests.p/.._lib_tran_sock.c.o 00:02:11.603 [26/36] Compiling C object samples/client.p/client.c.o 00:02:11.603 [27/36] Compiling C object lib/libvfio-user.a.p/libvfio-user.c.o 00:02:11.603 [28/36] Compiling C object test/unit_tests.p/.._lib_libvfio-user.c.o 00:02:11.862 [29/36] Linking static target lib/libvfio-user.a 00:02:11.862 [30/36] Linking target samples/client 00:02:11.862 [31/36] Linking target test/unit_tests 00:02:11.862 [32/36] Linking target samples/gpio-pci-idio-16 00:02:11.862 [33/36] Linking target samples/shadow_ioeventfd_server 00:02:11.862 [34/36] Linking target samples/server 00:02:11.862 [35/36] Linking target samples/null 00:02:11.862 [36/36] Linking target samples/lspci 00:02:11.862 INFO: autodetecting backend as ninja 00:02:11.862 INFO: calculating backend command to run: /usr/local/bin/ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:02:11.862 DESTDIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user meson install --quiet -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:02:12.123 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug' 00:02:12.123 ninja: no work to do. 00:02:17.398 The Meson build system 00:02:17.398 Version: 1.3.1 00:02:17.398 Source dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk 00:02:17.398 Build dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build-tmp 00:02:17.398 Build type: native build 00:02:17.398 Program cat found: YES (/usr/bin/cat) 00:02:17.398 Project name: DPDK 00:02:17.398 Project version: 23.11.0 00:02:17.398 C compiler for the host machine: clang-16 (clang 16.0.6 "clang version 16.0.6 (Fedora 16.0.6-3.fc38)") 00:02:17.398 C linker for the host machine: clang-16 ld.bfd 2.39-16 00:02:17.398 Host machine cpu family: x86_64 00:02:17.398 Host machine cpu: x86_64 00:02:17.398 Message: ## Building in Developer Mode ## 00:02:17.398 Program pkg-config found: YES (/usr/bin/pkg-config) 00:02:17.398 Program check-symbols.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/buildtools/check-symbols.sh) 00:02:17.398 Program options-ibverbs-static.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/buildtools/options-ibverbs-static.sh) 00:02:17.398 Program python3 found: YES (/usr/bin/python3) 00:02:17.398 Program cat found: YES (/usr/bin/cat) 00:02:17.398 Compiler for C supports arguments -march=native: YES 00:02:17.398 Checking for size of "void *" : 8 00:02:17.398 Checking for size of "void *" : 8 (cached) 00:02:17.398 Library m found: YES 00:02:17.398 Library numa found: YES 00:02:17.398 Has header "numaif.h" : YES 00:02:17.398 Library fdt found: NO 00:02:17.398 Library execinfo found: NO 00:02:17.398 Has header "execinfo.h" : YES 00:02:17.398 Found pkg-config: YES (/usr/bin/pkg-config) 1.8.0 00:02:17.398 Run-time dependency libarchive found: NO (tried pkgconfig) 00:02:17.398 Run-time dependency libbsd found: NO (tried pkgconfig) 00:02:17.398 Run-time dependency jansson found: NO (tried pkgconfig) 00:02:17.398 Run-time dependency openssl found: YES 3.0.9 00:02:17.398 Run-time dependency libpcap found: YES 1.10.4 00:02:17.398 Has header "pcap.h" with dependency libpcap: YES 00:02:17.398 Compiler for C supports arguments -Wcast-qual: YES 00:02:17.398 Compiler for C supports arguments -Wdeprecated: YES 00:02:17.398 Compiler for C supports arguments -Wformat: YES 00:02:17.398 Compiler for C supports arguments -Wformat-nonliteral: YES 00:02:17.398 Compiler for C supports arguments -Wformat-security: YES 00:02:17.398 Compiler for C supports arguments -Wmissing-declarations: YES 00:02:17.398 Compiler for C supports arguments -Wmissing-prototypes: YES 00:02:17.398 Compiler for C supports arguments -Wnested-externs: YES 00:02:17.398 Compiler for C supports arguments -Wold-style-definition: YES 00:02:17.398 Compiler for C supports arguments -Wpointer-arith: YES 00:02:17.398 Compiler for C supports arguments -Wsign-compare: YES 00:02:17.398 Compiler for C supports arguments -Wstrict-prototypes: YES 00:02:17.398 Compiler for C supports arguments -Wundef: YES 00:02:17.399 Compiler for C supports arguments -Wwrite-strings: YES 00:02:17.399 Compiler for C supports arguments -Wno-address-of-packed-member: YES 00:02:17.399 Compiler for C supports arguments -Wno-packed-not-aligned: NO 00:02:17.399 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:02:17.399 Program objdump found: YES (/usr/bin/objdump) 00:02:17.399 Compiler for C supports arguments -mavx512f: YES 00:02:17.399 Checking if "AVX512 checking" compiles: YES 00:02:17.399 Fetching value of define "__SSE4_2__" : 1 00:02:17.399 Fetching value of define "__AES__" : 1 00:02:17.399 Fetching value of define "__AVX__" : 1 00:02:17.399 Fetching value of define "__AVX2__" : 1 00:02:17.399 Fetching value of define "__AVX512BW__" : 1 00:02:17.399 Fetching value of define "__AVX512CD__" : 1 00:02:17.399 Fetching value of define "__AVX512DQ__" : 1 00:02:17.399 Fetching value of define "__AVX512F__" : 1 00:02:17.399 Fetching value of define "__AVX512VL__" : 1 00:02:17.399 Fetching value of define "__PCLMUL__" : 1 00:02:17.399 Fetching value of define "__RDRND__" : 1 00:02:17.399 Fetching value of define "__RDSEED__" : 1 00:02:17.399 Fetching value of define "__VPCLMULQDQ__" : (undefined) 00:02:17.399 Fetching value of define "__znver1__" : (undefined) 00:02:17.399 Fetching value of define "__znver2__" : (undefined) 00:02:17.399 Fetching value of define "__znver3__" : (undefined) 00:02:17.399 Fetching value of define "__znver4__" : (undefined) 00:02:17.399 Compiler for C supports arguments -Wno-format-truncation: NO 00:02:17.399 Message: lib/log: Defining dependency "log" 00:02:17.399 Message: lib/kvargs: Defining dependency "kvargs" 00:02:17.399 Message: lib/telemetry: Defining dependency "telemetry" 00:02:17.399 Checking for function "getentropy" : NO 00:02:17.399 Message: lib/eal: Defining dependency "eal" 00:02:17.399 Message: lib/ring: Defining dependency "ring" 00:02:17.399 Message: lib/rcu: Defining dependency "rcu" 00:02:17.399 Message: lib/mempool: Defining dependency "mempool" 00:02:17.399 Message: lib/mbuf: Defining dependency "mbuf" 00:02:17.399 Fetching value of define "__PCLMUL__" : 1 (cached) 00:02:17.399 Fetching value of define "__AVX512F__" : 1 (cached) 00:02:17.399 Fetching value of define "__AVX512BW__" : 1 (cached) 00:02:17.399 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:02:17.399 Fetching value of define "__AVX512VL__" : 1 (cached) 00:02:17.399 Fetching value of define "__VPCLMULQDQ__" : (undefined) (cached) 00:02:17.399 Compiler for C supports arguments -mpclmul: YES 00:02:17.399 Compiler for C supports arguments -maes: YES 00:02:17.399 Compiler for C supports arguments -mavx512f: YES (cached) 00:02:17.399 Compiler for C supports arguments -mavx512bw: YES 00:02:17.399 Compiler for C supports arguments -mavx512dq: YES 00:02:17.399 Compiler for C supports arguments -mavx512vl: YES 00:02:17.399 Compiler for C supports arguments -mvpclmulqdq: YES 00:02:17.399 Compiler for C supports arguments -mavx2: YES 00:02:17.399 Compiler for C supports arguments -mavx: YES 00:02:17.399 Message: lib/net: Defining dependency "net" 00:02:17.399 Message: lib/meter: Defining dependency "meter" 00:02:17.399 Message: lib/ethdev: Defining dependency "ethdev" 00:02:17.399 Message: lib/pci: Defining dependency "pci" 00:02:17.399 Message: lib/cmdline: Defining dependency "cmdline" 00:02:17.399 Message: lib/hash: Defining dependency "hash" 00:02:17.399 Message: lib/timer: Defining dependency "timer" 00:02:17.399 Message: lib/compressdev: Defining dependency "compressdev" 00:02:17.399 Message: lib/cryptodev: Defining dependency "cryptodev" 00:02:17.399 Message: lib/dmadev: Defining dependency "dmadev" 00:02:17.399 Compiler for C supports arguments -Wno-cast-qual: YES 00:02:17.399 Message: lib/power: Defining dependency "power" 00:02:17.399 Message: lib/reorder: Defining dependency "reorder" 00:02:17.399 Message: lib/security: Defining dependency "security" 00:02:17.399 Has header "linux/userfaultfd.h" : YES 00:02:17.399 Has header "linux/vduse.h" : YES 00:02:17.399 Message: lib/vhost: Defining dependency "vhost" 00:02:17.399 Compiler for C supports arguments -Wno-format-truncation: NO (cached) 00:02:17.399 Message: drivers/bus/pci: Defining dependency "bus_pci" 00:02:17.399 Message: drivers/bus/vdev: Defining dependency "bus_vdev" 00:02:17.399 Message: drivers/mempool/ring: Defining dependency "mempool_ring" 00:02:17.399 Message: Disabling raw/* drivers: missing internal dependency "rawdev" 00:02:17.399 Message: Disabling regex/* drivers: missing internal dependency "regexdev" 00:02:17.399 Message: Disabling ml/* drivers: missing internal dependency "mldev" 00:02:17.399 Message: Disabling event/* drivers: missing internal dependency "eventdev" 00:02:17.399 Message: Disabling baseband/* drivers: missing internal dependency "bbdev" 00:02:17.399 Message: Disabling gpu/* drivers: missing internal dependency "gpudev" 00:02:17.399 Program doxygen found: YES (/usr/bin/doxygen) 00:02:17.399 Configuring doxy-api-html.conf using configuration 00:02:17.399 Configuring doxy-api-man.conf using configuration 00:02:17.399 Program mandb found: YES (/usr/bin/mandb) 00:02:17.399 Program sphinx-build found: NO 00:02:17.399 Configuring rte_build_config.h using configuration 00:02:17.399 Message: 00:02:17.399 ================= 00:02:17.399 Applications Enabled 00:02:17.399 ================= 00:02:17.399 00:02:17.399 apps: 00:02:17.399 00:02:17.399 00:02:17.399 Message: 00:02:17.399 ================= 00:02:17.399 Libraries Enabled 00:02:17.399 ================= 00:02:17.399 00:02:17.399 libs: 00:02:17.399 log, kvargs, telemetry, eal, ring, rcu, mempool, mbuf, 00:02:17.399 net, meter, ethdev, pci, cmdline, hash, timer, compressdev, 00:02:17.399 cryptodev, dmadev, power, reorder, security, vhost, 00:02:17.399 00:02:17.399 Message: 00:02:17.399 =============== 00:02:17.399 Drivers Enabled 00:02:17.399 =============== 00:02:17.399 00:02:17.399 common: 00:02:17.399 00:02:17.399 bus: 00:02:17.399 pci, vdev, 00:02:17.399 mempool: 00:02:17.399 ring, 00:02:17.399 dma: 00:02:17.399 00:02:17.399 net: 00:02:17.399 00:02:17.399 crypto: 00:02:17.399 00:02:17.399 compress: 00:02:17.399 00:02:17.399 vdpa: 00:02:17.399 00:02:17.399 00:02:17.399 Message: 00:02:17.399 ================= 00:02:17.399 Content Skipped 00:02:17.399 ================= 00:02:17.399 00:02:17.399 apps: 00:02:17.399 dumpcap: explicitly disabled via build config 00:02:17.399 graph: explicitly disabled via build config 00:02:17.399 pdump: explicitly disabled via build config 00:02:17.399 proc-info: explicitly disabled via build config 00:02:17.399 test-acl: explicitly disabled via build config 00:02:17.399 test-bbdev: explicitly disabled via build config 00:02:17.399 test-cmdline: explicitly disabled via build config 00:02:17.399 test-compress-perf: explicitly disabled via build config 00:02:17.399 test-crypto-perf: explicitly disabled via build config 00:02:17.399 test-dma-perf: explicitly disabled via build config 00:02:17.399 test-eventdev: explicitly disabled via build config 00:02:17.399 test-fib: explicitly disabled via build config 00:02:17.399 test-flow-perf: explicitly disabled via build config 00:02:17.399 test-gpudev: explicitly disabled via build config 00:02:17.399 test-mldev: explicitly disabled via build config 00:02:17.399 test-pipeline: explicitly disabled via build config 00:02:17.399 test-pmd: explicitly disabled via build config 00:02:17.399 test-regex: explicitly disabled via build config 00:02:17.399 test-sad: explicitly disabled via build config 00:02:17.399 test-security-perf: explicitly disabled via build config 00:02:17.399 00:02:17.399 libs: 00:02:17.399 metrics: explicitly disabled via build config 00:02:17.399 acl: explicitly disabled via build config 00:02:17.399 bbdev: explicitly disabled via build config 00:02:17.399 bitratestats: explicitly disabled via build config 00:02:17.399 bpf: explicitly disabled via build config 00:02:17.399 cfgfile: explicitly disabled via build config 00:02:17.399 distributor: explicitly disabled via build config 00:02:17.399 efd: explicitly disabled via build config 00:02:17.399 eventdev: explicitly disabled via build config 00:02:17.399 dispatcher: explicitly disabled via build config 00:02:17.399 gpudev: explicitly disabled via build config 00:02:17.399 gro: explicitly disabled via build config 00:02:17.399 gso: explicitly disabled via build config 00:02:17.399 ip_frag: explicitly disabled via build config 00:02:17.399 jobstats: explicitly disabled via build config 00:02:17.399 latencystats: explicitly disabled via build config 00:02:17.399 lpm: explicitly disabled via build config 00:02:17.399 member: explicitly disabled via build config 00:02:17.399 pcapng: explicitly disabled via build config 00:02:17.399 rawdev: explicitly disabled via build config 00:02:17.399 regexdev: explicitly disabled via build config 00:02:17.399 mldev: explicitly disabled via build config 00:02:17.399 rib: explicitly disabled via build config 00:02:17.399 sched: explicitly disabled via build config 00:02:17.399 stack: explicitly disabled via build config 00:02:17.399 ipsec: explicitly disabled via build config 00:02:17.399 pdcp: explicitly disabled via build config 00:02:17.399 fib: explicitly disabled via build config 00:02:17.399 port: explicitly disabled via build config 00:02:17.399 pdump: explicitly disabled via build config 00:02:17.399 table: explicitly disabled via build config 00:02:17.399 pipeline: explicitly disabled via build config 00:02:17.399 graph: explicitly disabled via build config 00:02:17.399 node: explicitly disabled via build config 00:02:17.399 00:02:17.399 drivers: 00:02:17.399 common/cpt: not in enabled drivers build config 00:02:17.399 common/dpaax: not in enabled drivers build config 00:02:17.399 common/iavf: not in enabled drivers build config 00:02:17.399 common/idpf: not in enabled drivers build config 00:02:17.399 common/mvep: not in enabled drivers build config 00:02:17.399 common/octeontx: not in enabled drivers build config 00:02:17.399 bus/auxiliary: not in enabled drivers build config 00:02:17.399 bus/cdx: not in enabled drivers build config 00:02:17.399 bus/dpaa: not in enabled drivers build config 00:02:17.399 bus/fslmc: not in enabled drivers build config 00:02:17.399 bus/ifpga: not in enabled drivers build config 00:02:17.399 bus/platform: not in enabled drivers build config 00:02:17.399 bus/vmbus: not in enabled drivers build config 00:02:17.399 common/cnxk: not in enabled drivers build config 00:02:17.399 common/mlx5: not in enabled drivers build config 00:02:17.399 common/nfp: not in enabled drivers build config 00:02:17.399 common/qat: not in enabled drivers build config 00:02:17.399 common/sfc_efx: not in enabled drivers build config 00:02:17.399 mempool/bucket: not in enabled drivers build config 00:02:17.399 mempool/cnxk: not in enabled drivers build config 00:02:17.399 mempool/dpaa: not in enabled drivers build config 00:02:17.399 mempool/dpaa2: not in enabled drivers build config 00:02:17.399 mempool/octeontx: not in enabled drivers build config 00:02:17.399 mempool/stack: not in enabled drivers build config 00:02:17.399 dma/cnxk: not in enabled drivers build config 00:02:17.399 dma/dpaa: not in enabled drivers build config 00:02:17.399 dma/dpaa2: not in enabled drivers build config 00:02:17.400 dma/hisilicon: not in enabled drivers build config 00:02:17.400 dma/idxd: not in enabled drivers build config 00:02:17.400 dma/ioat: not in enabled drivers build config 00:02:17.400 dma/skeleton: not in enabled drivers build config 00:02:17.400 net/af_packet: not in enabled drivers build config 00:02:17.400 net/af_xdp: not in enabled drivers build config 00:02:17.400 net/ark: not in enabled drivers build config 00:02:17.400 net/atlantic: not in enabled drivers build config 00:02:17.400 net/avp: not in enabled drivers build config 00:02:17.400 net/axgbe: not in enabled drivers build config 00:02:17.400 net/bnx2x: not in enabled drivers build config 00:02:17.400 net/bnxt: not in enabled drivers build config 00:02:17.400 net/bonding: not in enabled drivers build config 00:02:17.400 net/cnxk: not in enabled drivers build config 00:02:17.400 net/cpfl: not in enabled drivers build config 00:02:17.400 net/cxgbe: not in enabled drivers build config 00:02:17.400 net/dpaa: not in enabled drivers build config 00:02:17.400 net/dpaa2: not in enabled drivers build config 00:02:17.400 net/e1000: not in enabled drivers build config 00:02:17.400 net/ena: not in enabled drivers build config 00:02:17.400 net/enetc: not in enabled drivers build config 00:02:17.400 net/enetfec: not in enabled drivers build config 00:02:17.400 net/enic: not in enabled drivers build config 00:02:17.400 net/failsafe: not in enabled drivers build config 00:02:17.400 net/fm10k: not in enabled drivers build config 00:02:17.400 net/gve: not in enabled drivers build config 00:02:17.400 net/hinic: not in enabled drivers build config 00:02:17.400 net/hns3: not in enabled drivers build config 00:02:17.400 net/i40e: not in enabled drivers build config 00:02:17.400 net/iavf: not in enabled drivers build config 00:02:17.400 net/ice: not in enabled drivers build config 00:02:17.400 net/idpf: not in enabled drivers build config 00:02:17.400 net/igc: not in enabled drivers build config 00:02:17.400 net/ionic: not in enabled drivers build config 00:02:17.400 net/ipn3ke: not in enabled drivers build config 00:02:17.400 net/ixgbe: not in enabled drivers build config 00:02:17.400 net/mana: not in enabled drivers build config 00:02:17.400 net/memif: not in enabled drivers build config 00:02:17.400 net/mlx4: not in enabled drivers build config 00:02:17.400 net/mlx5: not in enabled drivers build config 00:02:17.400 net/mvneta: not in enabled drivers build config 00:02:17.400 net/mvpp2: not in enabled drivers build config 00:02:17.400 net/netvsc: not in enabled drivers build config 00:02:17.400 net/nfb: not in enabled drivers build config 00:02:17.400 net/nfp: not in enabled drivers build config 00:02:17.400 net/ngbe: not in enabled drivers build config 00:02:17.400 net/null: not in enabled drivers build config 00:02:17.400 net/octeontx: not in enabled drivers build config 00:02:17.400 net/octeon_ep: not in enabled drivers build config 00:02:17.400 net/pcap: not in enabled drivers build config 00:02:17.400 net/pfe: not in enabled drivers build config 00:02:17.400 net/qede: not in enabled drivers build config 00:02:17.400 net/ring: not in enabled drivers build config 00:02:17.400 net/sfc: not in enabled drivers build config 00:02:17.400 net/softnic: not in enabled drivers build config 00:02:17.400 net/tap: not in enabled drivers build config 00:02:17.400 net/thunderx: not in enabled drivers build config 00:02:17.400 net/txgbe: not in enabled drivers build config 00:02:17.400 net/vdev_netvsc: not in enabled drivers build config 00:02:17.400 net/vhost: not in enabled drivers build config 00:02:17.400 net/virtio: not in enabled drivers build config 00:02:17.400 net/vmxnet3: not in enabled drivers build config 00:02:17.400 raw/*: missing internal dependency, "rawdev" 00:02:17.400 crypto/armv8: not in enabled drivers build config 00:02:17.400 crypto/bcmfs: not in enabled drivers build config 00:02:17.400 crypto/caam_jr: not in enabled drivers build config 00:02:17.400 crypto/ccp: not in enabled drivers build config 00:02:17.400 crypto/cnxk: not in enabled drivers build config 00:02:17.400 crypto/dpaa_sec: not in enabled drivers build config 00:02:17.400 crypto/dpaa2_sec: not in enabled drivers build config 00:02:17.400 crypto/ipsec_mb: not in enabled drivers build config 00:02:17.400 crypto/mlx5: not in enabled drivers build config 00:02:17.400 crypto/mvsam: not in enabled drivers build config 00:02:17.400 crypto/nitrox: not in enabled drivers build config 00:02:17.400 crypto/null: not in enabled drivers build config 00:02:17.400 crypto/octeontx: not in enabled drivers build config 00:02:17.400 crypto/openssl: not in enabled drivers build config 00:02:17.400 crypto/scheduler: not in enabled drivers build config 00:02:17.400 crypto/uadk: not in enabled drivers build config 00:02:17.400 crypto/virtio: not in enabled drivers build config 00:02:17.400 compress/isal: not in enabled drivers build config 00:02:17.400 compress/mlx5: not in enabled drivers build config 00:02:17.400 compress/octeontx: not in enabled drivers build config 00:02:17.400 compress/zlib: not in enabled drivers build config 00:02:17.400 regex/*: missing internal dependency, "regexdev" 00:02:17.400 ml/*: missing internal dependency, "mldev" 00:02:17.400 vdpa/ifc: not in enabled drivers build config 00:02:17.400 vdpa/mlx5: not in enabled drivers build config 00:02:17.400 vdpa/nfp: not in enabled drivers build config 00:02:17.400 vdpa/sfc: not in enabled drivers build config 00:02:17.400 event/*: missing internal dependency, "eventdev" 00:02:17.400 baseband/*: missing internal dependency, "bbdev" 00:02:17.400 gpu/*: missing internal dependency, "gpudev" 00:02:17.400 00:02:17.400 00:02:17.400 Build targets in project: 85 00:02:17.400 00:02:17.400 DPDK 23.11.0 00:02:17.400 00:02:17.400 User defined options 00:02:17.400 buildtype : debug 00:02:17.400 default_library : static 00:02:17.400 libdir : lib 00:02:17.400 prefix : /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build 00:02:17.400 c_args : -fPIC -Werror 00:02:17.400 c_link_args : 00:02:17.400 cpu_instruction_set: native 00:02:17.400 disable_apps : test-fib,test-sad,test,test-regex,test-security-perf,test-bbdev,dumpcap,test-crypto-perf,test-flow-perf,test-gpudev,test-cmdline,test-dma-perf,test-eventdev,test-pipeline,test-acl,proc-info,test-compress-perf,graph,test-pmd,test-mldev,pdump 00:02:17.400 disable_libs : bbdev,latencystats,member,gpudev,mldev,pipeline,lpm,efd,regexdev,sched,node,dispatcher,table,bpf,port,gro,fib,cfgfile,ip_frag,gso,rawdev,ipsec,pdcp,rib,acl,metrics,graph,pcapng,jobstats,eventdev,stack,bitratestats,distributor,pdump 00:02:17.400 enable_docs : false 00:02:17.400 enable_drivers : bus,bus/pci,bus/vdev,mempool/ring 00:02:17.400 enable_kmods : false 00:02:17.400 tests : false 00:02:17.400 00:02:17.400 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:02:17.662 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build-tmp' 00:02:17.662 [1/265] Compiling C object lib/librte_log.a.p/log_log_linux.c.o 00:02:17.662 [2/265] Compiling C object lib/librte_kvargs.a.p/kvargs_rte_kvargs.c.o 00:02:17.662 [3/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_errno.c.o 00:02:17.931 [4/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hexdump.c.o 00:02:17.931 [5/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hypervisor.c.o 00:02:17.931 [6/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_class.c.o 00:02:17.931 [7/265] Compiling C object lib/librte_eal.a.p/eal_common_rte_version.c.o 00:02:17.931 [8/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_cpuflags.c.o 00:02:17.931 [9/265] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_data.c.o 00:02:17.931 [10/265] Compiling C object lib/librte_log.a.p/log_log.c.o 00:02:17.931 [11/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_debug.c.o 00:02:17.931 [12/265] Compiling C object lib/librte_eal.a.p/eal_common_rte_reciprocal.c.o 00:02:17.931 [13/265] Compiling C object lib/librte_eal.a.p/eal_x86_rte_spinlock.c.o 00:02:17.931 [14/265] Linking static target lib/librte_kvargs.a 00:02:17.931 [15/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_uuid.c.o 00:02:17.931 [16/265] Compiling C object lib/librte_eal.a.p/eal_x86_rte_hypervisor.c.o 00:02:17.931 [17/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_string_fns.c.o 00:02:17.931 [18/265] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cpuflags.c.o 00:02:17.931 [19/265] Compiling C object lib/librte_eal.a.p/eal_unix_eal_debug.c.o 00:02:17.931 [20/265] Compiling C object lib/librte_eal.a.p/eal_unix_eal_firmware.c.o 00:02:17.931 [21/265] Linking static target lib/librte_log.a 00:02:17.931 [22/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_cpuflags.c.o 00:02:17.931 [23/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_thread.c.o 00:02:17.931 [24/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio_mp_sync.c.o 00:02:17.931 [25/265] Compiling C object lib/librte_eal.a.p/eal_unix_rte_thread.c.o 00:02:17.931 [26/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_vt100.c.o 00:02:17.931 [27/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_num.c.o 00:02:17.931 [28/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_string.c.o 00:02:17.931 [29/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_socket.c.o 00:02:17.931 [30/265] Compiling C object lib/librte_pci.a.p/pci_rte_pci.c.o 00:02:17.931 [31/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_cirbuf.c.o 00:02:17.931 [32/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_portlist.c.o 00:02:17.931 [33/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline.c.o 00:02:17.931 [34/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_os_unix.c.o 00:02:17.931 [35/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse.c.o 00:02:17.931 [36/265] Linking static target lib/librte_pci.a 00:02:17.931 [37/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_rdline.c.o 00:02:17.931 [38/265] Compiling C object lib/librte_power.a.p/power_guest_channel.c.o 00:02:17.931 [39/265] Compiling C object lib/librte_power.a.p/power_power_kvm_vm.c.o 00:02:17.931 [40/265] Compiling C object lib/librte_power.a.p/power_power_common.c.o 00:02:17.931 [41/265] Compiling C object lib/librte_vhost.a.p/vhost_fd_man.c.o 00:02:18.194 [42/265] Generating lib/kvargs.sym_chk with a custom command (wrapped by meson to capture output) 00:02:18.194 [43/265] Generating lib/pci.sym_chk with a custom command (wrapped by meson to capture output) 00:02:18.194 [44/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_config.c.o 00:02:18.194 [45/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_bus.c.o 00:02:18.194 [46/265] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_legacy.c.o 00:02:18.194 [47/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_mcfg.c.o 00:02:18.194 [48/265] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_timer.c.o 00:02:18.194 [49/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memalloc.c.o 00:02:18.194 [50/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_devargs.c.o 00:02:18.194 [51/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_timer.c.o 00:02:18.194 [52/265] Compiling C object lib/librte_eal.a.p/eal_common_rte_random.c.o 00:02:18.194 [53/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_tailqs.c.o 00:02:18.194 [54/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_interrupts.c.o 00:02:18.194 [55/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_launch.c.o 00:02:18.194 [56/265] Compiling C object lib/librte_eal.a.p/eal_unix_eal_file.c.o 00:02:18.194 [57/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_lcore.c.o 00:02:18.194 [58/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dev.c.o 00:02:18.194 [59/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_ctf.c.o 00:02:18.194 [60/265] Compiling C object lib/librte_eal.a.p/eal_unix_eal_filesystem.c.o 00:02:18.194 [61/265] Compiling C object lib/librte_eal.a.p/eal_common_rte_keepalive.c.o 00:02:18.194 [62/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dynmem.c.o 00:02:18.454 [63/265] Compiling C object lib/librte_net.a.p/net_rte_net_crc.c.o 00:02:18.454 [64/265] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_thread.c.o 00:02:18.454 [65/265] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_memory.c.o 00:02:18.454 [66/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_timer.c.o 00:02:18.454 [67/265] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry.c.o 00:02:18.454 [68/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_thread.c.o 00:02:18.454 [69/265] Compiling C object lib/librte_eal.a.p/eal_common_hotplug_mp.c.o 00:02:18.454 [70/265] Compiling C object lib/librte_eal.a.p/eal_x86_rte_power_intrinsics.c.o 00:02:18.454 [71/265] Compiling C object lib/librte_eal.a.p/eal_common_malloc_elem.c.o 00:02:18.454 [72/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_lcore.c.o 00:02:18.454 [73/265] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cycles.c.o 00:02:18.454 [74/265] Compiling C object lib/librte_net.a.p/net_net_crc_sse.c.o 00:02:18.454 [75/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_utils.c.o 00:02:18.454 [76/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memzone.c.o 00:02:18.454 [77/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_fbarray.c.o 00:02:18.454 [78/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_points.c.o 00:02:18.454 [79/265] Compiling C object lib/librte_eal.a.p/eal_common_malloc_heap.c.o 00:02:18.454 [80/265] Linking static target lib/librte_telemetry.a 00:02:18.454 [81/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_dev.c.o 00:02:18.454 [82/265] Compiling C object lib/librte_eal.a.p/eal_common_malloc_mp.c.o 00:02:18.454 [83/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace.c.o 00:02:18.454 [84/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_hugepage_info.c.o 00:02:18.454 [85/265] Compiling C object lib/librte_ring.a.p/ring_rte_ring.c.o 00:02:18.454 [86/265] Compiling C object lib/librte_meter.a.p/meter_rte_meter.c.o 00:02:18.454 [87/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_alarm.c.o 00:02:18.454 [88/265] Compiling C object lib/net/libnet_crc_avx512_lib.a.p/net_crc_avx512.c.o 00:02:18.454 [89/265] Compiling C object lib/librte_hash.a.p/hash_rte_fbk_hash.c.o 00:02:18.455 [90/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memory.c.o 00:02:18.455 [91/265] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_pool_ops.c.o 00:02:18.455 [92/265] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev_params.c.o 00:02:18.455 [93/265] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops.c.o 00:02:18.455 [94/265] Linking static target lib/librte_ring.a 00:02:18.455 [95/265] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_ptype.c.o 00:02:18.455 [96/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_etheraddr.c.o 00:02:18.455 [97/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_interrupts.c.o 00:02:18.455 [98/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_proc.c.o 00:02:18.455 [99/265] Linking static target lib/librte_meter.a 00:02:18.455 [100/265] Linking static target lib/net/libnet_crc_avx512_lib.a 00:02:18.455 [101/265] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops_default.c.o 00:02:18.455 [102/265] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_params.c.o 00:02:18.455 [103/265] Compiling C object lib/librte_eal.a.p/eal_common_rte_malloc.c.o 00:02:18.455 [104/265] Compiling C object lib/librte_power.a.p/power_power_acpi_cpufreq.c.o 00:02:18.455 [105/265] Compiling C object lib/librte_eal.a.p/eal_common_rte_service.c.o 00:02:18.455 [106/265] Compiling C object lib/librte_mempool.a.p/mempool_mempool_trace_points.c.o 00:02:18.455 [107/265] Compiling C object lib/librte_net.a.p/net_rte_ether.c.o 00:02:18.455 [108/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal.c.o 00:02:18.455 [109/265] Compiling C object lib/librte_timer.a.p/timer_rte_timer.c.o 00:02:18.455 [110/265] Compiling C object lib/librte_power.a.p/power_rte_power.c.o 00:02:18.455 [111/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memory.c.o 00:02:18.455 [112/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memalloc.c.o 00:02:18.455 [113/265] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_profile.c.o 00:02:18.455 [114/265] Compiling C object lib/librte_net.a.p/net_rte_net.c.o 00:02:18.455 [115/265] Compiling C object lib/librte_rcu.a.p/rcu_rte_rcu_qsbr.c.o 00:02:18.455 [116/265] Generating lib/log.sym_chk with a custom command (wrapped by meson to capture output) 00:02:18.455 [117/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio.c.o 00:02:18.455 [118/265] Compiling C object lib/librte_power.a.p/power_power_intel_uncore.c.o 00:02:18.455 [119/265] Compiling C object lib/librte_power.a.p/power_power_pstate_cpufreq.c.o 00:02:18.455 [120/265] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8079.c.o 00:02:18.455 [121/265] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_telemetry.c.o 00:02:18.455 [122/265] Linking static target lib/librte_timer.a 00:02:18.455 [123/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_ipaddr.c.o 00:02:18.455 [124/265] Compiling C object lib/librte_power.a.p/power_rte_power_uncore.c.o 00:02:18.455 [125/265] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev.c.o 00:02:18.455 [126/265] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_dyn.c.o 00:02:18.455 [127/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_options.c.o 00:02:18.455 [128/265] Compiling C object lib/librte_power.a.p/power_power_amd_pstate_cpufreq.c.o 00:02:18.455 [129/265] Linking static target lib/librte_rcu.a 00:02:18.455 [130/265] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_class_eth.c.o 00:02:18.455 [131/265] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev_pmd.c.o 00:02:18.455 [132/265] Compiling C object lib/librte_power.a.p/power_power_cppc_cpufreq.c.o 00:02:18.455 [133/265] Compiling C object lib/librte_hash.a.p/hash_rte_thash.c.o 00:02:18.455 [134/265] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8472.c.o 00:02:18.455 [135/265] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_common.c.o 00:02:18.455 [136/265] Linking static target lib/librte_cmdline.a 00:02:18.455 [137/265] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev_trace_points.c.o 00:02:18.455 [138/265] Linking static target lib/librte_eal.a 00:02:18.455 [139/265] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev.c.o 00:02:18.455 [140/265] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_private.c.o 00:02:18.455 [141/265] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_driver.c.o 00:02:18.455 [142/265] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_telemetry.c.o 00:02:18.455 [143/265] Linking target lib/librte_log.so.24.0 00:02:18.455 [144/265] Compiling C object lib/librte_net.a.p/net_rte_arp.c.o 00:02:18.455 [145/265] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf.c.o 00:02:18.455 [146/265] Compiling C object lib/librte_power.a.p/power_rte_power_pmd_mgmt.c.o 00:02:18.455 [147/265] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool.c.o 00:02:18.455 [148/265] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_comp.c.o 00:02:18.455 [149/265] Linking static target lib/librte_dmadev.a 00:02:18.455 [150/265] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_pmd.c.o 00:02:18.455 [151/265] Linking static target lib/librte_mbuf.a 00:02:18.455 [152/265] Linking static target lib/librte_mempool.a 00:02:18.455 [153/265] Linking static target lib/librte_power.a 00:02:18.455 [154/265] Linking static target lib/librte_net.a 00:02:18.455 [155/265] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8636.c.o 00:02:18.455 [156/265] Linking static target lib/librte_compressdev.a 00:02:18.455 [157/265] Compiling C object lib/librte_hash.a.p/hash_rte_cuckoo_hash.c.o 00:02:18.455 [158/265] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_trace_points.c.o 00:02:18.455 [159/265] Linking static target lib/librte_hash.a 00:02:18.713 [160/265] Compiling C object lib/librte_security.a.p/security_rte_security.c.o 00:02:18.713 [161/265] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_mtr.c.o 00:02:18.713 [162/265] Compiling C object lib/librte_reorder.a.p/reorder_rte_reorder.c.o 00:02:18.713 [163/265] Compiling C object lib/librte_vhost.a.p/vhost_socket.c.o 00:02:18.713 [164/265] Linking static target lib/librte_security.a 00:02:18.713 [165/265] Generating symbol file lib/librte_log.so.24.0.p/librte_log.so.24.0.symbols 00:02:18.713 [166/265] Compiling C object lib/librte_vhost.a.p/vhost_iotlb.c.o 00:02:18.713 [167/265] Linking static target lib/librte_reorder.a 00:02:18.713 [168/265] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_cman.c.o 00:02:18.713 [169/265] Generating lib/meter.sym_chk with a custom command (wrapped by meson to capture output) 00:02:18.714 [170/265] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_uio.c.o 00:02:18.714 [171/265] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common_uio.c.o 00:02:18.714 [172/265] Compiling C object lib/librte_vhost.a.p/vhost_vdpa.c.o 00:02:18.714 [173/265] Linking target lib/librte_kvargs.so.24.0 00:02:18.714 [174/265] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev.c.o 00:02:18.714 [175/265] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_tm.c.o 00:02:18.714 [176/265] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common.c.o 00:02:18.714 [177/265] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net_ctrl.c.o 00:02:18.714 [178/265] Generating lib/ring.sym_chk with a custom command (wrapped by meson to capture output) 00:02:18.714 [179/265] Linking static target drivers/libtmp_rte_bus_vdev.a 00:02:18.714 [180/265] Compiling C object lib/librte_vhost.a.p/vhost_vduse.c.o 00:02:18.714 [181/265] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci.c.o 00:02:18.714 [182/265] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_vfio.c.o 00:02:18.714 [183/265] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_trace_points.c.o 00:02:18.714 [184/265] Compiling C object lib/librte_cryptodev.a.p/cryptodev_rte_cryptodev.c.o 00:02:18.714 [185/265] Generating lib/rcu.sym_chk with a custom command (wrapped by meson to capture output) 00:02:18.714 [186/265] Compiling C object drivers/libtmp_rte_mempool_ring.a.p/mempool_ring_rte_mempool_ring.c.o 00:02:18.714 [187/265] Linking static target drivers/libtmp_rte_bus_pci.a 00:02:18.714 [188/265] Linking static target lib/librte_cryptodev.a 00:02:18.972 [189/265] Generating symbol file lib/librte_kvargs.so.24.0.p/librte_kvargs.so.24.0.symbols 00:02:18.972 [190/265] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_flow.c.o 00:02:18.972 [191/265] Linking static target drivers/libtmp_rte_mempool_ring.a 00:02:18.972 [192/265] Compiling C object lib/librte_vhost.a.p/vhost_vhost.c.o 00:02:18.972 [193/265] Generating lib/net.sym_chk with a custom command (wrapped by meson to capture output) 00:02:18.972 [194/265] Generating lib/timer.sym_chk with a custom command (wrapped by meson to capture output) 00:02:18.972 [195/265] Compiling C object lib/librte_vhost.a.p/vhost_vhost_user.c.o 00:02:18.972 [196/265] Generating lib/telemetry.sym_chk with a custom command (wrapped by meson to capture output) 00:02:18.972 [197/265] Generating drivers/rte_bus_vdev.pmd.c with a custom command 00:02:18.972 [198/265] Compiling C object drivers/librte_bus_vdev.a.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:02:18.972 [199/265] Compiling C object drivers/librte_bus_vdev.so.24.0.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:02:18.972 [200/265] Linking static target drivers/librte_bus_vdev.a 00:02:18.972 [201/265] Linking target lib/librte_telemetry.so.24.0 00:02:18.972 [202/265] Generating drivers/rte_bus_pci.pmd.c with a custom command 00:02:18.972 [203/265] Generating drivers/rte_mempool_ring.pmd.c with a custom command 00:02:18.972 [204/265] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev.c.o 00:02:18.972 [205/265] Compiling C object drivers/librte_bus_pci.so.24.0.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:02:18.972 [206/265] Compiling C object drivers/librte_mempool_ring.so.24.0.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:02:18.972 [207/265] Compiling C object drivers/librte_bus_pci.a.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:02:18.972 [208/265] Linking static target lib/librte_ethdev.a 00:02:18.972 [209/265] Generating lib/dmadev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:18.972 [210/265] Compiling C object drivers/librte_mempool_ring.a.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:02:18.972 [211/265] Linking static target drivers/librte_bus_pci.a 00:02:19.230 [212/265] Linking static target drivers/librte_mempool_ring.a 00:02:19.230 [213/265] Compiling C object lib/librte_vhost.a.p/vhost_vhost_crypto.c.o 00:02:19.230 [214/265] Generating lib/reorder.sym_chk with a custom command (wrapped by meson to capture output) 00:02:19.230 [215/265] Generating symbol file lib/librte_telemetry.so.24.0.p/librte_telemetry.so.24.0.symbols 00:02:19.230 [216/265] Generating lib/security.sym_chk with a custom command (wrapped by meson to capture output) 00:02:19.230 [217/265] Generating lib/compressdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:19.488 [218/265] Generating lib/mbuf.sym_chk with a custom command (wrapped by meson to capture output) 00:02:19.488 [219/265] Generating drivers/rte_bus_vdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:19.488 [220/265] Generating lib/hash.sym_chk with a custom command (wrapped by meson to capture output) 00:02:19.488 [221/265] Generating lib/power.sym_chk with a custom command (wrapped by meson to capture output) 00:02:19.488 [222/265] Generating lib/mempool.sym_chk with a custom command (wrapped by meson to capture output) 00:02:19.746 [223/265] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net.c.o 00:02:19.746 [224/265] Linking static target lib/librte_vhost.a 00:02:19.746 [225/265] Generating lib/cmdline.sym_chk with a custom command (wrapped by meson to capture output) 00:02:20.005 [226/265] Generating drivers/rte_bus_pci.sym_chk with a custom command (wrapped by meson to capture output) 00:02:21.384 [227/265] Generating lib/cryptodev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:21.950 [228/265] Generating lib/vhost.sym_chk with a custom command (wrapped by meson to capture output) 00:02:28.514 [229/265] Generating lib/ethdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:31.043 [230/265] Generating lib/eal.sym_chk with a custom command (wrapped by meson to capture output) 00:02:31.043 [231/265] Linking target lib/librte_eal.so.24.0 00:02:31.301 [232/265] Generating symbol file lib/librte_eal.so.24.0.p/librte_eal.so.24.0.symbols 00:02:31.301 [233/265] Linking target lib/librte_meter.so.24.0 00:02:31.301 [234/265] Linking target lib/librte_timer.so.24.0 00:02:31.301 [235/265] Linking target lib/librte_pci.so.24.0 00:02:31.301 [236/265] Linking target drivers/librte_bus_vdev.so.24.0 00:02:31.301 [237/265] Linking target lib/librte_ring.so.24.0 00:02:31.301 [238/265] Linking target lib/librte_dmadev.so.24.0 00:02:31.301 [239/265] Generating symbol file lib/librte_meter.so.24.0.p/librte_meter.so.24.0.symbols 00:02:31.559 [240/265] Generating symbol file lib/librte_pci.so.24.0.p/librte_pci.so.24.0.symbols 00:02:31.559 [241/265] Generating symbol file lib/librte_ring.so.24.0.p/librte_ring.so.24.0.symbols 00:02:31.559 [242/265] Generating symbol file lib/librte_timer.so.24.0.p/librte_timer.so.24.0.symbols 00:02:31.559 [243/265] Generating symbol file lib/librte_dmadev.so.24.0.p/librte_dmadev.so.24.0.symbols 00:02:31.559 [244/265] Linking target drivers/librte_bus_pci.so.24.0 00:02:31.559 [245/265] Linking target lib/librte_rcu.so.24.0 00:02:31.559 [246/265] Linking target lib/librte_mempool.so.24.0 00:02:31.559 [247/265] Generating symbol file lib/librte_rcu.so.24.0.p/librte_rcu.so.24.0.symbols 00:02:31.559 [248/265] Generating symbol file lib/librte_mempool.so.24.0.p/librte_mempool.so.24.0.symbols 00:02:31.817 [249/265] Linking target drivers/librte_mempool_ring.so.24.0 00:02:31.817 [250/265] Linking target lib/librte_mbuf.so.24.0 00:02:31.817 [251/265] Generating symbol file lib/librte_mbuf.so.24.0.p/librte_mbuf.so.24.0.symbols 00:02:31.817 [252/265] Linking target lib/librte_compressdev.so.24.0 00:02:31.817 [253/265] Linking target lib/librte_reorder.so.24.0 00:02:31.817 [254/265] Linking target lib/librte_net.so.24.0 00:02:31.817 [255/265] Linking target lib/librte_cryptodev.so.24.0 00:02:32.077 [256/265] Generating symbol file lib/librte_net.so.24.0.p/librte_net.so.24.0.symbols 00:02:32.077 [257/265] Generating symbol file lib/librte_cryptodev.so.24.0.p/librte_cryptodev.so.24.0.symbols 00:02:32.077 [258/265] Linking target lib/librte_hash.so.24.0 00:02:32.077 [259/265] Linking target lib/librte_cmdline.so.24.0 00:02:32.077 [260/265] Linking target lib/librte_ethdev.so.24.0 00:02:32.077 [261/265] Linking target lib/librte_security.so.24.0 00:02:32.077 [262/265] Generating symbol file lib/librte_hash.so.24.0.p/librte_hash.so.24.0.symbols 00:02:32.336 [263/265] Generating symbol file lib/librte_ethdev.so.24.0.p/librte_ethdev.so.24.0.symbols 00:02:32.336 [264/265] Linking target lib/librte_vhost.so.24.0 00:02:32.336 [265/265] Linking target lib/librte_power.so.24.0 00:02:32.336 INFO: autodetecting backend as ninja 00:02:32.336 INFO: calculating backend command to run: /usr/local/bin/ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build-tmp -j 112 00:02:33.271 CC lib/ut_mock/mock.o 00:02:33.271 CC lib/log/log.o 00:02:33.271 CC lib/log/log_flags.o 00:02:33.271 CC lib/log/log_deprecated.o 00:02:33.271 CC lib/ut/ut.o 00:02:33.271 LIB libspdk_ut_mock.a 00:02:33.271 LIB libspdk_log.a 00:02:33.271 LIB libspdk_ut.a 00:02:33.530 CC lib/util/base64.o 00:02:33.530 CC lib/util/bit_array.o 00:02:33.530 CC lib/util/crc16.o 00:02:33.530 CC lib/util/cpuset.o 00:02:33.530 CC lib/util/crc32.o 00:02:33.530 CC lib/util/crc32c.o 00:02:33.530 CC lib/util/crc32_ieee.o 00:02:33.530 CC lib/util/fd.o 00:02:33.530 CC lib/util/crc64.o 00:02:33.530 CC lib/util/file.o 00:02:33.530 CC lib/util/dif.o 00:02:33.530 CC lib/util/hexlify.o 00:02:33.530 CC lib/util/iov.o 00:02:33.530 CC lib/util/math.o 00:02:33.530 CC lib/util/pipe.o 00:02:33.530 CC lib/util/strerror_tls.o 00:02:33.530 CC lib/dma/dma.o 00:02:33.530 CC lib/util/string.o 00:02:33.530 CC lib/util/uuid.o 00:02:33.530 CC lib/util/fd_group.o 00:02:33.530 CC lib/util/xor.o 00:02:33.530 CC lib/util/zipf.o 00:02:33.530 CXX lib/trace_parser/trace.o 00:02:33.530 CC lib/ioat/ioat.o 00:02:33.789 CC lib/vfio_user/host/vfio_user_pci.o 00:02:33.789 CC lib/vfio_user/host/vfio_user.o 00:02:33.789 LIB libspdk_dma.a 00:02:33.789 LIB libspdk_ioat.a 00:02:33.789 LIB libspdk_vfio_user.a 00:02:34.048 LIB libspdk_util.a 00:02:34.048 LIB libspdk_trace_parser.a 00:02:34.306 CC lib/rdma/common.o 00:02:34.306 CC lib/rdma/rdma_verbs.o 00:02:34.306 CC lib/json/json_parse.o 00:02:34.306 CC lib/json/json_util.o 00:02:34.306 CC lib/json/json_write.o 00:02:34.306 CC lib/idxd/idxd.o 00:02:34.306 CC lib/idxd/idxd_user.o 00:02:34.306 CC lib/idxd/idxd_kernel.o 00:02:34.306 CC lib/conf/conf.o 00:02:34.306 CC lib/env_dpdk/env.o 00:02:34.306 CC lib/env_dpdk/memory.o 00:02:34.306 CC lib/env_dpdk/pci.o 00:02:34.306 CC lib/env_dpdk/threads.o 00:02:34.306 CC lib/env_dpdk/init.o 00:02:34.306 CC lib/env_dpdk/pci_ioat.o 00:02:34.306 CC lib/env_dpdk/pci_virtio.o 00:02:34.306 CC lib/env_dpdk/pci_vmd.o 00:02:34.306 CC lib/env_dpdk/pci_idxd.o 00:02:34.306 CC lib/env_dpdk/pci_event.o 00:02:34.306 CC lib/env_dpdk/sigbus_handler.o 00:02:34.306 CC lib/env_dpdk/pci_dpdk.o 00:02:34.306 CC lib/vmd/vmd.o 00:02:34.306 CC lib/env_dpdk/pci_dpdk_2207.o 00:02:34.306 CC lib/env_dpdk/pci_dpdk_2211.o 00:02:34.306 CC lib/vmd/led.o 00:02:34.306 LIB libspdk_conf.a 00:02:34.565 LIB libspdk_rdma.a 00:02:34.565 LIB libspdk_json.a 00:02:34.565 LIB libspdk_idxd.a 00:02:34.565 LIB libspdk_vmd.a 00:02:34.825 CC lib/jsonrpc/jsonrpc_server.o 00:02:34.825 CC lib/jsonrpc/jsonrpc_server_tcp.o 00:02:34.825 CC lib/jsonrpc/jsonrpc_client.o 00:02:34.825 CC lib/jsonrpc/jsonrpc_client_tcp.o 00:02:34.825 LIB libspdk_jsonrpc.a 00:02:35.084 LIB libspdk_env_dpdk.a 00:02:35.344 CC lib/rpc/rpc.o 00:02:35.344 LIB libspdk_rpc.a 00:02:35.602 CC lib/sock/sock.o 00:02:35.602 CC lib/sock/sock_rpc.o 00:02:35.602 CC lib/trace/trace.o 00:02:35.602 CC lib/notify/notify.o 00:02:35.602 CC lib/trace/trace_flags.o 00:02:35.602 CC lib/notify/notify_rpc.o 00:02:35.602 CC lib/trace/trace_rpc.o 00:02:35.862 LIB libspdk_notify.a 00:02:35.862 LIB libspdk_trace.a 00:02:35.862 LIB libspdk_sock.a 00:02:36.120 CC lib/thread/thread.o 00:02:36.120 CC lib/thread/iobuf.o 00:02:36.120 CC lib/nvme/nvme_ctrlr.o 00:02:36.120 CC lib/nvme/nvme_ctrlr_cmd.o 00:02:36.120 CC lib/nvme/nvme_fabric.o 00:02:36.120 CC lib/nvme/nvme_ns.o 00:02:36.120 CC lib/nvme/nvme_ns_cmd.o 00:02:36.120 CC lib/nvme/nvme_pcie.o 00:02:36.120 CC lib/nvme/nvme_qpair.o 00:02:36.120 CC lib/nvme/nvme_pcie_common.o 00:02:36.120 CC lib/nvme/nvme_quirks.o 00:02:36.120 CC lib/nvme/nvme_transport.o 00:02:36.120 CC lib/nvme/nvme.o 00:02:36.120 CC lib/nvme/nvme_ctrlr_ocssd_cmd.o 00:02:36.120 CC lib/nvme/nvme_discovery.o 00:02:36.120 CC lib/nvme/nvme_tcp.o 00:02:36.120 CC lib/nvme/nvme_ns_ocssd_cmd.o 00:02:36.120 CC lib/nvme/nvme_opal.o 00:02:36.120 CC lib/nvme/nvme_io_msg.o 00:02:36.120 CC lib/nvme/nvme_poll_group.o 00:02:36.120 CC lib/nvme/nvme_cuse.o 00:02:36.120 CC lib/nvme/nvme_zns.o 00:02:36.120 CC lib/nvme/nvme_vfio_user.o 00:02:36.120 CC lib/nvme/nvme_rdma.o 00:02:37.055 LIB libspdk_thread.a 00:02:37.055 CC lib/blob/blobstore.o 00:02:37.055 CC lib/virtio/virtio.o 00:02:37.055 CC lib/init/json_config.o 00:02:37.055 CC lib/blob/zeroes.o 00:02:37.055 CC lib/virtio/virtio_vhost_user.o 00:02:37.055 CC lib/virtio/virtio_pci.o 00:02:37.055 CC lib/blob/request.o 00:02:37.055 CC lib/init/subsystem_rpc.o 00:02:37.055 CC lib/virtio/virtio_vfio_user.o 00:02:37.055 CC lib/init/subsystem.o 00:02:37.055 CC lib/blob/blob_bs_dev.o 00:02:37.313 CC lib/init/rpc.o 00:02:37.313 CC lib/vfu_tgt/tgt_endpoint.o 00:02:37.313 CC lib/accel/accel_rpc.o 00:02:37.313 CC lib/accel/accel.o 00:02:37.313 CC lib/vfu_tgt/tgt_rpc.o 00:02:37.313 CC lib/accel/accel_sw.o 00:02:37.313 LIB libspdk_init.a 00:02:37.313 LIB libspdk_virtio.a 00:02:37.313 LIB libspdk_nvme.a 00:02:37.313 LIB libspdk_vfu_tgt.a 00:02:37.572 CC lib/event/log_rpc.o 00:02:37.572 CC lib/event/app.o 00:02:37.572 CC lib/event/reactor.o 00:02:37.572 CC lib/event/app_rpc.o 00:02:37.572 CC lib/event/scheduler_static.o 00:02:37.831 LIB libspdk_accel.a 00:02:37.831 LIB libspdk_event.a 00:02:38.091 CC lib/bdev/bdev.o 00:02:38.091 CC lib/bdev/bdev_rpc.o 00:02:38.091 CC lib/bdev/bdev_zone.o 00:02:38.091 CC lib/bdev/part.o 00:02:38.091 CC lib/bdev/scsi_nvme.o 00:02:38.660 LIB libspdk_blob.a 00:02:38.919 CC lib/blobfs/blobfs.o 00:02:39.178 CC lib/blobfs/tree.o 00:02:39.178 CC lib/lvol/lvol.o 00:02:39.437 LIB libspdk_lvol.a 00:02:39.437 LIB libspdk_blobfs.a 00:02:40.006 LIB libspdk_bdev.a 00:02:40.264 CC lib/ublk/ublk.o 00:02:40.264 CC lib/ublk/ublk_rpc.o 00:02:40.264 CC lib/nvmf/ctrlr.o 00:02:40.264 CC lib/nvmf/ctrlr_discovery.o 00:02:40.264 CC lib/nvmf/ctrlr_bdev.o 00:02:40.264 CC lib/nvmf/subsystem.o 00:02:40.264 CC lib/nvmf/nvmf_rpc.o 00:02:40.264 CC lib/nbd/nbd.o 00:02:40.264 CC lib/nvmf/nvmf.o 00:02:40.264 CC lib/nbd/nbd_rpc.o 00:02:40.264 CC lib/nvmf/transport.o 00:02:40.264 CC lib/nvmf/tcp.o 00:02:40.264 CC lib/nvmf/vfio_user.o 00:02:40.264 CC lib/nvmf/rdma.o 00:02:40.264 CC lib/scsi/dev.o 00:02:40.264 CC lib/scsi/lun.o 00:02:40.264 CC lib/scsi/port.o 00:02:40.264 CC lib/scsi/scsi.o 00:02:40.264 CC lib/scsi/scsi_bdev.o 00:02:40.264 CC lib/ftl/ftl_core.o 00:02:40.264 CC lib/scsi/scsi_pr.o 00:02:40.264 CC lib/ftl/ftl_init.o 00:02:40.264 CC lib/scsi/scsi_rpc.o 00:02:40.264 CC lib/ftl/ftl_layout.o 00:02:40.264 CC lib/scsi/task.o 00:02:40.264 CC lib/ftl/ftl_debug.o 00:02:40.264 CC lib/ftl/ftl_io.o 00:02:40.264 CC lib/ftl/ftl_sb.o 00:02:40.264 CC lib/ftl/ftl_l2p.o 00:02:40.264 CC lib/ftl/ftl_l2p_flat.o 00:02:40.264 CC lib/ftl/ftl_nv_cache.o 00:02:40.264 CC lib/ftl/ftl_band.o 00:02:40.264 CC lib/ftl/ftl_band_ops.o 00:02:40.264 CC lib/ftl/ftl_writer.o 00:02:40.264 CC lib/ftl/ftl_rq.o 00:02:40.264 CC lib/ftl/ftl_reloc.o 00:02:40.264 CC lib/ftl/ftl_l2p_cache.o 00:02:40.264 CC lib/ftl/ftl_p2l.o 00:02:40.264 CC lib/ftl/mngt/ftl_mngt.o 00:02:40.264 CC lib/ftl/mngt/ftl_mngt_bdev.o 00:02:40.264 CC lib/ftl/mngt/ftl_mngt_shutdown.o 00:02:40.264 CC lib/ftl/mngt/ftl_mngt_startup.o 00:02:40.264 CC lib/ftl/mngt/ftl_mngt_md.o 00:02:40.264 CC lib/ftl/mngt/ftl_mngt_misc.o 00:02:40.264 CC lib/ftl/mngt/ftl_mngt_ioch.o 00:02:40.264 CC lib/ftl/mngt/ftl_mngt_l2p.o 00:02:40.264 CC lib/ftl/mngt/ftl_mngt_band.o 00:02:40.264 CC lib/ftl/mngt/ftl_mngt_self_test.o 00:02:40.264 CC lib/ftl/mngt/ftl_mngt_p2l.o 00:02:40.264 CC lib/ftl/mngt/ftl_mngt_recovery.o 00:02:40.264 CC lib/ftl/mngt/ftl_mngt_upgrade.o 00:02:40.264 CC lib/ftl/utils/ftl_conf.o 00:02:40.264 CC lib/ftl/utils/ftl_md.o 00:02:40.264 CC lib/ftl/utils/ftl_mempool.o 00:02:40.264 CC lib/ftl/utils/ftl_bitmap.o 00:02:40.264 CC lib/ftl/utils/ftl_property.o 00:02:40.264 CC lib/ftl/upgrade/ftl_layout_upgrade.o 00:02:40.264 CC lib/ftl/utils/ftl_layout_tracker_bdev.o 00:02:40.264 CC lib/ftl/upgrade/ftl_sb_upgrade.o 00:02:40.265 CC lib/ftl/upgrade/ftl_p2l_upgrade.o 00:02:40.265 CC lib/ftl/upgrade/ftl_chunk_upgrade.o 00:02:40.265 CC lib/ftl/upgrade/ftl_band_upgrade.o 00:02:40.265 CC lib/ftl/upgrade/ftl_sb_v3.o 00:02:40.265 CC lib/ftl/upgrade/ftl_sb_v5.o 00:02:40.265 CC lib/ftl/nvc/ftl_nvc_dev.o 00:02:40.265 CC lib/ftl/nvc/ftl_nvc_bdev_vss.o 00:02:40.265 CC lib/ftl/base/ftl_base_dev.o 00:02:40.265 CC lib/ftl/base/ftl_base_bdev.o 00:02:40.265 CC lib/ftl/ftl_trace.o 00:02:40.525 LIB libspdk_scsi.a 00:02:40.525 LIB libspdk_nbd.a 00:02:40.525 LIB libspdk_ublk.a 00:02:40.829 CC lib/vhost/vhost.o 00:02:40.829 CC lib/vhost/vhost_rpc.o 00:02:40.829 CC lib/vhost/vhost_scsi.o 00:02:40.829 CC lib/vhost/vhost_blk.o 00:02:40.829 CC lib/vhost/rte_vhost_user.o 00:02:40.829 CC lib/iscsi/conn.o 00:02:40.829 CC lib/iscsi/md5.o 00:02:40.829 CC lib/iscsi/init_grp.o 00:02:40.829 CC lib/iscsi/portal_grp.o 00:02:40.829 CC lib/iscsi/iscsi.o 00:02:40.829 CC lib/iscsi/param.o 00:02:40.829 CC lib/iscsi/tgt_node.o 00:02:40.829 CC lib/iscsi/iscsi_subsystem.o 00:02:40.829 CC lib/iscsi/iscsi_rpc.o 00:02:40.829 CC lib/iscsi/task.o 00:02:40.829 LIB libspdk_ftl.a 00:02:41.404 LIB libspdk_nvmf.a 00:02:41.404 LIB libspdk_vhost.a 00:02:41.661 LIB libspdk_iscsi.a 00:02:42.227 CC module/env_dpdk/env_dpdk_rpc.o 00:02:42.227 CC module/vfu_device/vfu_virtio.o 00:02:42.227 CC module/vfu_device/vfu_virtio_blk.o 00:02:42.227 CC module/vfu_device/vfu_virtio_scsi.o 00:02:42.227 CC module/vfu_device/vfu_virtio_rpc.o 00:02:42.227 CC module/scheduler/dynamic/scheduler_dynamic.o 00:02:42.227 CC module/blob/bdev/blob_bdev.o 00:02:42.227 LIB libspdk_env_dpdk_rpc.a 00:02:42.227 CC module/scheduler/dpdk_governor/dpdk_governor.o 00:02:42.227 CC module/sock/posix/posix.o 00:02:42.227 CC module/scheduler/gscheduler/gscheduler.o 00:02:42.227 CC module/accel/iaa/accel_iaa.o 00:02:42.227 CC module/accel/iaa/accel_iaa_rpc.o 00:02:42.227 CC module/accel/ioat/accel_ioat_rpc.o 00:02:42.227 CC module/accel/ioat/accel_ioat.o 00:02:42.227 CC module/accel/error/accel_error_rpc.o 00:02:42.227 CC module/accel/error/accel_error.o 00:02:42.227 CC module/accel/dsa/accel_dsa.o 00:02:42.227 CC module/accel/dsa/accel_dsa_rpc.o 00:02:42.484 LIB libspdk_scheduler_dpdk_governor.a 00:02:42.484 LIB libspdk_scheduler_gscheduler.a 00:02:42.484 LIB libspdk_scheduler_dynamic.a 00:02:42.484 LIB libspdk_accel_error.a 00:02:42.484 LIB libspdk_accel_ioat.a 00:02:42.484 LIB libspdk_blob_bdev.a 00:02:42.484 LIB libspdk_accel_iaa.a 00:02:42.484 LIB libspdk_accel_dsa.a 00:02:42.484 LIB libspdk_vfu_device.a 00:02:42.741 LIB libspdk_sock_posix.a 00:02:42.741 CC module/bdev/nvme/bdev_nvme.o 00:02:42.741 CC module/bdev/split/vbdev_split.o 00:02:42.741 CC module/bdev/nvme/bdev_nvme_rpc.o 00:02:42.741 CC module/bdev/nvme/nvme_rpc.o 00:02:42.741 CC module/bdev/split/vbdev_split_rpc.o 00:02:42.741 CC module/bdev/nvme/bdev_mdns_client.o 00:02:42.741 CC module/bdev/nvme/vbdev_opal.o 00:02:42.741 CC module/bdev/nvme/vbdev_opal_rpc.o 00:02:42.741 CC module/bdev/nvme/bdev_nvme_cuse_rpc.o 00:02:42.741 CC module/bdev/raid/bdev_raid.o 00:02:42.741 CC module/bdev/gpt/gpt.o 00:02:42.741 CC module/bdev/raid/bdev_raid_rpc.o 00:02:42.741 CC module/bdev/gpt/vbdev_gpt.o 00:02:42.741 CC module/bdev/raid/raid0.o 00:02:42.741 CC module/bdev/raid/raid1.o 00:02:42.741 CC module/bdev/raid/bdev_raid_sb.o 00:02:42.741 CC module/bdev/raid/concat.o 00:02:42.741 CC module/bdev/delay/vbdev_delay.o 00:02:42.742 CC module/bdev/iscsi/bdev_iscsi_rpc.o 00:02:42.742 CC module/bdev/delay/vbdev_delay_rpc.o 00:02:42.742 CC module/bdev/lvol/vbdev_lvol_rpc.o 00:02:42.742 CC module/bdev/iscsi/bdev_iscsi.o 00:02:42.742 CC module/bdev/lvol/vbdev_lvol.o 00:02:42.999 CC module/bdev/malloc/bdev_malloc.o 00:02:42.999 CC module/bdev/malloc/bdev_malloc_rpc.o 00:02:42.999 CC module/bdev/null/bdev_null.o 00:02:42.999 CC module/bdev/ftl/bdev_ftl_rpc.o 00:02:42.999 CC module/bdev/ftl/bdev_ftl.o 00:02:42.999 CC module/bdev/aio/bdev_aio.o 00:02:42.999 CC module/bdev/passthru/vbdev_passthru.o 00:02:42.999 CC module/bdev/aio/bdev_aio_rpc.o 00:02:42.999 CC module/bdev/null/bdev_null_rpc.o 00:02:42.999 CC module/blobfs/bdev/blobfs_bdev.o 00:02:42.999 CC module/blobfs/bdev/blobfs_bdev_rpc.o 00:02:42.999 CC module/bdev/passthru/vbdev_passthru_rpc.o 00:02:42.999 CC module/bdev/error/vbdev_error.o 00:02:42.999 CC module/bdev/error/vbdev_error_rpc.o 00:02:42.999 CC module/bdev/virtio/bdev_virtio_scsi.o 00:02:42.999 CC module/bdev/virtio/bdev_virtio_blk.o 00:02:42.999 CC module/bdev/virtio/bdev_virtio_rpc.o 00:02:42.999 CC module/bdev/zone_block/vbdev_zone_block.o 00:02:42.999 CC module/bdev/zone_block/vbdev_zone_block_rpc.o 00:02:42.999 LIB libspdk_blobfs_bdev.a 00:02:42.999 LIB libspdk_bdev_split.a 00:02:42.999 LIB libspdk_bdev_gpt.a 00:02:42.999 LIB libspdk_bdev_null.a 00:02:42.999 LIB libspdk_bdev_ftl.a 00:02:42.999 LIB libspdk_bdev_error.a 00:02:42.999 LIB libspdk_bdev_passthru.a 00:02:42.999 LIB libspdk_bdev_iscsi.a 00:02:42.999 LIB libspdk_bdev_aio.a 00:02:43.257 LIB libspdk_bdev_delay.a 00:02:43.257 LIB libspdk_bdev_zone_block.a 00:02:43.257 LIB libspdk_bdev_malloc.a 00:02:43.257 LIB libspdk_bdev_lvol.a 00:02:43.257 LIB libspdk_bdev_virtio.a 00:02:43.257 LIB libspdk_bdev_raid.a 00:02:44.190 LIB libspdk_bdev_nvme.a 00:02:44.754 CC module/event/subsystems/scheduler/scheduler.o 00:02:44.754 CC module/event/subsystems/iobuf/iobuf_rpc.o 00:02:44.754 CC module/event/subsystems/iobuf/iobuf.o 00:02:44.754 CC module/event/subsystems/vmd/vmd.o 00:02:44.754 CC module/event/subsystems/vmd/vmd_rpc.o 00:02:44.754 CC module/event/subsystems/vfu_tgt/vfu_tgt.o 00:02:44.754 CC module/event/subsystems/vhost_blk/vhost_blk.o 00:02:44.754 CC module/event/subsystems/sock/sock.o 00:02:44.754 LIB libspdk_event_scheduler.a 00:02:44.754 LIB libspdk_event_vmd.a 00:02:44.754 LIB libspdk_event_iobuf.a 00:02:44.754 LIB libspdk_event_vfu_tgt.a 00:02:44.754 LIB libspdk_event_vhost_blk.a 00:02:44.754 LIB libspdk_event_sock.a 00:02:45.013 CC module/event/subsystems/accel/accel.o 00:02:45.270 LIB libspdk_event_accel.a 00:02:45.528 CC module/event/subsystems/bdev/bdev.o 00:02:45.786 LIB libspdk_event_bdev.a 00:02:46.044 CC module/event/subsystems/ublk/ublk.o 00:02:46.044 CC module/event/subsystems/nvmf/nvmf_rpc.o 00:02:46.044 CC module/event/subsystems/nvmf/nvmf_tgt.o 00:02:46.044 CC module/event/subsystems/nbd/nbd.o 00:02:46.044 CC module/event/subsystems/scsi/scsi.o 00:02:46.044 LIB libspdk_event_ublk.a 00:02:46.044 LIB libspdk_event_nbd.a 00:02:46.044 LIB libspdk_event_scsi.a 00:02:46.302 LIB libspdk_event_nvmf.a 00:02:46.560 CC module/event/subsystems/vhost_scsi/vhost_scsi.o 00:02:46.560 CC module/event/subsystems/iscsi/iscsi.o 00:02:46.560 LIB libspdk_event_vhost_scsi.a 00:02:46.560 LIB libspdk_event_iscsi.a 00:02:46.818 CC app/spdk_top/spdk_top.o 00:02:46.818 CC app/trace_record/trace_record.o 00:02:46.818 CXX app/trace/trace.o 00:02:46.818 CC app/spdk_lspci/spdk_lspci.o 00:02:46.818 CC app/spdk_nvme_perf/perf.o 00:02:46.818 CC app/spdk_nvme_identify/identify.o 00:02:46.818 CC app/spdk_nvme_discover/discovery_aer.o 00:02:46.818 TEST_HEADER include/spdk/accel.h 00:02:46.818 TEST_HEADER include/spdk/accel_module.h 00:02:46.818 TEST_HEADER include/spdk/assert.h 00:02:46.818 TEST_HEADER include/spdk/base64.h 00:02:46.818 TEST_HEADER include/spdk/barrier.h 00:02:46.818 TEST_HEADER include/spdk/bdev.h 00:02:46.818 CC app/iscsi_tgt/iscsi_tgt.o 00:02:46.818 CC app/spdk_dd/spdk_dd.o 00:02:46.818 TEST_HEADER include/spdk/bdev_module.h 00:02:46.818 TEST_HEADER include/spdk/bit_array.h 00:02:46.818 TEST_HEADER include/spdk/bit_pool.h 00:02:46.818 TEST_HEADER include/spdk/bdev_zone.h 00:02:46.819 TEST_HEADER include/spdk/blob_bdev.h 00:02:46.819 TEST_HEADER include/spdk/blobfs_bdev.h 00:02:46.819 TEST_HEADER include/spdk/blobfs.h 00:02:46.819 TEST_HEADER include/spdk/conf.h 00:02:46.819 TEST_HEADER include/spdk/blob.h 00:02:46.819 TEST_HEADER include/spdk/config.h 00:02:46.819 TEST_HEADER include/spdk/cpuset.h 00:02:46.819 TEST_HEADER include/spdk/crc16.h 00:02:46.819 CC app/vhost/vhost.o 00:02:46.819 TEST_HEADER include/spdk/crc32.h 00:02:46.819 CC test/rpc_client/rpc_client_test.o 00:02:46.819 TEST_HEADER include/spdk/crc64.h 00:02:46.819 TEST_HEADER include/spdk/dif.h 00:02:46.819 TEST_HEADER include/spdk/dma.h 00:02:46.819 TEST_HEADER include/spdk/endian.h 00:02:46.819 TEST_HEADER include/spdk/env.h 00:02:46.819 TEST_HEADER include/spdk/env_dpdk.h 00:02:46.819 TEST_HEADER include/spdk/event.h 00:02:46.819 CC app/nvmf_tgt/nvmf_main.o 00:02:46.819 TEST_HEADER include/spdk/fd_group.h 00:02:46.819 CC examples/interrupt_tgt/interrupt_tgt.o 00:02:46.819 TEST_HEADER include/spdk/fd.h 00:02:46.819 TEST_HEADER include/spdk/file.h 00:02:46.819 TEST_HEADER include/spdk/ftl.h 00:02:46.819 TEST_HEADER include/spdk/hexlify.h 00:02:46.819 TEST_HEADER include/spdk/gpt_spec.h 00:02:46.819 TEST_HEADER include/spdk/histogram_data.h 00:02:46.819 TEST_HEADER include/spdk/idxd.h 00:02:46.819 TEST_HEADER include/spdk/idxd_spec.h 00:02:46.819 TEST_HEADER include/spdk/ioat.h 00:02:46.819 TEST_HEADER include/spdk/init.h 00:02:46.819 TEST_HEADER include/spdk/ioat_spec.h 00:02:46.819 TEST_HEADER include/spdk/iscsi_spec.h 00:02:46.819 TEST_HEADER include/spdk/json.h 00:02:46.819 TEST_HEADER include/spdk/jsonrpc.h 00:02:46.819 TEST_HEADER include/spdk/likely.h 00:02:46.819 TEST_HEADER include/spdk/log.h 00:02:46.819 TEST_HEADER include/spdk/lvol.h 00:02:46.819 TEST_HEADER include/spdk/memory.h 00:02:46.819 TEST_HEADER include/spdk/mmio.h 00:02:46.819 TEST_HEADER include/spdk/nbd.h 00:02:46.819 TEST_HEADER include/spdk/notify.h 00:02:46.819 TEST_HEADER include/spdk/nvme.h 00:02:46.819 TEST_HEADER include/spdk/nvme_intel.h 00:02:46.819 TEST_HEADER include/spdk/nvme_ocssd.h 00:02:46.819 TEST_HEADER include/spdk/nvme_ocssd_spec.h 00:02:46.819 CC app/spdk_tgt/spdk_tgt.o 00:02:46.819 TEST_HEADER include/spdk/nvme_zns.h 00:02:46.819 TEST_HEADER include/spdk/nvme_spec.h 00:02:46.819 TEST_HEADER include/spdk/nvmf_cmd.h 00:02:46.819 TEST_HEADER include/spdk/nvmf_fc_spec.h 00:02:46.819 TEST_HEADER include/spdk/nvmf.h 00:02:46.819 TEST_HEADER include/spdk/nvmf_spec.h 00:02:46.819 TEST_HEADER include/spdk/opal.h 00:02:46.819 TEST_HEADER include/spdk/nvmf_transport.h 00:02:46.819 TEST_HEADER include/spdk/opal_spec.h 00:02:46.819 TEST_HEADER include/spdk/pci_ids.h 00:02:47.084 TEST_HEADER include/spdk/pipe.h 00:02:47.084 TEST_HEADER include/spdk/queue.h 00:02:47.084 TEST_HEADER include/spdk/reduce.h 00:02:47.084 TEST_HEADER include/spdk/scheduler.h 00:02:47.084 TEST_HEADER include/spdk/rpc.h 00:02:47.084 TEST_HEADER include/spdk/scsi.h 00:02:47.084 TEST_HEADER include/spdk/scsi_spec.h 00:02:47.084 TEST_HEADER include/spdk/sock.h 00:02:47.084 TEST_HEADER include/spdk/stdinc.h 00:02:47.084 TEST_HEADER include/spdk/string.h 00:02:47.084 TEST_HEADER include/spdk/thread.h 00:02:47.084 TEST_HEADER include/spdk/trace.h 00:02:47.084 TEST_HEADER include/spdk/tree.h 00:02:47.084 TEST_HEADER include/spdk/trace_parser.h 00:02:47.084 TEST_HEADER include/spdk/ublk.h 00:02:47.084 TEST_HEADER include/spdk/util.h 00:02:47.084 TEST_HEADER include/spdk/uuid.h 00:02:47.084 TEST_HEADER include/spdk/version.h 00:02:47.084 TEST_HEADER include/spdk/vfio_user_pci.h 00:02:47.084 TEST_HEADER include/spdk/vfio_user_spec.h 00:02:47.084 TEST_HEADER include/spdk/vhost.h 00:02:47.084 TEST_HEADER include/spdk/vmd.h 00:02:47.084 TEST_HEADER include/spdk/xor.h 00:02:47.084 TEST_HEADER include/spdk/zipf.h 00:02:47.084 CXX test/cpp_headers/accel.o 00:02:47.084 CXX test/cpp_headers/assert.o 00:02:47.084 CXX test/cpp_headers/accel_module.o 00:02:47.084 CXX test/cpp_headers/barrier.o 00:02:47.084 CXX test/cpp_headers/base64.o 00:02:47.084 CXX test/cpp_headers/bdev.o 00:02:47.084 CXX test/cpp_headers/bdev_zone.o 00:02:47.084 CXX test/cpp_headers/bdev_module.o 00:02:47.084 CXX test/cpp_headers/bit_array.o 00:02:47.084 CXX test/cpp_headers/bit_pool.o 00:02:47.084 CXX test/cpp_headers/blob_bdev.o 00:02:47.084 CXX test/cpp_headers/blobfs_bdev.o 00:02:47.084 CXX test/cpp_headers/blobfs.o 00:02:47.084 CXX test/cpp_headers/blob.o 00:02:47.084 CC app/fio/nvme/fio_plugin.o 00:02:47.084 CXX test/cpp_headers/conf.o 00:02:47.084 CXX test/cpp_headers/config.o 00:02:47.084 CXX test/cpp_headers/cpuset.o 00:02:47.084 CXX test/cpp_headers/crc16.o 00:02:47.084 CXX test/cpp_headers/crc32.o 00:02:47.084 CXX test/cpp_headers/crc64.o 00:02:47.084 CXX test/cpp_headers/dif.o 00:02:47.084 CXX test/cpp_headers/dma.o 00:02:47.084 CXX test/cpp_headers/endian.o 00:02:47.084 CXX test/cpp_headers/env_dpdk.o 00:02:47.084 CXX test/cpp_headers/env.o 00:02:47.084 CXX test/cpp_headers/event.o 00:02:47.084 CXX test/cpp_headers/fd_group.o 00:02:47.084 CXX test/cpp_headers/fd.o 00:02:47.084 CXX test/cpp_headers/file.o 00:02:47.084 CXX test/cpp_headers/ftl.o 00:02:47.084 CC examples/nvme/nvme_manage/nvme_manage.o 00:02:47.084 CXX test/cpp_headers/gpt_spec.o 00:02:47.084 CXX test/cpp_headers/hexlify.o 00:02:47.084 CXX test/cpp_headers/histogram_data.o 00:02:47.084 CC examples/accel/perf/accel_perf.o 00:02:47.084 CXX test/cpp_headers/idxd.o 00:02:47.084 CXX test/cpp_headers/idxd_spec.o 00:02:47.084 CXX test/cpp_headers/init.o 00:02:47.084 CC examples/util/zipf/zipf.o 00:02:47.084 CC examples/nvme/abort/abort.o 00:02:47.084 CC examples/nvme/cmb_copy/cmb_copy.o 00:02:47.084 CC examples/nvme/reconnect/reconnect.o 00:02:47.084 CC examples/nvme/pmr_persistence/pmr_persistence.o 00:02:47.084 CC examples/nvme/hello_world/hello_world.o 00:02:47.084 CC examples/vmd/led/led.o 00:02:47.084 CC test/event/reactor/reactor.o 00:02:47.084 CC test/app/histogram_perf/histogram_perf.o 00:02:47.084 CC examples/vmd/lsvmd/lsvmd.o 00:02:47.084 CC test/event/event_perf/event_perf.o 00:02:47.084 CC examples/nvme/hotplug/hotplug.o 00:02:47.084 CC test/app/jsoncat/jsoncat.o 00:02:47.084 CC examples/ioat/perf/perf.o 00:02:47.084 CC examples/nvme/arbitration/arbitration.o 00:02:47.084 CC examples/sock/hello_world/hello_sock.o 00:02:47.084 CC test/app/stub/stub.o 00:02:47.084 LINK spdk_lspci 00:02:47.084 CC examples/ioat/verify/verify.o 00:02:47.084 CC test/event/reactor_perf/reactor_perf.o 00:02:47.084 CC test/thread/lock/spdk_lock.o 00:02:47.084 CC test/thread/poller_perf/poller_perf.o 00:02:47.084 CC examples/blob/cli/blobcli.o 00:02:47.084 CC examples/idxd/perf/perf.o 00:02:47.084 CC test/nvme/reset/reset.o 00:02:47.084 CC app/fio/bdev/fio_plugin.o 00:02:47.084 CC test/nvme/sgl/sgl.o 00:02:47.084 CC test/nvme/overhead/overhead.o 00:02:47.084 CC test/nvme/boot_partition/boot_partition.o 00:02:47.084 CC test/nvme/aer/aer.o 00:02:47.084 CC test/nvme/e2edp/nvme_dp.o 00:02:47.084 CC test/nvme/cuse/cuse.o 00:02:47.084 CC test/nvme/err_injection/err_injection.o 00:02:47.084 CC test/nvme/reserve/reserve.o 00:02:47.084 CC test/env/memory/memory_ut.o 00:02:47.084 CC test/nvme/simple_copy/simple_copy.o 00:02:47.084 CC examples/blob/hello_world/hello_blob.o 00:02:47.084 CC test/nvme/compliance/nvme_compliance.o 00:02:47.084 CC test/event/app_repeat/app_repeat.o 00:02:47.084 CC test/env/vtophys/vtophys.o 00:02:47.084 CC test/nvme/startup/startup.o 00:02:47.084 CXX test/cpp_headers/ioat.o 00:02:47.084 CC test/nvme/doorbell_aers/doorbell_aers.o 00:02:47.084 CC test/env/env_dpdk_post_init/env_dpdk_post_init.o 00:02:47.084 CC test/nvme/fdp/fdp.o 00:02:47.084 CC test/nvme/fused_ordering/fused_ordering.o 00:02:47.084 CC test/env/pci/pci_ut.o 00:02:47.084 CC test/nvme/connect_stress/connect_stress.o 00:02:47.084 CC test/event/scheduler/scheduler.o 00:02:47.084 CC examples/nvmf/nvmf/nvmf.o 00:02:47.084 CC examples/bdev/bdevperf/bdevperf.o 00:02:47.084 CC examples/bdev/hello_world/hello_bdev.o 00:02:47.084 CC test/dma/test_dma/test_dma.o 00:02:47.084 CC test/app/bdev_svc/bdev_svc.o 00:02:47.084 CC examples/thread/thread/thread_ex.o 00:02:47.084 CC test/accel/dif/dif.o 00:02:47.084 CC test/blobfs/mkfs/mkfs.o 00:02:47.084 CC test/bdev/bdevio/bdevio.o 00:02:47.084 LINK spdk_nvme_discover 00:02:47.084 LINK spdk_trace_record 00:02:47.084 LINK rpc_client_test 00:02:47.084 CC test/app/fuzz/nvme_fuzz/nvme_fuzz.o 00:02:47.084 LINK nvmf_tgt 00:02:47.084 LINK interrupt_tgt 00:02:47.084 CC test/lvol/esnap/esnap.o 00:02:47.084 CC test/env/mem_callbacks/mem_callbacks.o 00:02:47.084 LINK vhost 00:02:47.351 LINK iscsi_tgt 00:02:47.351 CXX test/cpp_headers/ioat_spec.o 00:02:47.351 LINK jsoncat 00:02:47.351 CXX test/cpp_headers/iscsi_spec.o 00:02:47.351 LINK lsvmd 00:02:47.351 LINK led 00:02:47.351 LINK reactor 00:02:47.351 CXX test/cpp_headers/json.o 00:02:47.351 CXX test/cpp_headers/jsonrpc.o 00:02:47.351 LINK zipf 00:02:47.351 CXX test/cpp_headers/likely.o 00:02:47.351 CXX test/cpp_headers/log.o 00:02:47.351 CXX test/cpp_headers/lvol.o 00:02:47.351 LINK histogram_perf 00:02:47.351 CXX test/cpp_headers/memory.o 00:02:47.351 CXX test/cpp_headers/mmio.o 00:02:47.351 CXX test/cpp_headers/nbd.o 00:02:47.351 LINK spdk_tgt 00:02:47.351 CXX test/cpp_headers/notify.o 00:02:47.351 LINK reactor_perf 00:02:47.351 CXX test/cpp_headers/nvme.o 00:02:47.351 CXX test/cpp_headers/nvme_intel.o 00:02:47.351 CXX test/cpp_headers/nvme_ocssd.o 00:02:47.351 CXX test/cpp_headers/nvme_ocssd_spec.o 00:02:47.351 LINK event_perf 00:02:47.351 CXX test/cpp_headers/nvme_spec.o 00:02:47.351 CXX test/cpp_headers/nvme_zns.o 00:02:47.351 CXX test/cpp_headers/nvmf_cmd.o 00:02:47.351 CXX test/cpp_headers/nvmf_fc_spec.o 00:02:47.351 CXX test/cpp_headers/nvmf.o 00:02:47.351 CXX test/cpp_headers/nvmf_spec.o 00:02:47.351 LINK poller_perf 00:02:47.351 CXX test/cpp_headers/nvmf_transport.o 00:02:47.351 CXX test/cpp_headers/opal.o 00:02:47.351 CXX test/cpp_headers/opal_spec.o 00:02:47.351 CXX test/cpp_headers/pci_ids.o 00:02:47.351 fio_plugin.c:1491:29: warning: field 'ruhs' with variable sized type 'struct spdk_nvme_fdp_ruhs' not at the end of a struct or class is a GNU extension [-Wgnu-variable-sized-type-not-at-end] 00:02:47.351 struct spdk_nvme_fdp_ruhs ruhs; 00:02:47.351 ^ 00:02:47.351 CXX test/cpp_headers/pipe.o 00:02:47.351 CXX test/cpp_headers/queue.o 00:02:47.351 CXX test/cpp_headers/reduce.o 00:02:47.351 CXX test/cpp_headers/rpc.o 00:02:47.351 LINK vtophys 00:02:47.351 LINK stub 00:02:47.351 LINK cmb_copy 00:02:47.351 CXX test/cpp_headers/scheduler.o 00:02:47.351 LINK pmr_persistence 00:02:47.351 LINK app_repeat 00:02:47.351 LINK env_dpdk_post_init 00:02:47.351 LINK err_injection 00:02:47.351 CXX test/cpp_headers/scsi.o 00:02:47.351 LINK boot_partition 00:02:47.351 CXX test/cpp_headers/scsi_spec.o 00:02:47.351 LINK doorbell_aers 00:02:47.351 LINK connect_stress 00:02:47.352 LINK hello_world 00:02:47.352 LINK startup 00:02:47.352 LINK ioat_perf 00:02:47.352 CXX test/cpp_headers/sock.o 00:02:47.352 LINK verify 00:02:47.352 LINK hello_sock 00:02:47.352 LINK fused_ordering 00:02:47.352 CXX test/cpp_headers/stdinc.o 00:02:47.352 LINK reserve 00:02:47.352 LINK hotplug 00:02:47.352 LINK bdev_svc 00:02:47.352 LINK simple_copy 00:02:47.352 CC test/app/fuzz/vhost_fuzz/vhost_fuzz_rpc.o 00:02:47.352 CC test/app/fuzz/iscsi_fuzz/iscsi_fuzz.o 00:02:47.352 LINK hello_blob 00:02:47.352 LINK mkfs 00:02:47.352 LINK spdk_trace 00:02:47.352 LINK reset 00:02:47.352 CC test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.o 00:02:47.352 LINK scheduler 00:02:47.352 CC test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.o 00:02:47.352 LINK overhead 00:02:47.352 LINK hello_bdev 00:02:47.352 LINK thread 00:02:47.352 CXX test/cpp_headers/string.o 00:02:47.615 CXX test/cpp_headers/thread.o 00:02:47.615 LINK sgl 00:02:47.615 CXX test/cpp_headers/trace.o 00:02:47.615 LINK aer 00:02:47.615 CC test/app/fuzz/vhost_fuzz/vhost_fuzz.o 00:02:47.615 CXX test/cpp_headers/trace_parser.o 00:02:47.615 CXX test/cpp_headers/tree.o 00:02:47.615 LINK fdp 00:02:47.615 LINK nvmf 00:02:47.615 CXX test/cpp_headers/ublk.o 00:02:47.615 CXX test/cpp_headers/util.o 00:02:47.615 LINK reconnect 00:02:47.615 LINK idxd_perf 00:02:47.615 LINK spdk_dd 00:02:47.615 CXX test/cpp_headers/uuid.o 00:02:47.615 CXX test/cpp_headers/version.o 00:02:47.615 LINK nvme_dp 00:02:47.615 CXX test/cpp_headers/vfio_user_pci.o 00:02:47.615 CXX test/cpp_headers/vfio_user_spec.o 00:02:47.615 CXX test/cpp_headers/vhost.o 00:02:47.615 CXX test/cpp_headers/vmd.o 00:02:47.615 CXX test/cpp_headers/xor.o 00:02:47.615 CXX test/cpp_headers/zipf.o 00:02:47.615 LINK test_dma 00:02:47.615 LINK nvme_manage 00:02:47.615 LINK dif 00:02:47.615 LINK nvme_compliance 00:02:47.615 LINK arbitration 00:02:47.615 LINK abort 00:02:47.615 LINK pci_ut 00:02:47.615 LINK blobcli 00:02:47.615 LINK bdevio 00:02:47.615 LINK spdk_nvme_identify 00:02:47.615 LINK spdk_bdev 00:02:47.875 LINK nvme_fuzz 00:02:47.875 LINK accel_perf 00:02:47.875 LINK llvm_vfio_fuzz 00:02:47.875 1 warning generated. 00:02:47.875 LINK mem_callbacks 00:02:47.875 LINK spdk_nvme 00:02:48.133 LINK memory_ut 00:02:48.133 LINK bdevperf 00:02:48.133 LINK vhost_fuzz 00:02:48.133 LINK spdk_nvme_perf 00:02:48.133 LINK cuse 00:02:48.133 LINK llvm_nvme_fuzz 00:02:48.133 LINK spdk_top 00:02:48.699 LINK iscsi_fuzz 00:02:48.958 LINK spdk_lock 00:02:50.864 LINK esnap 00:02:50.864 00:02:50.864 real 0m41.632s 00:02:50.864 user 6m3.372s 00:02:50.864 sys 2m42.186s 00:02:50.864 00:10:49 -- common/autotest_common.sh@1105 -- $ xtrace_disable 00:02:50.864 00:10:49 -- common/autotest_common.sh@10 -- $ set +x 00:02:50.864 ************************************ 00:02:50.864 END TEST make 00:02:50.864 ************************************ 00:02:51.123 00:10:50 -- spdk/autotest.sh@25 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:02:51.123 00:10:50 -- nvmf/common.sh@7 -- # uname -s 00:02:51.123 00:10:50 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:02:51.123 00:10:50 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:02:51.123 00:10:50 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:02:51.123 00:10:50 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:02:51.123 00:10:50 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:02:51.123 00:10:50 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:02:51.123 00:10:50 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:02:51.123 00:10:50 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:02:51.123 00:10:50 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:02:51.123 00:10:50 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:02:51.123 00:10:50 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:02:51.123 00:10:50 -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:02:51.123 00:10:50 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:02:51.123 00:10:50 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:02:51.123 00:10:50 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:02:51.123 00:10:50 -- nvmf/common.sh@44 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:02:51.123 00:10:50 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:02:51.123 00:10:50 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:02:51.123 00:10:50 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:02:51.123 00:10:50 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:51.123 00:10:50 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:51.123 00:10:50 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:51.123 00:10:50 -- paths/export.sh@5 -- # export PATH 00:02:51.123 00:10:50 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:51.123 00:10:50 -- nvmf/common.sh@46 -- # : 0 00:02:51.123 00:10:50 -- nvmf/common.sh@47 -- # export NVMF_APP_SHM_ID 00:02:51.123 00:10:50 -- nvmf/common.sh@48 -- # build_nvmf_app_args 00:02:51.123 00:10:50 -- nvmf/common.sh@24 -- # '[' 0 -eq 1 ']' 00:02:51.123 00:10:50 -- nvmf/common.sh@28 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:02:51.123 00:10:50 -- nvmf/common.sh@30 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:02:51.123 00:10:50 -- nvmf/common.sh@32 -- # '[' -n '' ']' 00:02:51.123 00:10:50 -- nvmf/common.sh@34 -- # '[' 0 -eq 1 ']' 00:02:51.123 00:10:50 -- nvmf/common.sh@50 -- # have_pci_nics=0 00:02:51.123 00:10:50 -- spdk/autotest.sh@27 -- # '[' 0 -ne 0 ']' 00:02:51.123 00:10:50 -- spdk/autotest.sh@32 -- # uname -s 00:02:51.123 00:10:50 -- spdk/autotest.sh@32 -- # '[' Linux = Linux ']' 00:02:51.123 00:10:50 -- spdk/autotest.sh@33 -- # old_core_pattern='|/usr/lib/systemd/systemd-coredump %P %u %g %s %t %c %h' 00:02:51.123 00:10:50 -- spdk/autotest.sh@34 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/coredumps 00:02:51.123 00:10:50 -- spdk/autotest.sh@39 -- # echo '|/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/core-collector.sh %P %s %t' 00:02:51.123 00:10:50 -- spdk/autotest.sh@40 -- # echo /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/coredumps 00:02:51.123 00:10:50 -- spdk/autotest.sh@44 -- # modprobe nbd 00:02:51.124 00:10:50 -- spdk/autotest.sh@46 -- # type -P udevadm 00:02:51.124 00:10:50 -- spdk/autotest.sh@46 -- # udevadm=/usr/sbin/udevadm 00:02:51.124 00:10:50 -- spdk/autotest.sh@48 -- # udevadm_pid=256054 00:02:51.124 00:10:50 -- spdk/autotest.sh@51 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:02:51.124 00:10:50 -- spdk/autotest.sh@47 -- # /usr/sbin/udevadm monitor --property 00:02:51.124 00:10:50 -- spdk/autotest.sh@54 -- # echo 256056 00:02:51.124 00:10:50 -- spdk/autotest.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:02:51.124 00:10:50 -- spdk/autotest.sh@56 -- # echo 256057 00:02:51.124 00:10:50 -- spdk/autotest.sh@55 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:02:51.124 00:10:50 -- spdk/autotest.sh@58 -- # [[ ............................... != QEMU ]] 00:02:51.124 00:10:50 -- spdk/autotest.sh@60 -- # echo 256058 00:02:51.124 00:10:50 -- spdk/autotest.sh@59 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l 00:02:51.124 00:10:50 -- spdk/autotest.sh@62 -- # echo 256059 00:02:51.124 00:10:50 -- spdk/autotest.sh@61 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l 00:02:51.124 00:10:50 -- spdk/autotest.sh@66 -- # trap 'autotest_cleanup || :; exit 1' SIGINT SIGTERM EXIT 00:02:51.124 00:10:50 -- spdk/autotest.sh@68 -- # timing_enter autotest 00:02:51.124 00:10:50 -- common/autotest_common.sh@712 -- # xtrace_disable 00:02:51.124 00:10:50 -- common/autotest_common.sh@10 -- # set +x 00:02:51.124 00:10:50 -- spdk/autotest.sh@70 -- # create_test_list 00:02:51.124 00:10:50 -- common/autotest_common.sh@736 -- # xtrace_disable 00:02:51.124 00:10:50 -- common/autotest_common.sh@10 -- # set +x 00:02:51.124 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-bmc-pm.bmc.pm.log 00:02:51.124 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-cpu-temp.pm.log 00:02:51.124 00:10:50 -- spdk/autotest.sh@72 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/autotest.sh 00:02:51.124 00:10:50 -- spdk/autotest.sh@72 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:02:51.124 00:10:50 -- spdk/autotest.sh@72 -- # src=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:02:51.124 00:10:50 -- spdk/autotest.sh@73 -- # out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:02:51.124 00:10:50 -- spdk/autotest.sh@74 -- # cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:02:51.124 00:10:50 -- spdk/autotest.sh@76 -- # freebsd_update_contigmem_mod 00:02:51.124 00:10:50 -- common/autotest_common.sh@1440 -- # uname 00:02:51.124 00:10:50 -- common/autotest_common.sh@1440 -- # '[' Linux = FreeBSD ']' 00:02:51.124 00:10:50 -- spdk/autotest.sh@77 -- # freebsd_set_maxsock_buf 00:02:51.124 00:10:50 -- common/autotest_common.sh@1460 -- # uname 00:02:51.124 00:10:50 -- common/autotest_common.sh@1460 -- # [[ Linux = FreeBSD ]] 00:02:51.124 00:10:50 -- spdk/autotest.sh@82 -- # grep CC_TYPE mk/cc.mk 00:02:51.124 00:10:50 -- spdk/autotest.sh@82 -- # CC_TYPE=CC_TYPE=clang 00:02:51.124 00:10:50 -- spdk/autotest.sh@83 -- # hash lcov 00:02:51.124 00:10:50 -- spdk/autotest.sh@83 -- # [[ CC_TYPE=clang == *\c\l\a\n\g* ]] 00:02:51.124 00:10:50 -- spdk/autotest.sh@100 -- # timing_enter pre_cleanup 00:02:51.124 00:10:50 -- common/autotest_common.sh@712 -- # xtrace_disable 00:02:51.124 00:10:50 -- common/autotest_common.sh@10 -- # set +x 00:02:51.124 00:10:50 -- spdk/autotest.sh@102 -- # rm -f 00:02:51.124 00:10:50 -- spdk/autotest.sh@105 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:02:54.410 0000:00:04.7 (8086 2021): Already using the ioatdma driver 00:02:54.410 0000:00:04.6 (8086 2021): Already using the ioatdma driver 00:02:54.410 0000:00:04.5 (8086 2021): Already using the ioatdma driver 00:02:54.410 0000:00:04.4 (8086 2021): Already using the ioatdma driver 00:02:54.669 0000:00:04.3 (8086 2021): Already using the ioatdma driver 00:02:54.669 0000:00:04.2 (8086 2021): Already using the ioatdma driver 00:02:54.669 0000:00:04.1 (8086 2021): Already using the ioatdma driver 00:02:54.669 0000:00:04.0 (8086 2021): Already using the ioatdma driver 00:02:54.669 0000:80:04.7 (8086 2021): Already using the ioatdma driver 00:02:54.669 0000:80:04.6 (8086 2021): Already using the ioatdma driver 00:02:54.669 0000:80:04.5 (8086 2021): Already using the ioatdma driver 00:02:54.669 0000:80:04.4 (8086 2021): Already using the ioatdma driver 00:02:54.669 0000:80:04.3 (8086 2021): Already using the ioatdma driver 00:02:54.927 0000:80:04.2 (8086 2021): Already using the ioatdma driver 00:02:54.927 0000:80:04.1 (8086 2021): Already using the ioatdma driver 00:02:54.927 0000:80:04.0 (8086 2021): Already using the ioatdma driver 00:02:54.927 0000:d8:00.0 (8086 0a54): Already using the nvme driver 00:02:54.927 00:10:53 -- spdk/autotest.sh@107 -- # get_zoned_devs 00:02:54.927 00:10:53 -- common/autotest_common.sh@1654 -- # zoned_devs=() 00:02:54.927 00:10:53 -- common/autotest_common.sh@1654 -- # local -gA zoned_devs 00:02:54.927 00:10:53 -- common/autotest_common.sh@1655 -- # local nvme bdf 00:02:54.927 00:10:53 -- common/autotest_common.sh@1657 -- # for nvme in /sys/block/nvme* 00:02:54.927 00:10:53 -- common/autotest_common.sh@1658 -- # is_block_zoned nvme0n1 00:02:54.927 00:10:53 -- common/autotest_common.sh@1647 -- # local device=nvme0n1 00:02:54.927 00:10:53 -- common/autotest_common.sh@1649 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:02:54.927 00:10:53 -- common/autotest_common.sh@1650 -- # [[ none != none ]] 00:02:54.928 00:10:53 -- spdk/autotest.sh@109 -- # (( 0 > 0 )) 00:02:54.928 00:10:53 -- spdk/autotest.sh@121 -- # ls /dev/nvme0n1 00:02:54.928 00:10:53 -- spdk/autotest.sh@121 -- # grep -v p 00:02:54.928 00:10:53 -- spdk/autotest.sh@121 -- # for dev in $(ls /dev/nvme*n* | grep -v p || true) 00:02:54.928 00:10:53 -- spdk/autotest.sh@123 -- # [[ -z '' ]] 00:02:54.928 00:10:53 -- spdk/autotest.sh@124 -- # block_in_use /dev/nvme0n1 00:02:54.928 00:10:53 -- scripts/common.sh@380 -- # local block=/dev/nvme0n1 pt 00:02:54.928 00:10:53 -- scripts/common.sh@389 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/spdk-gpt.py /dev/nvme0n1 00:02:54.928 No valid GPT data, bailing 00:02:54.928 00:10:53 -- scripts/common.sh@393 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:02:54.928 00:10:53 -- scripts/common.sh@393 -- # pt= 00:02:54.928 00:10:53 -- scripts/common.sh@394 -- # return 1 00:02:54.928 00:10:53 -- spdk/autotest.sh@125 -- # dd if=/dev/zero of=/dev/nvme0n1 bs=1M count=1 00:02:54.928 1+0 records in 00:02:54.928 1+0 records out 00:02:54.928 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00462748 s, 227 MB/s 00:02:54.928 00:10:53 -- spdk/autotest.sh@129 -- # sync 00:02:54.928 00:10:53 -- spdk/autotest.sh@131 -- # xtrace_disable_per_cmd reap_spdk_processes 00:02:54.928 00:10:53 -- common/autotest_common.sh@22 -- # eval 'reap_spdk_processes 12> /dev/null' 00:02:54.928 00:10:53 -- common/autotest_common.sh@22 -- # reap_spdk_processes 00:03:01.501 00:10:59 -- spdk/autotest.sh@135 -- # uname -s 00:03:01.501 00:10:59 -- spdk/autotest.sh@135 -- # '[' Linux = Linux ']' 00:03:01.501 00:10:59 -- spdk/autotest.sh@136 -- # run_test setup.sh /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/test-setup.sh 00:03:01.501 00:10:59 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:03:01.501 00:10:59 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:03:01.501 00:10:59 -- common/autotest_common.sh@10 -- # set +x 00:03:01.501 ************************************ 00:03:01.501 START TEST setup.sh 00:03:01.501 ************************************ 00:03:01.501 00:10:59 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/test-setup.sh 00:03:01.501 * Looking for test storage... 00:03:01.501 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:03:01.501 00:10:59 -- setup/test-setup.sh@10 -- # uname -s 00:03:01.501 00:10:59 -- setup/test-setup.sh@10 -- # [[ Linux == Linux ]] 00:03:01.501 00:10:59 -- setup/test-setup.sh@12 -- # run_test acl /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/acl.sh 00:03:01.501 00:10:59 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:03:01.501 00:10:59 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:03:01.501 00:10:59 -- common/autotest_common.sh@10 -- # set +x 00:03:01.501 ************************************ 00:03:01.501 START TEST acl 00:03:01.501 ************************************ 00:03:01.501 00:10:59 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/acl.sh 00:03:01.501 * Looking for test storage... 00:03:01.501 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:03:01.501 00:11:00 -- setup/acl.sh@10 -- # get_zoned_devs 00:03:01.501 00:11:00 -- common/autotest_common.sh@1654 -- # zoned_devs=() 00:03:01.501 00:11:00 -- common/autotest_common.sh@1654 -- # local -gA zoned_devs 00:03:01.501 00:11:00 -- common/autotest_common.sh@1655 -- # local nvme bdf 00:03:01.501 00:11:00 -- common/autotest_common.sh@1657 -- # for nvme in /sys/block/nvme* 00:03:01.501 00:11:00 -- common/autotest_common.sh@1658 -- # is_block_zoned nvme0n1 00:03:01.501 00:11:00 -- common/autotest_common.sh@1647 -- # local device=nvme0n1 00:03:01.501 00:11:00 -- common/autotest_common.sh@1649 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:03:01.501 00:11:00 -- common/autotest_common.sh@1650 -- # [[ none != none ]] 00:03:01.501 00:11:00 -- setup/acl.sh@12 -- # devs=() 00:03:01.501 00:11:00 -- setup/acl.sh@12 -- # declare -a devs 00:03:01.501 00:11:00 -- setup/acl.sh@13 -- # drivers=() 00:03:01.501 00:11:00 -- setup/acl.sh@13 -- # declare -A drivers 00:03:01.501 00:11:00 -- setup/acl.sh@51 -- # setup reset 00:03:01.501 00:11:00 -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:01.501 00:11:00 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:03:04.795 00:11:03 -- setup/acl.sh@52 -- # collect_setup_devs 00:03:04.795 00:11:03 -- setup/acl.sh@16 -- # local dev driver 00:03:04.795 00:11:03 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:04.795 00:11:03 -- setup/acl.sh@15 -- # setup output status 00:03:04.795 00:11:03 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:04.795 00:11:03 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:03:08.086 Hugepages 00:03:08.086 node hugesize free / total 00:03:08.086 00:11:06 -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:03:08.086 00:11:06 -- setup/acl.sh@19 -- # continue 00:03:08.086 00:11:06 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:08.086 00:11:06 -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:03:08.086 00:11:06 -- setup/acl.sh@19 -- # continue 00:03:08.086 00:11:06 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:08.086 00:11:06 -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:03:08.086 00:11:06 -- setup/acl.sh@19 -- # continue 00:03:08.086 00:11:06 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:08.086 00:03:08.086 Type BDF Vendor Device NUMA Driver Device Block devices 00:03:08.086 00:11:06 -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:03:08.086 00:11:06 -- setup/acl.sh@19 -- # continue 00:03:08.086 00:11:06 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:08.086 00:11:06 -- setup/acl.sh@19 -- # [[ 0000:00:04.0 == *:*:*.* ]] 00:03:08.086 00:11:06 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:08.086 00:11:06 -- setup/acl.sh@20 -- # continue 00:03:08.086 00:11:06 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:08.086 00:11:06 -- setup/acl.sh@19 -- # [[ 0000:00:04.1 == *:*:*.* ]] 00:03:08.086 00:11:06 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:08.086 00:11:06 -- setup/acl.sh@20 -- # continue 00:03:08.086 00:11:06 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:08.086 00:11:06 -- setup/acl.sh@19 -- # [[ 0000:00:04.2 == *:*:*.* ]] 00:03:08.086 00:11:06 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:08.086 00:11:06 -- setup/acl.sh@20 -- # continue 00:03:08.086 00:11:06 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:08.086 00:11:06 -- setup/acl.sh@19 -- # [[ 0000:00:04.3 == *:*:*.* ]] 00:03:08.086 00:11:06 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:08.086 00:11:06 -- setup/acl.sh@20 -- # continue 00:03:08.086 00:11:06 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:08.086 00:11:06 -- setup/acl.sh@19 -- # [[ 0000:00:04.4 == *:*:*.* ]] 00:03:08.086 00:11:06 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:08.086 00:11:06 -- setup/acl.sh@20 -- # continue 00:03:08.086 00:11:06 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:08.086 00:11:06 -- setup/acl.sh@19 -- # [[ 0000:00:04.5 == *:*:*.* ]] 00:03:08.086 00:11:06 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:08.086 00:11:06 -- setup/acl.sh@20 -- # continue 00:03:08.086 00:11:06 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:08.086 00:11:06 -- setup/acl.sh@19 -- # [[ 0000:00:04.6 == *:*:*.* ]] 00:03:08.086 00:11:06 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:08.086 00:11:06 -- setup/acl.sh@20 -- # continue 00:03:08.086 00:11:06 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:08.086 00:11:06 -- setup/acl.sh@19 -- # [[ 0000:00:04.7 == *:*:*.* ]] 00:03:08.086 00:11:06 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:08.086 00:11:06 -- setup/acl.sh@20 -- # continue 00:03:08.086 00:11:06 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:08.086 00:11:06 -- setup/acl.sh@19 -- # [[ 0000:80:04.0 == *:*:*.* ]] 00:03:08.086 00:11:06 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:08.086 00:11:06 -- setup/acl.sh@20 -- # continue 00:03:08.086 00:11:06 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:08.086 00:11:06 -- setup/acl.sh@19 -- # [[ 0000:80:04.1 == *:*:*.* ]] 00:03:08.086 00:11:06 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:08.086 00:11:06 -- setup/acl.sh@20 -- # continue 00:03:08.086 00:11:06 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:08.086 00:11:06 -- setup/acl.sh@19 -- # [[ 0000:80:04.2 == *:*:*.* ]] 00:03:08.086 00:11:06 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:08.086 00:11:06 -- setup/acl.sh@20 -- # continue 00:03:08.086 00:11:06 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:08.086 00:11:06 -- setup/acl.sh@19 -- # [[ 0000:80:04.3 == *:*:*.* ]] 00:03:08.086 00:11:06 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:08.086 00:11:06 -- setup/acl.sh@20 -- # continue 00:03:08.086 00:11:06 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:08.086 00:11:06 -- setup/acl.sh@19 -- # [[ 0000:80:04.4 == *:*:*.* ]] 00:03:08.086 00:11:06 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:08.086 00:11:06 -- setup/acl.sh@20 -- # continue 00:03:08.086 00:11:06 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:08.086 00:11:06 -- setup/acl.sh@19 -- # [[ 0000:80:04.5 == *:*:*.* ]] 00:03:08.086 00:11:06 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:08.086 00:11:06 -- setup/acl.sh@20 -- # continue 00:03:08.086 00:11:06 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:08.086 00:11:06 -- setup/acl.sh@19 -- # [[ 0000:80:04.6 == *:*:*.* ]] 00:03:08.086 00:11:06 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:08.086 00:11:06 -- setup/acl.sh@20 -- # continue 00:03:08.086 00:11:06 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:08.086 00:11:06 -- setup/acl.sh@19 -- # [[ 0000:80:04.7 == *:*:*.* ]] 00:03:08.086 00:11:06 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:08.086 00:11:06 -- setup/acl.sh@20 -- # continue 00:03:08.086 00:11:06 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:08.086 00:11:06 -- setup/acl.sh@19 -- # [[ 0000:d8:00.0 == *:*:*.* ]] 00:03:08.086 00:11:06 -- setup/acl.sh@20 -- # [[ nvme == nvme ]] 00:03:08.086 00:11:06 -- setup/acl.sh@21 -- # [[ '' == *\0\0\0\0\:\d\8\:\0\0\.\0* ]] 00:03:08.087 00:11:06 -- setup/acl.sh@22 -- # devs+=("$dev") 00:03:08.087 00:11:06 -- setup/acl.sh@22 -- # drivers["$dev"]=nvme 00:03:08.087 00:11:06 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:08.087 00:11:06 -- setup/acl.sh@24 -- # (( 1 > 0 )) 00:03:08.087 00:11:06 -- setup/acl.sh@54 -- # run_test denied denied 00:03:08.087 00:11:06 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:03:08.087 00:11:06 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:03:08.087 00:11:06 -- common/autotest_common.sh@10 -- # set +x 00:03:08.087 ************************************ 00:03:08.087 START TEST denied 00:03:08.087 ************************************ 00:03:08.087 00:11:06 -- common/autotest_common.sh@1104 -- # denied 00:03:08.087 00:11:06 -- setup/acl.sh@38 -- # PCI_BLOCKED=' 0000:d8:00.0' 00:03:08.087 00:11:06 -- setup/acl.sh@38 -- # setup output config 00:03:08.087 00:11:06 -- setup/acl.sh@39 -- # grep 'Skipping denied controller at 0000:d8:00.0' 00:03:08.087 00:11:06 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:08.087 00:11:06 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:03:12.284 0000:d8:00.0 (8086 0a54): Skipping denied controller at 0000:d8:00.0 00:03:12.284 00:11:10 -- setup/acl.sh@40 -- # verify 0000:d8:00.0 00:03:12.284 00:11:10 -- setup/acl.sh@28 -- # local dev driver 00:03:12.284 00:11:10 -- setup/acl.sh@30 -- # for dev in "$@" 00:03:12.284 00:11:10 -- setup/acl.sh@31 -- # [[ -e /sys/bus/pci/devices/0000:d8:00.0 ]] 00:03:12.284 00:11:10 -- setup/acl.sh@32 -- # readlink -f /sys/bus/pci/devices/0000:d8:00.0/driver 00:03:12.284 00:11:10 -- setup/acl.sh@32 -- # driver=/sys/bus/pci/drivers/nvme 00:03:12.284 00:11:10 -- setup/acl.sh@33 -- # [[ nvme == \n\v\m\e ]] 00:03:12.284 00:11:10 -- setup/acl.sh@41 -- # setup reset 00:03:12.284 00:11:10 -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:12.284 00:11:10 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:03:16.542 00:03:16.542 real 0m8.256s 00:03:16.542 user 0m2.702s 00:03:16.542 sys 0m4.952s 00:03:16.542 00:11:15 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:16.542 00:11:15 -- common/autotest_common.sh@10 -- # set +x 00:03:16.542 ************************************ 00:03:16.542 END TEST denied 00:03:16.542 ************************************ 00:03:16.542 00:11:15 -- setup/acl.sh@55 -- # run_test allowed allowed 00:03:16.542 00:11:15 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:03:16.542 00:11:15 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:03:16.543 00:11:15 -- common/autotest_common.sh@10 -- # set +x 00:03:16.543 ************************************ 00:03:16.543 START TEST allowed 00:03:16.543 ************************************ 00:03:16.543 00:11:15 -- common/autotest_common.sh@1104 -- # allowed 00:03:16.543 00:11:15 -- setup/acl.sh@45 -- # PCI_ALLOWED=0000:d8:00.0 00:03:16.543 00:11:15 -- setup/acl.sh@45 -- # setup output config 00:03:16.543 00:11:15 -- setup/acl.sh@46 -- # grep -E '0000:d8:00.0 .*: nvme -> .*' 00:03:16.543 00:11:15 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:16.543 00:11:15 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:03:21.817 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:03:21.817 00:11:20 -- setup/acl.sh@47 -- # verify 00:03:21.817 00:11:20 -- setup/acl.sh@28 -- # local dev driver 00:03:21.817 00:11:20 -- setup/acl.sh@48 -- # setup reset 00:03:21.817 00:11:20 -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:21.817 00:11:20 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:03:25.106 00:03:25.106 real 0m8.986s 00:03:25.106 user 0m2.532s 00:03:25.106 sys 0m5.028s 00:03:25.106 00:11:24 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:25.106 00:11:24 -- common/autotest_common.sh@10 -- # set +x 00:03:25.106 ************************************ 00:03:25.106 END TEST allowed 00:03:25.106 ************************************ 00:03:25.106 00:03:25.106 real 0m24.163s 00:03:25.106 user 0m7.615s 00:03:25.106 sys 0m14.745s 00:03:25.106 00:11:24 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:25.106 00:11:24 -- common/autotest_common.sh@10 -- # set +x 00:03:25.106 ************************************ 00:03:25.106 END TEST acl 00:03:25.106 ************************************ 00:03:25.106 00:11:24 -- setup/test-setup.sh@13 -- # run_test hugepages /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/hugepages.sh 00:03:25.106 00:11:24 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:03:25.106 00:11:24 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:03:25.106 00:11:24 -- common/autotest_common.sh@10 -- # set +x 00:03:25.107 ************************************ 00:03:25.107 START TEST hugepages 00:03:25.107 ************************************ 00:03:25.107 00:11:24 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/hugepages.sh 00:03:25.367 * Looking for test storage... 00:03:25.367 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:03:25.367 00:11:24 -- setup/hugepages.sh@10 -- # nodes_sys=() 00:03:25.367 00:11:24 -- setup/hugepages.sh@10 -- # declare -a nodes_sys 00:03:25.367 00:11:24 -- setup/hugepages.sh@12 -- # declare -i default_hugepages=0 00:03:25.367 00:11:24 -- setup/hugepages.sh@13 -- # declare -i no_nodes=0 00:03:25.367 00:11:24 -- setup/hugepages.sh@14 -- # declare -i nr_hugepages=0 00:03:25.367 00:11:24 -- setup/hugepages.sh@16 -- # get_meminfo Hugepagesize 00:03:25.367 00:11:24 -- setup/common.sh@17 -- # local get=Hugepagesize 00:03:25.367 00:11:24 -- setup/common.sh@18 -- # local node= 00:03:25.367 00:11:24 -- setup/common.sh@19 -- # local var val 00:03:25.367 00:11:24 -- setup/common.sh@20 -- # local mem_f mem 00:03:25.367 00:11:24 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:25.367 00:11:24 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:25.367 00:11:24 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:25.367 00:11:24 -- setup/common.sh@28 -- # mapfile -t mem 00:03:25.367 00:11:24 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:25.367 00:11:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.367 00:11:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.367 00:11:24 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 41716960 kB' 'MemAvailable: 44097676 kB' 'Buffers: 12536 kB' 'Cached: 9968804 kB' 'SwapCached: 16 kB' 'Active: 8209336 kB' 'Inactive: 2354388 kB' 'Active(anon): 7733984 kB' 'Inactive(anon): 57088 kB' 'Active(file): 475352 kB' 'Inactive(file): 2297300 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8387836 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 586188 kB' 'Mapped: 162240 kB' 'Shmem: 7208688 kB' 'KReclaimable: 243160 kB' 'Slab: 778952 kB' 'SReclaimable: 243160 kB' 'SUnreclaim: 535792 kB' 'KernelStack: 21952 kB' 'PageTables: 8388 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36439068 kB' 'Committed_AS: 9159136 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213284 kB' 'VmallocChunk: 0 kB' 'Percpu: 78848 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 2048' 'HugePages_Free: 2048' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 4194304 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:03:25.367 00:11:24 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:25.367 00:11:24 -- setup/common.sh@32 -- # continue 00:03:25.367 00:11:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.367 00:11:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.367 00:11:24 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:25.367 00:11:24 -- setup/common.sh@32 -- # continue 00:03:25.367 00:11:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.367 00:11:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.367 00:11:24 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:25.367 00:11:24 -- setup/common.sh@32 -- # continue 00:03:25.367 00:11:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.367 00:11:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.367 00:11:24 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:25.367 00:11:24 -- setup/common.sh@32 -- # continue 00:03:25.367 00:11:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.367 00:11:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.367 00:11:24 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:25.367 00:11:24 -- setup/common.sh@32 -- # continue 00:03:25.368 00:11:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.368 00:11:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.368 00:11:24 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:25.368 00:11:24 -- setup/common.sh@32 -- # continue 00:03:25.368 00:11:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.368 00:11:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.368 00:11:24 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:25.368 00:11:24 -- setup/common.sh@32 -- # continue 00:03:25.368 00:11:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.368 00:11:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.368 00:11:24 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:25.368 00:11:24 -- setup/common.sh@32 -- # continue 00:03:25.368 00:11:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.368 00:11:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.368 00:11:24 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:25.368 00:11:24 -- setup/common.sh@32 -- # continue 00:03:25.368 00:11:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.368 00:11:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.368 00:11:24 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:25.368 00:11:24 -- setup/common.sh@32 -- # continue 00:03:25.368 00:11:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.368 00:11:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.368 00:11:24 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:25.368 00:11:24 -- setup/common.sh@32 -- # continue 00:03:25.368 00:11:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.368 00:11:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.368 00:11:24 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:25.368 00:11:24 -- setup/common.sh@32 -- # continue 00:03:25.368 00:11:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.368 00:11:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.368 00:11:24 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:25.368 00:11:24 -- setup/common.sh@32 -- # continue 00:03:25.368 00:11:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.368 00:11:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.368 00:11:24 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:25.368 00:11:24 -- setup/common.sh@32 -- # continue 00:03:25.368 00:11:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.368 00:11:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.368 00:11:24 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:25.368 00:11:24 -- setup/common.sh@32 -- # continue 00:03:25.368 00:11:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.368 00:11:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.368 00:11:24 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:25.368 00:11:24 -- setup/common.sh@32 -- # continue 00:03:25.368 00:11:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.368 00:11:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.368 00:11:24 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:25.368 00:11:24 -- setup/common.sh@32 -- # continue 00:03:25.368 00:11:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.368 00:11:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.368 00:11:24 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:25.368 00:11:24 -- setup/common.sh@32 -- # continue 00:03:25.368 00:11:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.368 00:11:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.368 00:11:24 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:25.368 00:11:24 -- setup/common.sh@32 -- # continue 00:03:25.368 00:11:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.368 00:11:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.368 00:11:24 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:25.368 00:11:24 -- setup/common.sh@32 -- # continue 00:03:25.368 00:11:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.368 00:11:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.368 00:11:24 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:25.368 00:11:24 -- setup/common.sh@32 -- # continue 00:03:25.368 00:11:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.368 00:11:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.368 00:11:24 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:25.368 00:11:24 -- setup/common.sh@32 -- # continue 00:03:25.368 00:11:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.368 00:11:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.368 00:11:24 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:25.368 00:11:24 -- setup/common.sh@32 -- # continue 00:03:25.368 00:11:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.368 00:11:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.368 00:11:24 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:25.368 00:11:24 -- setup/common.sh@32 -- # continue 00:03:25.368 00:11:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.368 00:11:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.368 00:11:24 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:25.368 00:11:24 -- setup/common.sh@32 -- # continue 00:03:25.368 00:11:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.368 00:11:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.368 00:11:24 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:25.368 00:11:24 -- setup/common.sh@32 -- # continue 00:03:25.368 00:11:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.368 00:11:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.368 00:11:24 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:25.368 00:11:24 -- setup/common.sh@32 -- # continue 00:03:25.368 00:11:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.368 00:11:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.368 00:11:24 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:25.368 00:11:24 -- setup/common.sh@32 -- # continue 00:03:25.368 00:11:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.368 00:11:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.368 00:11:24 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:25.368 00:11:24 -- setup/common.sh@32 -- # continue 00:03:25.368 00:11:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.368 00:11:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.368 00:11:24 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:25.368 00:11:24 -- setup/common.sh@32 -- # continue 00:03:25.368 00:11:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.368 00:11:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.368 00:11:24 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:25.368 00:11:24 -- setup/common.sh@32 -- # continue 00:03:25.368 00:11:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.368 00:11:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.368 00:11:24 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:25.368 00:11:24 -- setup/common.sh@32 -- # continue 00:03:25.368 00:11:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.368 00:11:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.368 00:11:24 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:25.368 00:11:24 -- setup/common.sh@32 -- # continue 00:03:25.368 00:11:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.368 00:11:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.368 00:11:24 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:25.368 00:11:24 -- setup/common.sh@32 -- # continue 00:03:25.368 00:11:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.368 00:11:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.368 00:11:24 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:25.368 00:11:24 -- setup/common.sh@32 -- # continue 00:03:25.368 00:11:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.368 00:11:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.368 00:11:24 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:25.368 00:11:24 -- setup/common.sh@32 -- # continue 00:03:25.368 00:11:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.368 00:11:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.368 00:11:24 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:25.368 00:11:24 -- setup/common.sh@32 -- # continue 00:03:25.368 00:11:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.368 00:11:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.368 00:11:24 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:25.368 00:11:24 -- setup/common.sh@32 -- # continue 00:03:25.368 00:11:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.368 00:11:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.368 00:11:24 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:25.368 00:11:24 -- setup/common.sh@32 -- # continue 00:03:25.368 00:11:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.368 00:11:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.368 00:11:24 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:25.368 00:11:24 -- setup/common.sh@32 -- # continue 00:03:25.368 00:11:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.368 00:11:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.368 00:11:24 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:25.368 00:11:24 -- setup/common.sh@32 -- # continue 00:03:25.368 00:11:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.368 00:11:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.368 00:11:24 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:25.368 00:11:24 -- setup/common.sh@32 -- # continue 00:03:25.368 00:11:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.368 00:11:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.368 00:11:24 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:25.368 00:11:24 -- setup/common.sh@32 -- # continue 00:03:25.368 00:11:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.368 00:11:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.368 00:11:24 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:25.369 00:11:24 -- setup/common.sh@32 -- # continue 00:03:25.369 00:11:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.369 00:11:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.369 00:11:24 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:25.369 00:11:24 -- setup/common.sh@32 -- # continue 00:03:25.369 00:11:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.369 00:11:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.369 00:11:24 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:25.369 00:11:24 -- setup/common.sh@32 -- # continue 00:03:25.369 00:11:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.369 00:11:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.369 00:11:24 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:25.369 00:11:24 -- setup/common.sh@32 -- # continue 00:03:25.369 00:11:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.369 00:11:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.369 00:11:24 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:25.369 00:11:24 -- setup/common.sh@32 -- # continue 00:03:25.369 00:11:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.369 00:11:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.369 00:11:24 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:25.369 00:11:24 -- setup/common.sh@32 -- # continue 00:03:25.369 00:11:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.369 00:11:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.369 00:11:24 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:25.369 00:11:24 -- setup/common.sh@32 -- # continue 00:03:25.369 00:11:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.369 00:11:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.369 00:11:24 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:25.369 00:11:24 -- setup/common.sh@32 -- # continue 00:03:25.369 00:11:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.369 00:11:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.369 00:11:24 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:25.369 00:11:24 -- setup/common.sh@32 -- # continue 00:03:25.369 00:11:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.369 00:11:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.369 00:11:24 -- setup/common.sh@32 -- # [[ Hugepagesize == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:25.369 00:11:24 -- setup/common.sh@33 -- # echo 2048 00:03:25.369 00:11:24 -- setup/common.sh@33 -- # return 0 00:03:25.369 00:11:24 -- setup/hugepages.sh@16 -- # default_hugepages=2048 00:03:25.369 00:11:24 -- setup/hugepages.sh@17 -- # default_huge_nr=/sys/kernel/mm/hugepages/hugepages-2048kB/nr_hugepages 00:03:25.369 00:11:24 -- setup/hugepages.sh@18 -- # global_huge_nr=/proc/sys/vm/nr_hugepages 00:03:25.369 00:11:24 -- setup/hugepages.sh@21 -- # unset -v HUGE_EVEN_ALLOC 00:03:25.369 00:11:24 -- setup/hugepages.sh@22 -- # unset -v HUGEMEM 00:03:25.369 00:11:24 -- setup/hugepages.sh@23 -- # unset -v HUGENODE 00:03:25.369 00:11:24 -- setup/hugepages.sh@24 -- # unset -v NRHUGE 00:03:25.369 00:11:24 -- setup/hugepages.sh@207 -- # get_nodes 00:03:25.369 00:11:24 -- setup/hugepages.sh@27 -- # local node 00:03:25.369 00:11:24 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:25.369 00:11:24 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=2048 00:03:25.369 00:11:24 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:25.369 00:11:24 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:03:25.369 00:11:24 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:25.369 00:11:24 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:25.369 00:11:24 -- setup/hugepages.sh@208 -- # clear_hp 00:03:25.369 00:11:24 -- setup/hugepages.sh@37 -- # local node hp 00:03:25.369 00:11:24 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:03:25.369 00:11:24 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:25.369 00:11:24 -- setup/hugepages.sh@41 -- # echo 0 00:03:25.369 00:11:24 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:25.369 00:11:24 -- setup/hugepages.sh@41 -- # echo 0 00:03:25.369 00:11:24 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:03:25.369 00:11:24 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:25.369 00:11:24 -- setup/hugepages.sh@41 -- # echo 0 00:03:25.369 00:11:24 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:25.369 00:11:24 -- setup/hugepages.sh@41 -- # echo 0 00:03:25.369 00:11:24 -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:03:25.369 00:11:24 -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:03:25.369 00:11:24 -- setup/hugepages.sh@210 -- # run_test default_setup default_setup 00:03:25.369 00:11:24 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:03:25.369 00:11:24 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:03:25.369 00:11:24 -- common/autotest_common.sh@10 -- # set +x 00:03:25.369 ************************************ 00:03:25.369 START TEST default_setup 00:03:25.369 ************************************ 00:03:25.369 00:11:24 -- common/autotest_common.sh@1104 -- # default_setup 00:03:25.369 00:11:24 -- setup/hugepages.sh@136 -- # get_test_nr_hugepages 2097152 0 00:03:25.369 00:11:24 -- setup/hugepages.sh@49 -- # local size=2097152 00:03:25.369 00:11:24 -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:03:25.369 00:11:24 -- setup/hugepages.sh@51 -- # shift 00:03:25.369 00:11:24 -- setup/hugepages.sh@52 -- # node_ids=('0') 00:03:25.369 00:11:24 -- setup/hugepages.sh@52 -- # local node_ids 00:03:25.369 00:11:24 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:25.369 00:11:24 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:03:25.369 00:11:24 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:03:25.369 00:11:24 -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:03:25.369 00:11:24 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:25.369 00:11:24 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:25.369 00:11:24 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:25.369 00:11:24 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:25.369 00:11:24 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:25.369 00:11:24 -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:03:25.369 00:11:24 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:25.369 00:11:24 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:03:25.369 00:11:24 -- setup/hugepages.sh@73 -- # return 0 00:03:25.369 00:11:24 -- setup/hugepages.sh@137 -- # setup output 00:03:25.369 00:11:24 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:25.369 00:11:24 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:28.662 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:03:28.662 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:03:28.662 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:03:28.662 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:03:28.662 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:03:28.662 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:03:28.662 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:03:28.662 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:03:28.662 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:03:28.662 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:03:28.662 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:03:28.662 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:03:28.662 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:03:28.662 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:03:28.662 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:03:28.662 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:03:30.043 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:03:30.043 00:11:28 -- setup/hugepages.sh@138 -- # verify_nr_hugepages 00:03:30.043 00:11:28 -- setup/hugepages.sh@89 -- # local node 00:03:30.043 00:11:28 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:30.043 00:11:28 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:30.043 00:11:28 -- setup/hugepages.sh@92 -- # local surp 00:03:30.043 00:11:28 -- setup/hugepages.sh@93 -- # local resv 00:03:30.043 00:11:28 -- setup/hugepages.sh@94 -- # local anon 00:03:30.043 00:11:28 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:30.043 00:11:28 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:30.043 00:11:28 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:30.043 00:11:28 -- setup/common.sh@18 -- # local node= 00:03:30.043 00:11:28 -- setup/common.sh@19 -- # local var val 00:03:30.043 00:11:28 -- setup/common.sh@20 -- # local mem_f mem 00:03:30.043 00:11:28 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:30.043 00:11:28 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:30.043 00:11:28 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:30.043 00:11:28 -- setup/common.sh@28 -- # mapfile -t mem 00:03:30.043 00:11:28 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:30.043 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.043 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.043 00:11:28 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 43945848 kB' 'MemAvailable: 46326548 kB' 'Buffers: 12536 kB' 'Cached: 9968924 kB' 'SwapCached: 16 kB' 'Active: 8223108 kB' 'Inactive: 2354388 kB' 'Active(anon): 7747756 kB' 'Inactive(anon): 57088 kB' 'Active(file): 475352 kB' 'Inactive(file): 2297300 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8387836 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 599456 kB' 'Mapped: 162508 kB' 'Shmem: 7208808 kB' 'KReclaimable: 243128 kB' 'Slab: 778184 kB' 'SReclaimable: 243128 kB' 'SUnreclaim: 535056 kB' 'KernelStack: 22128 kB' 'PageTables: 8396 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 9174908 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213540 kB' 'VmallocChunk: 0 kB' 'Percpu: 78848 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:03:30.043 00:11:28 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:30.043 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.043 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.043 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.043 00:11:28 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:30.043 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.043 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.043 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.043 00:11:28 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:30.043 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.043 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.043 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.043 00:11:28 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:30.043 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.043 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.043 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.043 00:11:28 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:30.043 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.043 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.043 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.043 00:11:28 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:30.043 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.043 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.043 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.043 00:11:28 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:30.043 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.043 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.043 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.043 00:11:28 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:30.043 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.043 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.043 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.043 00:11:28 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:30.043 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.043 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.043 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.043 00:11:28 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:30.043 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.043 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.043 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.043 00:11:28 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:30.043 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.043 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.043 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.044 00:11:28 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:30.044 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.044 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.044 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.044 00:11:28 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:30.044 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.044 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.044 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.044 00:11:28 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:30.044 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.044 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.044 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.044 00:11:28 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:30.044 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.044 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.044 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.044 00:11:28 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:30.044 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.044 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.044 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.044 00:11:28 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:30.044 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.044 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.044 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.044 00:11:28 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:30.044 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.044 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.044 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.044 00:11:28 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:30.044 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.044 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.044 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.044 00:11:28 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:30.044 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.044 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.044 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.044 00:11:28 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:30.044 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.044 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.044 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.044 00:11:28 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:30.044 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.044 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.044 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.044 00:11:28 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:30.044 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.044 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.044 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.044 00:11:28 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:30.044 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.044 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.044 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.044 00:11:28 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:30.044 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.044 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.044 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.044 00:11:28 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:30.044 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.044 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.044 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.044 00:11:28 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:30.044 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.044 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.044 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.044 00:11:28 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:30.044 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.044 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.044 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.044 00:11:28 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:30.044 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.044 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.044 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.044 00:11:28 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:30.044 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.044 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.044 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.044 00:11:28 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:30.044 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.044 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.044 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.044 00:11:28 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:30.044 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.044 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.044 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.044 00:11:28 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:30.044 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.044 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.044 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.044 00:11:28 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:30.044 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.044 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.044 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.044 00:11:28 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:30.044 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.044 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.044 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.044 00:11:28 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:30.044 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.044 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.044 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.044 00:11:28 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:30.044 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.044 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.044 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.044 00:11:28 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:30.044 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.044 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.044 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.044 00:11:28 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:30.044 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.044 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.044 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.044 00:11:28 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:30.044 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.044 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.044 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.044 00:11:28 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:30.044 00:11:28 -- setup/common.sh@33 -- # echo 0 00:03:30.044 00:11:28 -- setup/common.sh@33 -- # return 0 00:03:30.044 00:11:28 -- setup/hugepages.sh@97 -- # anon=0 00:03:30.044 00:11:28 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:30.044 00:11:28 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:30.044 00:11:28 -- setup/common.sh@18 -- # local node= 00:03:30.044 00:11:28 -- setup/common.sh@19 -- # local var val 00:03:30.044 00:11:28 -- setup/common.sh@20 -- # local mem_f mem 00:03:30.044 00:11:28 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:30.044 00:11:28 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:30.044 00:11:28 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:30.044 00:11:28 -- setup/common.sh@28 -- # mapfile -t mem 00:03:30.044 00:11:28 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:30.044 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.044 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.044 00:11:28 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 43950376 kB' 'MemAvailable: 46331076 kB' 'Buffers: 12536 kB' 'Cached: 9968924 kB' 'SwapCached: 16 kB' 'Active: 8223108 kB' 'Inactive: 2354388 kB' 'Active(anon): 7747756 kB' 'Inactive(anon): 57088 kB' 'Active(file): 475352 kB' 'Inactive(file): 2297300 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8387836 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 599472 kB' 'Mapped: 162468 kB' 'Shmem: 7208808 kB' 'KReclaimable: 243128 kB' 'Slab: 778156 kB' 'SReclaimable: 243128 kB' 'SUnreclaim: 535028 kB' 'KernelStack: 22032 kB' 'PageTables: 8420 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 9176152 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213556 kB' 'VmallocChunk: 0 kB' 'Percpu: 78848 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:03:30.044 00:11:28 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.044 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.044 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.044 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.044 00:11:28 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.044 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.044 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.044 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.044 00:11:28 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.044 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.044 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.044 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.044 00:11:28 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.044 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.044 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.044 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.044 00:11:28 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.044 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.044 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.044 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.044 00:11:28 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.044 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.044 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.045 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.045 00:11:28 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.045 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.045 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.045 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.045 00:11:28 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.045 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.045 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.045 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.045 00:11:28 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.045 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.045 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.045 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.045 00:11:28 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.045 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.045 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.045 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.045 00:11:28 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.045 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.045 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.045 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.045 00:11:28 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.045 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.045 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.045 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.045 00:11:28 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.045 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.045 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.045 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.045 00:11:28 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.045 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.045 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.045 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.045 00:11:28 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.045 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.045 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.045 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.045 00:11:28 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.045 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.045 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.045 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.045 00:11:28 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.045 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.045 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.045 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.045 00:11:28 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.045 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.045 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.045 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.045 00:11:28 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.045 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.045 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.045 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.045 00:11:28 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.045 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.045 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.045 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.045 00:11:28 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.045 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.045 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.045 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.045 00:11:28 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.045 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.045 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.045 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.045 00:11:28 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.045 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.045 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.045 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.045 00:11:28 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.045 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.045 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.045 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.045 00:11:28 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.045 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.045 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.045 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.045 00:11:28 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.045 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.045 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.045 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.045 00:11:28 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.045 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.045 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.045 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.045 00:11:28 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.045 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.045 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.045 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.045 00:11:28 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.045 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.045 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.045 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.045 00:11:28 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.045 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.045 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.045 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.045 00:11:28 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.045 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.045 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.045 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.045 00:11:28 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.045 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.045 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.045 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.045 00:11:28 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.045 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.045 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.045 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.045 00:11:28 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.045 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.045 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.045 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.045 00:11:28 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.045 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.045 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.045 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.045 00:11:28 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.045 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.045 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.045 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.045 00:11:28 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.045 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.045 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.045 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.045 00:11:28 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.045 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.045 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.045 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.045 00:11:28 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.045 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.045 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.045 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.045 00:11:28 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.045 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.045 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.045 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.045 00:11:28 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.045 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.045 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.045 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.045 00:11:28 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.045 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.045 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.045 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.045 00:11:28 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.045 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.045 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.045 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.045 00:11:28 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.045 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.045 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.045 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.045 00:11:28 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.045 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.045 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.045 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.045 00:11:28 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.045 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.045 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.045 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.045 00:11:28 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.045 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.045 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.045 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.045 00:11:28 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.045 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.045 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.045 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.045 00:11:28 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.045 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.045 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.045 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.046 00:11:28 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.046 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.046 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.046 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.046 00:11:28 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.046 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.046 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.046 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.046 00:11:28 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.046 00:11:28 -- setup/common.sh@33 -- # echo 0 00:03:30.046 00:11:28 -- setup/common.sh@33 -- # return 0 00:03:30.046 00:11:28 -- setup/hugepages.sh@99 -- # surp=0 00:03:30.046 00:11:28 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:30.046 00:11:28 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:30.046 00:11:28 -- setup/common.sh@18 -- # local node= 00:03:30.046 00:11:28 -- setup/common.sh@19 -- # local var val 00:03:30.046 00:11:28 -- setup/common.sh@20 -- # local mem_f mem 00:03:30.046 00:11:28 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:30.046 00:11:28 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:30.046 00:11:28 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:30.046 00:11:28 -- setup/common.sh@28 -- # mapfile -t mem 00:03:30.046 00:11:28 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:30.046 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.046 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.046 00:11:28 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 43948752 kB' 'MemAvailable: 46329452 kB' 'Buffers: 12536 kB' 'Cached: 9968936 kB' 'SwapCached: 16 kB' 'Active: 8222632 kB' 'Inactive: 2354388 kB' 'Active(anon): 7747280 kB' 'Inactive(anon): 57088 kB' 'Active(file): 475352 kB' 'Inactive(file): 2297300 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8387836 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 598956 kB' 'Mapped: 162468 kB' 'Shmem: 7208820 kB' 'KReclaimable: 243128 kB' 'Slab: 778152 kB' 'SReclaimable: 243128 kB' 'SUnreclaim: 535024 kB' 'KernelStack: 22144 kB' 'PageTables: 8668 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 9176328 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213540 kB' 'VmallocChunk: 0 kB' 'Percpu: 78848 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:03:30.046 00:11:28 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.046 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.046 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.046 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.046 00:11:28 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.046 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.046 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.046 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.046 00:11:28 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.046 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.046 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.046 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.046 00:11:28 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.046 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.046 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.046 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.046 00:11:28 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.046 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.046 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.046 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.046 00:11:28 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.046 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.046 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.046 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.046 00:11:28 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.046 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.046 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.046 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.046 00:11:28 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.046 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.046 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.046 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.046 00:11:28 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.046 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.046 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.046 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.046 00:11:28 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.046 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.046 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.046 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.046 00:11:28 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.046 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.046 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.046 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.046 00:11:28 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.046 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.046 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.046 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.046 00:11:28 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.046 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.046 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.046 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.046 00:11:28 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.046 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.046 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.046 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.046 00:11:28 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.046 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.046 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.046 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.046 00:11:28 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.046 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.046 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.046 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.046 00:11:28 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.046 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.046 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.046 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.046 00:11:28 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.046 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.046 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.046 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.046 00:11:28 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.046 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.046 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.046 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.046 00:11:28 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.046 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.046 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.046 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.046 00:11:28 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.046 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.046 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.046 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.046 00:11:28 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.046 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.046 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.046 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.046 00:11:28 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.046 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.046 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.046 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.046 00:11:28 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.046 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.046 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.046 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.046 00:11:28 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.046 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.046 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.046 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.046 00:11:28 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.046 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.046 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.046 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.046 00:11:28 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.046 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.046 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.046 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.046 00:11:28 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.046 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.046 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.046 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.046 00:11:28 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.046 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.046 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.046 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.046 00:11:28 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.046 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.046 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.046 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.046 00:11:28 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.046 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.046 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.046 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.046 00:11:28 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.046 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.046 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.046 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.046 00:11:28 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.046 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.046 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.046 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.047 00:11:28 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.047 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.047 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.047 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.047 00:11:28 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.047 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.047 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.047 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.047 00:11:28 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.047 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.047 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.047 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.047 00:11:28 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.047 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.047 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.047 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.047 00:11:28 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.047 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.047 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.047 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.047 00:11:28 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.047 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.047 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.047 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.047 00:11:28 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.047 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.047 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.047 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.047 00:11:28 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.047 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.047 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.047 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.047 00:11:28 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.047 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.047 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.047 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.047 00:11:28 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.047 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.047 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.047 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.047 00:11:28 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.047 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.047 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.047 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.047 00:11:28 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.047 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.047 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.047 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.047 00:11:28 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.047 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.047 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.047 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.047 00:11:28 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.047 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.047 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.047 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.047 00:11:28 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.047 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.047 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.047 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.047 00:11:28 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.047 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.047 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.047 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.047 00:11:28 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.047 00:11:28 -- setup/common.sh@32 -- # continue 00:03:30.047 00:11:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.047 00:11:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.047 00:11:28 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.047 00:11:28 -- setup/common.sh@33 -- # echo 0 00:03:30.047 00:11:28 -- setup/common.sh@33 -- # return 0 00:03:30.047 00:11:28 -- setup/hugepages.sh@100 -- # resv=0 00:03:30.047 00:11:28 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:30.047 nr_hugepages=1024 00:03:30.047 00:11:28 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:30.047 resv_hugepages=0 00:03:30.047 00:11:28 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:30.047 surplus_hugepages=0 00:03:30.047 00:11:28 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:30.047 anon_hugepages=0 00:03:30.047 00:11:28 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:30.047 00:11:28 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:30.047 00:11:28 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:30.047 00:11:28 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:30.047 00:11:28 -- setup/common.sh@18 -- # local node= 00:03:30.047 00:11:28 -- setup/common.sh@19 -- # local var val 00:03:30.047 00:11:28 -- setup/common.sh@20 -- # local mem_f mem 00:03:30.047 00:11:28 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:30.047 00:11:28 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:30.047 00:11:28 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:30.047 00:11:28 -- setup/common.sh@28 -- # mapfile -t mem 00:03:30.047 00:11:28 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:30.047 00:11:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.047 00:11:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.047 00:11:29 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 43947192 kB' 'MemAvailable: 46327892 kB' 'Buffers: 12536 kB' 'Cached: 9968952 kB' 'SwapCached: 16 kB' 'Active: 8222912 kB' 'Inactive: 2354388 kB' 'Active(anon): 7747560 kB' 'Inactive(anon): 57088 kB' 'Active(file): 475352 kB' 'Inactive(file): 2297300 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8387836 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 599356 kB' 'Mapped: 162468 kB' 'Shmem: 7208836 kB' 'KReclaimable: 243128 kB' 'Slab: 778152 kB' 'SReclaimable: 243128 kB' 'SUnreclaim: 535024 kB' 'KernelStack: 22160 kB' 'PageTables: 8976 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 9176340 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213524 kB' 'VmallocChunk: 0 kB' 'Percpu: 78848 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:03:30.047 00:11:29 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.047 00:11:29 -- setup/common.sh@32 -- # continue 00:03:30.047 00:11:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.047 00:11:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.047 00:11:29 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.047 00:11:29 -- setup/common.sh@32 -- # continue 00:03:30.047 00:11:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.047 00:11:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.047 00:11:29 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.047 00:11:29 -- setup/common.sh@32 -- # continue 00:03:30.047 00:11:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.047 00:11:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.047 00:11:29 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.047 00:11:29 -- setup/common.sh@32 -- # continue 00:03:30.047 00:11:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.047 00:11:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.047 00:11:29 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.047 00:11:29 -- setup/common.sh@32 -- # continue 00:03:30.047 00:11:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.047 00:11:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.047 00:11:29 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.047 00:11:29 -- setup/common.sh@32 -- # continue 00:03:30.047 00:11:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.047 00:11:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.047 00:11:29 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.047 00:11:29 -- setup/common.sh@32 -- # continue 00:03:30.047 00:11:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.047 00:11:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.047 00:11:29 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.047 00:11:29 -- setup/common.sh@32 -- # continue 00:03:30.047 00:11:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.047 00:11:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.047 00:11:29 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.047 00:11:29 -- setup/common.sh@32 -- # continue 00:03:30.047 00:11:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.047 00:11:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.047 00:11:29 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.048 00:11:29 -- setup/common.sh@32 -- # continue 00:03:30.048 00:11:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.048 00:11:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.048 00:11:29 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.048 00:11:29 -- setup/common.sh@32 -- # continue 00:03:30.048 00:11:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.048 00:11:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.048 00:11:29 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.048 00:11:29 -- setup/common.sh@32 -- # continue 00:03:30.048 00:11:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.048 00:11:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.048 00:11:29 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.048 00:11:29 -- setup/common.sh@32 -- # continue 00:03:30.048 00:11:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.048 00:11:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.048 00:11:29 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.048 00:11:29 -- setup/common.sh@32 -- # continue 00:03:30.048 00:11:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.048 00:11:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.048 00:11:29 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.048 00:11:29 -- setup/common.sh@32 -- # continue 00:03:30.048 00:11:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.048 00:11:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.048 00:11:29 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.048 00:11:29 -- setup/common.sh@32 -- # continue 00:03:30.048 00:11:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.048 00:11:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.048 00:11:29 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.048 00:11:29 -- setup/common.sh@32 -- # continue 00:03:30.048 00:11:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.048 00:11:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.048 00:11:29 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.048 00:11:29 -- setup/common.sh@32 -- # continue 00:03:30.048 00:11:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.048 00:11:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.048 00:11:29 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.048 00:11:29 -- setup/common.sh@32 -- # continue 00:03:30.048 00:11:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.048 00:11:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.048 00:11:29 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.048 00:11:29 -- setup/common.sh@32 -- # continue 00:03:30.048 00:11:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.048 00:11:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.048 00:11:29 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.048 00:11:29 -- setup/common.sh@32 -- # continue 00:03:30.048 00:11:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.048 00:11:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.048 00:11:29 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.048 00:11:29 -- setup/common.sh@32 -- # continue 00:03:30.048 00:11:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.048 00:11:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.048 00:11:29 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.048 00:11:29 -- setup/common.sh@32 -- # continue 00:03:30.048 00:11:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.048 00:11:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.048 00:11:29 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.048 00:11:29 -- setup/common.sh@32 -- # continue 00:03:30.048 00:11:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.048 00:11:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.048 00:11:29 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.048 00:11:29 -- setup/common.sh@32 -- # continue 00:03:30.048 00:11:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.048 00:11:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.048 00:11:29 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.048 00:11:29 -- setup/common.sh@32 -- # continue 00:03:30.048 00:11:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.048 00:11:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.048 00:11:29 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.048 00:11:29 -- setup/common.sh@32 -- # continue 00:03:30.048 00:11:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.048 00:11:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.048 00:11:29 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.048 00:11:29 -- setup/common.sh@32 -- # continue 00:03:30.048 00:11:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.048 00:11:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.048 00:11:29 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.048 00:11:29 -- setup/common.sh@32 -- # continue 00:03:30.048 00:11:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.048 00:11:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.048 00:11:29 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.048 00:11:29 -- setup/common.sh@32 -- # continue 00:03:30.048 00:11:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.048 00:11:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.048 00:11:29 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.048 00:11:29 -- setup/common.sh@32 -- # continue 00:03:30.048 00:11:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.048 00:11:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.048 00:11:29 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.048 00:11:29 -- setup/common.sh@32 -- # continue 00:03:30.048 00:11:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.048 00:11:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.048 00:11:29 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.048 00:11:29 -- setup/common.sh@32 -- # continue 00:03:30.048 00:11:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.048 00:11:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.048 00:11:29 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.048 00:11:29 -- setup/common.sh@32 -- # continue 00:03:30.048 00:11:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.048 00:11:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.048 00:11:29 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.048 00:11:29 -- setup/common.sh@32 -- # continue 00:03:30.048 00:11:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.048 00:11:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.048 00:11:29 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.048 00:11:29 -- setup/common.sh@32 -- # continue 00:03:30.048 00:11:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.048 00:11:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.048 00:11:29 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.048 00:11:29 -- setup/common.sh@32 -- # continue 00:03:30.048 00:11:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.048 00:11:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.048 00:11:29 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.048 00:11:29 -- setup/common.sh@32 -- # continue 00:03:30.048 00:11:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.048 00:11:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.048 00:11:29 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.048 00:11:29 -- setup/common.sh@32 -- # continue 00:03:30.048 00:11:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.048 00:11:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.048 00:11:29 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.048 00:11:29 -- setup/common.sh@32 -- # continue 00:03:30.048 00:11:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.048 00:11:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.048 00:11:29 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.048 00:11:29 -- setup/common.sh@32 -- # continue 00:03:30.048 00:11:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.048 00:11:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.048 00:11:29 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.048 00:11:29 -- setup/common.sh@32 -- # continue 00:03:30.048 00:11:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.048 00:11:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.048 00:11:29 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.048 00:11:29 -- setup/common.sh@32 -- # continue 00:03:30.048 00:11:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.048 00:11:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.048 00:11:29 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.048 00:11:29 -- setup/common.sh@32 -- # continue 00:03:30.048 00:11:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.048 00:11:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.048 00:11:29 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.048 00:11:29 -- setup/common.sh@32 -- # continue 00:03:30.048 00:11:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.048 00:11:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.048 00:11:29 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.048 00:11:29 -- setup/common.sh@32 -- # continue 00:03:30.048 00:11:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.048 00:11:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.048 00:11:29 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.048 00:11:29 -- setup/common.sh@32 -- # continue 00:03:30.048 00:11:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.048 00:11:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.048 00:11:29 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.048 00:11:29 -- setup/common.sh@32 -- # continue 00:03:30.048 00:11:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.048 00:11:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.048 00:11:29 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.048 00:11:29 -- setup/common.sh@33 -- # echo 1024 00:03:30.048 00:11:29 -- setup/common.sh@33 -- # return 0 00:03:30.049 00:11:29 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:30.049 00:11:29 -- setup/hugepages.sh@112 -- # get_nodes 00:03:30.049 00:11:29 -- setup/hugepages.sh@27 -- # local node 00:03:30.049 00:11:29 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:30.049 00:11:29 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:30.049 00:11:29 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:30.049 00:11:29 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:03:30.049 00:11:29 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:30.049 00:11:29 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:30.049 00:11:29 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:30.049 00:11:29 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:30.049 00:11:29 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:30.049 00:11:29 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:30.049 00:11:29 -- setup/common.sh@18 -- # local node=0 00:03:30.049 00:11:29 -- setup/common.sh@19 -- # local var val 00:03:30.049 00:11:29 -- setup/common.sh@20 -- # local mem_f mem 00:03:30.049 00:11:29 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:30.049 00:11:29 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:30.049 00:11:29 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:30.049 00:11:29 -- setup/common.sh@28 -- # mapfile -t mem 00:03:30.049 00:11:29 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:30.049 00:11:29 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32592084 kB' 'MemFree: 25519476 kB' 'MemUsed: 7072608 kB' 'SwapCached: 16 kB' 'Active: 3150912 kB' 'Inactive: 180704 kB' 'Active(anon): 2934292 kB' 'Inactive(anon): 16 kB' 'Active(file): 216620 kB' 'Inactive(file): 180688 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3065292 kB' 'Mapped: 97008 kB' 'AnonPages: 269532 kB' 'Shmem: 2667968 kB' 'KernelStack: 11976 kB' 'PageTables: 5400 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 132824 kB' 'Slab: 382168 kB' 'SReclaimable: 132824 kB' 'SUnreclaim: 249344 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:03:30.049 00:11:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.049 00:11:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.049 00:11:29 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.049 00:11:29 -- setup/common.sh@32 -- # continue 00:03:30.049 00:11:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.049 00:11:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.049 00:11:29 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.049 00:11:29 -- setup/common.sh@32 -- # continue 00:03:30.049 00:11:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.049 00:11:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.049 00:11:29 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.049 00:11:29 -- setup/common.sh@32 -- # continue 00:03:30.049 00:11:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.049 00:11:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.049 00:11:29 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.049 00:11:29 -- setup/common.sh@32 -- # continue 00:03:30.049 00:11:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.049 00:11:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.049 00:11:29 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.049 00:11:29 -- setup/common.sh@32 -- # continue 00:03:30.049 00:11:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.049 00:11:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.049 00:11:29 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.049 00:11:29 -- setup/common.sh@32 -- # continue 00:03:30.049 00:11:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.049 00:11:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.049 00:11:29 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.049 00:11:29 -- setup/common.sh@32 -- # continue 00:03:30.049 00:11:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.049 00:11:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.049 00:11:29 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.049 00:11:29 -- setup/common.sh@32 -- # continue 00:03:30.049 00:11:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.049 00:11:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.049 00:11:29 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.049 00:11:29 -- setup/common.sh@32 -- # continue 00:03:30.049 00:11:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.049 00:11:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.049 00:11:29 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.049 00:11:29 -- setup/common.sh@32 -- # continue 00:03:30.049 00:11:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.049 00:11:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.049 00:11:29 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.049 00:11:29 -- setup/common.sh@32 -- # continue 00:03:30.049 00:11:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.049 00:11:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.049 00:11:29 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.049 00:11:29 -- setup/common.sh@32 -- # continue 00:03:30.049 00:11:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.049 00:11:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.049 00:11:29 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.049 00:11:29 -- setup/common.sh@32 -- # continue 00:03:30.049 00:11:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.049 00:11:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.049 00:11:29 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.049 00:11:29 -- setup/common.sh@32 -- # continue 00:03:30.049 00:11:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.049 00:11:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.049 00:11:29 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.049 00:11:29 -- setup/common.sh@32 -- # continue 00:03:30.049 00:11:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.049 00:11:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.049 00:11:29 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.049 00:11:29 -- setup/common.sh@32 -- # continue 00:03:30.049 00:11:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.049 00:11:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.049 00:11:29 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.049 00:11:29 -- setup/common.sh@32 -- # continue 00:03:30.049 00:11:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.049 00:11:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.049 00:11:29 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.049 00:11:29 -- setup/common.sh@32 -- # continue 00:03:30.049 00:11:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.049 00:11:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.049 00:11:29 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.049 00:11:29 -- setup/common.sh@32 -- # continue 00:03:30.049 00:11:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.049 00:11:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.049 00:11:29 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.049 00:11:29 -- setup/common.sh@32 -- # continue 00:03:30.049 00:11:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.049 00:11:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.049 00:11:29 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.049 00:11:29 -- setup/common.sh@32 -- # continue 00:03:30.049 00:11:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.049 00:11:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.049 00:11:29 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.049 00:11:29 -- setup/common.sh@32 -- # continue 00:03:30.049 00:11:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.049 00:11:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.049 00:11:29 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.049 00:11:29 -- setup/common.sh@32 -- # continue 00:03:30.049 00:11:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.049 00:11:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.049 00:11:29 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.049 00:11:29 -- setup/common.sh@32 -- # continue 00:03:30.049 00:11:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.049 00:11:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.049 00:11:29 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.049 00:11:29 -- setup/common.sh@32 -- # continue 00:03:30.049 00:11:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.049 00:11:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.049 00:11:29 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.049 00:11:29 -- setup/common.sh@32 -- # continue 00:03:30.049 00:11:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.049 00:11:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.049 00:11:29 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.049 00:11:29 -- setup/common.sh@32 -- # continue 00:03:30.049 00:11:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.049 00:11:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.049 00:11:29 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.049 00:11:29 -- setup/common.sh@32 -- # continue 00:03:30.049 00:11:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.049 00:11:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.049 00:11:29 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.049 00:11:29 -- setup/common.sh@32 -- # continue 00:03:30.049 00:11:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.049 00:11:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.049 00:11:29 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.049 00:11:29 -- setup/common.sh@32 -- # continue 00:03:30.049 00:11:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.049 00:11:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.049 00:11:29 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.049 00:11:29 -- setup/common.sh@32 -- # continue 00:03:30.049 00:11:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.049 00:11:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.049 00:11:29 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.049 00:11:29 -- setup/common.sh@32 -- # continue 00:03:30.049 00:11:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.049 00:11:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.049 00:11:29 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.049 00:11:29 -- setup/common.sh@32 -- # continue 00:03:30.049 00:11:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.049 00:11:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.049 00:11:29 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.049 00:11:29 -- setup/common.sh@32 -- # continue 00:03:30.049 00:11:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.049 00:11:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.049 00:11:29 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.049 00:11:29 -- setup/common.sh@32 -- # continue 00:03:30.050 00:11:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.050 00:11:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.050 00:11:29 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.050 00:11:29 -- setup/common.sh@32 -- # continue 00:03:30.050 00:11:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.050 00:11:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.050 00:11:29 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.050 00:11:29 -- setup/common.sh@33 -- # echo 0 00:03:30.050 00:11:29 -- setup/common.sh@33 -- # return 0 00:03:30.050 00:11:29 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:30.050 00:11:29 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:30.050 00:11:29 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:30.050 00:11:29 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:30.050 00:11:29 -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:03:30.050 node0=1024 expecting 1024 00:03:30.050 00:11:29 -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:03:30.050 00:03:30.050 real 0m4.760s 00:03:30.050 user 0m1.165s 00:03:30.050 sys 0m2.029s 00:03:30.050 00:11:29 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:30.050 00:11:29 -- common/autotest_common.sh@10 -- # set +x 00:03:30.050 ************************************ 00:03:30.050 END TEST default_setup 00:03:30.050 ************************************ 00:03:30.308 00:11:29 -- setup/hugepages.sh@211 -- # run_test per_node_1G_alloc per_node_1G_alloc 00:03:30.308 00:11:29 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:03:30.308 00:11:29 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:03:30.308 00:11:29 -- common/autotest_common.sh@10 -- # set +x 00:03:30.308 ************************************ 00:03:30.308 START TEST per_node_1G_alloc 00:03:30.308 ************************************ 00:03:30.308 00:11:29 -- common/autotest_common.sh@1104 -- # per_node_1G_alloc 00:03:30.308 00:11:29 -- setup/hugepages.sh@143 -- # local IFS=, 00:03:30.308 00:11:29 -- setup/hugepages.sh@145 -- # get_test_nr_hugepages 1048576 0 1 00:03:30.308 00:11:29 -- setup/hugepages.sh@49 -- # local size=1048576 00:03:30.308 00:11:29 -- setup/hugepages.sh@50 -- # (( 3 > 1 )) 00:03:30.308 00:11:29 -- setup/hugepages.sh@51 -- # shift 00:03:30.308 00:11:29 -- setup/hugepages.sh@52 -- # node_ids=('0' '1') 00:03:30.308 00:11:29 -- setup/hugepages.sh@52 -- # local node_ids 00:03:30.308 00:11:29 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:30.308 00:11:29 -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:03:30.308 00:11:29 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 1 00:03:30.308 00:11:29 -- setup/hugepages.sh@62 -- # user_nodes=('0' '1') 00:03:30.308 00:11:29 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:30.308 00:11:29 -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:03:30.308 00:11:29 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:30.308 00:11:29 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:30.308 00:11:29 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:30.308 00:11:29 -- setup/hugepages.sh@69 -- # (( 2 > 0 )) 00:03:30.308 00:11:29 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:30.308 00:11:29 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:03:30.308 00:11:29 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:30.308 00:11:29 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:03:30.308 00:11:29 -- setup/hugepages.sh@73 -- # return 0 00:03:30.309 00:11:29 -- setup/hugepages.sh@146 -- # NRHUGE=512 00:03:30.309 00:11:29 -- setup/hugepages.sh@146 -- # HUGENODE=0,1 00:03:30.309 00:11:29 -- setup/hugepages.sh@146 -- # setup output 00:03:30.309 00:11:29 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:30.309 00:11:29 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:33.598 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:33.598 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:33.598 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:33.598 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:33.598 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:33.598 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:33.598 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:33.598 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:33.598 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:33.598 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:33.598 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:33.598 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:33.598 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:33.598 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:33.598 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:33.598 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:33.598 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:33.598 00:11:32 -- setup/hugepages.sh@147 -- # nr_hugepages=1024 00:03:33.598 00:11:32 -- setup/hugepages.sh@147 -- # verify_nr_hugepages 00:03:33.598 00:11:32 -- setup/hugepages.sh@89 -- # local node 00:03:33.598 00:11:32 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:33.598 00:11:32 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:33.598 00:11:32 -- setup/hugepages.sh@92 -- # local surp 00:03:33.598 00:11:32 -- setup/hugepages.sh@93 -- # local resv 00:03:33.598 00:11:32 -- setup/hugepages.sh@94 -- # local anon 00:03:33.598 00:11:32 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:33.598 00:11:32 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:33.598 00:11:32 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:33.598 00:11:32 -- setup/common.sh@18 -- # local node= 00:03:33.598 00:11:32 -- setup/common.sh@19 -- # local var val 00:03:33.598 00:11:32 -- setup/common.sh@20 -- # local mem_f mem 00:03:33.598 00:11:32 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:33.598 00:11:32 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:33.598 00:11:32 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:33.598 00:11:32 -- setup/common.sh@28 -- # mapfile -t mem 00:03:33.598 00:11:32 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:33.598 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.598 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.598 00:11:32 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 44023488 kB' 'MemAvailable: 46404188 kB' 'Buffers: 12536 kB' 'Cached: 9969056 kB' 'SwapCached: 16 kB' 'Active: 8229172 kB' 'Inactive: 2354388 kB' 'Active(anon): 7753820 kB' 'Inactive(anon): 57088 kB' 'Active(file): 475352 kB' 'Inactive(file): 2297300 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8387836 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 604964 kB' 'Mapped: 163352 kB' 'Shmem: 7208940 kB' 'KReclaimable: 243128 kB' 'Slab: 777684 kB' 'SReclaimable: 243128 kB' 'SUnreclaim: 534556 kB' 'KernelStack: 21968 kB' 'PageTables: 8308 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 9179332 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213480 kB' 'VmallocChunk: 0 kB' 'Percpu: 78848 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:03:33.598 00:11:32 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.598 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.598 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.598 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.598 00:11:32 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.598 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.598 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.598 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.598 00:11:32 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.598 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.598 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.598 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.598 00:11:32 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.598 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.598 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.598 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.598 00:11:32 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.598 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.598 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.598 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.598 00:11:32 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.598 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.598 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.598 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.598 00:11:32 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.598 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.598 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.598 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.598 00:11:32 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.598 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.598 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.598 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.598 00:11:32 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.598 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.598 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.598 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.598 00:11:32 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.598 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.598 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.598 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.598 00:11:32 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.598 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.598 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.598 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.599 00:11:32 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.599 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.599 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.599 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.599 00:11:32 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.599 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.599 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.599 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.599 00:11:32 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.599 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.599 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.599 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.599 00:11:32 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.599 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.599 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.599 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.599 00:11:32 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.599 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.599 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.599 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.599 00:11:32 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.599 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.599 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.599 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.599 00:11:32 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.599 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.599 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.599 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.599 00:11:32 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.599 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.599 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.599 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.599 00:11:32 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.599 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.599 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.599 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.599 00:11:32 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.599 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.599 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.599 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.599 00:11:32 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.599 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.599 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.599 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.599 00:11:32 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.599 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.599 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.599 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.599 00:11:32 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.599 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.599 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.599 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.599 00:11:32 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.599 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.599 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.599 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.599 00:11:32 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.599 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.599 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.599 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.599 00:11:32 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.599 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.599 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.599 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.599 00:11:32 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.599 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.599 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.599 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.599 00:11:32 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.599 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.599 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.599 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.599 00:11:32 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.599 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.599 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.599 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.599 00:11:32 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.599 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.599 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.599 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.599 00:11:32 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.599 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.599 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.599 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.599 00:11:32 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.599 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.599 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.599 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.599 00:11:32 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.599 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.599 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.599 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.599 00:11:32 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.599 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.599 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.599 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.599 00:11:32 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.599 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.599 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.599 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.599 00:11:32 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.599 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.599 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.599 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.599 00:11:32 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.599 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.599 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.599 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.599 00:11:32 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.599 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.599 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.599 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.599 00:11:32 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.599 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.599 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.599 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.599 00:11:32 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.599 00:11:32 -- setup/common.sh@33 -- # echo 0 00:03:33.599 00:11:32 -- setup/common.sh@33 -- # return 0 00:03:33.862 00:11:32 -- setup/hugepages.sh@97 -- # anon=0 00:03:33.862 00:11:32 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:33.862 00:11:32 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:33.862 00:11:32 -- setup/common.sh@18 -- # local node= 00:03:33.862 00:11:32 -- setup/common.sh@19 -- # local var val 00:03:33.862 00:11:32 -- setup/common.sh@20 -- # local mem_f mem 00:03:33.862 00:11:32 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:33.862 00:11:32 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:33.862 00:11:32 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:33.862 00:11:32 -- setup/common.sh@28 -- # mapfile -t mem 00:03:33.862 00:11:32 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:33.862 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.862 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.862 00:11:32 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 44033356 kB' 'MemAvailable: 46414056 kB' 'Buffers: 12536 kB' 'Cached: 9969060 kB' 'SwapCached: 16 kB' 'Active: 8222656 kB' 'Inactive: 2354388 kB' 'Active(anon): 7747304 kB' 'Inactive(anon): 57088 kB' 'Active(file): 475352 kB' 'Inactive(file): 2297300 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8387836 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 598488 kB' 'Mapped: 161812 kB' 'Shmem: 7208944 kB' 'KReclaimable: 243128 kB' 'Slab: 777684 kB' 'SReclaimable: 243128 kB' 'SUnreclaim: 534556 kB' 'KernelStack: 21952 kB' 'PageTables: 8240 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 9162208 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213428 kB' 'VmallocChunk: 0 kB' 'Percpu: 78848 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:03:33.862 00:11:32 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.862 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.862 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.862 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.862 00:11:32 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.862 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.862 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.862 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.862 00:11:32 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.862 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.862 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.862 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.862 00:11:32 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.862 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.862 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.862 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.862 00:11:32 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.862 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.862 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.862 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.862 00:11:32 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.862 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.862 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.862 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.862 00:11:32 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.862 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.862 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.862 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.862 00:11:32 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.862 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.862 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.862 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.862 00:11:32 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.862 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.862 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.862 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.862 00:11:32 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.862 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.862 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.862 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.862 00:11:32 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.862 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.862 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.862 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.862 00:11:32 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.862 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.862 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.862 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.862 00:11:32 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.862 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.862 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.862 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.862 00:11:32 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.862 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.862 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.862 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.862 00:11:32 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.862 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.862 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.862 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.862 00:11:32 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.862 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.862 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.862 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.862 00:11:32 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.862 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.862 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.862 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.862 00:11:32 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.862 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.862 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.862 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.862 00:11:32 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.862 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.862 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.862 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.862 00:11:32 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.862 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.862 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.862 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.862 00:11:32 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.862 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.862 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.862 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.862 00:11:32 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.862 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.862 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.862 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.862 00:11:32 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.862 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.863 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.863 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.863 00:11:32 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.863 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.863 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.863 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.863 00:11:32 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.863 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.863 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.863 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.863 00:11:32 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.863 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.863 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.863 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.863 00:11:32 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.863 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.863 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.863 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.863 00:11:32 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.863 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.863 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.863 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.863 00:11:32 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.863 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.863 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.863 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.863 00:11:32 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.863 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.863 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.863 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.863 00:11:32 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.863 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.863 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.863 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.863 00:11:32 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.863 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.863 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.863 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.863 00:11:32 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.863 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.863 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.863 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.863 00:11:32 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.863 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.863 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.863 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.863 00:11:32 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.863 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.863 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.863 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.863 00:11:32 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.863 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.863 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.863 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.863 00:11:32 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.863 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.863 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.863 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.863 00:11:32 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.863 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.863 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.863 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.863 00:11:32 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.863 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.863 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.863 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.863 00:11:32 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.863 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.863 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.863 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.863 00:11:32 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.863 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.863 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.863 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.863 00:11:32 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.863 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.863 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.863 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.863 00:11:32 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.863 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.863 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.863 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.863 00:11:32 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.863 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.863 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.863 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.863 00:11:32 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.863 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.863 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.863 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.863 00:11:32 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.863 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.863 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.863 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.863 00:11:32 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.863 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.863 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.863 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.863 00:11:32 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.863 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.863 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.863 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.863 00:11:32 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.863 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.863 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.863 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.863 00:11:32 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.863 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.863 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.863 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.863 00:11:32 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.863 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.863 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.863 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.863 00:11:32 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.863 00:11:32 -- setup/common.sh@33 -- # echo 0 00:03:33.863 00:11:32 -- setup/common.sh@33 -- # return 0 00:03:33.863 00:11:32 -- setup/hugepages.sh@99 -- # surp=0 00:03:33.863 00:11:32 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:33.863 00:11:32 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:33.863 00:11:32 -- setup/common.sh@18 -- # local node= 00:03:33.863 00:11:32 -- setup/common.sh@19 -- # local var val 00:03:33.863 00:11:32 -- setup/common.sh@20 -- # local mem_f mem 00:03:33.863 00:11:32 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:33.863 00:11:32 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:33.863 00:11:32 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:33.863 00:11:32 -- setup/common.sh@28 -- # mapfile -t mem 00:03:33.863 00:11:32 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:33.863 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.863 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.863 00:11:32 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 44034924 kB' 'MemAvailable: 46415624 kB' 'Buffers: 12536 kB' 'Cached: 9969060 kB' 'SwapCached: 16 kB' 'Active: 8220964 kB' 'Inactive: 2354388 kB' 'Active(anon): 7745612 kB' 'Inactive(anon): 57088 kB' 'Active(file): 475352 kB' 'Inactive(file): 2297300 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8387836 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 597236 kB' 'Mapped: 161208 kB' 'Shmem: 7208944 kB' 'KReclaimable: 243128 kB' 'Slab: 777632 kB' 'SReclaimable: 243128 kB' 'SUnreclaim: 534504 kB' 'KernelStack: 21904 kB' 'PageTables: 7980 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 9162224 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213412 kB' 'VmallocChunk: 0 kB' 'Percpu: 78848 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:03:33.863 00:11:32 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.863 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.863 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.863 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.863 00:11:32 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.863 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.863 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.863 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.863 00:11:32 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.863 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.863 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.863 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.863 00:11:32 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.863 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.863 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.863 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.863 00:11:32 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.863 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.863 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.863 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.863 00:11:32 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.863 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.863 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.863 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.863 00:11:32 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.863 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.863 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.864 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.864 00:11:32 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.864 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.864 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.864 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.864 00:11:32 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.864 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.864 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.864 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.864 00:11:32 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.864 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.864 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.864 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.864 00:11:32 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.864 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.864 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.864 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.864 00:11:32 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.864 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.864 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.864 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.864 00:11:32 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.864 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.864 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.864 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.864 00:11:32 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.864 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.864 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.864 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.864 00:11:32 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.864 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.864 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.864 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.864 00:11:32 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.864 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.864 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.864 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.864 00:11:32 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.864 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.864 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.864 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.864 00:11:32 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.864 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.864 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.864 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.864 00:11:32 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.864 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.864 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.864 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.864 00:11:32 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.864 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.864 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.864 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.864 00:11:32 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.864 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.864 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.864 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.864 00:11:32 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.864 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.864 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.864 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.864 00:11:32 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.864 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.864 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.864 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.864 00:11:32 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.864 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.864 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.864 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.864 00:11:32 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.864 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.864 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.864 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.864 00:11:32 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.864 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.864 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.864 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.864 00:11:32 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.864 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.864 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.864 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.864 00:11:32 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.864 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.864 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.864 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.864 00:11:32 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.864 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.864 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.864 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.864 00:11:32 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.864 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.864 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.864 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.864 00:11:32 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.864 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.864 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.864 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.864 00:11:32 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.864 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.864 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.864 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.864 00:11:32 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.864 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.864 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.864 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.864 00:11:32 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.864 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.864 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.864 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.864 00:11:32 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.864 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.864 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.864 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.864 00:11:32 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.864 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.864 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.864 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.864 00:11:32 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.864 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.864 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.864 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.864 00:11:32 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.864 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.864 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.864 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.864 00:11:32 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.864 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.864 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.864 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.864 00:11:32 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.864 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.864 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.864 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.864 00:11:32 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.864 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.864 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.864 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.864 00:11:32 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.864 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.864 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.864 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.864 00:11:32 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.864 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.864 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.864 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.864 00:11:32 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.864 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.864 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.864 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.864 00:11:32 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.864 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.864 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.864 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.864 00:11:32 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.864 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.864 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.864 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.864 00:11:32 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.864 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.864 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.864 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.864 00:11:32 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.864 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.864 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.864 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.864 00:11:32 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.864 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.864 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.864 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.864 00:11:32 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.864 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.864 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.864 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.865 00:11:32 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.865 00:11:32 -- setup/common.sh@33 -- # echo 0 00:03:33.865 00:11:32 -- setup/common.sh@33 -- # return 0 00:03:33.865 00:11:32 -- setup/hugepages.sh@100 -- # resv=0 00:03:33.865 00:11:32 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:33.865 nr_hugepages=1024 00:03:33.865 00:11:32 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:33.865 resv_hugepages=0 00:03:33.865 00:11:32 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:33.865 surplus_hugepages=0 00:03:33.865 00:11:32 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:33.865 anon_hugepages=0 00:03:33.865 00:11:32 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:33.865 00:11:32 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:33.865 00:11:32 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:33.865 00:11:32 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:33.865 00:11:32 -- setup/common.sh@18 -- # local node= 00:03:33.865 00:11:32 -- setup/common.sh@19 -- # local var val 00:03:33.865 00:11:32 -- setup/common.sh@20 -- # local mem_f mem 00:03:33.865 00:11:32 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:33.865 00:11:32 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:33.865 00:11:32 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:33.865 00:11:32 -- setup/common.sh@28 -- # mapfile -t mem 00:03:33.865 00:11:32 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:33.865 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.865 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.865 00:11:32 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 44034832 kB' 'MemAvailable: 46415532 kB' 'Buffers: 12536 kB' 'Cached: 9969088 kB' 'SwapCached: 16 kB' 'Active: 8220996 kB' 'Inactive: 2354388 kB' 'Active(anon): 7745644 kB' 'Inactive(anon): 57088 kB' 'Active(file): 475352 kB' 'Inactive(file): 2297300 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8387836 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 597232 kB' 'Mapped: 161208 kB' 'Shmem: 7208972 kB' 'KReclaimable: 243128 kB' 'Slab: 777632 kB' 'SReclaimable: 243128 kB' 'SUnreclaim: 534504 kB' 'KernelStack: 21904 kB' 'PageTables: 7980 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 9162240 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213428 kB' 'VmallocChunk: 0 kB' 'Percpu: 78848 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:03:33.865 00:11:32 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.865 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.865 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.865 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.865 00:11:32 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.865 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.865 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.865 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.865 00:11:32 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.865 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.865 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.865 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.865 00:11:32 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.865 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.865 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.865 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.865 00:11:32 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.865 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.865 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.865 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.865 00:11:32 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.865 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.865 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.865 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.865 00:11:32 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.865 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.865 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.865 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.865 00:11:32 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.865 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.865 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.865 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.865 00:11:32 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.865 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.865 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.865 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.865 00:11:32 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.865 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.865 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.865 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.865 00:11:32 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.865 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.865 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.865 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.865 00:11:32 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.865 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.865 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.865 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.865 00:11:32 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.865 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.865 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.865 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.865 00:11:32 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.865 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.865 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.865 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.865 00:11:32 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.865 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.865 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.865 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.865 00:11:32 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.865 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.865 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.865 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.865 00:11:32 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.865 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.865 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.865 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.865 00:11:32 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.865 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.865 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.865 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.865 00:11:32 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.865 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.865 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.865 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.865 00:11:32 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.865 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.865 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.865 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.865 00:11:32 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.865 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.865 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.865 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.865 00:11:32 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.865 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.865 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.865 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.865 00:11:32 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.865 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.865 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.865 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.865 00:11:32 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.865 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.865 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.865 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.865 00:11:32 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.865 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.865 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.865 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.865 00:11:32 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.865 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.865 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.865 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.865 00:11:32 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.865 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.865 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.865 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.865 00:11:32 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.865 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.865 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.865 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.865 00:11:32 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.865 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.865 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.865 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.865 00:11:32 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.865 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.865 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.865 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.865 00:11:32 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.865 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.865 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.865 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.865 00:11:32 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.865 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.865 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.865 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.865 00:11:32 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.865 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.866 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.866 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.866 00:11:32 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.866 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.866 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.866 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.866 00:11:32 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.866 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.866 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.866 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.866 00:11:32 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.866 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.866 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.866 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.866 00:11:32 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.866 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.866 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.866 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.866 00:11:32 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.866 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.866 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.866 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.866 00:11:32 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.866 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.866 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.866 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.866 00:11:32 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.866 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.866 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.866 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.866 00:11:32 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.866 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.866 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.866 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.866 00:11:32 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.866 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.866 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.866 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.866 00:11:32 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.866 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.866 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.866 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.866 00:11:32 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.866 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.866 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.866 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.866 00:11:32 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.866 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.866 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.866 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.866 00:11:32 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.866 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.866 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.866 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.866 00:11:32 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.866 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.866 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.866 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.866 00:11:32 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.866 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.866 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.866 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.866 00:11:32 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.866 00:11:32 -- setup/common.sh@33 -- # echo 1024 00:03:33.866 00:11:32 -- setup/common.sh@33 -- # return 0 00:03:33.866 00:11:32 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:33.866 00:11:32 -- setup/hugepages.sh@112 -- # get_nodes 00:03:33.866 00:11:32 -- setup/hugepages.sh@27 -- # local node 00:03:33.866 00:11:32 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:33.866 00:11:32 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:33.866 00:11:32 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:33.866 00:11:32 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:33.866 00:11:32 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:33.866 00:11:32 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:33.866 00:11:32 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:33.866 00:11:32 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:33.866 00:11:32 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:33.866 00:11:32 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:33.866 00:11:32 -- setup/common.sh@18 -- # local node=0 00:03:33.866 00:11:32 -- setup/common.sh@19 -- # local var val 00:03:33.866 00:11:32 -- setup/common.sh@20 -- # local mem_f mem 00:03:33.866 00:11:32 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:33.866 00:11:32 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:33.866 00:11:32 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:33.866 00:11:32 -- setup/common.sh@28 -- # mapfile -t mem 00:03:33.866 00:11:32 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:33.866 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.866 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.866 00:11:32 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32592084 kB' 'MemFree: 26615948 kB' 'MemUsed: 5976136 kB' 'SwapCached: 16 kB' 'Active: 3149492 kB' 'Inactive: 180704 kB' 'Active(anon): 2932872 kB' 'Inactive(anon): 16 kB' 'Active(file): 216620 kB' 'Inactive(file): 180688 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3065304 kB' 'Mapped: 96720 kB' 'AnonPages: 268204 kB' 'Shmem: 2667980 kB' 'KernelStack: 11768 kB' 'PageTables: 4608 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 132824 kB' 'Slab: 381628 kB' 'SReclaimable: 132824 kB' 'SUnreclaim: 248804 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:33.866 00:11:32 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.866 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.866 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.866 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.866 00:11:32 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.866 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.866 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.866 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.866 00:11:32 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.866 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.866 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.866 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.866 00:11:32 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.866 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.866 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.866 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.866 00:11:32 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.866 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.866 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.866 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.866 00:11:32 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.866 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.866 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.866 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.866 00:11:32 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.866 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.866 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.866 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.866 00:11:32 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.866 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.866 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.866 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.866 00:11:32 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.866 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.866 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.866 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.866 00:11:32 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.866 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.866 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.867 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.867 00:11:32 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.867 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.867 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.867 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.867 00:11:32 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.867 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.867 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.867 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.867 00:11:32 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.867 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.867 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.867 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.867 00:11:32 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.867 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.867 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.867 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.867 00:11:32 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.867 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.867 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.867 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.867 00:11:32 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.867 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.867 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.867 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.867 00:11:32 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.867 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.867 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.867 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.867 00:11:32 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.867 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.867 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.867 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.867 00:11:32 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.867 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.867 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.867 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.867 00:11:32 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.867 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.867 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.867 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.867 00:11:32 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.867 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.867 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.867 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.867 00:11:32 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.867 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.867 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.867 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.867 00:11:32 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.867 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.867 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.867 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.867 00:11:32 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.867 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.867 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.867 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.867 00:11:32 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.867 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.867 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.867 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.867 00:11:32 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.867 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.867 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.867 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.867 00:11:32 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.867 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.867 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.867 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.867 00:11:32 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.867 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.867 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.867 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.867 00:11:32 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.867 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.867 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.867 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.867 00:11:32 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.867 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.867 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.867 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.867 00:11:32 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.867 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.867 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.867 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.867 00:11:32 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.867 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.867 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.867 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.867 00:11:32 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.867 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.867 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.867 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.867 00:11:32 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.867 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.867 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.867 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.867 00:11:32 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.867 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.867 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.867 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.867 00:11:32 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.867 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.867 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.867 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.867 00:11:32 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.867 00:11:32 -- setup/common.sh@33 -- # echo 0 00:03:33.867 00:11:32 -- setup/common.sh@33 -- # return 0 00:03:33.867 00:11:32 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:33.867 00:11:32 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:33.867 00:11:32 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:33.867 00:11:32 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:03:33.867 00:11:32 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:33.867 00:11:32 -- setup/common.sh@18 -- # local node=1 00:03:33.867 00:11:32 -- setup/common.sh@19 -- # local var val 00:03:33.867 00:11:32 -- setup/common.sh@20 -- # local mem_f mem 00:03:33.867 00:11:32 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:33.867 00:11:32 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:03:33.867 00:11:32 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:03:33.867 00:11:32 -- setup/common.sh@28 -- # mapfile -t mem 00:03:33.867 00:11:32 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:33.867 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.867 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.867 00:11:32 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27703148 kB' 'MemFree: 17419716 kB' 'MemUsed: 10283432 kB' 'SwapCached: 0 kB' 'Active: 5071216 kB' 'Inactive: 2173684 kB' 'Active(anon): 4812484 kB' 'Inactive(anon): 57072 kB' 'Active(file): 258732 kB' 'Inactive(file): 2116612 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 6916352 kB' 'Mapped: 64488 kB' 'AnonPages: 328744 kB' 'Shmem: 4541008 kB' 'KernelStack: 10136 kB' 'PageTables: 3372 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 110304 kB' 'Slab: 396004 kB' 'SReclaimable: 110304 kB' 'SUnreclaim: 285700 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:33.867 00:11:32 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.867 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.867 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.867 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.867 00:11:32 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.867 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.867 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.867 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.867 00:11:32 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.867 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.867 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.867 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.867 00:11:32 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.867 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.867 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.867 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.867 00:11:32 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.867 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.867 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.867 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.867 00:11:32 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.867 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.867 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.867 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.867 00:11:32 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.867 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.867 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.867 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.867 00:11:32 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.867 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.867 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.867 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.867 00:11:32 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.867 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.867 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.867 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.867 00:11:32 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.867 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.867 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.868 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.868 00:11:32 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.868 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.868 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.868 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.868 00:11:32 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.868 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.868 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.868 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.868 00:11:32 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.868 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.868 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.868 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.868 00:11:32 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.868 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.868 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.868 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.868 00:11:32 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.868 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.868 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.868 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.868 00:11:32 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.868 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.868 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.868 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.868 00:11:32 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.868 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.868 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.868 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.868 00:11:32 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.868 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.868 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.868 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.868 00:11:32 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.868 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.868 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.868 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.868 00:11:32 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.868 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.868 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.868 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.868 00:11:32 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.868 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.868 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.868 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.868 00:11:32 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.868 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.868 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.868 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.868 00:11:32 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.868 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.868 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.868 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.868 00:11:32 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.868 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.868 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.868 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.868 00:11:32 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.868 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.868 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.868 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.868 00:11:32 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.868 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.868 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.868 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.868 00:11:32 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.868 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.868 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.868 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.868 00:11:32 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.868 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.868 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.868 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.868 00:11:32 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.868 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.868 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.868 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.868 00:11:32 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.868 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.868 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.868 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.868 00:11:32 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.868 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.868 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.868 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.868 00:11:32 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.868 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.868 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.868 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.868 00:11:32 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.868 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.868 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.868 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.868 00:11:32 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.868 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.868 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.868 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.868 00:11:32 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.868 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.868 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.868 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.868 00:11:32 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.868 00:11:32 -- setup/common.sh@32 -- # continue 00:03:33.868 00:11:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.868 00:11:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.868 00:11:32 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.868 00:11:32 -- setup/common.sh@33 -- # echo 0 00:03:33.868 00:11:32 -- setup/common.sh@33 -- # return 0 00:03:33.868 00:11:32 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:33.868 00:11:32 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:33.868 00:11:32 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:33.868 00:11:32 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:33.868 00:11:32 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:03:33.868 node0=512 expecting 512 00:03:33.868 00:11:32 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:33.868 00:11:32 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:33.868 00:11:32 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:33.868 00:11:32 -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:03:33.868 node1=512 expecting 512 00:03:33.868 00:11:32 -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:03:33.868 00:03:33.868 real 0m3.666s 00:03:33.868 user 0m1.412s 00:03:33.868 sys 0m2.317s 00:03:33.868 00:11:32 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:33.868 00:11:32 -- common/autotest_common.sh@10 -- # set +x 00:03:33.868 ************************************ 00:03:33.868 END TEST per_node_1G_alloc 00:03:33.868 ************************************ 00:03:33.868 00:11:32 -- setup/hugepages.sh@212 -- # run_test even_2G_alloc even_2G_alloc 00:03:33.868 00:11:32 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:03:33.868 00:11:32 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:03:33.868 00:11:32 -- common/autotest_common.sh@10 -- # set +x 00:03:33.868 ************************************ 00:03:33.868 START TEST even_2G_alloc 00:03:33.868 ************************************ 00:03:33.868 00:11:32 -- common/autotest_common.sh@1104 -- # even_2G_alloc 00:03:33.868 00:11:32 -- setup/hugepages.sh@152 -- # get_test_nr_hugepages 2097152 00:03:33.868 00:11:32 -- setup/hugepages.sh@49 -- # local size=2097152 00:03:33.868 00:11:32 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:03:33.868 00:11:32 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:33.868 00:11:32 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:03:33.868 00:11:32 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:03:33.868 00:11:32 -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:33.868 00:11:32 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:33.868 00:11:32 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:33.868 00:11:32 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:33.868 00:11:32 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:33.868 00:11:32 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:33.868 00:11:32 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:33.868 00:11:32 -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:03:33.868 00:11:32 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:33.868 00:11:32 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:03:33.868 00:11:32 -- setup/hugepages.sh@83 -- # : 512 00:03:33.868 00:11:32 -- setup/hugepages.sh@84 -- # : 1 00:03:33.868 00:11:32 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:33.868 00:11:32 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:03:33.868 00:11:32 -- setup/hugepages.sh@83 -- # : 0 00:03:33.868 00:11:32 -- setup/hugepages.sh@84 -- # : 0 00:03:33.868 00:11:32 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:33.868 00:11:32 -- setup/hugepages.sh@153 -- # NRHUGE=1024 00:03:33.868 00:11:32 -- setup/hugepages.sh@153 -- # HUGE_EVEN_ALLOC=yes 00:03:33.868 00:11:32 -- setup/hugepages.sh@153 -- # setup output 00:03:33.868 00:11:32 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:33.868 00:11:32 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:37.158 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:37.158 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:37.158 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:37.158 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:37.158 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:37.158 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:37.158 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:37.158 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:37.158 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:37.158 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:37.158 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:37.158 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:37.158 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:37.158 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:37.158 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:37.158 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:37.158 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:37.421 00:11:36 -- setup/hugepages.sh@154 -- # verify_nr_hugepages 00:03:37.421 00:11:36 -- setup/hugepages.sh@89 -- # local node 00:03:37.421 00:11:36 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:37.421 00:11:36 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:37.421 00:11:36 -- setup/hugepages.sh@92 -- # local surp 00:03:37.421 00:11:36 -- setup/hugepages.sh@93 -- # local resv 00:03:37.421 00:11:36 -- setup/hugepages.sh@94 -- # local anon 00:03:37.421 00:11:36 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:37.421 00:11:36 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:37.421 00:11:36 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:37.421 00:11:36 -- setup/common.sh@18 -- # local node= 00:03:37.421 00:11:36 -- setup/common.sh@19 -- # local var val 00:03:37.421 00:11:36 -- setup/common.sh@20 -- # local mem_f mem 00:03:37.421 00:11:36 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:37.421 00:11:36 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:37.421 00:11:36 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:37.421 00:11:36 -- setup/common.sh@28 -- # mapfile -t mem 00:03:37.421 00:11:36 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:37.421 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.421 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.421 00:11:36 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 44081604 kB' 'MemAvailable: 46462304 kB' 'Buffers: 12536 kB' 'Cached: 9969180 kB' 'SwapCached: 16 kB' 'Active: 8222736 kB' 'Inactive: 2354388 kB' 'Active(anon): 7747384 kB' 'Inactive(anon): 57088 kB' 'Active(file): 475352 kB' 'Inactive(file): 2297300 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8387836 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 598224 kB' 'Mapped: 161312 kB' 'Shmem: 7209064 kB' 'KReclaimable: 243128 kB' 'Slab: 777028 kB' 'SReclaimable: 243128 kB' 'SUnreclaim: 533900 kB' 'KernelStack: 21904 kB' 'PageTables: 8052 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 9162852 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213444 kB' 'VmallocChunk: 0 kB' 'Percpu: 78848 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:03:37.421 00:11:36 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.421 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.421 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.421 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.421 00:11:36 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.421 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.421 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.421 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.421 00:11:36 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.421 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.421 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.421 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.421 00:11:36 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.421 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.421 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.421 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.421 00:11:36 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.421 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.421 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.421 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.421 00:11:36 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.421 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.421 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.421 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.421 00:11:36 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.421 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.421 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.421 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.421 00:11:36 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.421 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.421 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.421 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.421 00:11:36 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.421 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.421 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.421 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.421 00:11:36 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.421 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.421 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.421 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.422 00:11:36 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.422 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.422 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.422 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.422 00:11:36 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.422 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.422 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.422 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.422 00:11:36 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.422 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.422 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.422 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.422 00:11:36 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.422 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.422 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.422 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.422 00:11:36 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.422 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.422 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.422 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.422 00:11:36 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.422 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.422 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.422 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.422 00:11:36 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.422 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.422 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.422 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.422 00:11:36 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.422 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.422 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.422 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.422 00:11:36 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.422 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.422 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.422 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.422 00:11:36 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.422 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.422 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.422 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.422 00:11:36 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.422 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.422 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.422 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.422 00:11:36 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.422 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.422 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.422 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.422 00:11:36 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.422 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.422 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.422 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.422 00:11:36 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.422 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.422 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.422 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.422 00:11:36 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.422 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.422 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.422 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.422 00:11:36 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.422 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.422 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.422 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.422 00:11:36 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.422 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.422 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.422 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.422 00:11:36 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.422 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.422 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.422 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.422 00:11:36 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.422 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.422 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.422 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.422 00:11:36 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.422 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.422 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.422 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.422 00:11:36 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.422 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.422 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.422 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.422 00:11:36 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.422 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.422 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.422 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.422 00:11:36 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.422 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.422 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.422 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.422 00:11:36 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.422 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.422 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.422 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.422 00:11:36 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.422 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.422 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.422 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.422 00:11:36 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.422 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.422 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.422 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.422 00:11:36 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.422 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.423 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.423 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.423 00:11:36 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.423 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.423 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.423 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.423 00:11:36 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.423 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.423 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.423 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.423 00:11:36 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.423 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.423 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.423 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.423 00:11:36 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.423 00:11:36 -- setup/common.sh@33 -- # echo 0 00:03:37.423 00:11:36 -- setup/common.sh@33 -- # return 0 00:03:37.423 00:11:36 -- setup/hugepages.sh@97 -- # anon=0 00:03:37.423 00:11:36 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:37.423 00:11:36 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:37.423 00:11:36 -- setup/common.sh@18 -- # local node= 00:03:37.423 00:11:36 -- setup/common.sh@19 -- # local var val 00:03:37.423 00:11:36 -- setup/common.sh@20 -- # local mem_f mem 00:03:37.423 00:11:36 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:37.423 00:11:36 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:37.423 00:11:36 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:37.423 00:11:36 -- setup/common.sh@28 -- # mapfile -t mem 00:03:37.423 00:11:36 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:37.423 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.423 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.423 00:11:36 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 44081632 kB' 'MemAvailable: 46462332 kB' 'Buffers: 12536 kB' 'Cached: 9969184 kB' 'SwapCached: 16 kB' 'Active: 8221888 kB' 'Inactive: 2354388 kB' 'Active(anon): 7746536 kB' 'Inactive(anon): 57088 kB' 'Active(file): 475352 kB' 'Inactive(file): 2297300 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8387836 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 597804 kB' 'Mapped: 161220 kB' 'Shmem: 7209068 kB' 'KReclaimable: 243128 kB' 'Slab: 776984 kB' 'SReclaimable: 243128 kB' 'SUnreclaim: 533856 kB' 'KernelStack: 21888 kB' 'PageTables: 7992 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 9162864 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213428 kB' 'VmallocChunk: 0 kB' 'Percpu: 78848 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:03:37.423 00:11:36 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.423 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.423 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.423 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.423 00:11:36 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.423 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.423 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.423 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.423 00:11:36 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.423 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.423 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.423 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.423 00:11:36 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.423 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.423 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.423 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.423 00:11:36 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.423 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.423 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.423 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.423 00:11:36 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.423 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.423 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.423 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.423 00:11:36 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.423 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.423 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.423 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.423 00:11:36 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.423 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.423 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.423 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.423 00:11:36 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.423 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.423 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.423 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.423 00:11:36 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.423 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.423 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.423 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.423 00:11:36 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.423 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.423 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.423 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.423 00:11:36 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.423 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.423 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.423 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.423 00:11:36 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.423 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.423 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.423 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.423 00:11:36 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.423 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.423 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.423 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.423 00:11:36 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.423 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.423 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.423 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.423 00:11:36 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.423 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.423 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.423 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.423 00:11:36 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.423 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.423 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.424 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.424 00:11:36 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.424 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.424 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.424 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.424 00:11:36 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.424 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.424 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.424 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.424 00:11:36 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.424 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.424 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.424 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.424 00:11:36 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.424 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.424 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.424 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.424 00:11:36 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.424 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.424 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.424 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.424 00:11:36 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.424 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.424 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.424 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.424 00:11:36 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.424 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.424 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.424 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.424 00:11:36 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.424 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.424 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.424 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.424 00:11:36 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.424 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.424 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.424 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.424 00:11:36 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.424 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.424 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.424 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.424 00:11:36 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.424 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.424 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.424 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.424 00:11:36 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.424 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.424 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.424 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.424 00:11:36 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.424 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.424 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.424 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.424 00:11:36 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.424 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.424 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.424 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.424 00:11:36 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.424 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.424 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.424 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.424 00:11:36 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.424 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.424 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.424 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.424 00:11:36 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.424 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.424 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.424 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.424 00:11:36 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.424 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.424 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.424 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.424 00:11:36 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.424 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.424 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.424 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.424 00:11:36 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.424 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.424 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.424 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.424 00:11:36 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.424 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.424 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.424 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.424 00:11:36 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.424 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.424 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.424 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.424 00:11:36 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.424 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.424 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.424 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.424 00:11:36 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.424 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.424 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.424 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.424 00:11:36 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.424 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.424 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.424 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.424 00:11:36 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.424 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.424 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.424 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.424 00:11:36 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.424 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.424 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.424 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.424 00:11:36 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.424 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.424 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.425 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.425 00:11:36 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.425 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.425 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.425 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.425 00:11:36 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.425 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.425 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.425 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.425 00:11:36 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.425 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.425 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.425 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.425 00:11:36 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.425 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.425 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.425 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.425 00:11:36 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.425 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.425 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.425 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.425 00:11:36 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.425 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.425 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.425 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.425 00:11:36 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.425 00:11:36 -- setup/common.sh@33 -- # echo 0 00:03:37.425 00:11:36 -- setup/common.sh@33 -- # return 0 00:03:37.425 00:11:36 -- setup/hugepages.sh@99 -- # surp=0 00:03:37.425 00:11:36 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:37.425 00:11:36 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:37.425 00:11:36 -- setup/common.sh@18 -- # local node= 00:03:37.425 00:11:36 -- setup/common.sh@19 -- # local var val 00:03:37.425 00:11:36 -- setup/common.sh@20 -- # local mem_f mem 00:03:37.425 00:11:36 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:37.425 00:11:36 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:37.425 00:11:36 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:37.425 00:11:36 -- setup/common.sh@28 -- # mapfile -t mem 00:03:37.425 00:11:36 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:37.425 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.425 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.425 00:11:36 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 44081724 kB' 'MemAvailable: 46462424 kB' 'Buffers: 12536 kB' 'Cached: 9969196 kB' 'SwapCached: 16 kB' 'Active: 8221888 kB' 'Inactive: 2354388 kB' 'Active(anon): 7746536 kB' 'Inactive(anon): 57088 kB' 'Active(file): 475352 kB' 'Inactive(file): 2297300 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8387836 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 597800 kB' 'Mapped: 161220 kB' 'Shmem: 7209080 kB' 'KReclaimable: 243128 kB' 'Slab: 776984 kB' 'SReclaimable: 243128 kB' 'SUnreclaim: 533856 kB' 'KernelStack: 21888 kB' 'PageTables: 7992 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 9162876 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213428 kB' 'VmallocChunk: 0 kB' 'Percpu: 78848 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:03:37.425 00:11:36 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.425 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.425 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.425 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.425 00:11:36 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.425 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.425 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.425 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.425 00:11:36 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.425 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.425 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.425 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.425 00:11:36 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.425 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.425 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.425 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.425 00:11:36 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.425 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.425 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.425 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.425 00:11:36 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.425 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.425 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.425 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.425 00:11:36 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.425 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.425 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.425 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.425 00:11:36 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.425 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.425 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.425 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.425 00:11:36 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.425 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.425 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.425 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.425 00:11:36 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.425 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.425 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.425 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.425 00:11:36 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.425 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.425 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.425 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.425 00:11:36 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.425 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.425 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.425 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.425 00:11:36 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.425 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.425 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.425 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.425 00:11:36 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.425 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.426 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.426 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.426 00:11:36 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.426 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.426 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.426 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.426 00:11:36 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.426 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.426 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.426 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.426 00:11:36 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.426 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.426 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.426 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.426 00:11:36 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.426 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.426 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.426 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.426 00:11:36 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.426 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.426 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.426 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.426 00:11:36 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.426 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.426 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.426 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.426 00:11:36 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.426 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.426 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.426 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.426 00:11:36 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.426 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.426 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.426 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.426 00:11:36 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.426 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.426 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.426 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.426 00:11:36 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.426 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.426 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.426 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.426 00:11:36 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.426 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.426 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.426 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.426 00:11:36 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.426 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.426 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.426 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.426 00:11:36 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.426 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.426 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.426 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.426 00:11:36 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.426 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.426 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.426 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.426 00:11:36 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.426 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.426 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.426 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.426 00:11:36 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.426 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.426 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.426 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.426 00:11:36 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.426 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.426 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.426 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.426 00:11:36 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.426 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.426 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.426 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.426 00:11:36 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.426 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.426 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.426 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.426 00:11:36 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.426 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.426 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.426 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.426 00:11:36 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.426 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.426 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.426 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.426 00:11:36 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.426 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.426 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.426 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.426 00:11:36 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.426 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.426 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.426 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.426 00:11:36 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.426 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.426 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.426 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.426 00:11:36 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.426 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.426 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.426 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.426 00:11:36 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.426 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.426 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.426 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.426 00:11:36 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.426 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.427 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.427 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.427 00:11:36 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.427 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.427 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.427 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.427 00:11:36 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.427 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.427 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.427 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.427 00:11:36 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.427 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.427 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.427 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.427 00:11:36 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.427 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.427 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.427 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.427 00:11:36 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.427 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.427 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.427 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.427 00:11:36 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.427 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.427 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.427 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.427 00:11:36 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.427 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.427 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.427 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.427 00:11:36 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.427 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.427 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.427 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.427 00:11:36 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.427 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.427 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.427 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.427 00:11:36 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.427 00:11:36 -- setup/common.sh@33 -- # echo 0 00:03:37.427 00:11:36 -- setup/common.sh@33 -- # return 0 00:03:37.427 00:11:36 -- setup/hugepages.sh@100 -- # resv=0 00:03:37.427 00:11:36 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:37.427 nr_hugepages=1024 00:03:37.427 00:11:36 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:37.427 resv_hugepages=0 00:03:37.427 00:11:36 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:37.427 surplus_hugepages=0 00:03:37.427 00:11:36 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:37.427 anon_hugepages=0 00:03:37.427 00:11:36 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:37.427 00:11:36 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:37.427 00:11:36 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:37.427 00:11:36 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:37.427 00:11:36 -- setup/common.sh@18 -- # local node= 00:03:37.427 00:11:36 -- setup/common.sh@19 -- # local var val 00:03:37.427 00:11:36 -- setup/common.sh@20 -- # local mem_f mem 00:03:37.427 00:11:36 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:37.427 00:11:36 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:37.427 00:11:36 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:37.427 00:11:36 -- setup/common.sh@28 -- # mapfile -t mem 00:03:37.427 00:11:36 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:37.427 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.427 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.427 00:11:36 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 44080748 kB' 'MemAvailable: 46461448 kB' 'Buffers: 12536 kB' 'Cached: 9969212 kB' 'SwapCached: 16 kB' 'Active: 8222348 kB' 'Inactive: 2354388 kB' 'Active(anon): 7746996 kB' 'Inactive(anon): 57088 kB' 'Active(file): 475352 kB' 'Inactive(file): 2297300 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8387836 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 598368 kB' 'Mapped: 161220 kB' 'Shmem: 7209096 kB' 'KReclaimable: 243128 kB' 'Slab: 776976 kB' 'SReclaimable: 243128 kB' 'SUnreclaim: 533848 kB' 'KernelStack: 21904 kB' 'PageTables: 8040 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 9165676 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213412 kB' 'VmallocChunk: 0 kB' 'Percpu: 78848 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:03:37.427 00:11:36 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.427 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.427 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.427 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.427 00:11:36 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.427 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.427 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.427 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.427 00:11:36 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.427 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.427 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.427 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.427 00:11:36 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.427 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.427 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.427 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.427 00:11:36 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.427 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.427 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.427 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.427 00:11:36 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.427 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.427 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.427 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.427 00:11:36 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.427 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.427 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.427 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.428 00:11:36 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.428 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.428 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.428 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.428 00:11:36 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.428 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.428 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.428 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.428 00:11:36 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.428 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.428 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.428 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.428 00:11:36 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.428 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.428 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.428 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.428 00:11:36 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.428 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.428 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.428 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.428 00:11:36 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.428 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.428 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.428 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.428 00:11:36 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.428 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.428 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.428 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.428 00:11:36 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.428 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.428 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.428 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.428 00:11:36 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.428 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.428 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.428 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.428 00:11:36 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.428 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.428 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.428 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.428 00:11:36 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.428 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.428 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.428 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.428 00:11:36 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.428 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.428 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.428 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.428 00:11:36 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.428 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.428 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.428 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.428 00:11:36 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.428 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.428 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.428 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.428 00:11:36 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.428 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.428 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.428 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.428 00:11:36 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.428 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.428 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.428 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.428 00:11:36 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.428 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.428 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.428 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.428 00:11:36 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.428 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.428 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.428 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.428 00:11:36 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.428 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.428 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.428 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.428 00:11:36 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.428 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.428 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.428 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.428 00:11:36 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.428 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.428 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.428 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.428 00:11:36 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.429 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.429 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.429 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.429 00:11:36 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.429 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.429 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.429 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.429 00:11:36 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.429 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.429 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.429 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.429 00:11:36 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.429 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.429 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.429 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.429 00:11:36 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.429 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.429 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.429 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.429 00:11:36 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.429 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.429 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.429 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.429 00:11:36 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.429 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.429 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.429 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.429 00:11:36 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.429 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.429 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.429 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.429 00:11:36 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.429 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.429 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.429 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.429 00:11:36 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.429 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.429 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.429 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.429 00:11:36 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.429 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.429 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.429 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.429 00:11:36 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.429 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.429 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.429 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.429 00:11:36 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.429 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.429 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.429 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.429 00:11:36 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.429 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.429 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.429 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.429 00:11:36 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.429 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.429 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.429 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.429 00:11:36 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.429 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.429 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.429 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.429 00:11:36 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.429 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.429 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.429 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.429 00:11:36 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.429 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.429 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.429 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.429 00:11:36 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.429 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.429 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.429 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.429 00:11:36 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.429 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.429 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.429 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.429 00:11:36 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.429 00:11:36 -- setup/common.sh@33 -- # echo 1024 00:03:37.429 00:11:36 -- setup/common.sh@33 -- # return 0 00:03:37.429 00:11:36 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:37.429 00:11:36 -- setup/hugepages.sh@112 -- # get_nodes 00:03:37.429 00:11:36 -- setup/hugepages.sh@27 -- # local node 00:03:37.429 00:11:36 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:37.429 00:11:36 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:37.429 00:11:36 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:37.429 00:11:36 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:37.429 00:11:36 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:37.429 00:11:36 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:37.429 00:11:36 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:37.429 00:11:36 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:37.429 00:11:36 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:37.429 00:11:36 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:37.429 00:11:36 -- setup/common.sh@18 -- # local node=0 00:03:37.429 00:11:36 -- setup/common.sh@19 -- # local var val 00:03:37.429 00:11:36 -- setup/common.sh@20 -- # local mem_f mem 00:03:37.429 00:11:36 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:37.429 00:11:36 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:37.429 00:11:36 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:37.429 00:11:36 -- setup/common.sh@28 -- # mapfile -t mem 00:03:37.429 00:11:36 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:37.429 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.429 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.429 00:11:36 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32592084 kB' 'MemFree: 26636216 kB' 'MemUsed: 5955868 kB' 'SwapCached: 16 kB' 'Active: 3150400 kB' 'Inactive: 180704 kB' 'Active(anon): 2933780 kB' 'Inactive(anon): 16 kB' 'Active(file): 216620 kB' 'Inactive(file): 180688 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3065308 kB' 'Mapped: 96728 kB' 'AnonPages: 269032 kB' 'Shmem: 2667984 kB' 'KernelStack: 11880 kB' 'PageTables: 4572 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 132824 kB' 'Slab: 381132 kB' 'SReclaimable: 132824 kB' 'SUnreclaim: 248308 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:37.429 00:11:36 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.429 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.429 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.429 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.429 00:11:36 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.429 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.429 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.429 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.429 00:11:36 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.429 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.429 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.429 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.429 00:11:36 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.429 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.429 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.429 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.429 00:11:36 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.429 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.429 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.429 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.430 00:11:36 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.430 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.430 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.430 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.430 00:11:36 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.430 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.430 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.430 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.430 00:11:36 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.430 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.430 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.430 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.430 00:11:36 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.430 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.430 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.430 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.430 00:11:36 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.430 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.430 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.430 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.430 00:11:36 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.430 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.430 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.430 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.430 00:11:36 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.430 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.430 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.430 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.430 00:11:36 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.430 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.430 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.430 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.430 00:11:36 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.430 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.430 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.430 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.430 00:11:36 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.430 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.430 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.430 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.430 00:11:36 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.430 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.430 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.430 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.430 00:11:36 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.430 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.430 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.430 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.430 00:11:36 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.430 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.430 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.430 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.430 00:11:36 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.430 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.430 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.430 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.430 00:11:36 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.430 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.430 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.430 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.430 00:11:36 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.430 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.430 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.430 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.430 00:11:36 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.430 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.430 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.430 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.430 00:11:36 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.430 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.430 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.430 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.430 00:11:36 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.430 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.430 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.430 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.430 00:11:36 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.430 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.430 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.430 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.430 00:11:36 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.430 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.430 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.430 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.430 00:11:36 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.430 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.430 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.430 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.430 00:11:36 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.430 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.430 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.430 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.430 00:11:36 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.430 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.430 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.430 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.430 00:11:36 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.430 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.430 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.430 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.430 00:11:36 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.430 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.430 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.430 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.430 00:11:36 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.430 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.430 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.430 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.430 00:11:36 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.430 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.430 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.430 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.430 00:11:36 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.430 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.430 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.430 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.430 00:11:36 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.430 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.430 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.430 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.430 00:11:36 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.430 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.430 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.430 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.430 00:11:36 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.430 00:11:36 -- setup/common.sh@33 -- # echo 0 00:03:37.430 00:11:36 -- setup/common.sh@33 -- # return 0 00:03:37.430 00:11:36 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:37.430 00:11:36 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:37.430 00:11:36 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:37.430 00:11:36 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:03:37.430 00:11:36 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:37.430 00:11:36 -- setup/common.sh@18 -- # local node=1 00:03:37.430 00:11:36 -- setup/common.sh@19 -- # local var val 00:03:37.430 00:11:36 -- setup/common.sh@20 -- # local mem_f mem 00:03:37.430 00:11:36 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:37.430 00:11:36 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:03:37.430 00:11:36 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:03:37.430 00:11:36 -- setup/common.sh@28 -- # mapfile -t mem 00:03:37.430 00:11:36 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:37.430 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.430 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.430 00:11:36 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27703148 kB' 'MemFree: 17444032 kB' 'MemUsed: 10259116 kB' 'SwapCached: 0 kB' 'Active: 5072688 kB' 'Inactive: 2173684 kB' 'Active(anon): 4813956 kB' 'Inactive(anon): 57072 kB' 'Active(file): 258732 kB' 'Inactive(file): 2116612 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 6916468 kB' 'Mapped: 64492 kB' 'AnonPages: 330104 kB' 'Shmem: 4541124 kB' 'KernelStack: 10152 kB' 'PageTables: 3472 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 110304 kB' 'Slab: 395812 kB' 'SReclaimable: 110304 kB' 'SUnreclaim: 285508 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:37.430 00:11:36 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.430 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.430 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.430 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.431 00:11:36 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.431 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.431 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.431 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.431 00:11:36 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.431 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.431 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.431 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.431 00:11:36 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.431 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.431 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.431 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.431 00:11:36 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.431 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.431 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.431 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.431 00:11:36 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.431 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.431 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.431 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.431 00:11:36 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.431 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.431 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.431 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.431 00:11:36 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.431 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.431 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.431 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.431 00:11:36 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.431 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.431 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.431 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.431 00:11:36 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.431 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.431 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.431 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.431 00:11:36 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.431 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.431 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.431 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.431 00:11:36 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.431 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.431 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.431 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.431 00:11:36 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.431 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.431 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.431 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.431 00:11:36 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.431 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.431 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.431 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.431 00:11:36 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.431 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.431 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.431 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.431 00:11:36 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.431 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.431 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.431 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.431 00:11:36 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.431 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.431 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.431 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.431 00:11:36 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.431 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.431 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.431 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.431 00:11:36 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.431 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.431 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.431 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.431 00:11:36 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.431 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.431 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.431 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.431 00:11:36 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.431 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.431 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.431 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.431 00:11:36 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.431 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.431 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.431 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.431 00:11:36 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.431 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.431 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.431 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.431 00:11:36 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.431 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.431 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.431 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.431 00:11:36 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.431 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.431 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.431 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.431 00:11:36 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.431 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.431 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.431 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.431 00:11:36 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.431 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.431 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.431 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.431 00:11:36 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.431 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.431 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.431 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.431 00:11:36 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.431 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.431 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.431 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.431 00:11:36 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.431 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.431 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.431 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.431 00:11:36 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.431 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.431 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.431 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.431 00:11:36 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.431 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.431 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.431 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.431 00:11:36 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.431 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.431 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.431 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.431 00:11:36 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.431 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.431 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.431 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.431 00:11:36 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.431 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.431 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.431 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.431 00:11:36 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.431 00:11:36 -- setup/common.sh@32 -- # continue 00:03:37.431 00:11:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.431 00:11:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.431 00:11:36 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.431 00:11:36 -- setup/common.sh@33 -- # echo 0 00:03:37.431 00:11:36 -- setup/common.sh@33 -- # return 0 00:03:37.431 00:11:36 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:37.431 00:11:36 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:37.431 00:11:36 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:37.431 00:11:36 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:37.431 00:11:36 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:03:37.431 node0=512 expecting 512 00:03:37.431 00:11:36 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:37.431 00:11:36 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:37.431 00:11:36 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:37.431 00:11:36 -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:03:37.431 node1=512 expecting 512 00:03:37.431 00:11:36 -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:03:37.431 00:03:37.431 real 0m3.636s 00:03:37.431 user 0m1.401s 00:03:37.431 sys 0m2.287s 00:03:37.431 00:11:36 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:37.431 00:11:36 -- common/autotest_common.sh@10 -- # set +x 00:03:37.431 ************************************ 00:03:37.431 END TEST even_2G_alloc 00:03:37.431 ************************************ 00:03:37.691 00:11:36 -- setup/hugepages.sh@213 -- # run_test odd_alloc odd_alloc 00:03:37.691 00:11:36 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:03:37.691 00:11:36 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:03:37.691 00:11:36 -- common/autotest_common.sh@10 -- # set +x 00:03:37.691 ************************************ 00:03:37.691 START TEST odd_alloc 00:03:37.691 ************************************ 00:03:37.691 00:11:36 -- common/autotest_common.sh@1104 -- # odd_alloc 00:03:37.691 00:11:36 -- setup/hugepages.sh@159 -- # get_test_nr_hugepages 2098176 00:03:37.691 00:11:36 -- setup/hugepages.sh@49 -- # local size=2098176 00:03:37.691 00:11:36 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:03:37.691 00:11:36 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:37.691 00:11:36 -- setup/hugepages.sh@57 -- # nr_hugepages=1025 00:03:37.691 00:11:36 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:03:37.691 00:11:36 -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:37.691 00:11:36 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:37.691 00:11:36 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1025 00:03:37.691 00:11:36 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:37.691 00:11:36 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:37.691 00:11:36 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:37.691 00:11:36 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:37.691 00:11:36 -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:03:37.691 00:11:36 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:37.691 00:11:36 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:03:37.691 00:11:36 -- setup/hugepages.sh@83 -- # : 513 00:03:37.691 00:11:36 -- setup/hugepages.sh@84 -- # : 1 00:03:37.691 00:11:36 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:37.691 00:11:36 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=513 00:03:37.691 00:11:36 -- setup/hugepages.sh@83 -- # : 0 00:03:37.691 00:11:36 -- setup/hugepages.sh@84 -- # : 0 00:03:37.691 00:11:36 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:37.691 00:11:36 -- setup/hugepages.sh@160 -- # HUGEMEM=2049 00:03:37.691 00:11:36 -- setup/hugepages.sh@160 -- # HUGE_EVEN_ALLOC=yes 00:03:37.691 00:11:36 -- setup/hugepages.sh@160 -- # setup output 00:03:37.691 00:11:36 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:37.691 00:11:36 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:40.979 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:40.979 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:40.979 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:40.979 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:40.979 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:40.979 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:40.979 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:40.979 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:40.979 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:40.979 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:40.979 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:40.979 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:40.979 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:40.979 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:40.979 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:40.979 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:40.979 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:40.979 00:11:39 -- setup/hugepages.sh@161 -- # verify_nr_hugepages 00:03:40.979 00:11:39 -- setup/hugepages.sh@89 -- # local node 00:03:40.979 00:11:39 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:40.979 00:11:39 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:40.979 00:11:39 -- setup/hugepages.sh@92 -- # local surp 00:03:40.979 00:11:39 -- setup/hugepages.sh@93 -- # local resv 00:03:40.979 00:11:39 -- setup/hugepages.sh@94 -- # local anon 00:03:40.979 00:11:39 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:40.979 00:11:39 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:40.979 00:11:39 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:40.979 00:11:39 -- setup/common.sh@18 -- # local node= 00:03:40.979 00:11:39 -- setup/common.sh@19 -- # local var val 00:03:40.979 00:11:39 -- setup/common.sh@20 -- # local mem_f mem 00:03:40.979 00:11:39 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:40.979 00:11:39 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:40.979 00:11:39 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:40.979 00:11:39 -- setup/common.sh@28 -- # mapfile -t mem 00:03:40.979 00:11:39 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:40.979 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.979 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.979 00:11:39 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 44102332 kB' 'MemAvailable: 46483032 kB' 'Buffers: 12536 kB' 'Cached: 9969316 kB' 'SwapCached: 16 kB' 'Active: 8223480 kB' 'Inactive: 2354388 kB' 'Active(anon): 7748128 kB' 'Inactive(anon): 57088 kB' 'Active(file): 475352 kB' 'Inactive(file): 2297300 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8387836 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 598772 kB' 'Mapped: 161340 kB' 'Shmem: 7209200 kB' 'KReclaimable: 243128 kB' 'Slab: 777072 kB' 'SReclaimable: 243128 kB' 'SUnreclaim: 533944 kB' 'KernelStack: 21920 kB' 'PageTables: 8132 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37486620 kB' 'Committed_AS: 9163508 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213508 kB' 'VmallocChunk: 0 kB' 'Percpu: 78848 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:03:40.979 00:11:39 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:40.979 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.979 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.979 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.979 00:11:39 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:40.979 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.979 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.979 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.979 00:11:39 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:40.979 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.979 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.979 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.979 00:11:39 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:40.979 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.979 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.979 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.979 00:11:39 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:40.979 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.979 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.979 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.979 00:11:39 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:40.979 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.979 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.979 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.979 00:11:39 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:40.979 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.979 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.979 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.979 00:11:39 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:40.979 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.979 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.979 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.979 00:11:39 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:40.979 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.979 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.979 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.979 00:11:39 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:40.979 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.979 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.979 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.979 00:11:39 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:40.979 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.979 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.979 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.979 00:11:39 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:40.979 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.979 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.979 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.979 00:11:39 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:40.979 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.979 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.979 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.979 00:11:39 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:40.979 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.979 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.979 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.979 00:11:39 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:40.979 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.979 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.979 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.979 00:11:39 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:40.979 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.979 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.979 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.979 00:11:39 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:40.979 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.979 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.979 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.979 00:11:39 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:40.979 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.979 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.979 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.979 00:11:39 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:40.979 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.979 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.979 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.979 00:11:39 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:40.979 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.979 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.979 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.979 00:11:39 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:40.979 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.980 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.980 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.980 00:11:39 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:40.980 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.980 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.980 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.980 00:11:39 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:40.980 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.980 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.980 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.980 00:11:39 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:40.980 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.980 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.980 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.980 00:11:39 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:40.980 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.980 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.980 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.980 00:11:39 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:40.980 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.980 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.980 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.980 00:11:39 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:40.980 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.980 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.980 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.980 00:11:39 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:40.980 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.980 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.980 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.980 00:11:39 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:40.980 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.980 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.980 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.980 00:11:39 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:40.980 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.980 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.980 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.980 00:11:39 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:40.980 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.980 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.980 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.980 00:11:39 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:40.980 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.980 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.980 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.980 00:11:39 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:40.980 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.980 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.980 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.980 00:11:39 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:40.980 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.980 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.980 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.980 00:11:39 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:40.980 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.980 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.980 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.980 00:11:39 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:40.980 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.980 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.980 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.980 00:11:39 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:40.980 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.980 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.980 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.980 00:11:39 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:40.980 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.980 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.980 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.980 00:11:39 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:40.980 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.980 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.980 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.980 00:11:39 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:40.980 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.980 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.980 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.980 00:11:39 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:40.980 00:11:39 -- setup/common.sh@33 -- # echo 0 00:03:40.980 00:11:39 -- setup/common.sh@33 -- # return 0 00:03:40.980 00:11:39 -- setup/hugepages.sh@97 -- # anon=0 00:03:40.980 00:11:39 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:40.980 00:11:39 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:40.980 00:11:39 -- setup/common.sh@18 -- # local node= 00:03:40.980 00:11:39 -- setup/common.sh@19 -- # local var val 00:03:40.980 00:11:39 -- setup/common.sh@20 -- # local mem_f mem 00:03:40.980 00:11:39 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:40.980 00:11:39 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:40.980 00:11:39 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:40.980 00:11:39 -- setup/common.sh@28 -- # mapfile -t mem 00:03:40.980 00:11:39 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:40.980 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.980 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.980 00:11:39 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 44107800 kB' 'MemAvailable: 46488500 kB' 'Buffers: 12536 kB' 'Cached: 9969316 kB' 'SwapCached: 16 kB' 'Active: 8222564 kB' 'Inactive: 2354388 kB' 'Active(anon): 7747212 kB' 'Inactive(anon): 57088 kB' 'Active(file): 475352 kB' 'Inactive(file): 2297300 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8387836 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 598332 kB' 'Mapped: 161220 kB' 'Shmem: 7209200 kB' 'KReclaimable: 243128 kB' 'Slab: 777016 kB' 'SReclaimable: 243128 kB' 'SUnreclaim: 533888 kB' 'KernelStack: 21888 kB' 'PageTables: 8004 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37486620 kB' 'Committed_AS: 9163520 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213476 kB' 'VmallocChunk: 0 kB' 'Percpu: 78848 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:03:40.980 00:11:39 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.980 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.980 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.980 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.980 00:11:39 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.980 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.980 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.980 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.980 00:11:39 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.980 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.980 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.980 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.980 00:11:39 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.980 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.980 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.980 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.980 00:11:39 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.980 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.980 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.980 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.980 00:11:39 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.980 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.980 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.980 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.980 00:11:39 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.980 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.980 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.980 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.980 00:11:39 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.980 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.980 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.980 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.980 00:11:39 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.980 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.980 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.980 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.980 00:11:39 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.980 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.980 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.980 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.980 00:11:39 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.980 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.980 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.980 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.980 00:11:39 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.980 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.980 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.980 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.980 00:11:39 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.980 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.980 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.980 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.980 00:11:39 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.980 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.980 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.980 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.980 00:11:39 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.980 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.980 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.980 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.980 00:11:39 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.980 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.980 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.980 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.980 00:11:39 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.980 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.980 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.980 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.980 00:11:39 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.980 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.980 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.980 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.980 00:11:39 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.980 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.980 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.980 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.980 00:11:39 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.980 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.980 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.980 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.980 00:11:39 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.980 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.980 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.980 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.980 00:11:39 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.980 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.980 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.980 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.980 00:11:39 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.980 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.980 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.980 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.980 00:11:39 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.980 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.980 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.980 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.980 00:11:39 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.980 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.981 00:11:39 -- setup/common.sh@33 -- # echo 0 00:03:40.981 00:11:39 -- setup/common.sh@33 -- # return 0 00:03:40.981 00:11:39 -- setup/hugepages.sh@99 -- # surp=0 00:03:40.981 00:11:39 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:40.981 00:11:39 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:40.981 00:11:39 -- setup/common.sh@18 -- # local node= 00:03:40.981 00:11:39 -- setup/common.sh@19 -- # local var val 00:03:40.981 00:11:39 -- setup/common.sh@20 -- # local mem_f mem 00:03:40.981 00:11:39 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:40.981 00:11:39 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:40.981 00:11:39 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:40.981 00:11:39 -- setup/common.sh@28 -- # mapfile -t mem 00:03:40.981 00:11:39 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.981 00:11:39 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 44109564 kB' 'MemAvailable: 46490264 kB' 'Buffers: 12536 kB' 'Cached: 9969316 kB' 'SwapCached: 16 kB' 'Active: 8222564 kB' 'Inactive: 2354388 kB' 'Active(anon): 7747212 kB' 'Inactive(anon): 57088 kB' 'Active(file): 475352 kB' 'Inactive(file): 2297300 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8387836 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 598332 kB' 'Mapped: 161220 kB' 'Shmem: 7209200 kB' 'KReclaimable: 243128 kB' 'Slab: 777016 kB' 'SReclaimable: 243128 kB' 'SUnreclaim: 533888 kB' 'KernelStack: 21888 kB' 'PageTables: 8004 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37486620 kB' 'Committed_AS: 9163532 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213476 kB' 'VmallocChunk: 0 kB' 'Percpu: 78848 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.981 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.981 00:11:39 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:40.982 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.982 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.982 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.982 00:11:39 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:40.982 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.982 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.982 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.982 00:11:39 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:40.982 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.982 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.982 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.982 00:11:39 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:40.982 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.982 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.982 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.982 00:11:39 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:40.982 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.982 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.982 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.982 00:11:39 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:40.982 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.982 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.982 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.982 00:11:39 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:40.982 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.982 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.982 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.982 00:11:39 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:40.982 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.982 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.982 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.982 00:11:39 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:40.982 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.982 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.982 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.982 00:11:39 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:40.982 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.982 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.982 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.982 00:11:39 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:40.982 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.982 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.982 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.982 00:11:39 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:40.982 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.982 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.982 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.982 00:11:39 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:40.982 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.982 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.982 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.982 00:11:39 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:40.982 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.982 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.982 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.982 00:11:39 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:40.982 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.982 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.982 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.982 00:11:39 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:40.982 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.982 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.982 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.982 00:11:39 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:40.982 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.982 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.982 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.982 00:11:39 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:40.982 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.982 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.982 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.982 00:11:39 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:40.982 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.982 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.982 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.982 00:11:39 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:40.982 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.982 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.982 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.982 00:11:39 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:40.982 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.982 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.982 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.982 00:11:39 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:40.982 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.982 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.982 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.982 00:11:39 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:40.982 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.982 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.982 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.982 00:11:39 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:40.982 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.982 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.982 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.982 00:11:39 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:40.982 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.982 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.982 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.982 00:11:39 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:40.982 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.982 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.982 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.982 00:11:39 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:40.982 00:11:39 -- setup/common.sh@33 -- # echo 0 00:03:40.982 00:11:39 -- setup/common.sh@33 -- # return 0 00:03:40.982 00:11:39 -- setup/hugepages.sh@100 -- # resv=0 00:03:40.982 00:11:39 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1025 00:03:40.982 nr_hugepages=1025 00:03:40.982 00:11:39 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:40.982 resv_hugepages=0 00:03:40.982 00:11:39 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:40.982 surplus_hugepages=0 00:03:40.982 00:11:39 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:40.982 anon_hugepages=0 00:03:40.982 00:11:39 -- setup/hugepages.sh@107 -- # (( 1025 == nr_hugepages + surp + resv )) 00:03:40.982 00:11:39 -- setup/hugepages.sh@109 -- # (( 1025 == nr_hugepages )) 00:03:40.982 00:11:39 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:40.982 00:11:39 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:40.982 00:11:39 -- setup/common.sh@18 -- # local node= 00:03:40.982 00:11:39 -- setup/common.sh@19 -- # local var val 00:03:40.982 00:11:39 -- setup/common.sh@20 -- # local mem_f mem 00:03:40.982 00:11:39 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:40.982 00:11:39 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:40.982 00:11:39 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:40.982 00:11:39 -- setup/common.sh@28 -- # mapfile -t mem 00:03:40.982 00:11:39 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:40.982 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.982 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.982 00:11:39 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 44109964 kB' 'MemAvailable: 46490664 kB' 'Buffers: 12536 kB' 'Cached: 9969348 kB' 'SwapCached: 16 kB' 'Active: 8222908 kB' 'Inactive: 2354388 kB' 'Active(anon): 7747556 kB' 'Inactive(anon): 57088 kB' 'Active(file): 475352 kB' 'Inactive(file): 2297300 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8387836 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 598672 kB' 'Mapped: 161220 kB' 'Shmem: 7209232 kB' 'KReclaimable: 243128 kB' 'Slab: 777000 kB' 'SReclaimable: 243128 kB' 'SUnreclaim: 533872 kB' 'KernelStack: 21888 kB' 'PageTables: 8008 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37486620 kB' 'Committed_AS: 9163548 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213476 kB' 'VmallocChunk: 0 kB' 'Percpu: 78848 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:03:40.982 00:11:39 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:40.982 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.982 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.982 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.982 00:11:39 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:40.982 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.982 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.982 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.982 00:11:39 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:40.982 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.982 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.982 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.982 00:11:39 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:40.982 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.982 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.982 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.982 00:11:39 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:40.982 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.982 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.982 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.982 00:11:39 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:40.982 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.982 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.982 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.982 00:11:39 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:40.982 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.982 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.982 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.982 00:11:39 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:40.982 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.982 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.982 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.982 00:11:39 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:40.982 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.982 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.982 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.982 00:11:39 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:40.982 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.982 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.982 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:40.983 00:11:39 -- setup/common.sh@33 -- # echo 1025 00:03:40.983 00:11:39 -- setup/common.sh@33 -- # return 0 00:03:40.983 00:11:39 -- setup/hugepages.sh@110 -- # (( 1025 == nr_hugepages + surp + resv )) 00:03:40.983 00:11:39 -- setup/hugepages.sh@112 -- # get_nodes 00:03:40.983 00:11:39 -- setup/hugepages.sh@27 -- # local node 00:03:40.983 00:11:39 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:40.983 00:11:39 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:40.983 00:11:39 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:40.983 00:11:39 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=513 00:03:40.983 00:11:39 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:40.983 00:11:39 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:40.983 00:11:39 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:40.983 00:11:39 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:40.983 00:11:39 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:40.983 00:11:39 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:40.983 00:11:39 -- setup/common.sh@18 -- # local node=0 00:03:40.983 00:11:39 -- setup/common.sh@19 -- # local var val 00:03:40.983 00:11:39 -- setup/common.sh@20 -- # local mem_f mem 00:03:40.983 00:11:39 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:40.983 00:11:39 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:40.983 00:11:39 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:40.983 00:11:39 -- setup/common.sh@28 -- # mapfile -t mem 00:03:40.983 00:11:39 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.983 00:11:39 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32592084 kB' 'MemFree: 26654376 kB' 'MemUsed: 5937708 kB' 'SwapCached: 16 kB' 'Active: 3149392 kB' 'Inactive: 180704 kB' 'Active(anon): 2932772 kB' 'Inactive(anon): 16 kB' 'Active(file): 216620 kB' 'Inactive(file): 180688 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3065320 kB' 'Mapped: 96732 kB' 'AnonPages: 267916 kB' 'Shmem: 2667996 kB' 'KernelStack: 11752 kB' 'PageTables: 4572 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 132824 kB' 'Slab: 381292 kB' 'SReclaimable: 132824 kB' 'SUnreclaim: 248468 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.983 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.983 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.984 00:11:39 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.984 00:11:39 -- setup/common.sh@32 -- # continue 00:03:40.984 00:11:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.984 00:11:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.984 00:11:40 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.984 00:11:40 -- setup/common.sh@32 -- # continue 00:03:40.984 00:11:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.984 00:11:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.984 00:11:40 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.984 00:11:40 -- setup/common.sh@32 -- # continue 00:03:40.984 00:11:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.984 00:11:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.984 00:11:40 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.984 00:11:40 -- setup/common.sh@32 -- # continue 00:03:40.984 00:11:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.984 00:11:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.984 00:11:40 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.984 00:11:40 -- setup/common.sh@32 -- # continue 00:03:40.984 00:11:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.984 00:11:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.984 00:11:40 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.984 00:11:40 -- setup/common.sh@32 -- # continue 00:03:40.984 00:11:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.984 00:11:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.984 00:11:40 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.984 00:11:40 -- setup/common.sh@32 -- # continue 00:03:40.984 00:11:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.984 00:11:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.984 00:11:40 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.984 00:11:40 -- setup/common.sh@32 -- # continue 00:03:40.984 00:11:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.984 00:11:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.984 00:11:40 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.984 00:11:40 -- setup/common.sh@32 -- # continue 00:03:40.984 00:11:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.984 00:11:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.984 00:11:40 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.984 00:11:40 -- setup/common.sh@32 -- # continue 00:03:40.984 00:11:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.984 00:11:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.984 00:11:40 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.984 00:11:40 -- setup/common.sh@32 -- # continue 00:03:40.984 00:11:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.984 00:11:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.984 00:11:40 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.984 00:11:40 -- setup/common.sh@32 -- # continue 00:03:40.984 00:11:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.984 00:11:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.984 00:11:40 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.984 00:11:40 -- setup/common.sh@32 -- # continue 00:03:40.984 00:11:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.984 00:11:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.984 00:11:40 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.984 00:11:40 -- setup/common.sh@32 -- # continue 00:03:40.984 00:11:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.984 00:11:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.984 00:11:40 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.984 00:11:40 -- setup/common.sh@32 -- # continue 00:03:40.984 00:11:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.984 00:11:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.984 00:11:40 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.984 00:11:40 -- setup/common.sh@32 -- # continue 00:03:40.984 00:11:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.984 00:11:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.984 00:11:40 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.984 00:11:40 -- setup/common.sh@32 -- # continue 00:03:40.984 00:11:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.984 00:11:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.984 00:11:40 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.984 00:11:40 -- setup/common.sh@32 -- # continue 00:03:40.984 00:11:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.984 00:11:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.984 00:11:40 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.984 00:11:40 -- setup/common.sh@32 -- # continue 00:03:40.984 00:11:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.984 00:11:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.984 00:11:40 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.984 00:11:40 -- setup/common.sh@32 -- # continue 00:03:40.984 00:11:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.984 00:11:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.984 00:11:40 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.984 00:11:40 -- setup/common.sh@32 -- # continue 00:03:40.984 00:11:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.984 00:11:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.984 00:11:40 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.984 00:11:40 -- setup/common.sh@32 -- # continue 00:03:40.984 00:11:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.984 00:11:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.984 00:11:40 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.984 00:11:40 -- setup/common.sh@32 -- # continue 00:03:40.984 00:11:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.984 00:11:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.984 00:11:40 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.984 00:11:40 -- setup/common.sh@32 -- # continue 00:03:40.984 00:11:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.984 00:11:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.984 00:11:40 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.984 00:11:40 -- setup/common.sh@32 -- # continue 00:03:40.984 00:11:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.984 00:11:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.984 00:11:40 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.984 00:11:40 -- setup/common.sh@33 -- # echo 0 00:03:40.984 00:11:40 -- setup/common.sh@33 -- # return 0 00:03:40.984 00:11:40 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:40.984 00:11:40 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:40.984 00:11:40 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:40.984 00:11:40 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:03:40.984 00:11:40 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:40.984 00:11:40 -- setup/common.sh@18 -- # local node=1 00:03:40.984 00:11:40 -- setup/common.sh@19 -- # local var val 00:03:40.984 00:11:40 -- setup/common.sh@20 -- # local mem_f mem 00:03:40.984 00:11:40 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:40.984 00:11:40 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:03:40.984 00:11:40 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:03:40.984 00:11:40 -- setup/common.sh@28 -- # mapfile -t mem 00:03:40.984 00:11:40 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:40.984 00:11:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.984 00:11:40 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27703148 kB' 'MemFree: 17455316 kB' 'MemUsed: 10247832 kB' 'SwapCached: 0 kB' 'Active: 5073628 kB' 'Inactive: 2173684 kB' 'Active(anon): 4814896 kB' 'Inactive(anon): 57072 kB' 'Active(file): 258732 kB' 'Inactive(file): 2116612 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 6916604 kB' 'Mapped: 64488 kB' 'AnonPages: 330860 kB' 'Shmem: 4541260 kB' 'KernelStack: 10152 kB' 'PageTables: 3480 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 110304 kB' 'Slab: 395708 kB' 'SReclaimable: 110304 kB' 'SUnreclaim: 285404 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 513' 'HugePages_Free: 513' 'HugePages_Surp: 0' 00:03:40.984 00:11:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.984 00:11:40 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.984 00:11:40 -- setup/common.sh@32 -- # continue 00:03:40.984 00:11:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.984 00:11:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.984 00:11:40 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.984 00:11:40 -- setup/common.sh@32 -- # continue 00:03:40.984 00:11:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.984 00:11:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.984 00:11:40 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.984 00:11:40 -- setup/common.sh@32 -- # continue 00:03:40.984 00:11:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.984 00:11:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.984 00:11:40 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.984 00:11:40 -- setup/common.sh@32 -- # continue 00:03:40.984 00:11:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.984 00:11:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.984 00:11:40 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.984 00:11:40 -- setup/common.sh@32 -- # continue 00:03:40.984 00:11:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.984 00:11:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.984 00:11:40 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.984 00:11:40 -- setup/common.sh@32 -- # continue 00:03:40.984 00:11:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.984 00:11:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.984 00:11:40 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.984 00:11:40 -- setup/common.sh@32 -- # continue 00:03:40.984 00:11:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.984 00:11:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.984 00:11:40 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.984 00:11:40 -- setup/common.sh@32 -- # continue 00:03:40.984 00:11:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.984 00:11:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.984 00:11:40 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.984 00:11:40 -- setup/common.sh@32 -- # continue 00:03:40.984 00:11:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.984 00:11:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.984 00:11:40 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.984 00:11:40 -- setup/common.sh@32 -- # continue 00:03:40.984 00:11:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.984 00:11:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.984 00:11:40 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.984 00:11:40 -- setup/common.sh@32 -- # continue 00:03:40.984 00:11:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.984 00:11:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.984 00:11:40 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.984 00:11:40 -- setup/common.sh@32 -- # continue 00:03:40.984 00:11:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.984 00:11:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.984 00:11:40 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.984 00:11:40 -- setup/common.sh@32 -- # continue 00:03:40.984 00:11:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.984 00:11:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.984 00:11:40 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.984 00:11:40 -- setup/common.sh@32 -- # continue 00:03:40.984 00:11:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.984 00:11:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.984 00:11:40 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.984 00:11:40 -- setup/common.sh@32 -- # continue 00:03:40.984 00:11:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.984 00:11:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.984 00:11:40 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.984 00:11:40 -- setup/common.sh@32 -- # continue 00:03:40.984 00:11:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.984 00:11:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.984 00:11:40 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.984 00:11:40 -- setup/common.sh@32 -- # continue 00:03:40.984 00:11:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.984 00:11:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.984 00:11:40 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.984 00:11:40 -- setup/common.sh@32 -- # continue 00:03:40.984 00:11:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.984 00:11:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.984 00:11:40 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.984 00:11:40 -- setup/common.sh@32 -- # continue 00:03:40.984 00:11:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.984 00:11:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.984 00:11:40 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.984 00:11:40 -- setup/common.sh@32 -- # continue 00:03:40.984 00:11:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.984 00:11:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.985 00:11:40 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.985 00:11:40 -- setup/common.sh@32 -- # continue 00:03:40.985 00:11:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.985 00:11:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.985 00:11:40 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.985 00:11:40 -- setup/common.sh@32 -- # continue 00:03:40.985 00:11:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.985 00:11:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.985 00:11:40 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.985 00:11:40 -- setup/common.sh@32 -- # continue 00:03:40.985 00:11:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.985 00:11:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.985 00:11:40 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.985 00:11:40 -- setup/common.sh@32 -- # continue 00:03:40.985 00:11:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.985 00:11:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.985 00:11:40 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.985 00:11:40 -- setup/common.sh@32 -- # continue 00:03:40.985 00:11:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.985 00:11:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.985 00:11:40 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.985 00:11:40 -- setup/common.sh@32 -- # continue 00:03:40.985 00:11:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.985 00:11:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.985 00:11:40 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.985 00:11:40 -- setup/common.sh@32 -- # continue 00:03:40.985 00:11:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.985 00:11:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.985 00:11:40 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.985 00:11:40 -- setup/common.sh@32 -- # continue 00:03:40.985 00:11:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.985 00:11:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.985 00:11:40 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.985 00:11:40 -- setup/common.sh@32 -- # continue 00:03:40.985 00:11:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.985 00:11:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.985 00:11:40 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.985 00:11:40 -- setup/common.sh@32 -- # continue 00:03:40.985 00:11:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.985 00:11:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.985 00:11:40 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.985 00:11:40 -- setup/common.sh@32 -- # continue 00:03:40.985 00:11:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.985 00:11:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.985 00:11:40 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.985 00:11:40 -- setup/common.sh@32 -- # continue 00:03:40.985 00:11:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.985 00:11:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.985 00:11:40 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.985 00:11:40 -- setup/common.sh@32 -- # continue 00:03:40.985 00:11:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.985 00:11:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.985 00:11:40 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.985 00:11:40 -- setup/common.sh@32 -- # continue 00:03:40.985 00:11:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.985 00:11:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.985 00:11:40 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.985 00:11:40 -- setup/common.sh@32 -- # continue 00:03:40.985 00:11:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.985 00:11:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.985 00:11:40 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.985 00:11:40 -- setup/common.sh@32 -- # continue 00:03:40.985 00:11:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.985 00:11:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.985 00:11:40 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.985 00:11:40 -- setup/common.sh@33 -- # echo 0 00:03:40.985 00:11:40 -- setup/common.sh@33 -- # return 0 00:03:40.985 00:11:40 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:40.985 00:11:40 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:40.985 00:11:40 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:40.985 00:11:40 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:40.985 00:11:40 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 513' 00:03:40.985 node0=512 expecting 513 00:03:41.244 00:11:40 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:41.244 00:11:40 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:41.244 00:11:40 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:41.244 00:11:40 -- setup/hugepages.sh@128 -- # echo 'node1=513 expecting 512' 00:03:41.244 node1=513 expecting 512 00:03:41.244 00:11:40 -- setup/hugepages.sh@130 -- # [[ 512 513 == \5\1\2\ \5\1\3 ]] 00:03:41.244 00:03:41.244 real 0m3.519s 00:03:41.244 user 0m1.322s 00:03:41.244 sys 0m2.239s 00:03:41.244 00:11:40 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:41.244 00:11:40 -- common/autotest_common.sh@10 -- # set +x 00:03:41.244 ************************************ 00:03:41.244 END TEST odd_alloc 00:03:41.244 ************************************ 00:03:41.244 00:11:40 -- setup/hugepages.sh@214 -- # run_test custom_alloc custom_alloc 00:03:41.244 00:11:40 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:03:41.244 00:11:40 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:03:41.244 00:11:40 -- common/autotest_common.sh@10 -- # set +x 00:03:41.244 ************************************ 00:03:41.244 START TEST custom_alloc 00:03:41.244 ************************************ 00:03:41.244 00:11:40 -- common/autotest_common.sh@1104 -- # custom_alloc 00:03:41.244 00:11:40 -- setup/hugepages.sh@167 -- # local IFS=, 00:03:41.244 00:11:40 -- setup/hugepages.sh@169 -- # local node 00:03:41.244 00:11:40 -- setup/hugepages.sh@170 -- # nodes_hp=() 00:03:41.244 00:11:40 -- setup/hugepages.sh@170 -- # local nodes_hp 00:03:41.244 00:11:40 -- setup/hugepages.sh@172 -- # local nr_hugepages=0 _nr_hugepages=0 00:03:41.244 00:11:40 -- setup/hugepages.sh@174 -- # get_test_nr_hugepages 1048576 00:03:41.244 00:11:40 -- setup/hugepages.sh@49 -- # local size=1048576 00:03:41.244 00:11:40 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:03:41.244 00:11:40 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:41.244 00:11:40 -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:03:41.244 00:11:40 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:03:41.244 00:11:40 -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:41.244 00:11:40 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:41.244 00:11:40 -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:03:41.244 00:11:40 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:41.244 00:11:40 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:41.244 00:11:40 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:41.244 00:11:40 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:41.244 00:11:40 -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:03:41.244 00:11:40 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:41.244 00:11:40 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:03:41.244 00:11:40 -- setup/hugepages.sh@83 -- # : 256 00:03:41.244 00:11:40 -- setup/hugepages.sh@84 -- # : 1 00:03:41.244 00:11:40 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:41.244 00:11:40 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:03:41.244 00:11:40 -- setup/hugepages.sh@83 -- # : 0 00:03:41.244 00:11:40 -- setup/hugepages.sh@84 -- # : 0 00:03:41.244 00:11:40 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:41.244 00:11:40 -- setup/hugepages.sh@175 -- # nodes_hp[0]=512 00:03:41.244 00:11:40 -- setup/hugepages.sh@176 -- # (( 2 > 1 )) 00:03:41.244 00:11:40 -- setup/hugepages.sh@177 -- # get_test_nr_hugepages 2097152 00:03:41.244 00:11:40 -- setup/hugepages.sh@49 -- # local size=2097152 00:03:41.244 00:11:40 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:03:41.244 00:11:40 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:41.244 00:11:40 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:03:41.244 00:11:40 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:03:41.244 00:11:40 -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:41.244 00:11:40 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:41.244 00:11:40 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:41.244 00:11:40 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:41.244 00:11:40 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:41.244 00:11:40 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:41.244 00:11:40 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:41.244 00:11:40 -- setup/hugepages.sh@74 -- # (( 1 > 0 )) 00:03:41.244 00:11:40 -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:03:41.244 00:11:40 -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:03:41.244 00:11:40 -- setup/hugepages.sh@78 -- # return 0 00:03:41.244 00:11:40 -- setup/hugepages.sh@178 -- # nodes_hp[1]=1024 00:03:41.244 00:11:40 -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:03:41.244 00:11:40 -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:03:41.244 00:11:40 -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:03:41.244 00:11:40 -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:03:41.244 00:11:40 -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:03:41.244 00:11:40 -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:03:41.244 00:11:40 -- setup/hugepages.sh@186 -- # get_test_nr_hugepages_per_node 00:03:41.244 00:11:40 -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:41.244 00:11:40 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:41.244 00:11:40 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:41.244 00:11:40 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:41.244 00:11:40 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:41.244 00:11:40 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:41.244 00:11:40 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:41.244 00:11:40 -- setup/hugepages.sh@74 -- # (( 2 > 0 )) 00:03:41.244 00:11:40 -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:03:41.244 00:11:40 -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:03:41.244 00:11:40 -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:03:41.244 00:11:40 -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=1024 00:03:41.244 00:11:40 -- setup/hugepages.sh@78 -- # return 0 00:03:41.244 00:11:40 -- setup/hugepages.sh@187 -- # HUGENODE='nodes_hp[0]=512,nodes_hp[1]=1024' 00:03:41.244 00:11:40 -- setup/hugepages.sh@187 -- # setup output 00:03:41.244 00:11:40 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:41.244 00:11:40 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:44.531 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:44.531 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:44.531 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:44.531 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:44.531 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:44.531 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:44.531 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:44.531 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:44.531 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:44.531 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:44.531 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:44.531 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:44.531 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:44.531 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:44.531 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:44.531 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:44.531 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:44.792 00:11:43 -- setup/hugepages.sh@188 -- # nr_hugepages=1536 00:03:44.792 00:11:43 -- setup/hugepages.sh@188 -- # verify_nr_hugepages 00:03:44.792 00:11:43 -- setup/hugepages.sh@89 -- # local node 00:03:44.792 00:11:43 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:44.792 00:11:43 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:44.792 00:11:43 -- setup/hugepages.sh@92 -- # local surp 00:03:44.792 00:11:43 -- setup/hugepages.sh@93 -- # local resv 00:03:44.792 00:11:43 -- setup/hugepages.sh@94 -- # local anon 00:03:44.792 00:11:43 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:44.792 00:11:43 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:44.792 00:11:43 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:44.792 00:11:43 -- setup/common.sh@18 -- # local node= 00:03:44.792 00:11:43 -- setup/common.sh@19 -- # local var val 00:03:44.792 00:11:43 -- setup/common.sh@20 -- # local mem_f mem 00:03:44.792 00:11:43 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:44.792 00:11:43 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:44.792 00:11:43 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:44.792 00:11:43 -- setup/common.sh@28 -- # mapfile -t mem 00:03:44.792 00:11:43 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:44.792 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.792 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.793 00:11:43 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 43070304 kB' 'MemAvailable: 45451004 kB' 'Buffers: 12536 kB' 'Cached: 9969456 kB' 'SwapCached: 16 kB' 'Active: 8222668 kB' 'Inactive: 2354388 kB' 'Active(anon): 7747316 kB' 'Inactive(anon): 57088 kB' 'Active(file): 475352 kB' 'Inactive(file): 2297300 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8387836 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 597904 kB' 'Mapped: 161340 kB' 'Shmem: 7209340 kB' 'KReclaimable: 243128 kB' 'Slab: 777320 kB' 'SReclaimable: 243128 kB' 'SUnreclaim: 534192 kB' 'KernelStack: 21952 kB' 'PageTables: 8216 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36963356 kB' 'Committed_AS: 9164172 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213396 kB' 'VmallocChunk: 0 kB' 'Percpu: 78848 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:03:44.793 00:11:43 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.793 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.793 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.793 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.793 00:11:43 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.793 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.793 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.793 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.793 00:11:43 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.793 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.793 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.793 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.793 00:11:43 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.793 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.793 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.793 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.793 00:11:43 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.793 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.793 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.793 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.793 00:11:43 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.793 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.793 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.793 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.793 00:11:43 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.793 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.793 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.793 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.793 00:11:43 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.793 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.793 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.793 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.793 00:11:43 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.793 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.793 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.793 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.793 00:11:43 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.793 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.793 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.793 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.793 00:11:43 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.793 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.793 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.793 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.793 00:11:43 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.793 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.793 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.793 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.793 00:11:43 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.793 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.793 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.793 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.793 00:11:43 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.793 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.793 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.793 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.793 00:11:43 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.793 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.793 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.793 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.793 00:11:43 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.793 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.793 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.793 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.793 00:11:43 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.793 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.793 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.793 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.793 00:11:43 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.793 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.793 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.793 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.793 00:11:43 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.793 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.793 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.793 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.793 00:11:43 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.793 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.793 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.793 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.793 00:11:43 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.793 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.793 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.793 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.793 00:11:43 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.793 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.793 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.793 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.793 00:11:43 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.793 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.793 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.793 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.793 00:11:43 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.793 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.793 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.793 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.793 00:11:43 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.793 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.793 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.793 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.793 00:11:43 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.793 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.793 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.793 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.793 00:11:43 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.793 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.793 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.793 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.793 00:11:43 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.793 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.793 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.793 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.793 00:11:43 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.793 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.793 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.793 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.793 00:11:43 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.793 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.793 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.793 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.793 00:11:43 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.793 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.793 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.793 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.793 00:11:43 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.793 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.793 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.793 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.793 00:11:43 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.793 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.793 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.793 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.793 00:11:43 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.793 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.793 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.793 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.793 00:11:43 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.793 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.793 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.793 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.793 00:11:43 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.793 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.793 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.793 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.793 00:11:43 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.793 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.793 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.793 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.793 00:11:43 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.793 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.793 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.793 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.793 00:11:43 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.793 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.793 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.793 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.793 00:11:43 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.793 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.793 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.793 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.793 00:11:43 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.793 00:11:43 -- setup/common.sh@33 -- # echo 0 00:03:44.793 00:11:43 -- setup/common.sh@33 -- # return 0 00:03:44.793 00:11:43 -- setup/hugepages.sh@97 -- # anon=0 00:03:44.793 00:11:43 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:44.794 00:11:43 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:44.794 00:11:43 -- setup/common.sh@18 -- # local node= 00:03:44.794 00:11:43 -- setup/common.sh@19 -- # local var val 00:03:44.794 00:11:43 -- setup/common.sh@20 -- # local mem_f mem 00:03:44.794 00:11:43 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:44.794 00:11:43 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:44.794 00:11:43 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:44.794 00:11:43 -- setup/common.sh@28 -- # mapfile -t mem 00:03:44.794 00:11:43 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:44.794 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.794 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.794 00:11:43 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 43071804 kB' 'MemAvailable: 45452504 kB' 'Buffers: 12536 kB' 'Cached: 9969456 kB' 'SwapCached: 16 kB' 'Active: 8222368 kB' 'Inactive: 2354388 kB' 'Active(anon): 7747016 kB' 'Inactive(anon): 57088 kB' 'Active(file): 475352 kB' 'Inactive(file): 2297300 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8387836 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 598064 kB' 'Mapped: 161224 kB' 'Shmem: 7209340 kB' 'KReclaimable: 243128 kB' 'Slab: 777316 kB' 'SReclaimable: 243128 kB' 'SUnreclaim: 534188 kB' 'KernelStack: 21936 kB' 'PageTables: 8148 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36963356 kB' 'Committed_AS: 9164184 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213380 kB' 'VmallocChunk: 0 kB' 'Percpu: 78848 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:03:44.794 00:11:43 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.794 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.794 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.794 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.794 00:11:43 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.794 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.794 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.794 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.794 00:11:43 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.794 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.794 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.794 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.794 00:11:43 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.794 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.794 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.794 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.794 00:11:43 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.794 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.794 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.794 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.794 00:11:43 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.794 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.794 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.794 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.794 00:11:43 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.794 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.794 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.794 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.794 00:11:43 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.794 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.794 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.794 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.794 00:11:43 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.794 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.794 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.794 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.794 00:11:43 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.794 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.794 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.794 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.794 00:11:43 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.794 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.794 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.794 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.794 00:11:43 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.794 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.794 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.794 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.794 00:11:43 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.794 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.794 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.794 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.794 00:11:43 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.794 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.794 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.794 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.794 00:11:43 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.794 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.794 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.794 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.794 00:11:43 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.794 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.794 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.794 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.794 00:11:43 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.794 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.794 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.794 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.794 00:11:43 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.794 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.794 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.794 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.794 00:11:43 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.794 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.794 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.794 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.794 00:11:43 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.794 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.794 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.794 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.794 00:11:43 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.794 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.794 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.794 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.794 00:11:43 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.794 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.794 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.794 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.794 00:11:43 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.794 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.794 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.794 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.794 00:11:43 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.794 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.794 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.794 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.794 00:11:43 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.794 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.794 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.794 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.794 00:11:43 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.794 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.794 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.794 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.794 00:11:43 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.794 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.794 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.794 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.794 00:11:43 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.794 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.794 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.794 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.794 00:11:43 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.794 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.794 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.794 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.794 00:11:43 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.794 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.794 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.794 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.794 00:11:43 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.794 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.794 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.794 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.794 00:11:43 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.794 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.794 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.794 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.794 00:11:43 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.794 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.794 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.794 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.794 00:11:43 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.794 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.794 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.794 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.794 00:11:43 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.794 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.794 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.794 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.794 00:11:43 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.794 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.794 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.794 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.794 00:11:43 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.794 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.794 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.795 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.795 00:11:43 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.795 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.795 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.795 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.795 00:11:43 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.795 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.795 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.795 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.795 00:11:43 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.795 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.795 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.795 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.795 00:11:43 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.795 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.795 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.795 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.795 00:11:43 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.795 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.795 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.795 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.795 00:11:43 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.795 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.795 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.795 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.795 00:11:43 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.795 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.795 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.795 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.795 00:11:43 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.795 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.795 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.795 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.795 00:11:43 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.795 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.795 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.795 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.795 00:11:43 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.795 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.795 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.795 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.795 00:11:43 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.795 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.795 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.795 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.795 00:11:43 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.795 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.795 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.795 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.795 00:11:43 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.795 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.795 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.795 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.795 00:11:43 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.795 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.795 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.795 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.795 00:11:43 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.795 00:11:43 -- setup/common.sh@33 -- # echo 0 00:03:44.795 00:11:43 -- setup/common.sh@33 -- # return 0 00:03:44.795 00:11:43 -- setup/hugepages.sh@99 -- # surp=0 00:03:44.795 00:11:43 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:44.795 00:11:43 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:44.795 00:11:43 -- setup/common.sh@18 -- # local node= 00:03:44.795 00:11:43 -- setup/common.sh@19 -- # local var val 00:03:44.795 00:11:43 -- setup/common.sh@20 -- # local mem_f mem 00:03:44.795 00:11:43 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:44.795 00:11:43 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:44.795 00:11:43 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:44.795 00:11:43 -- setup/common.sh@28 -- # mapfile -t mem 00:03:44.795 00:11:43 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:44.795 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.795 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.795 00:11:43 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 43071804 kB' 'MemAvailable: 45452504 kB' 'Buffers: 12536 kB' 'Cached: 9969468 kB' 'SwapCached: 16 kB' 'Active: 8222364 kB' 'Inactive: 2354388 kB' 'Active(anon): 7747012 kB' 'Inactive(anon): 57088 kB' 'Active(file): 475352 kB' 'Inactive(file): 2297300 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8387836 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 598052 kB' 'Mapped: 161224 kB' 'Shmem: 7209352 kB' 'KReclaimable: 243128 kB' 'Slab: 777316 kB' 'SReclaimable: 243128 kB' 'SUnreclaim: 534188 kB' 'KernelStack: 21936 kB' 'PageTables: 8148 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36963356 kB' 'Committed_AS: 9164196 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213380 kB' 'VmallocChunk: 0 kB' 'Percpu: 78848 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:03:44.795 00:11:43 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.795 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.795 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.795 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.795 00:11:43 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.795 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.795 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.795 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.795 00:11:43 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.795 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.795 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.795 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.795 00:11:43 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.795 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.795 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.795 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.795 00:11:43 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.795 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.795 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.795 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.795 00:11:43 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.795 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.795 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.795 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.795 00:11:43 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.795 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.795 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.795 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.795 00:11:43 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.795 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.795 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.795 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.795 00:11:43 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.795 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.795 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.795 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.795 00:11:43 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.795 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.795 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.795 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.795 00:11:43 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.795 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.795 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.795 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.795 00:11:43 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.795 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.795 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.795 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.795 00:11:43 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.795 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.795 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.795 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.795 00:11:43 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.795 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.795 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.795 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.795 00:11:43 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.795 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.795 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.795 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.795 00:11:43 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.795 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.795 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.795 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.795 00:11:43 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.795 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.795 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.795 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.795 00:11:43 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.795 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.795 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.795 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.795 00:11:43 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.795 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.795 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.795 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.795 00:11:43 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.795 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.795 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.795 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.795 00:11:43 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.795 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.795 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.796 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.796 00:11:43 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.796 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.796 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.796 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.796 00:11:43 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.796 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.796 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.796 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.796 00:11:43 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.796 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.796 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.796 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.796 00:11:43 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.796 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.796 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.796 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.796 00:11:43 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.796 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.796 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.796 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.796 00:11:43 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.796 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.796 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.796 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.796 00:11:43 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.796 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.796 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.796 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.796 00:11:43 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.796 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.796 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.796 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.796 00:11:43 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.796 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.796 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.796 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.796 00:11:43 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.796 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.796 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.796 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.796 00:11:43 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.796 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.796 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.796 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.796 00:11:43 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.796 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.796 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.796 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.796 00:11:43 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.796 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.796 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.796 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.796 00:11:43 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.796 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.796 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.796 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.796 00:11:43 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.796 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.796 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.796 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.796 00:11:43 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.796 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.796 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.796 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.796 00:11:43 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.796 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.796 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.796 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.796 00:11:43 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.796 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.796 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.796 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.796 00:11:43 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.796 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.796 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.796 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.796 00:11:43 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.796 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.796 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.796 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.796 00:11:43 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.796 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.796 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.796 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.796 00:11:43 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.796 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.796 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.796 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.796 00:11:43 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.796 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.796 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.796 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.796 00:11:43 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.796 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.796 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.796 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.796 00:11:43 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.796 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.796 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.796 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.796 00:11:43 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.796 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.796 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.796 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.796 00:11:43 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.796 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.796 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.796 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.796 00:11:43 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.796 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.796 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.796 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.796 00:11:43 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.796 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.796 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.796 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.796 00:11:43 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.796 00:11:43 -- setup/common.sh@33 -- # echo 0 00:03:44.796 00:11:43 -- setup/common.sh@33 -- # return 0 00:03:44.796 00:11:43 -- setup/hugepages.sh@100 -- # resv=0 00:03:44.796 00:11:43 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1536 00:03:44.796 nr_hugepages=1536 00:03:44.796 00:11:43 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:44.796 resv_hugepages=0 00:03:44.796 00:11:43 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:44.796 surplus_hugepages=0 00:03:44.796 00:11:43 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:44.796 anon_hugepages=0 00:03:44.796 00:11:43 -- setup/hugepages.sh@107 -- # (( 1536 == nr_hugepages + surp + resv )) 00:03:44.796 00:11:43 -- setup/hugepages.sh@109 -- # (( 1536 == nr_hugepages )) 00:03:44.796 00:11:43 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:44.796 00:11:43 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:44.796 00:11:43 -- setup/common.sh@18 -- # local node= 00:03:44.796 00:11:43 -- setup/common.sh@19 -- # local var val 00:03:44.796 00:11:43 -- setup/common.sh@20 -- # local mem_f mem 00:03:44.796 00:11:43 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:44.796 00:11:43 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:44.796 00:11:43 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:44.796 00:11:43 -- setup/common.sh@28 -- # mapfile -t mem 00:03:44.796 00:11:43 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:44.796 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.796 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.796 00:11:43 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 43072308 kB' 'MemAvailable: 45453008 kB' 'Buffers: 12536 kB' 'Cached: 9969468 kB' 'SwapCached: 16 kB' 'Active: 8222364 kB' 'Inactive: 2354388 kB' 'Active(anon): 7747012 kB' 'Inactive(anon): 57088 kB' 'Active(file): 475352 kB' 'Inactive(file): 2297300 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8387836 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 598052 kB' 'Mapped: 161224 kB' 'Shmem: 7209352 kB' 'KReclaimable: 243128 kB' 'Slab: 777316 kB' 'SReclaimable: 243128 kB' 'SUnreclaim: 534188 kB' 'KernelStack: 21936 kB' 'PageTables: 8148 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36963356 kB' 'Committed_AS: 9164212 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213380 kB' 'VmallocChunk: 0 kB' 'Percpu: 78848 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:03:44.796 00:11:43 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.796 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.796 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.796 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.796 00:11:43 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.796 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.796 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.796 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.796 00:11:43 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.796 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.796 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.796 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.796 00:11:43 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.796 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.796 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.796 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.797 00:11:43 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.797 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.797 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.797 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.797 00:11:43 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.797 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.797 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.797 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.797 00:11:43 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.797 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.797 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.797 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.797 00:11:43 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.797 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.797 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.797 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.797 00:11:43 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.797 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.797 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.797 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.797 00:11:43 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.797 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.797 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.797 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.797 00:11:43 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.797 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.797 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.797 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.797 00:11:43 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.797 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.797 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.797 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.797 00:11:43 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.797 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.797 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.797 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.797 00:11:43 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.797 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.797 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.797 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.797 00:11:43 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.797 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.797 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.797 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.797 00:11:43 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.797 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.797 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.797 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.797 00:11:43 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.797 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.797 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.797 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.797 00:11:43 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.797 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.797 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.797 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.797 00:11:43 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.797 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.797 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.797 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.797 00:11:43 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.797 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.797 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.797 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.797 00:11:43 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.797 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.797 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.797 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.797 00:11:43 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.797 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.797 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.797 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.797 00:11:43 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.797 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.797 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.797 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.797 00:11:43 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.797 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.797 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.797 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.797 00:11:43 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.797 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.797 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.797 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.797 00:11:43 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.797 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.797 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.797 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.797 00:11:43 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.797 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.797 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.797 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.797 00:11:43 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.797 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.797 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.797 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.797 00:11:43 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.797 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.797 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.797 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.797 00:11:43 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.797 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.797 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.797 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.797 00:11:43 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.797 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.797 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.797 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.797 00:11:43 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.797 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.797 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.797 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.797 00:11:43 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.797 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.797 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.797 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.797 00:11:43 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.797 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.797 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.797 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.797 00:11:43 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.797 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.797 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.797 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.797 00:11:43 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.797 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.797 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.797 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.797 00:11:43 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.797 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.797 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.797 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.797 00:11:43 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.797 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.797 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.797 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.797 00:11:43 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.797 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.797 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.797 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.797 00:11:43 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.797 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.797 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.797 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.797 00:11:43 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.798 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.798 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.798 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.798 00:11:43 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.798 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.798 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.798 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.798 00:11:43 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.798 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.798 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.798 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.798 00:11:43 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.798 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.798 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.798 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.798 00:11:43 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.798 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.798 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.798 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.798 00:11:43 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.798 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.798 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.798 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.798 00:11:43 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.798 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.798 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.798 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.798 00:11:43 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.798 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.798 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.798 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.798 00:11:43 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.798 00:11:43 -- setup/common.sh@33 -- # echo 1536 00:03:44.798 00:11:43 -- setup/common.sh@33 -- # return 0 00:03:44.798 00:11:43 -- setup/hugepages.sh@110 -- # (( 1536 == nr_hugepages + surp + resv )) 00:03:44.798 00:11:43 -- setup/hugepages.sh@112 -- # get_nodes 00:03:44.798 00:11:43 -- setup/hugepages.sh@27 -- # local node 00:03:44.798 00:11:43 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:44.798 00:11:43 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:44.798 00:11:43 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:44.798 00:11:43 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:44.798 00:11:43 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:44.798 00:11:43 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:44.798 00:11:43 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:44.798 00:11:43 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:44.798 00:11:43 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:44.798 00:11:43 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:44.798 00:11:43 -- setup/common.sh@18 -- # local node=0 00:03:44.798 00:11:43 -- setup/common.sh@19 -- # local var val 00:03:44.798 00:11:43 -- setup/common.sh@20 -- # local mem_f mem 00:03:44.798 00:11:43 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:44.798 00:11:43 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:44.798 00:11:43 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:44.798 00:11:43 -- setup/common.sh@28 -- # mapfile -t mem 00:03:44.798 00:11:43 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:44.798 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.798 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.798 00:11:43 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32592084 kB' 'MemFree: 26664560 kB' 'MemUsed: 5927524 kB' 'SwapCached: 16 kB' 'Active: 3150204 kB' 'Inactive: 180704 kB' 'Active(anon): 2933584 kB' 'Inactive(anon): 16 kB' 'Active(file): 216620 kB' 'Inactive(file): 180688 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3065384 kB' 'Mapped: 96736 kB' 'AnonPages: 268784 kB' 'Shmem: 2668060 kB' 'KernelStack: 11768 kB' 'PageTables: 4620 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 132824 kB' 'Slab: 381444 kB' 'SReclaimable: 132824 kB' 'SUnreclaim: 248620 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:44.798 00:11:43 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.798 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.798 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.798 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.798 00:11:43 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.798 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.798 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.798 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.798 00:11:43 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.798 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.798 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.798 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.798 00:11:43 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.798 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.798 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.798 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.798 00:11:43 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.798 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.798 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.798 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.798 00:11:43 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.798 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.798 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.798 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.798 00:11:43 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.798 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.798 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.798 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.798 00:11:43 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.798 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.798 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.798 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.798 00:11:43 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.798 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.798 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.798 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.798 00:11:43 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.798 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.798 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.798 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.798 00:11:43 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.798 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.798 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.798 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.798 00:11:43 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.798 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.798 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.798 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.798 00:11:43 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.798 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.798 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.798 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.798 00:11:43 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.798 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.798 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.798 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.798 00:11:43 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.798 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.798 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.798 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.798 00:11:43 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.798 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.798 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.798 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.798 00:11:43 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.798 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.798 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.798 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.798 00:11:43 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.798 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.798 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.798 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.798 00:11:43 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.798 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.798 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.798 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.798 00:11:43 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.798 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.798 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.798 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.798 00:11:43 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.798 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.798 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.798 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.798 00:11:43 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.798 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.798 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.798 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.798 00:11:43 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.798 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.798 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.798 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.798 00:11:43 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.798 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.798 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.798 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.798 00:11:43 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.798 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.798 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.798 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.798 00:11:43 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.798 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.798 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.798 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.798 00:11:43 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.799 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.799 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.799 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.799 00:11:43 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.799 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.799 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.799 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.799 00:11:43 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.799 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.799 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.799 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.799 00:11:43 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.799 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.799 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.799 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.799 00:11:43 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.799 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.799 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.799 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.799 00:11:43 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.799 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.799 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.799 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.799 00:11:43 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.799 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.799 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.799 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.799 00:11:43 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.799 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.799 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.799 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.799 00:11:43 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.799 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.799 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.799 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.799 00:11:43 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.799 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.799 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.799 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.799 00:11:43 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.799 00:11:43 -- setup/common.sh@33 -- # echo 0 00:03:44.799 00:11:43 -- setup/common.sh@33 -- # return 0 00:03:44.799 00:11:43 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:44.799 00:11:43 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:44.799 00:11:43 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:44.799 00:11:43 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:03:44.799 00:11:43 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:44.799 00:11:43 -- setup/common.sh@18 -- # local node=1 00:03:44.799 00:11:43 -- setup/common.sh@19 -- # local var val 00:03:44.799 00:11:43 -- setup/common.sh@20 -- # local mem_f mem 00:03:44.799 00:11:43 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:44.799 00:11:43 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:03:44.799 00:11:43 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:03:44.799 00:11:43 -- setup/common.sh@28 -- # mapfile -t mem 00:03:44.799 00:11:43 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:44.799 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.799 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.799 00:11:43 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27703148 kB' 'MemFree: 16408016 kB' 'MemUsed: 11295132 kB' 'SwapCached: 0 kB' 'Active: 5072224 kB' 'Inactive: 2173684 kB' 'Active(anon): 4813492 kB' 'Inactive(anon): 57072 kB' 'Active(file): 258732 kB' 'Inactive(file): 2116612 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 6916680 kB' 'Mapped: 64488 kB' 'AnonPages: 329280 kB' 'Shmem: 4541336 kB' 'KernelStack: 10168 kB' 'PageTables: 3528 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 110304 kB' 'Slab: 395872 kB' 'SReclaimable: 110304 kB' 'SUnreclaim: 285568 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:03:44.799 00:11:43 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.799 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.799 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.799 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.799 00:11:43 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.799 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.799 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.799 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.799 00:11:43 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.799 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.799 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.799 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.799 00:11:43 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.799 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.799 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.799 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.799 00:11:43 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.799 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.799 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.799 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.799 00:11:43 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.799 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.799 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.799 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.799 00:11:43 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.799 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.799 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.799 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.799 00:11:43 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.799 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.799 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.799 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.799 00:11:43 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.799 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.799 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.799 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.799 00:11:43 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.799 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.799 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.799 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.799 00:11:43 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.799 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.799 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.799 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.799 00:11:43 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.799 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.799 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.799 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.799 00:11:43 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.799 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.799 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.799 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.799 00:11:43 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.799 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.799 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.799 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.799 00:11:43 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.799 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.799 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.799 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.799 00:11:43 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.799 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.799 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.799 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.799 00:11:43 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.799 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.799 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.799 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.799 00:11:43 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.799 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.799 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.799 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.799 00:11:43 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.799 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.799 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.799 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.799 00:11:43 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.799 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.799 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.799 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.799 00:11:43 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.799 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.799 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.799 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.799 00:11:43 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.799 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.799 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.799 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.799 00:11:43 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.799 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.799 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.799 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.799 00:11:43 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.799 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.799 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.799 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.799 00:11:43 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.799 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.799 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.799 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.799 00:11:43 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.799 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.799 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.799 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.799 00:11:43 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.799 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.799 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.800 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.800 00:11:43 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.800 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.800 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.800 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.800 00:11:43 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.800 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.800 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.800 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.800 00:11:43 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.800 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.800 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.800 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.800 00:11:43 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.800 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.800 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.800 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.800 00:11:43 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.800 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.800 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.800 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.800 00:11:43 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.800 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.800 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.800 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.800 00:11:43 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.800 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.800 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.800 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.800 00:11:43 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.800 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.800 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.800 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.800 00:11:43 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.800 00:11:43 -- setup/common.sh@32 -- # continue 00:03:44.800 00:11:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.800 00:11:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.800 00:11:43 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.800 00:11:43 -- setup/common.sh@33 -- # echo 0 00:03:44.800 00:11:43 -- setup/common.sh@33 -- # return 0 00:03:44.800 00:11:43 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:44.800 00:11:43 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:44.800 00:11:43 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:44.800 00:11:43 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:44.800 00:11:43 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:03:44.800 node0=512 expecting 512 00:03:44.800 00:11:43 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:44.800 00:11:43 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:44.800 00:11:43 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:44.800 00:11:43 -- setup/hugepages.sh@128 -- # echo 'node1=1024 expecting 1024' 00:03:44.800 node1=1024 expecting 1024 00:03:44.800 00:11:43 -- setup/hugepages.sh@130 -- # [[ 512,1024 == \5\1\2\,\1\0\2\4 ]] 00:03:44.800 00:03:44.800 real 0m3.713s 00:03:44.800 user 0m1.433s 00:03:44.800 sys 0m2.349s 00:03:44.800 00:11:43 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:44.800 00:11:43 -- common/autotest_common.sh@10 -- # set +x 00:03:44.800 ************************************ 00:03:44.800 END TEST custom_alloc 00:03:44.800 ************************************ 00:03:44.800 00:11:43 -- setup/hugepages.sh@215 -- # run_test no_shrink_alloc no_shrink_alloc 00:03:44.800 00:11:43 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:03:44.800 00:11:43 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:03:44.800 00:11:43 -- common/autotest_common.sh@10 -- # set +x 00:03:44.800 ************************************ 00:03:44.800 START TEST no_shrink_alloc 00:03:44.800 ************************************ 00:03:44.800 00:11:43 -- common/autotest_common.sh@1104 -- # no_shrink_alloc 00:03:44.800 00:11:43 -- setup/hugepages.sh@195 -- # get_test_nr_hugepages 2097152 0 00:03:44.800 00:11:43 -- setup/hugepages.sh@49 -- # local size=2097152 00:03:44.800 00:11:43 -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:03:44.800 00:11:43 -- setup/hugepages.sh@51 -- # shift 00:03:44.800 00:11:43 -- setup/hugepages.sh@52 -- # node_ids=('0') 00:03:45.058 00:11:43 -- setup/hugepages.sh@52 -- # local node_ids 00:03:45.058 00:11:43 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:45.058 00:11:43 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:03:45.058 00:11:43 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:03:45.058 00:11:43 -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:03:45.058 00:11:43 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:45.058 00:11:43 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:45.058 00:11:43 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:45.058 00:11:43 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:45.058 00:11:43 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:45.058 00:11:43 -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:03:45.058 00:11:43 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:45.058 00:11:43 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:03:45.058 00:11:43 -- setup/hugepages.sh@73 -- # return 0 00:03:45.058 00:11:43 -- setup/hugepages.sh@198 -- # setup output 00:03:45.058 00:11:43 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:45.058 00:11:43 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:48.350 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:48.350 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:48.350 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:48.350 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:48.350 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:48.350 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:48.350 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:48.350 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:48.350 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:48.350 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:48.350 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:48.350 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:48.350 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:48.350 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:48.350 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:48.350 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:48.350 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:48.350 00:11:47 -- setup/hugepages.sh@199 -- # verify_nr_hugepages 00:03:48.350 00:11:47 -- setup/hugepages.sh@89 -- # local node 00:03:48.350 00:11:47 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:48.350 00:11:47 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:48.350 00:11:47 -- setup/hugepages.sh@92 -- # local surp 00:03:48.350 00:11:47 -- setup/hugepages.sh@93 -- # local resv 00:03:48.350 00:11:47 -- setup/hugepages.sh@94 -- # local anon 00:03:48.350 00:11:47 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:48.350 00:11:47 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:48.350 00:11:47 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:48.350 00:11:47 -- setup/common.sh@18 -- # local node= 00:03:48.350 00:11:47 -- setup/common.sh@19 -- # local var val 00:03:48.350 00:11:47 -- setup/common.sh@20 -- # local mem_f mem 00:03:48.350 00:11:47 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:48.350 00:11:47 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:48.350 00:11:47 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:48.350 00:11:47 -- setup/common.sh@28 -- # mapfile -t mem 00:03:48.350 00:11:47 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:48.350 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.350 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.351 00:11:47 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 44116708 kB' 'MemAvailable: 46497408 kB' 'Buffers: 12536 kB' 'Cached: 9969584 kB' 'SwapCached: 16 kB' 'Active: 8224152 kB' 'Inactive: 2354388 kB' 'Active(anon): 7748800 kB' 'Inactive(anon): 57088 kB' 'Active(file): 475352 kB' 'Inactive(file): 2297300 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8387836 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 599712 kB' 'Mapped: 161288 kB' 'Shmem: 7209468 kB' 'KReclaimable: 243128 kB' 'Slab: 777044 kB' 'SReclaimable: 243128 kB' 'SUnreclaim: 533916 kB' 'KernelStack: 22096 kB' 'PageTables: 8408 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 9169620 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213652 kB' 'VmallocChunk: 0 kB' 'Percpu: 78848 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:03:48.351 00:11:47 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.351 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.351 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.351 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.351 00:11:47 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.351 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.351 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.351 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.351 00:11:47 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.351 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.351 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.351 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.351 00:11:47 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.351 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.351 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.351 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.351 00:11:47 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.351 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.351 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.351 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.351 00:11:47 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.351 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.351 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.351 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.351 00:11:47 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.351 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.351 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.351 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.351 00:11:47 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.351 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.351 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.351 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.351 00:11:47 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.351 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.351 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.351 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.351 00:11:47 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.351 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.351 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.351 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.351 00:11:47 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.351 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.351 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.351 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.351 00:11:47 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.351 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.351 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.351 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.351 00:11:47 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.351 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.351 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.351 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.351 00:11:47 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.351 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.351 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.351 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.351 00:11:47 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.351 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.351 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.351 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.351 00:11:47 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.351 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.351 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.351 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.351 00:11:47 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.351 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.351 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.351 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.351 00:11:47 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.351 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.351 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.351 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.351 00:11:47 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.351 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.351 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.351 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.351 00:11:47 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.351 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.351 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.351 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.351 00:11:47 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.351 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.351 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.351 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.351 00:11:47 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.351 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.351 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.351 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.351 00:11:47 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.351 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.351 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.351 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.351 00:11:47 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.351 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.351 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.351 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.351 00:11:47 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.351 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.351 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.351 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.351 00:11:47 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.351 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.351 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.351 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.351 00:11:47 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.351 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.351 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.351 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.351 00:11:47 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.351 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.351 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.351 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.351 00:11:47 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.351 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.351 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.351 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.351 00:11:47 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.351 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.351 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.351 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.351 00:11:47 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.351 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.351 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.351 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.351 00:11:47 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.351 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.351 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.351 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.351 00:11:47 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.351 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.351 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.351 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.351 00:11:47 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.351 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.351 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.351 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.351 00:11:47 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.351 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.351 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.351 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.351 00:11:47 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.351 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.351 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.351 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.351 00:11:47 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.351 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.351 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.351 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.351 00:11:47 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.351 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.351 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.351 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.351 00:11:47 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.351 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.351 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.351 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.351 00:11:47 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.352 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.352 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.352 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.352 00:11:47 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.352 00:11:47 -- setup/common.sh@33 -- # echo 0 00:03:48.352 00:11:47 -- setup/common.sh@33 -- # return 0 00:03:48.352 00:11:47 -- setup/hugepages.sh@97 -- # anon=0 00:03:48.352 00:11:47 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:48.352 00:11:47 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:48.352 00:11:47 -- setup/common.sh@18 -- # local node= 00:03:48.352 00:11:47 -- setup/common.sh@19 -- # local var val 00:03:48.352 00:11:47 -- setup/common.sh@20 -- # local mem_f mem 00:03:48.352 00:11:47 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:48.352 00:11:47 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:48.352 00:11:47 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:48.352 00:11:47 -- setup/common.sh@28 -- # mapfile -t mem 00:03:48.352 00:11:47 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:48.352 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.352 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.352 00:11:47 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 44120096 kB' 'MemAvailable: 46500796 kB' 'Buffers: 12536 kB' 'Cached: 9969584 kB' 'SwapCached: 16 kB' 'Active: 8223360 kB' 'Inactive: 2354388 kB' 'Active(anon): 7748008 kB' 'Inactive(anon): 57088 kB' 'Active(file): 475352 kB' 'Inactive(file): 2297300 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8387836 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 598924 kB' 'Mapped: 161236 kB' 'Shmem: 7209468 kB' 'KReclaimable: 243128 kB' 'Slab: 777044 kB' 'SReclaimable: 243128 kB' 'SUnreclaim: 533916 kB' 'KernelStack: 21888 kB' 'PageTables: 7976 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 9168116 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213604 kB' 'VmallocChunk: 0 kB' 'Percpu: 78848 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:03:48.352 00:11:47 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.352 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.352 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.352 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.352 00:11:47 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.352 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.352 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.352 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.352 00:11:47 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.352 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.352 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.352 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.352 00:11:47 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.352 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.352 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.352 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.352 00:11:47 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.352 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.352 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.352 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.352 00:11:47 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.352 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.352 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.352 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.352 00:11:47 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.352 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.352 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.352 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.352 00:11:47 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.352 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.352 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.352 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.352 00:11:47 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.352 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.352 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.352 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.352 00:11:47 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.352 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.352 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.352 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.352 00:11:47 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.352 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.352 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.352 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.352 00:11:47 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.352 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.352 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.352 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.352 00:11:47 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.352 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.352 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.352 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.352 00:11:47 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.352 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.352 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.352 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.352 00:11:47 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.352 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.352 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.352 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.352 00:11:47 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.352 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.352 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.352 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.352 00:11:47 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.352 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.352 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.352 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.352 00:11:47 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.352 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.352 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.352 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.352 00:11:47 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.352 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.352 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.352 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.352 00:11:47 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.352 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.352 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.352 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.352 00:11:47 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.352 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.352 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.352 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.352 00:11:47 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.352 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.352 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.352 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.352 00:11:47 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.352 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.352 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.352 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.352 00:11:47 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.352 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.352 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.352 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.352 00:11:47 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.352 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.352 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.352 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.352 00:11:47 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.352 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.352 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.352 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.352 00:11:47 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.352 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.352 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.352 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.352 00:11:47 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.352 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.352 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.352 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.352 00:11:47 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.352 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.352 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.352 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.352 00:11:47 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.352 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.352 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.352 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.353 00:11:47 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.353 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.353 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.353 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.353 00:11:47 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.353 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.353 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.353 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.353 00:11:47 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.353 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.353 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.353 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.353 00:11:47 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.353 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.353 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.353 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.353 00:11:47 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.353 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.353 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.353 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.353 00:11:47 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.353 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.353 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.353 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.353 00:11:47 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.353 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.353 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.353 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.353 00:11:47 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.353 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.353 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.353 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.353 00:11:47 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.353 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.353 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.353 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.353 00:11:47 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.353 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.353 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.353 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.353 00:11:47 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.353 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.353 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.353 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.353 00:11:47 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.353 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.353 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.353 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.353 00:11:47 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.353 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.353 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.353 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.353 00:11:47 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.353 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.353 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.353 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.353 00:11:47 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.353 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.353 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.353 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.353 00:11:47 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.353 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.353 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.353 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.353 00:11:47 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.353 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.353 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.353 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.353 00:11:47 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.353 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.353 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.353 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.353 00:11:47 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.353 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.353 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.353 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.353 00:11:47 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.353 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.353 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.353 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.353 00:11:47 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.353 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.353 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.353 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.353 00:11:47 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.353 00:11:47 -- setup/common.sh@33 -- # echo 0 00:03:48.353 00:11:47 -- setup/common.sh@33 -- # return 0 00:03:48.353 00:11:47 -- setup/hugepages.sh@99 -- # surp=0 00:03:48.353 00:11:47 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:48.353 00:11:47 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:48.353 00:11:47 -- setup/common.sh@18 -- # local node= 00:03:48.353 00:11:47 -- setup/common.sh@19 -- # local var val 00:03:48.353 00:11:47 -- setup/common.sh@20 -- # local mem_f mem 00:03:48.353 00:11:47 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:48.353 00:11:47 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:48.353 00:11:47 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:48.353 00:11:47 -- setup/common.sh@28 -- # mapfile -t mem 00:03:48.353 00:11:47 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:48.353 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.353 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.353 00:11:47 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 44120520 kB' 'MemAvailable: 46501220 kB' 'Buffers: 12536 kB' 'Cached: 9969584 kB' 'SwapCached: 16 kB' 'Active: 8223552 kB' 'Inactive: 2354388 kB' 'Active(anon): 7748200 kB' 'Inactive(anon): 57088 kB' 'Active(file): 475352 kB' 'Inactive(file): 2297300 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8387836 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 599108 kB' 'Mapped: 161236 kB' 'Shmem: 7209468 kB' 'KReclaimable: 243128 kB' 'Slab: 777044 kB' 'SReclaimable: 243128 kB' 'SUnreclaim: 533916 kB' 'KernelStack: 22016 kB' 'PageTables: 8128 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 9169644 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213636 kB' 'VmallocChunk: 0 kB' 'Percpu: 78848 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:03:48.353 00:11:47 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.353 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.353 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.353 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.353 00:11:47 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.353 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.353 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.353 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.353 00:11:47 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.353 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.353 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.353 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.353 00:11:47 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.353 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.353 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.353 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.353 00:11:47 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.354 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.354 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.354 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.354 00:11:47 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.354 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.354 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.354 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.354 00:11:47 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.354 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.354 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.354 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.354 00:11:47 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.354 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.354 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.354 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.354 00:11:47 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.354 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.354 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.354 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.354 00:11:47 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.354 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.354 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.354 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.354 00:11:47 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.354 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.354 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.354 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.354 00:11:47 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.354 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.354 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.354 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.354 00:11:47 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.354 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.354 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.354 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.354 00:11:47 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.354 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.354 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.354 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.354 00:11:47 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.354 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.354 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.354 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.354 00:11:47 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.354 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.354 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.354 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.354 00:11:47 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.354 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.354 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.354 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.354 00:11:47 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.354 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.354 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.354 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.354 00:11:47 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.354 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.354 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.354 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.354 00:11:47 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.354 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.354 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.354 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.354 00:11:47 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.354 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.354 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.354 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.354 00:11:47 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.354 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.354 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.354 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.354 00:11:47 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.354 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.354 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.354 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.354 00:11:47 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.354 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.354 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.354 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.354 00:11:47 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.354 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.354 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.354 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.354 00:11:47 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.354 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.354 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.354 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.354 00:11:47 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.354 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.354 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.354 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.354 00:11:47 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.354 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.354 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.354 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.354 00:11:47 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.354 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.354 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.354 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.354 00:11:47 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.354 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.354 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.354 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.354 00:11:47 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.354 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.354 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.354 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.354 00:11:47 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.354 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.354 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.354 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.354 00:11:47 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.354 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.354 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.354 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.354 00:11:47 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.354 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.354 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.354 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.354 00:11:47 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.354 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.354 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.354 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.354 00:11:47 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.354 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.354 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.354 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.354 00:11:47 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.354 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.354 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.354 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.354 00:11:47 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.354 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.354 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.354 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.354 00:11:47 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.354 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.354 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.354 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.354 00:11:47 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.354 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.354 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.354 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.354 00:11:47 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.354 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.354 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.354 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.354 00:11:47 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.354 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.354 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.354 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.354 00:11:47 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.354 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.354 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.354 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.354 00:11:47 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.354 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.354 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.354 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.354 00:11:47 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.354 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.354 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.354 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.354 00:11:47 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.354 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.354 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.354 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.354 00:11:47 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.354 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.354 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.354 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.354 00:11:47 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.354 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.354 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.354 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.355 00:11:47 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.355 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.355 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.355 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.355 00:11:47 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.355 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.355 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.355 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.355 00:11:47 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.355 00:11:47 -- setup/common.sh@33 -- # echo 0 00:03:48.355 00:11:47 -- setup/common.sh@33 -- # return 0 00:03:48.355 00:11:47 -- setup/hugepages.sh@100 -- # resv=0 00:03:48.355 00:11:47 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:48.355 nr_hugepages=1024 00:03:48.355 00:11:47 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:48.355 resv_hugepages=0 00:03:48.355 00:11:47 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:48.355 surplus_hugepages=0 00:03:48.355 00:11:47 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:48.355 anon_hugepages=0 00:03:48.355 00:11:47 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:48.355 00:11:47 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:48.355 00:11:47 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:48.355 00:11:47 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:48.355 00:11:47 -- setup/common.sh@18 -- # local node= 00:03:48.355 00:11:47 -- setup/common.sh@19 -- # local var val 00:03:48.355 00:11:47 -- setup/common.sh@20 -- # local mem_f mem 00:03:48.355 00:11:47 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:48.355 00:11:47 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:48.355 00:11:47 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:48.355 00:11:47 -- setup/common.sh@28 -- # mapfile -t mem 00:03:48.355 00:11:47 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:48.355 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.355 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.355 00:11:47 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 44121692 kB' 'MemAvailable: 46502392 kB' 'Buffers: 12536 kB' 'Cached: 9969584 kB' 'SwapCached: 16 kB' 'Active: 8224496 kB' 'Inactive: 2354388 kB' 'Active(anon): 7749144 kB' 'Inactive(anon): 57088 kB' 'Active(file): 475352 kB' 'Inactive(file): 2297300 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8387836 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 599584 kB' 'Mapped: 161228 kB' 'Shmem: 7209468 kB' 'KReclaimable: 243128 kB' 'Slab: 777044 kB' 'SReclaimable: 243128 kB' 'SUnreclaim: 533916 kB' 'KernelStack: 21936 kB' 'PageTables: 7956 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 9179856 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213556 kB' 'VmallocChunk: 0 kB' 'Percpu: 78848 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:03:48.355 00:11:47 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.355 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.355 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.355 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.355 00:11:47 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.355 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.355 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.355 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.355 00:11:47 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.355 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.355 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.355 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.355 00:11:47 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.355 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.355 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.355 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.355 00:11:47 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.355 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.355 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.355 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.355 00:11:47 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.355 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.355 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.355 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.355 00:11:47 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.355 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.355 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.355 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.355 00:11:47 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.355 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.355 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.355 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.355 00:11:47 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.355 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.355 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.355 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.355 00:11:47 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.355 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.355 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.355 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.355 00:11:47 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.355 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.355 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.355 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.355 00:11:47 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.355 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.355 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.355 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.355 00:11:47 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.356 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.356 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.356 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.356 00:11:47 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.356 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.356 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.356 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.356 00:11:47 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.356 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.356 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.356 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.356 00:11:47 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.356 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.356 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.356 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.356 00:11:47 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.356 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.356 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.356 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.356 00:11:47 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.356 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.356 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.356 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.356 00:11:47 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.356 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.356 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.356 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.356 00:11:47 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.356 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.356 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.356 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.356 00:11:47 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.356 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.356 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.356 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.356 00:11:47 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.356 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.356 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.356 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.356 00:11:47 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.356 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.356 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.356 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.356 00:11:47 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.356 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.356 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.356 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.356 00:11:47 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.356 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.356 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.356 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.356 00:11:47 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.356 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.356 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.356 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.356 00:11:47 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.356 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.356 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.356 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.356 00:11:47 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.356 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.356 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.356 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.356 00:11:47 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.356 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.356 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.356 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.356 00:11:47 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.356 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.356 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.356 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.356 00:11:47 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.356 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.356 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.356 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.356 00:11:47 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.356 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.356 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.356 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.356 00:11:47 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.356 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.356 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.356 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.356 00:11:47 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.356 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.356 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.356 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.356 00:11:47 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.356 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.356 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.356 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.356 00:11:47 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.356 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.356 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.356 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.356 00:11:47 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.356 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.356 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.356 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.356 00:11:47 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.356 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.356 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.356 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.356 00:11:47 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.356 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.356 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.356 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.356 00:11:47 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.356 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.356 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.356 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.356 00:11:47 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.356 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.356 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.356 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.356 00:11:47 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.356 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.356 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.356 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.356 00:11:47 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.356 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.356 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.356 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.356 00:11:47 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.356 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.356 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.356 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.356 00:11:47 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.356 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.356 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.356 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.356 00:11:47 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.356 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.356 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.356 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.356 00:11:47 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.356 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.356 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.356 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.356 00:11:47 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.356 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.356 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.356 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.356 00:11:47 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.356 00:11:47 -- setup/common.sh@33 -- # echo 1024 00:03:48.356 00:11:47 -- setup/common.sh@33 -- # return 0 00:03:48.356 00:11:47 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:48.356 00:11:47 -- setup/hugepages.sh@112 -- # get_nodes 00:03:48.356 00:11:47 -- setup/hugepages.sh@27 -- # local node 00:03:48.356 00:11:47 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:48.356 00:11:47 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:48.356 00:11:47 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:48.356 00:11:47 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:03:48.356 00:11:47 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:48.356 00:11:47 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:48.356 00:11:47 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:48.356 00:11:47 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:48.356 00:11:47 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:48.356 00:11:47 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:48.356 00:11:47 -- setup/common.sh@18 -- # local node=0 00:03:48.356 00:11:47 -- setup/common.sh@19 -- # local var val 00:03:48.356 00:11:47 -- setup/common.sh@20 -- # local mem_f mem 00:03:48.356 00:11:47 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:48.356 00:11:47 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:48.357 00:11:47 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:48.357 00:11:47 -- setup/common.sh@28 -- # mapfile -t mem 00:03:48.357 00:11:47 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:48.357 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.357 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.357 00:11:47 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32592084 kB' 'MemFree: 25616776 kB' 'MemUsed: 6975308 kB' 'SwapCached: 16 kB' 'Active: 3151028 kB' 'Inactive: 180704 kB' 'Active(anon): 2934408 kB' 'Inactive(anon): 16 kB' 'Active(file): 216620 kB' 'Inactive(file): 180688 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3065436 kB' 'Mapped: 97248 kB' 'AnonPages: 269448 kB' 'Shmem: 2668112 kB' 'KernelStack: 11720 kB' 'PageTables: 4468 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 132824 kB' 'Slab: 381220 kB' 'SReclaimable: 132824 kB' 'SUnreclaim: 248396 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:03:48.357 00:11:47 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.357 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.357 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.357 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.357 00:11:47 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.357 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.357 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.357 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.357 00:11:47 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.357 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.357 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.357 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.357 00:11:47 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.357 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.357 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.357 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.357 00:11:47 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.357 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.357 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.357 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.357 00:11:47 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.357 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.357 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.357 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.357 00:11:47 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.357 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.357 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.357 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.357 00:11:47 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.357 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.357 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.357 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.357 00:11:47 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.357 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.357 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.357 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.357 00:11:47 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.357 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.357 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.357 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.357 00:11:47 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.357 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.357 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.357 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.357 00:11:47 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.357 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.357 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.357 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.357 00:11:47 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.357 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.357 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.357 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.357 00:11:47 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.357 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.357 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.357 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.357 00:11:47 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.357 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.357 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.357 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.357 00:11:47 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.357 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.357 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.357 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.357 00:11:47 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.357 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.357 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.357 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.357 00:11:47 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.357 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.357 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.357 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.357 00:11:47 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.357 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.357 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.357 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.357 00:11:47 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.357 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.357 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.357 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.357 00:11:47 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.357 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.357 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.357 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.357 00:11:47 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.357 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.357 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.357 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.357 00:11:47 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.357 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.357 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.357 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.357 00:11:47 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.357 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.357 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.357 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.357 00:11:47 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.357 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.357 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.357 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.357 00:11:47 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.357 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.357 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.357 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.357 00:11:47 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.357 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.357 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.357 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.357 00:11:47 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.357 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.357 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.357 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.357 00:11:47 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.357 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.357 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.357 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.357 00:11:47 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.357 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.357 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.357 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.357 00:11:47 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.357 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.357 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.357 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.357 00:11:47 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.357 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.357 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.357 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.358 00:11:47 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.358 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.358 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.358 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.358 00:11:47 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.358 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.358 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.358 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.358 00:11:47 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.358 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.358 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.358 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.358 00:11:47 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.358 00:11:47 -- setup/common.sh@32 -- # continue 00:03:48.358 00:11:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.358 00:11:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.358 00:11:47 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.358 00:11:47 -- setup/common.sh@33 -- # echo 0 00:03:48.358 00:11:47 -- setup/common.sh@33 -- # return 0 00:03:48.358 00:11:47 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:48.358 00:11:47 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:48.358 00:11:47 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:48.358 00:11:47 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:48.358 00:11:47 -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:03:48.358 node0=1024 expecting 1024 00:03:48.358 00:11:47 -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:03:48.358 00:11:47 -- setup/hugepages.sh@202 -- # CLEAR_HUGE=no 00:03:48.358 00:11:47 -- setup/hugepages.sh@202 -- # NRHUGE=512 00:03:48.358 00:11:47 -- setup/hugepages.sh@202 -- # setup output 00:03:48.358 00:11:47 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:48.358 00:11:47 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:51.681 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:51.681 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:51.681 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:51.681 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:51.681 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:51.681 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:51.681 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:51.681 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:51.681 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:51.681 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:51.681 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:51.681 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:51.681 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:51.681 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:51.681 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:51.681 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:51.681 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:51.681 INFO: Requested 512 hugepages but 1024 already allocated on node0 00:03:51.681 00:11:50 -- setup/hugepages.sh@204 -- # verify_nr_hugepages 00:03:51.681 00:11:50 -- setup/hugepages.sh@89 -- # local node 00:03:51.681 00:11:50 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:51.681 00:11:50 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:51.681 00:11:50 -- setup/hugepages.sh@92 -- # local surp 00:03:51.681 00:11:50 -- setup/hugepages.sh@93 -- # local resv 00:03:51.681 00:11:50 -- setup/hugepages.sh@94 -- # local anon 00:03:51.681 00:11:50 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:51.681 00:11:50 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:51.681 00:11:50 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:51.681 00:11:50 -- setup/common.sh@18 -- # local node= 00:03:51.681 00:11:50 -- setup/common.sh@19 -- # local var val 00:03:51.681 00:11:50 -- setup/common.sh@20 -- # local mem_f mem 00:03:51.681 00:11:50 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:51.681 00:11:50 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:51.681 00:11:50 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:51.681 00:11:50 -- setup/common.sh@28 -- # mapfile -t mem 00:03:51.681 00:11:50 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:51.681 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.681 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.681 00:11:50 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 44152424 kB' 'MemAvailable: 46533124 kB' 'Buffers: 12536 kB' 'Cached: 9969700 kB' 'SwapCached: 16 kB' 'Active: 8226412 kB' 'Inactive: 2354388 kB' 'Active(anon): 7751060 kB' 'Inactive(anon): 57088 kB' 'Active(file): 475352 kB' 'Inactive(file): 2297300 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8387836 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 601292 kB' 'Mapped: 161336 kB' 'Shmem: 7209584 kB' 'KReclaimable: 243128 kB' 'Slab: 777384 kB' 'SReclaimable: 243128 kB' 'SUnreclaim: 534256 kB' 'KernelStack: 22048 kB' 'PageTables: 8288 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 9165708 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213668 kB' 'VmallocChunk: 0 kB' 'Percpu: 78848 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:03:51.681 00:11:50 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.681 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.681 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.681 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.681 00:11:50 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.681 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.681 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.681 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.681 00:11:50 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.681 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.681 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.681 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.681 00:11:50 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.681 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.681 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.681 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.681 00:11:50 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.681 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.681 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.681 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.681 00:11:50 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.681 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.681 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.681 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.681 00:11:50 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.681 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.681 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.681 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.681 00:11:50 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.681 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.681 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.681 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.681 00:11:50 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.681 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.681 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.681 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.681 00:11:50 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.681 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.681 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.681 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.681 00:11:50 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.681 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.682 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.682 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.682 00:11:50 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.682 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.682 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.682 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.682 00:11:50 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.682 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.682 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.682 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.682 00:11:50 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.682 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.682 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.682 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.682 00:11:50 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.682 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.682 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.682 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.682 00:11:50 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.682 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.682 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.682 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.682 00:11:50 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.682 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.682 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.682 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.682 00:11:50 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.682 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.682 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.682 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.682 00:11:50 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.682 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.682 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.682 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.682 00:11:50 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.682 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.682 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.682 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.682 00:11:50 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.682 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.682 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.682 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.682 00:11:50 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.682 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.682 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.682 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.682 00:11:50 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.682 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.682 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.682 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.682 00:11:50 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.682 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.682 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.682 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.682 00:11:50 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.682 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.682 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.682 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.682 00:11:50 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.682 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.682 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.682 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.682 00:11:50 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.682 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.682 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.682 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.682 00:11:50 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.682 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.682 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.682 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.682 00:11:50 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.682 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.682 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.682 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.682 00:11:50 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.682 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.682 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.682 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.682 00:11:50 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.682 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.682 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.682 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.682 00:11:50 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.682 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.682 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.682 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.682 00:11:50 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.682 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.682 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.682 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.682 00:11:50 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.682 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.682 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.682 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.682 00:11:50 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.682 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.682 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.682 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.682 00:11:50 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.682 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.682 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.682 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.682 00:11:50 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.682 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.682 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.682 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.682 00:11:50 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.682 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.682 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.682 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.682 00:11:50 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.682 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.682 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.682 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.682 00:11:50 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.682 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.682 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.682 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.682 00:11:50 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.682 00:11:50 -- setup/common.sh@33 -- # echo 0 00:03:51.682 00:11:50 -- setup/common.sh@33 -- # return 0 00:03:51.682 00:11:50 -- setup/hugepages.sh@97 -- # anon=0 00:03:51.682 00:11:50 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:51.682 00:11:50 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:51.682 00:11:50 -- setup/common.sh@18 -- # local node= 00:03:51.682 00:11:50 -- setup/common.sh@19 -- # local var val 00:03:51.682 00:11:50 -- setup/common.sh@20 -- # local mem_f mem 00:03:51.682 00:11:50 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:51.682 00:11:50 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:51.682 00:11:50 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:51.682 00:11:50 -- setup/common.sh@28 -- # mapfile -t mem 00:03:51.682 00:11:50 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:51.682 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.682 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.682 00:11:50 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 44154144 kB' 'MemAvailable: 46534844 kB' 'Buffers: 12536 kB' 'Cached: 9969700 kB' 'SwapCached: 16 kB' 'Active: 8226276 kB' 'Inactive: 2354388 kB' 'Active(anon): 7750924 kB' 'Inactive(anon): 57088 kB' 'Active(file): 475352 kB' 'Inactive(file): 2297300 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8387836 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 601176 kB' 'Mapped: 161336 kB' 'Shmem: 7209584 kB' 'KReclaimable: 243128 kB' 'Slab: 777372 kB' 'SReclaimable: 243128 kB' 'SUnreclaim: 534244 kB' 'KernelStack: 21872 kB' 'PageTables: 7776 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 9165720 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213556 kB' 'VmallocChunk: 0 kB' 'Percpu: 78848 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:03:51.683 00:11:50 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.683 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.683 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.683 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.683 00:11:50 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.683 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.683 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.683 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.683 00:11:50 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.683 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.683 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.683 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.683 00:11:50 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.683 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.683 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.683 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.683 00:11:50 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.683 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.683 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.683 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.683 00:11:50 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.683 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.683 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.683 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.683 00:11:50 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.683 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.683 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.683 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.683 00:11:50 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.683 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.683 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.683 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.683 00:11:50 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.683 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.683 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.683 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.683 00:11:50 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.683 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.683 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.683 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.683 00:11:50 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.683 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.683 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.683 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.683 00:11:50 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.683 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.683 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.683 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.683 00:11:50 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.683 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.683 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.683 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.683 00:11:50 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.683 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.683 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.683 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.683 00:11:50 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.683 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.683 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.683 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.683 00:11:50 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.683 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.683 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.683 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.683 00:11:50 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.683 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.683 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.683 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.683 00:11:50 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.683 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.683 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.683 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.683 00:11:50 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.683 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.683 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.683 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.683 00:11:50 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.683 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.683 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.683 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.683 00:11:50 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.683 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.683 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.683 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.683 00:11:50 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.683 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.683 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.683 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.683 00:11:50 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.683 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.683 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.683 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.683 00:11:50 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.683 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.683 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.683 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.683 00:11:50 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.683 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.683 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.683 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.683 00:11:50 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.683 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.683 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.683 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.683 00:11:50 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.683 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.683 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.683 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.683 00:11:50 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.683 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.683 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.683 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.683 00:11:50 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.683 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.683 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.683 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.683 00:11:50 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.683 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.683 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.683 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.683 00:11:50 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.683 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.683 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.683 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.683 00:11:50 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.683 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.683 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.683 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.683 00:11:50 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.683 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.683 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.683 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.683 00:11:50 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.683 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.683 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.683 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.683 00:11:50 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.683 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.683 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.684 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.684 00:11:50 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.684 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.684 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.684 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.684 00:11:50 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.684 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.684 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.684 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.684 00:11:50 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.684 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.684 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.684 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.684 00:11:50 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.684 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.684 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.684 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.684 00:11:50 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.684 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.684 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.684 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.684 00:11:50 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.684 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.684 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.684 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.684 00:11:50 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.684 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.684 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.684 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.684 00:11:50 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.684 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.684 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.684 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.684 00:11:50 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.684 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.684 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.684 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.684 00:11:50 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.684 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.684 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.684 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.994 00:11:50 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.994 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.994 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.994 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.994 00:11:50 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.994 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.994 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.994 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.994 00:11:50 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.994 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.994 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.994 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.994 00:11:50 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.994 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.994 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.994 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.994 00:11:50 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.994 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.994 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.994 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.994 00:11:50 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.994 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.994 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.994 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.994 00:11:50 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.994 00:11:50 -- setup/common.sh@33 -- # echo 0 00:03:51.994 00:11:50 -- setup/common.sh@33 -- # return 0 00:03:51.994 00:11:50 -- setup/hugepages.sh@99 -- # surp=0 00:03:51.994 00:11:50 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:51.994 00:11:50 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:51.994 00:11:50 -- setup/common.sh@18 -- # local node= 00:03:51.994 00:11:50 -- setup/common.sh@19 -- # local var val 00:03:51.994 00:11:50 -- setup/common.sh@20 -- # local mem_f mem 00:03:51.994 00:11:50 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:51.994 00:11:50 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:51.994 00:11:50 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:51.994 00:11:50 -- setup/common.sh@28 -- # mapfile -t mem 00:03:51.994 00:11:50 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:51.994 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.994 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.994 00:11:50 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 44153040 kB' 'MemAvailable: 46533736 kB' 'Buffers: 12536 kB' 'Cached: 9969712 kB' 'SwapCached: 16 kB' 'Active: 8225368 kB' 'Inactive: 2354388 kB' 'Active(anon): 7750016 kB' 'Inactive(anon): 57088 kB' 'Active(file): 475352 kB' 'Inactive(file): 2297300 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8387836 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 600296 kB' 'Mapped: 161316 kB' 'Shmem: 7209596 kB' 'KReclaimable: 243120 kB' 'Slab: 777332 kB' 'SReclaimable: 243120 kB' 'SUnreclaim: 534212 kB' 'KernelStack: 21904 kB' 'PageTables: 8088 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 9201936 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213572 kB' 'VmallocChunk: 0 kB' 'Percpu: 78848 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:03:51.994 00:11:50 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.994 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.994 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.994 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.994 00:11:50 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.994 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.994 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.994 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.994 00:11:50 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.994 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.994 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.994 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.995 00:11:50 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.995 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.995 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.995 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.995 00:11:50 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.995 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.995 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.995 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.995 00:11:50 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.995 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.995 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.995 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.995 00:11:50 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.995 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.995 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.995 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.995 00:11:50 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.995 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.995 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.995 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.995 00:11:50 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.995 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.995 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.995 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.995 00:11:50 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.995 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.995 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.995 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.995 00:11:50 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.995 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.995 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.995 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.995 00:11:50 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.995 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.995 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.995 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.995 00:11:50 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.995 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.995 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.995 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.995 00:11:50 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.995 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.995 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.995 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.995 00:11:50 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.995 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.995 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.995 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.995 00:11:50 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.995 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.995 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.995 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.995 00:11:50 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.995 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.995 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.995 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.995 00:11:50 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.995 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.995 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.995 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.995 00:11:50 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.995 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.995 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.995 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.995 00:11:50 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.995 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.995 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.995 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.995 00:11:50 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.995 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.995 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.995 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.995 00:11:50 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.995 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.995 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.995 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.995 00:11:50 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.995 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.995 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.995 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.995 00:11:50 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.995 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.995 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.995 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.995 00:11:50 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.995 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.995 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.995 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.995 00:11:50 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.995 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.995 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.995 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.995 00:11:50 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.995 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.995 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.995 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.995 00:11:50 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.995 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.995 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.995 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.995 00:11:50 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.995 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.995 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.995 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.995 00:11:50 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.995 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.995 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.995 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.995 00:11:50 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.995 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.995 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.995 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.995 00:11:50 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.995 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.995 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.995 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.995 00:11:50 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.995 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.995 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.995 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.995 00:11:50 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.995 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.995 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.995 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.995 00:11:50 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.995 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.995 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.995 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.995 00:11:50 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.995 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.995 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.995 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.995 00:11:50 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.995 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.995 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.995 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.995 00:11:50 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.995 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.995 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.995 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.995 00:11:50 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.995 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.995 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.995 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.995 00:11:50 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.995 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.995 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.995 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.995 00:11:50 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.995 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.995 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.995 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.995 00:11:50 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.995 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.995 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.995 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.995 00:11:50 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.995 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.995 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.995 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.995 00:11:50 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.995 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.995 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.995 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.995 00:11:50 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.995 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.995 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.995 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.995 00:11:50 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.995 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.995 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.995 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.996 00:11:50 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.996 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.996 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.996 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.996 00:11:50 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.996 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.996 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.996 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.996 00:11:50 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.996 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.996 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.996 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.996 00:11:50 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.996 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.996 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.996 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.996 00:11:50 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.996 00:11:50 -- setup/common.sh@33 -- # echo 0 00:03:51.996 00:11:50 -- setup/common.sh@33 -- # return 0 00:03:51.996 00:11:50 -- setup/hugepages.sh@100 -- # resv=0 00:03:51.996 00:11:50 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:51.996 nr_hugepages=1024 00:03:51.996 00:11:50 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:51.996 resv_hugepages=0 00:03:51.996 00:11:50 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:51.996 surplus_hugepages=0 00:03:51.996 00:11:50 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:51.996 anon_hugepages=0 00:03:51.996 00:11:50 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:51.996 00:11:50 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:51.996 00:11:50 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:51.996 00:11:50 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:51.996 00:11:50 -- setup/common.sh@18 -- # local node= 00:03:51.996 00:11:50 -- setup/common.sh@19 -- # local var val 00:03:51.996 00:11:50 -- setup/common.sh@20 -- # local mem_f mem 00:03:51.996 00:11:50 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:51.996 00:11:50 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:51.996 00:11:50 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:51.996 00:11:50 -- setup/common.sh@28 -- # mapfile -t mem 00:03:51.996 00:11:50 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:51.996 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.996 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.996 00:11:50 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 44153344 kB' 'MemAvailable: 46534040 kB' 'Buffers: 12536 kB' 'Cached: 9969728 kB' 'SwapCached: 16 kB' 'Active: 8225360 kB' 'Inactive: 2354388 kB' 'Active(anon): 7750008 kB' 'Inactive(anon): 57088 kB' 'Active(file): 475352 kB' 'Inactive(file): 2297300 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8387836 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 601220 kB' 'Mapped: 161744 kB' 'Shmem: 7209612 kB' 'KReclaimable: 243120 kB' 'Slab: 777328 kB' 'SReclaimable: 243120 kB' 'SUnreclaim: 534208 kB' 'KernelStack: 21824 kB' 'PageTables: 7748 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 9167396 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213476 kB' 'VmallocChunk: 0 kB' 'Percpu: 78848 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:03:51.996 00:11:50 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.996 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.996 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.996 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.996 00:11:50 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.996 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.996 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.996 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.996 00:11:50 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.996 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.996 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.996 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.996 00:11:50 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.996 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.996 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.996 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.996 00:11:50 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.996 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.996 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.996 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.996 00:11:50 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.996 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.996 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.996 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.996 00:11:50 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.996 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.996 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.996 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.996 00:11:50 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.996 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.996 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.996 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.996 00:11:50 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.996 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.996 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.996 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.996 00:11:50 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.996 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.996 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.996 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.996 00:11:50 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.996 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.996 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.996 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.996 00:11:50 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.996 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.996 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.996 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.996 00:11:50 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.996 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.996 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.996 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.996 00:11:50 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.996 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.996 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.996 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.996 00:11:50 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.996 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.996 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.996 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.996 00:11:50 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.996 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.996 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.996 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.996 00:11:50 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.996 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.996 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.996 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.996 00:11:50 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.996 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.996 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.996 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.996 00:11:50 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.996 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.996 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.996 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.996 00:11:50 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.996 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.996 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.996 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.996 00:11:50 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.996 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.996 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.996 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.996 00:11:50 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.996 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.996 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.996 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.996 00:11:50 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.996 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.996 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.996 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.996 00:11:50 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.996 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.996 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.996 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.996 00:11:50 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.996 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.996 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.996 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.996 00:11:50 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.996 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.996 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.996 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.996 00:11:50 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.996 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.996 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.996 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.996 00:11:50 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.996 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.996 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.996 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.996 00:11:50 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.996 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.997 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.997 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.997 00:11:50 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.997 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.997 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.997 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.997 00:11:50 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.997 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.997 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.997 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.997 00:11:50 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.997 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.997 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.997 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.997 00:11:50 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.997 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.997 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.997 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.997 00:11:50 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.997 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.997 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.997 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.997 00:11:50 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.997 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.997 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.997 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.997 00:11:50 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.997 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.997 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.997 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.997 00:11:50 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.997 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.997 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.997 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.997 00:11:50 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.997 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.997 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.997 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.997 00:11:50 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.997 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.997 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.997 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.997 00:11:50 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.997 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.997 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.997 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.997 00:11:50 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.997 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.997 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.997 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.997 00:11:50 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.997 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.997 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.997 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.997 00:11:50 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.997 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.997 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.997 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.997 00:11:50 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.997 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.997 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.997 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.997 00:11:50 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.997 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.997 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.997 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.997 00:11:50 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.997 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.997 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.997 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.997 00:11:50 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.997 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.997 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.997 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.997 00:11:50 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.997 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.997 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.997 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.997 00:11:50 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.997 00:11:50 -- setup/common.sh@33 -- # echo 1024 00:03:51.997 00:11:50 -- setup/common.sh@33 -- # return 0 00:03:51.997 00:11:50 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:51.997 00:11:50 -- setup/hugepages.sh@112 -- # get_nodes 00:03:51.997 00:11:50 -- setup/hugepages.sh@27 -- # local node 00:03:51.997 00:11:50 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:51.997 00:11:50 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:51.997 00:11:50 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:51.997 00:11:50 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:03:51.997 00:11:50 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:51.997 00:11:50 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:51.997 00:11:50 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:51.997 00:11:50 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:51.997 00:11:50 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:51.997 00:11:50 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:51.997 00:11:50 -- setup/common.sh@18 -- # local node=0 00:03:51.997 00:11:50 -- setup/common.sh@19 -- # local var val 00:03:51.997 00:11:50 -- setup/common.sh@20 -- # local mem_f mem 00:03:51.997 00:11:50 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:51.997 00:11:50 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:51.997 00:11:50 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:51.997 00:11:50 -- setup/common.sh@28 -- # mapfile -t mem 00:03:51.997 00:11:50 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:51.997 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.997 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.997 00:11:50 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32592084 kB' 'MemFree: 25623184 kB' 'MemUsed: 6968900 kB' 'SwapCached: 16 kB' 'Active: 3156248 kB' 'Inactive: 180704 kB' 'Active(anon): 2939628 kB' 'Inactive(anon): 16 kB' 'Active(file): 216620 kB' 'Inactive(file): 180688 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3065484 kB' 'Mapped: 97256 kB' 'AnonPages: 274664 kB' 'Shmem: 2668160 kB' 'KernelStack: 11768 kB' 'PageTables: 4656 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 132824 kB' 'Slab: 381260 kB' 'SReclaimable: 132824 kB' 'SUnreclaim: 248436 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:03:51.997 00:11:50 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.997 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.997 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.997 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.997 00:11:50 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.997 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.997 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.997 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.997 00:11:50 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.997 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.997 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.997 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.997 00:11:50 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.997 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.997 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.997 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.997 00:11:50 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.997 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.997 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.997 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.997 00:11:50 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.997 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.997 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.997 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.997 00:11:50 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.997 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.997 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.997 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.997 00:11:50 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.997 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.997 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.997 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.997 00:11:50 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.997 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.997 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.997 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.997 00:11:50 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.997 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.997 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.997 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.997 00:11:50 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.997 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.997 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.997 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.997 00:11:50 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.997 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.997 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.997 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.997 00:11:50 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.997 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.997 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.997 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.997 00:11:50 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.997 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.997 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.997 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.997 00:11:50 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.998 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.998 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.998 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.998 00:11:50 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.998 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.998 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.998 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.998 00:11:50 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.998 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.998 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.998 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.998 00:11:50 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.998 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.998 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.998 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.998 00:11:50 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.998 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.998 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.998 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.998 00:11:50 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.998 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.998 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.998 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.998 00:11:50 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.998 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.998 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.998 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.998 00:11:50 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.998 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.998 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.998 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.998 00:11:50 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.998 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.998 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.998 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.998 00:11:50 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.998 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.998 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.998 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.998 00:11:50 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.998 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.998 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.998 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.998 00:11:50 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.998 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.998 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.998 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.998 00:11:50 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.998 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.998 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.998 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.998 00:11:50 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.998 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.998 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.998 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.998 00:11:50 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.998 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.998 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.998 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.998 00:11:50 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.998 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.998 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.998 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.998 00:11:50 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.998 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.998 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.998 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.998 00:11:50 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.998 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.998 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.998 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.998 00:11:50 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.998 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.998 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.998 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.998 00:11:50 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.998 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.998 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.998 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.998 00:11:50 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.998 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.998 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.998 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.998 00:11:50 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.998 00:11:50 -- setup/common.sh@32 -- # continue 00:03:51.998 00:11:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.998 00:11:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.998 00:11:50 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.998 00:11:50 -- setup/common.sh@33 -- # echo 0 00:03:51.998 00:11:50 -- setup/common.sh@33 -- # return 0 00:03:51.998 00:11:50 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:51.998 00:11:50 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:51.998 00:11:50 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:51.998 00:11:50 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:51.998 00:11:50 -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:03:51.998 node0=1024 expecting 1024 00:03:51.998 00:11:50 -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:03:51.998 00:03:51.998 real 0m6.959s 00:03:51.998 user 0m2.615s 00:03:51.998 sys 0m4.483s 00:03:51.998 00:11:50 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:51.998 00:11:50 -- common/autotest_common.sh@10 -- # set +x 00:03:51.998 ************************************ 00:03:51.998 END TEST no_shrink_alloc 00:03:51.998 ************************************ 00:03:51.998 00:11:50 -- setup/hugepages.sh@217 -- # clear_hp 00:03:51.998 00:11:50 -- setup/hugepages.sh@37 -- # local node hp 00:03:51.998 00:11:50 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:03:51.998 00:11:50 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:51.998 00:11:50 -- setup/hugepages.sh@41 -- # echo 0 00:03:51.998 00:11:50 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:51.998 00:11:50 -- setup/hugepages.sh@41 -- # echo 0 00:03:51.998 00:11:50 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:03:51.998 00:11:50 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:51.998 00:11:50 -- setup/hugepages.sh@41 -- # echo 0 00:03:51.998 00:11:50 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:51.998 00:11:50 -- setup/hugepages.sh@41 -- # echo 0 00:03:51.998 00:11:50 -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:03:51.998 00:11:50 -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:03:51.998 00:03:51.998 real 0m26.709s 00:03:51.998 user 0m9.516s 00:03:51.998 sys 0m16.055s 00:03:51.998 00:11:50 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:51.998 00:11:50 -- common/autotest_common.sh@10 -- # set +x 00:03:51.998 ************************************ 00:03:51.998 END TEST hugepages 00:03:51.998 ************************************ 00:03:51.998 00:11:50 -- setup/test-setup.sh@14 -- # run_test driver /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/driver.sh 00:03:51.998 00:11:50 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:03:51.998 00:11:50 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:03:51.998 00:11:50 -- common/autotest_common.sh@10 -- # set +x 00:03:51.998 ************************************ 00:03:51.998 START TEST driver 00:03:51.998 ************************************ 00:03:51.998 00:11:50 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/driver.sh 00:03:51.998 * Looking for test storage... 00:03:51.998 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:03:51.998 00:11:51 -- setup/driver.sh@68 -- # setup reset 00:03:51.998 00:11:51 -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:51.998 00:11:51 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:03:57.274 00:11:55 -- setup/driver.sh@69 -- # run_test guess_driver guess_driver 00:03:57.274 00:11:55 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:03:57.274 00:11:55 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:03:57.274 00:11:55 -- common/autotest_common.sh@10 -- # set +x 00:03:57.274 ************************************ 00:03:57.274 START TEST guess_driver 00:03:57.274 ************************************ 00:03:57.274 00:11:55 -- common/autotest_common.sh@1104 -- # guess_driver 00:03:57.274 00:11:55 -- setup/driver.sh@46 -- # local driver setup_driver marker 00:03:57.274 00:11:55 -- setup/driver.sh@47 -- # local fail=0 00:03:57.274 00:11:55 -- setup/driver.sh@49 -- # pick_driver 00:03:57.274 00:11:55 -- setup/driver.sh@36 -- # vfio 00:03:57.274 00:11:55 -- setup/driver.sh@21 -- # local iommu_grups 00:03:57.274 00:11:55 -- setup/driver.sh@22 -- # local unsafe_vfio 00:03:57.274 00:11:55 -- setup/driver.sh@24 -- # [[ -e /sys/module/vfio/parameters/enable_unsafe_noiommu_mode ]] 00:03:57.274 00:11:55 -- setup/driver.sh@25 -- # unsafe_vfio=N 00:03:57.274 00:11:55 -- setup/driver.sh@27 -- # iommu_groups=(/sys/kernel/iommu_groups/*) 00:03:57.274 00:11:55 -- setup/driver.sh@29 -- # (( 176 > 0 )) 00:03:57.274 00:11:55 -- setup/driver.sh@30 -- # is_driver vfio_pci 00:03:57.274 00:11:55 -- setup/driver.sh@14 -- # mod vfio_pci 00:03:57.274 00:11:55 -- setup/driver.sh@12 -- # dep vfio_pci 00:03:57.274 00:11:55 -- setup/driver.sh@11 -- # modprobe --show-depends vfio_pci 00:03:57.274 00:11:55 -- setup/driver.sh@12 -- # [[ insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/virt/lib/irqbypass.ko.xz 00:03:57.274 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:03:57.274 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:03:57.274 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:03:57.274 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:03:57.274 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio_iommu_type1.ko.xz 00:03:57.274 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci-core.ko.xz 00:03:57.274 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci.ko.xz == *\.\k\o* ]] 00:03:57.274 00:11:55 -- setup/driver.sh@30 -- # return 0 00:03:57.274 00:11:55 -- setup/driver.sh@37 -- # echo vfio-pci 00:03:57.274 00:11:55 -- setup/driver.sh@49 -- # driver=vfio-pci 00:03:57.274 00:11:55 -- setup/driver.sh@51 -- # [[ vfio-pci == \N\o\ \v\a\l\i\d\ \d\r\i\v\e\r\ \f\o\u\n\d ]] 00:03:57.274 00:11:55 -- setup/driver.sh@56 -- # echo 'Looking for driver=vfio-pci' 00:03:57.274 Looking for driver=vfio-pci 00:03:57.274 00:11:55 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:57.274 00:11:55 -- setup/driver.sh@45 -- # setup output config 00:03:57.274 00:11:55 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:57.274 00:11:55 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:00.561 00:11:58 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:00.561 00:11:58 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:00.561 00:11:58 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:00.561 00:11:58 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:00.561 00:11:58 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:00.561 00:11:58 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:00.561 00:11:58 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:00.561 00:11:58 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:00.561 00:11:58 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:00.561 00:11:58 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:00.561 00:11:58 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:00.561 00:11:58 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:00.561 00:11:58 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:00.561 00:11:58 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:00.561 00:11:58 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:00.561 00:11:58 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:00.561 00:11:58 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:00.561 00:11:58 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:00.561 00:11:58 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:00.561 00:11:58 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:00.561 00:11:58 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:00.561 00:11:59 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:00.561 00:11:59 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:00.561 00:11:59 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:00.561 00:11:59 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:00.561 00:11:59 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:00.561 00:11:59 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:00.561 00:11:59 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:00.561 00:11:59 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:00.561 00:11:59 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:00.561 00:11:59 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:00.561 00:11:59 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:00.561 00:11:59 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:00.561 00:11:59 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:00.561 00:11:59 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:00.561 00:11:59 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:00.561 00:11:59 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:00.561 00:11:59 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:00.561 00:11:59 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:00.561 00:11:59 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:00.561 00:11:59 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:00.561 00:11:59 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:00.561 00:11:59 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:00.561 00:11:59 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:00.561 00:11:59 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:00.561 00:11:59 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:00.561 00:11:59 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:00.561 00:11:59 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:01.940 00:12:00 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:01.940 00:12:00 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:01.940 00:12:00 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:01.940 00:12:00 -- setup/driver.sh@64 -- # (( fail == 0 )) 00:04:01.940 00:12:00 -- setup/driver.sh@65 -- # setup reset 00:04:01.940 00:12:00 -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:01.940 00:12:00 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:06.127 00:04:06.127 real 0m9.426s 00:04:06.127 user 0m2.283s 00:04:06.127 sys 0m4.703s 00:04:06.127 00:12:04 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:06.127 00:12:04 -- common/autotest_common.sh@10 -- # set +x 00:04:06.127 ************************************ 00:04:06.127 END TEST guess_driver 00:04:06.127 ************************************ 00:04:06.127 00:04:06.127 real 0m14.039s 00:04:06.127 user 0m3.489s 00:04:06.127 sys 0m7.306s 00:04:06.127 00:12:04 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:06.127 00:12:04 -- common/autotest_common.sh@10 -- # set +x 00:04:06.127 ************************************ 00:04:06.127 END TEST driver 00:04:06.127 ************************************ 00:04:06.127 00:12:04 -- setup/test-setup.sh@15 -- # run_test devices /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/devices.sh 00:04:06.127 00:12:04 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:06.127 00:12:04 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:06.127 00:12:04 -- common/autotest_common.sh@10 -- # set +x 00:04:06.127 ************************************ 00:04:06.127 START TEST devices 00:04:06.127 ************************************ 00:04:06.127 00:12:04 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/devices.sh 00:04:06.127 * Looking for test storage... 00:04:06.127 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:04:06.127 00:12:05 -- setup/devices.sh@190 -- # trap cleanup EXIT 00:04:06.127 00:12:05 -- setup/devices.sh@192 -- # setup reset 00:04:06.127 00:12:05 -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:06.127 00:12:05 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:10.341 00:12:08 -- setup/devices.sh@194 -- # get_zoned_devs 00:04:10.341 00:12:08 -- common/autotest_common.sh@1654 -- # zoned_devs=() 00:04:10.341 00:12:08 -- common/autotest_common.sh@1654 -- # local -gA zoned_devs 00:04:10.341 00:12:08 -- common/autotest_common.sh@1655 -- # local nvme bdf 00:04:10.341 00:12:08 -- common/autotest_common.sh@1657 -- # for nvme in /sys/block/nvme* 00:04:10.341 00:12:08 -- common/autotest_common.sh@1658 -- # is_block_zoned nvme0n1 00:04:10.341 00:12:08 -- common/autotest_common.sh@1647 -- # local device=nvme0n1 00:04:10.341 00:12:08 -- common/autotest_common.sh@1649 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:04:10.341 00:12:08 -- common/autotest_common.sh@1650 -- # [[ none != none ]] 00:04:10.341 00:12:08 -- setup/devices.sh@196 -- # blocks=() 00:04:10.341 00:12:08 -- setup/devices.sh@196 -- # declare -a blocks 00:04:10.341 00:12:08 -- setup/devices.sh@197 -- # blocks_to_pci=() 00:04:10.341 00:12:08 -- setup/devices.sh@197 -- # declare -A blocks_to_pci 00:04:10.341 00:12:08 -- setup/devices.sh@198 -- # min_disk_size=3221225472 00:04:10.341 00:12:08 -- setup/devices.sh@200 -- # for block in "/sys/block/nvme"!(*c*) 00:04:10.341 00:12:08 -- setup/devices.sh@201 -- # ctrl=nvme0n1 00:04:10.341 00:12:08 -- setup/devices.sh@201 -- # ctrl=nvme0 00:04:10.341 00:12:08 -- setup/devices.sh@202 -- # pci=0000:d8:00.0 00:04:10.341 00:12:08 -- setup/devices.sh@203 -- # [[ '' == *\0\0\0\0\:\d\8\:\0\0\.\0* ]] 00:04:10.341 00:12:08 -- setup/devices.sh@204 -- # block_in_use nvme0n1 00:04:10.341 00:12:08 -- scripts/common.sh@380 -- # local block=nvme0n1 pt 00:04:10.341 00:12:08 -- scripts/common.sh@389 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/spdk-gpt.py nvme0n1 00:04:10.341 No valid GPT data, bailing 00:04:10.341 00:12:08 -- scripts/common.sh@393 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:04:10.341 00:12:08 -- scripts/common.sh@393 -- # pt= 00:04:10.341 00:12:08 -- scripts/common.sh@394 -- # return 1 00:04:10.341 00:12:08 -- setup/devices.sh@204 -- # sec_size_to_bytes nvme0n1 00:04:10.341 00:12:08 -- setup/common.sh@76 -- # local dev=nvme0n1 00:04:10.341 00:12:08 -- setup/common.sh@78 -- # [[ -e /sys/block/nvme0n1 ]] 00:04:10.341 00:12:08 -- setup/common.sh@80 -- # echo 1600321314816 00:04:10.341 00:12:08 -- setup/devices.sh@204 -- # (( 1600321314816 >= min_disk_size )) 00:04:10.341 00:12:08 -- setup/devices.sh@205 -- # blocks+=("${block##*/}") 00:04:10.341 00:12:08 -- setup/devices.sh@206 -- # blocks_to_pci["${block##*/}"]=0000:d8:00.0 00:04:10.341 00:12:08 -- setup/devices.sh@209 -- # (( 1 > 0 )) 00:04:10.341 00:12:08 -- setup/devices.sh@211 -- # declare -r test_disk=nvme0n1 00:04:10.341 00:12:08 -- setup/devices.sh@213 -- # run_test nvme_mount nvme_mount 00:04:10.341 00:12:08 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:10.341 00:12:08 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:10.341 00:12:08 -- common/autotest_common.sh@10 -- # set +x 00:04:10.341 ************************************ 00:04:10.341 START TEST nvme_mount 00:04:10.341 ************************************ 00:04:10.341 00:12:08 -- common/autotest_common.sh@1104 -- # nvme_mount 00:04:10.341 00:12:08 -- setup/devices.sh@95 -- # nvme_disk=nvme0n1 00:04:10.341 00:12:08 -- setup/devices.sh@96 -- # nvme_disk_p=nvme0n1p1 00:04:10.341 00:12:08 -- setup/devices.sh@97 -- # nvme_mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:10.341 00:12:08 -- setup/devices.sh@98 -- # nvme_dummy_test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:10.341 00:12:08 -- setup/devices.sh@101 -- # partition_drive nvme0n1 1 00:04:10.341 00:12:08 -- setup/common.sh@39 -- # local disk=nvme0n1 00:04:10.341 00:12:08 -- setup/common.sh@40 -- # local part_no=1 00:04:10.341 00:12:08 -- setup/common.sh@41 -- # local size=1073741824 00:04:10.341 00:12:08 -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:04:10.341 00:12:08 -- setup/common.sh@44 -- # parts=() 00:04:10.341 00:12:08 -- setup/common.sh@44 -- # local parts 00:04:10.341 00:12:08 -- setup/common.sh@46 -- # (( part = 1 )) 00:04:10.341 00:12:08 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:10.341 00:12:08 -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:04:10.341 00:12:08 -- setup/common.sh@46 -- # (( part++ )) 00:04:10.341 00:12:08 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:10.341 00:12:08 -- setup/common.sh@51 -- # (( size /= 512 )) 00:04:10.341 00:12:08 -- setup/common.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 00:04:10.341 00:12:08 -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:04:10.911 Creating new GPT entries in memory. 00:04:10.911 GPT data structures destroyed! You may now partition the disk using fdisk or 00:04:10.911 other utilities. 00:04:10.911 00:12:09 -- setup/common.sh@57 -- # (( part = 1 )) 00:04:10.911 00:12:09 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:10.911 00:12:09 -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:04:10.911 00:12:09 -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:04:10.911 00:12:09 -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:04:12.290 Creating new GPT entries in memory. 00:04:12.290 The operation has completed successfully. 00:04:12.290 00:12:10 -- setup/common.sh@57 -- # (( part++ )) 00:04:12.290 00:12:10 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:12.290 00:12:10 -- setup/common.sh@62 -- # wait 286331 00:04:12.290 00:12:10 -- setup/devices.sh@102 -- # mkfs /dev/nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:12.290 00:12:10 -- setup/common.sh@66 -- # local dev=/dev/nvme0n1p1 mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount size= 00:04:12.290 00:12:10 -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:12.290 00:12:10 -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1p1 ]] 00:04:12.290 00:12:10 -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1p1 00:04:12.290 00:12:10 -- setup/common.sh@72 -- # mount /dev/nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:12.290 00:12:11 -- setup/devices.sh@105 -- # verify 0000:d8:00.0 nvme0n1:nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:12.290 00:12:11 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:04:12.290 00:12:11 -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1p1 00:04:12.290 00:12:11 -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:12.290 00:12:11 -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:12.290 00:12:11 -- setup/devices.sh@53 -- # local found=0 00:04:12.290 00:12:11 -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:12.290 00:12:11 -- setup/devices.sh@56 -- # : 00:04:12.290 00:12:11 -- setup/devices.sh@59 -- # local pci status 00:04:12.290 00:12:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:12.290 00:12:11 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:04:12.290 00:12:11 -- setup/devices.sh@47 -- # setup output config 00:04:12.290 00:12:11 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:12.290 00:12:11 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:14.824 00:12:13 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:14.824 00:12:13 -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1p1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1\p\1* ]] 00:04:14.824 00:12:13 -- setup/devices.sh@63 -- # found=1 00:04:14.824 00:12:13 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:14.824 00:12:13 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:14.824 00:12:13 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:14.824 00:12:13 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:14.824 00:12:13 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:14.824 00:12:13 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:14.824 00:12:13 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:14.824 00:12:13 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:14.824 00:12:13 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:14.824 00:12:13 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:14.824 00:12:13 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:14.824 00:12:13 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:14.824 00:12:13 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:15.082 00:12:13 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:15.082 00:12:13 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:15.082 00:12:13 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:15.082 00:12:13 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:15.082 00:12:13 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:15.082 00:12:13 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:15.082 00:12:13 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:15.082 00:12:13 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:15.082 00:12:13 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:15.082 00:12:13 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:15.082 00:12:13 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:15.082 00:12:13 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:15.082 00:12:13 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:15.082 00:12:13 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:15.082 00:12:13 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:15.082 00:12:13 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:15.082 00:12:13 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:15.082 00:12:13 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:15.082 00:12:13 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:15.082 00:12:13 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:15.082 00:12:14 -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:15.082 00:12:14 -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount ]] 00:04:15.082 00:12:14 -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:15.082 00:12:14 -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:15.082 00:12:14 -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:15.082 00:12:14 -- setup/devices.sh@110 -- # cleanup_nvme 00:04:15.082 00:12:14 -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:15.082 00:12:14 -- setup/devices.sh@21 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:15.082 00:12:14 -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:15.082 00:12:14 -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:04:15.082 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:04:15.082 00:12:14 -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:04:15.082 00:12:14 -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:04:15.341 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:04:15.341 /dev/nvme0n1: 8 bytes were erased at offset 0x1749a955e00 (gpt): 45 46 49 20 50 41 52 54 00:04:15.341 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:04:15.341 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:04:15.341 00:12:14 -- setup/devices.sh@113 -- # mkfs /dev/nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 1024M 00:04:15.341 00:12:14 -- setup/common.sh@66 -- # local dev=/dev/nvme0n1 mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount size=1024M 00:04:15.341 00:12:14 -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:15.341 00:12:14 -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1 ]] 00:04:15.341 00:12:14 -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1 1024M 00:04:15.601 00:12:14 -- setup/common.sh@72 -- # mount /dev/nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:15.601 00:12:14 -- setup/devices.sh@116 -- # verify 0000:d8:00.0 nvme0n1:nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:15.601 00:12:14 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:04:15.601 00:12:14 -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1 00:04:15.601 00:12:14 -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:15.601 00:12:14 -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:15.601 00:12:14 -- setup/devices.sh@53 -- # local found=0 00:04:15.601 00:12:14 -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:15.601 00:12:14 -- setup/devices.sh@56 -- # : 00:04:15.601 00:12:14 -- setup/devices.sh@59 -- # local pci status 00:04:15.601 00:12:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:15.601 00:12:14 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:04:15.601 00:12:14 -- setup/devices.sh@47 -- # setup output config 00:04:15.601 00:12:14 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:15.601 00:12:14 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:18.895 00:12:17 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:18.895 00:12:17 -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1* ]] 00:04:18.895 00:12:17 -- setup/devices.sh@63 -- # found=1 00:04:18.895 00:12:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:18.895 00:12:17 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:18.895 00:12:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:18.895 00:12:17 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:18.895 00:12:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:18.895 00:12:17 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:18.895 00:12:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:18.895 00:12:17 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:18.895 00:12:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:18.895 00:12:17 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:18.895 00:12:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:18.895 00:12:17 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:18.895 00:12:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:18.895 00:12:17 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:18.895 00:12:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:18.895 00:12:17 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:18.895 00:12:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:18.895 00:12:17 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:18.895 00:12:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:18.895 00:12:17 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:18.895 00:12:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:18.895 00:12:17 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:18.895 00:12:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:18.895 00:12:17 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:18.895 00:12:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:18.895 00:12:17 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:18.895 00:12:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:18.895 00:12:17 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:18.895 00:12:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:18.895 00:12:17 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:18.895 00:12:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:18.895 00:12:17 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:18.895 00:12:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:18.895 00:12:17 -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:18.895 00:12:17 -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount ]] 00:04:18.895 00:12:17 -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:18.895 00:12:17 -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:18.895 00:12:17 -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:18.895 00:12:17 -- setup/devices.sh@123 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:18.895 00:12:17 -- setup/devices.sh@125 -- # verify 0000:d8:00.0 data@nvme0n1 '' '' 00:04:18.895 00:12:17 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:04:18.895 00:12:17 -- setup/devices.sh@49 -- # local mounts=data@nvme0n1 00:04:18.895 00:12:17 -- setup/devices.sh@50 -- # local mount_point= 00:04:18.895 00:12:17 -- setup/devices.sh@51 -- # local test_file= 00:04:18.895 00:12:17 -- setup/devices.sh@53 -- # local found=0 00:04:18.895 00:12:17 -- setup/devices.sh@55 -- # [[ -n '' ]] 00:04:18.895 00:12:17 -- setup/devices.sh@59 -- # local pci status 00:04:18.895 00:12:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:18.895 00:12:17 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:04:18.895 00:12:17 -- setup/devices.sh@47 -- # setup output config 00:04:18.895 00:12:17 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:18.895 00:12:17 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:22.188 00:12:20 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:22.188 00:12:20 -- setup/devices.sh@62 -- # [[ Active devices: data@nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\d\a\t\a\@\n\v\m\e\0\n\1* ]] 00:04:22.188 00:12:20 -- setup/devices.sh@63 -- # found=1 00:04:22.188 00:12:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:22.188 00:12:20 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:22.188 00:12:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:22.188 00:12:20 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:22.188 00:12:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:22.188 00:12:20 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:22.188 00:12:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:22.188 00:12:20 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:22.188 00:12:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:22.188 00:12:20 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:22.188 00:12:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:22.188 00:12:20 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:22.188 00:12:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:22.188 00:12:20 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:22.188 00:12:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:22.188 00:12:20 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:22.188 00:12:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:22.188 00:12:20 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:22.188 00:12:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:22.188 00:12:20 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:22.188 00:12:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:22.188 00:12:20 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:22.188 00:12:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:22.188 00:12:20 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:22.188 00:12:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:22.188 00:12:20 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:22.188 00:12:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:22.188 00:12:20 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:22.188 00:12:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:22.188 00:12:20 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:22.188 00:12:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:22.188 00:12:20 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:22.188 00:12:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:22.188 00:12:20 -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:22.188 00:12:20 -- setup/devices.sh@68 -- # [[ -n '' ]] 00:04:22.188 00:12:20 -- setup/devices.sh@68 -- # return 0 00:04:22.188 00:12:20 -- setup/devices.sh@128 -- # cleanup_nvme 00:04:22.188 00:12:20 -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:22.188 00:12:20 -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:22.188 00:12:20 -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:04:22.188 00:12:20 -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:04:22.188 /dev/nvme0n1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:04:22.188 00:04:22.188 real 0m11.877s 00:04:22.188 user 0m3.234s 00:04:22.188 sys 0m6.352s 00:04:22.188 00:12:20 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:22.188 00:12:20 -- common/autotest_common.sh@10 -- # set +x 00:04:22.188 ************************************ 00:04:22.188 END TEST nvme_mount 00:04:22.188 ************************************ 00:04:22.188 00:12:20 -- setup/devices.sh@214 -- # run_test dm_mount dm_mount 00:04:22.188 00:12:20 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:22.188 00:12:20 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:22.188 00:12:20 -- common/autotest_common.sh@10 -- # set +x 00:04:22.188 ************************************ 00:04:22.188 START TEST dm_mount 00:04:22.188 ************************************ 00:04:22.188 00:12:20 -- common/autotest_common.sh@1104 -- # dm_mount 00:04:22.188 00:12:20 -- setup/devices.sh@144 -- # pv=nvme0n1 00:04:22.188 00:12:20 -- setup/devices.sh@145 -- # pv0=nvme0n1p1 00:04:22.188 00:12:20 -- setup/devices.sh@146 -- # pv1=nvme0n1p2 00:04:22.188 00:12:20 -- setup/devices.sh@148 -- # partition_drive nvme0n1 00:04:22.188 00:12:20 -- setup/common.sh@39 -- # local disk=nvme0n1 00:04:22.188 00:12:20 -- setup/common.sh@40 -- # local part_no=2 00:04:22.188 00:12:20 -- setup/common.sh@41 -- # local size=1073741824 00:04:22.188 00:12:20 -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:04:22.188 00:12:20 -- setup/common.sh@44 -- # parts=() 00:04:22.188 00:12:20 -- setup/common.sh@44 -- # local parts 00:04:22.188 00:12:20 -- setup/common.sh@46 -- # (( part = 1 )) 00:04:22.188 00:12:20 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:22.188 00:12:20 -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:04:22.188 00:12:20 -- setup/common.sh@46 -- # (( part++ )) 00:04:22.188 00:12:20 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:22.188 00:12:20 -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:04:22.188 00:12:20 -- setup/common.sh@46 -- # (( part++ )) 00:04:22.188 00:12:20 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:22.188 00:12:20 -- setup/common.sh@51 -- # (( size /= 512 )) 00:04:22.188 00:12:20 -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:04:22.188 00:12:20 -- setup/common.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 nvme0n1p2 00:04:23.126 Creating new GPT entries in memory. 00:04:23.126 GPT data structures destroyed! You may now partition the disk using fdisk or 00:04:23.126 other utilities. 00:04:23.126 00:12:21 -- setup/common.sh@57 -- # (( part = 1 )) 00:04:23.126 00:12:21 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:23.126 00:12:21 -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:04:23.126 00:12:21 -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:04:23.126 00:12:21 -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:04:24.064 Creating new GPT entries in memory. 00:04:24.064 The operation has completed successfully. 00:04:24.064 00:12:22 -- setup/common.sh@57 -- # (( part++ )) 00:04:24.064 00:12:22 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:24.064 00:12:22 -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:04:24.064 00:12:22 -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:04:24.064 00:12:22 -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=2:2099200:4196351 00:04:25.001 The operation has completed successfully. 00:04:25.001 00:12:23 -- setup/common.sh@57 -- # (( part++ )) 00:04:25.001 00:12:23 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:25.001 00:12:23 -- setup/common.sh@62 -- # wait 290700 00:04:25.001 00:12:23 -- setup/devices.sh@150 -- # dm_name=nvme_dm_test 00:04:25.001 00:12:23 -- setup/devices.sh@151 -- # dm_mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:25.001 00:12:23 -- setup/devices.sh@152 -- # dm_dummy_test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:25.001 00:12:23 -- setup/devices.sh@155 -- # dmsetup create nvme_dm_test 00:04:25.001 00:12:23 -- setup/devices.sh@160 -- # for t in {1..5} 00:04:25.001 00:12:23 -- setup/devices.sh@161 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:04:25.001 00:12:23 -- setup/devices.sh@161 -- # break 00:04:25.001 00:12:23 -- setup/devices.sh@164 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:04:25.001 00:12:23 -- setup/devices.sh@165 -- # readlink -f /dev/mapper/nvme_dm_test 00:04:25.001 00:12:23 -- setup/devices.sh@165 -- # dm=/dev/dm-0 00:04:25.001 00:12:23 -- setup/devices.sh@166 -- # dm=dm-0 00:04:25.001 00:12:23 -- setup/devices.sh@168 -- # [[ -e /sys/class/block/nvme0n1p1/holders/dm-0 ]] 00:04:25.001 00:12:23 -- setup/devices.sh@169 -- # [[ -e /sys/class/block/nvme0n1p2/holders/dm-0 ]] 00:04:25.001 00:12:23 -- setup/devices.sh@171 -- # mkfs /dev/mapper/nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:25.001 00:12:23 -- setup/common.sh@66 -- # local dev=/dev/mapper/nvme_dm_test mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount size= 00:04:25.001 00:12:23 -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:25.001 00:12:23 -- setup/common.sh@70 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:04:25.001 00:12:23 -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/mapper/nvme_dm_test 00:04:25.001 00:12:23 -- setup/common.sh@72 -- # mount /dev/mapper/nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:25.001 00:12:23 -- setup/devices.sh@174 -- # verify 0000:d8:00.0 nvme0n1:nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:25.001 00:12:23 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:04:25.001 00:12:23 -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme_dm_test 00:04:25.001 00:12:23 -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:25.001 00:12:23 -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:25.001 00:12:23 -- setup/devices.sh@53 -- # local found=0 00:04:25.001 00:12:23 -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:04:25.001 00:12:23 -- setup/devices.sh@56 -- # : 00:04:25.001 00:12:23 -- setup/devices.sh@59 -- # local pci status 00:04:25.001 00:12:23 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:25.001 00:12:23 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:04:25.001 00:12:23 -- setup/devices.sh@47 -- # setup output config 00:04:25.001 00:12:23 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:25.001 00:12:23 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:28.292 00:12:27 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:28.292 00:12:27 -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0,mount@nvme0n1:nvme_dm_test, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\_\d\m\_\t\e\s\t* ]] 00:04:28.292 00:12:27 -- setup/devices.sh@63 -- # found=1 00:04:28.292 00:12:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:28.292 00:12:27 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:28.292 00:12:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:28.292 00:12:27 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:28.292 00:12:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:28.292 00:12:27 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:28.292 00:12:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:28.292 00:12:27 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:28.292 00:12:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:28.292 00:12:27 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:28.292 00:12:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:28.292 00:12:27 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:28.292 00:12:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:28.292 00:12:27 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:28.292 00:12:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:28.292 00:12:27 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:28.292 00:12:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:28.292 00:12:27 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:28.292 00:12:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:28.292 00:12:27 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:28.293 00:12:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:28.293 00:12:27 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:28.293 00:12:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:28.293 00:12:27 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:28.293 00:12:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:28.293 00:12:27 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:28.293 00:12:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:28.293 00:12:27 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:28.293 00:12:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:28.293 00:12:27 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:28.293 00:12:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:28.293 00:12:27 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:28.293 00:12:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:28.553 00:12:27 -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:28.553 00:12:27 -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount ]] 00:04:28.553 00:12:27 -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:28.553 00:12:27 -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:04:28.553 00:12:27 -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:28.553 00:12:27 -- setup/devices.sh@182 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:28.553 00:12:27 -- setup/devices.sh@184 -- # verify 0000:d8:00.0 holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 '' '' 00:04:28.553 00:12:27 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:04:28.553 00:12:27 -- setup/devices.sh@49 -- # local mounts=holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 00:04:28.553 00:12:27 -- setup/devices.sh@50 -- # local mount_point= 00:04:28.553 00:12:27 -- setup/devices.sh@51 -- # local test_file= 00:04:28.553 00:12:27 -- setup/devices.sh@53 -- # local found=0 00:04:28.553 00:12:27 -- setup/devices.sh@55 -- # [[ -n '' ]] 00:04:28.553 00:12:27 -- setup/devices.sh@59 -- # local pci status 00:04:28.553 00:12:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:28.553 00:12:27 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:04:28.553 00:12:27 -- setup/devices.sh@47 -- # setup output config 00:04:28.553 00:12:27 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:28.553 00:12:27 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:31.944 00:12:30 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:31.944 00:12:30 -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\1\:\d\m\-\0\,\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\2\:\d\m\-\0* ]] 00:04:31.944 00:12:30 -- setup/devices.sh@63 -- # found=1 00:04:31.944 00:12:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:31.944 00:12:30 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:31.944 00:12:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:31.944 00:12:30 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:31.944 00:12:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:31.944 00:12:30 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:31.944 00:12:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:31.944 00:12:30 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:31.944 00:12:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:31.944 00:12:30 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:31.944 00:12:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:31.944 00:12:30 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:31.944 00:12:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:31.944 00:12:30 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:31.944 00:12:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:31.944 00:12:30 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:31.944 00:12:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:31.944 00:12:30 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:31.944 00:12:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:31.944 00:12:30 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:31.945 00:12:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:31.945 00:12:30 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:31.945 00:12:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:31.945 00:12:30 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:31.945 00:12:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:31.945 00:12:30 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:31.945 00:12:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:31.945 00:12:30 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:31.945 00:12:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:31.945 00:12:30 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:31.945 00:12:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:31.945 00:12:30 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:31.945 00:12:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:31.945 00:12:30 -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:31.945 00:12:30 -- setup/devices.sh@68 -- # [[ -n '' ]] 00:04:31.945 00:12:30 -- setup/devices.sh@68 -- # return 0 00:04:31.945 00:12:30 -- setup/devices.sh@187 -- # cleanup_dm 00:04:31.945 00:12:30 -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:31.945 00:12:30 -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:04:31.945 00:12:30 -- setup/devices.sh@37 -- # dmsetup remove --force nvme_dm_test 00:04:31.945 00:12:30 -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:31.945 00:12:30 -- setup/devices.sh@40 -- # wipefs --all /dev/nvme0n1p1 00:04:31.945 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:04:31.945 00:12:30 -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:04:31.945 00:12:30 -- setup/devices.sh@43 -- # wipefs --all /dev/nvme0n1p2 00:04:31.945 00:04:31.945 real 0m9.958s 00:04:31.945 user 0m2.518s 00:04:31.945 sys 0m4.549s 00:04:31.945 00:12:30 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:31.945 00:12:30 -- common/autotest_common.sh@10 -- # set +x 00:04:31.945 ************************************ 00:04:31.945 END TEST dm_mount 00:04:31.945 ************************************ 00:04:31.945 00:12:30 -- setup/devices.sh@1 -- # cleanup 00:04:31.945 00:12:30 -- setup/devices.sh@11 -- # cleanup_nvme 00:04:31.945 00:12:30 -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:31.945 00:12:30 -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:31.945 00:12:30 -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:04:31.945 00:12:30 -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:04:31.945 00:12:30 -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:04:32.204 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:04:32.204 /dev/nvme0n1: 8 bytes were erased at offset 0x1749a955e00 (gpt): 45 46 49 20 50 41 52 54 00:04:32.204 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:04:32.204 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:04:32.204 00:12:31 -- setup/devices.sh@12 -- # cleanup_dm 00:04:32.204 00:12:31 -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:32.204 00:12:31 -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:04:32.204 00:12:31 -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:32.204 00:12:31 -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:04:32.204 00:12:31 -- setup/devices.sh@14 -- # [[ -b /dev/nvme0n1 ]] 00:04:32.204 00:12:31 -- setup/devices.sh@15 -- # wipefs --all /dev/nvme0n1 00:04:32.204 00:04:32.204 real 0m26.116s 00:04:32.204 user 0m7.215s 00:04:32.204 sys 0m13.622s 00:04:32.204 00:12:31 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:32.204 00:12:31 -- common/autotest_common.sh@10 -- # set +x 00:04:32.204 ************************************ 00:04:32.204 END TEST devices 00:04:32.204 ************************************ 00:04:32.204 00:04:32.204 real 1m31.287s 00:04:32.204 user 0m27.920s 00:04:32.204 sys 0m51.943s 00:04:32.204 00:12:31 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:32.204 00:12:31 -- common/autotest_common.sh@10 -- # set +x 00:04:32.204 ************************************ 00:04:32.204 END TEST setup.sh 00:04:32.204 ************************************ 00:04:32.204 00:12:31 -- spdk/autotest.sh@139 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:04:35.494 Hugepages 00:04:35.494 node hugesize free / total 00:04:35.494 node0 1048576kB 0 / 0 00:04:35.494 node0 2048kB 2048 / 2048 00:04:35.494 node1 1048576kB 0 / 0 00:04:35.494 node1 2048kB 0 / 0 00:04:35.494 00:04:35.494 Type BDF Vendor Device NUMA Driver Device Block devices 00:04:35.494 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:04:35.494 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:04:35.494 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:04:35.494 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:04:35.494 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:04:35.494 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:04:35.494 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:04:35.494 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:04:35.494 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:04:35.494 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:04:35.494 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:04:35.494 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:04:35.494 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:04:35.494 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:04:35.494 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:04:35.494 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:04:35.494 NVMe 0000:d8:00.0 8086 0a54 1 nvme nvme0 nvme0n1 00:04:35.494 00:12:34 -- spdk/autotest.sh@141 -- # uname -s 00:04:35.494 00:12:34 -- spdk/autotest.sh@141 -- # [[ Linux == Linux ]] 00:04:35.494 00:12:34 -- spdk/autotest.sh@143 -- # nvme_namespace_revert 00:04:35.494 00:12:34 -- common/autotest_common.sh@1516 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:38.783 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:38.783 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:38.783 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:38.783 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:38.783 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:38.783 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:38.783 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:38.783 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:38.783 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:38.783 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:38.784 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:38.784 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:38.784 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:38.784 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:38.784 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:38.784 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:40.691 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:04:40.691 00:12:39 -- common/autotest_common.sh@1517 -- # sleep 1 00:04:41.630 00:12:40 -- common/autotest_common.sh@1518 -- # bdfs=() 00:04:41.630 00:12:40 -- common/autotest_common.sh@1518 -- # local bdfs 00:04:41.630 00:12:40 -- common/autotest_common.sh@1519 -- # bdfs=($(get_nvme_bdfs)) 00:04:41.630 00:12:40 -- common/autotest_common.sh@1519 -- # get_nvme_bdfs 00:04:41.630 00:12:40 -- common/autotest_common.sh@1498 -- # bdfs=() 00:04:41.630 00:12:40 -- common/autotest_common.sh@1498 -- # local bdfs 00:04:41.630 00:12:40 -- common/autotest_common.sh@1499 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:04:41.630 00:12:40 -- common/autotest_common.sh@1499 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/gen_nvme.sh 00:04:41.630 00:12:40 -- common/autotest_common.sh@1499 -- # jq -r '.config[].params.traddr' 00:04:41.630 00:12:40 -- common/autotest_common.sh@1500 -- # (( 1 == 0 )) 00:04:41.630 00:12:40 -- common/autotest_common.sh@1504 -- # printf '%s\n' 0000:d8:00.0 00:04:41.630 00:12:40 -- common/autotest_common.sh@1521 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:44.921 Waiting for block devices as requested 00:04:44.921 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:04:44.921 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:04:44.921 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:04:44.921 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:04:45.180 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:04:45.180 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:04:45.180 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:04:45.439 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:04:45.439 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:04:45.439 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:04:45.439 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:04:45.698 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:04:45.698 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:04:45.698 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:04:45.957 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:04:45.957 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:04:46.216 0000:d8:00.0 (8086 0a54): vfio-pci -> nvme 00:04:46.216 00:12:45 -- common/autotest_common.sh@1523 -- # for bdf in "${bdfs[@]}" 00:04:46.217 00:12:45 -- common/autotest_common.sh@1524 -- # get_nvme_ctrlr_from_bdf 0000:d8:00.0 00:04:46.217 00:12:45 -- common/autotest_common.sh@1487 -- # readlink -f /sys/class/nvme/nvme0 00:04:46.217 00:12:45 -- common/autotest_common.sh@1487 -- # grep 0000:d8:00.0/nvme/nvme 00:04:46.217 00:12:45 -- common/autotest_common.sh@1487 -- # bdf_sysfs_path=/sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 00:04:46.217 00:12:45 -- common/autotest_common.sh@1488 -- # [[ -z /sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 ]] 00:04:46.217 00:12:45 -- common/autotest_common.sh@1492 -- # basename /sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 00:04:46.217 00:12:45 -- common/autotest_common.sh@1492 -- # printf '%s\n' nvme0 00:04:46.217 00:12:45 -- common/autotest_common.sh@1524 -- # nvme_ctrlr=/dev/nvme0 00:04:46.217 00:12:45 -- common/autotest_common.sh@1525 -- # [[ -z /dev/nvme0 ]] 00:04:46.217 00:12:45 -- common/autotest_common.sh@1530 -- # nvme id-ctrl /dev/nvme0 00:04:46.217 00:12:45 -- common/autotest_common.sh@1530 -- # grep oacs 00:04:46.217 00:12:45 -- common/autotest_common.sh@1530 -- # cut -d: -f2 00:04:46.217 00:12:45 -- common/autotest_common.sh@1530 -- # oacs=' 0xe' 00:04:46.217 00:12:45 -- common/autotest_common.sh@1531 -- # oacs_ns_manage=8 00:04:46.217 00:12:45 -- common/autotest_common.sh@1533 -- # [[ 8 -ne 0 ]] 00:04:46.217 00:12:45 -- common/autotest_common.sh@1539 -- # nvme id-ctrl /dev/nvme0 00:04:46.217 00:12:45 -- common/autotest_common.sh@1539 -- # grep unvmcap 00:04:46.217 00:12:45 -- common/autotest_common.sh@1539 -- # cut -d: -f2 00:04:46.217 00:12:45 -- common/autotest_common.sh@1539 -- # unvmcap=' 0' 00:04:46.217 00:12:45 -- common/autotest_common.sh@1540 -- # [[ 0 -eq 0 ]] 00:04:46.217 00:12:45 -- common/autotest_common.sh@1542 -- # continue 00:04:46.217 00:12:45 -- spdk/autotest.sh@146 -- # timing_exit pre_cleanup 00:04:46.217 00:12:45 -- common/autotest_common.sh@718 -- # xtrace_disable 00:04:46.217 00:12:45 -- common/autotest_common.sh@10 -- # set +x 00:04:46.217 00:12:45 -- spdk/autotest.sh@149 -- # timing_enter afterboot 00:04:46.217 00:12:45 -- common/autotest_common.sh@712 -- # xtrace_disable 00:04:46.217 00:12:45 -- common/autotest_common.sh@10 -- # set +x 00:04:46.217 00:12:45 -- spdk/autotest.sh@150 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:49.502 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:49.503 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:49.503 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:49.503 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:49.503 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:49.503 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:49.503 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:49.503 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:49.503 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:49.503 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:49.503 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:49.760 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:49.760 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:49.760 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:49.760 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:49.760 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:51.663 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:04:51.663 00:12:50 -- spdk/autotest.sh@151 -- # timing_exit afterboot 00:04:51.663 00:12:50 -- common/autotest_common.sh@718 -- # xtrace_disable 00:04:51.664 00:12:50 -- common/autotest_common.sh@10 -- # set +x 00:04:51.664 00:12:50 -- spdk/autotest.sh@155 -- # opal_revert_cleanup 00:04:51.664 00:12:50 -- common/autotest_common.sh@1576 -- # mapfile -t bdfs 00:04:51.664 00:12:50 -- common/autotest_common.sh@1576 -- # get_nvme_bdfs_by_id 0x0a54 00:04:51.664 00:12:50 -- common/autotest_common.sh@1562 -- # bdfs=() 00:04:51.664 00:12:50 -- common/autotest_common.sh@1562 -- # local bdfs 00:04:51.664 00:12:50 -- common/autotest_common.sh@1564 -- # get_nvme_bdfs 00:04:51.664 00:12:50 -- common/autotest_common.sh@1498 -- # bdfs=() 00:04:51.664 00:12:50 -- common/autotest_common.sh@1498 -- # local bdfs 00:04:51.664 00:12:50 -- common/autotest_common.sh@1499 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:04:51.664 00:12:50 -- common/autotest_common.sh@1499 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/gen_nvme.sh 00:04:51.664 00:12:50 -- common/autotest_common.sh@1499 -- # jq -r '.config[].params.traddr' 00:04:51.664 00:12:50 -- common/autotest_common.sh@1500 -- # (( 1 == 0 )) 00:04:51.664 00:12:50 -- common/autotest_common.sh@1504 -- # printf '%s\n' 0000:d8:00.0 00:04:51.664 00:12:50 -- common/autotest_common.sh@1564 -- # for bdf in $(get_nvme_bdfs) 00:04:51.664 00:12:50 -- common/autotest_common.sh@1565 -- # cat /sys/bus/pci/devices/0000:d8:00.0/device 00:04:51.664 00:12:50 -- common/autotest_common.sh@1565 -- # device=0x0a54 00:04:51.664 00:12:50 -- common/autotest_common.sh@1566 -- # [[ 0x0a54 == \0\x\0\a\5\4 ]] 00:04:51.664 00:12:50 -- common/autotest_common.sh@1567 -- # bdfs+=($bdf) 00:04:51.664 00:12:50 -- common/autotest_common.sh@1571 -- # printf '%s\n' 0000:d8:00.0 00:04:51.664 00:12:50 -- common/autotest_common.sh@1577 -- # [[ -z 0000:d8:00.0 ]] 00:04:51.664 00:12:50 -- common/autotest_common.sh@1582 -- # spdk_tgt_pid=300790 00:04:51.664 00:12:50 -- common/autotest_common.sh@1583 -- # waitforlisten 300790 00:04:51.664 00:12:50 -- common/autotest_common.sh@1581 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:04:51.664 00:12:50 -- common/autotest_common.sh@819 -- # '[' -z 300790 ']' 00:04:51.664 00:12:50 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:04:51.664 00:12:50 -- common/autotest_common.sh@824 -- # local max_retries=100 00:04:51.664 00:12:50 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:04:51.664 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:04:51.664 00:12:50 -- common/autotest_common.sh@828 -- # xtrace_disable 00:04:51.664 00:12:50 -- common/autotest_common.sh@10 -- # set +x 00:04:51.664 [2024-07-15 00:12:50.490890] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:04:51.664 [2024-07-15 00:12:50.490949] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid300790 ] 00:04:51.664 EAL: No free 2048 kB hugepages reported on node 1 00:04:51.664 [2024-07-15 00:12:50.558867] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:04:51.664 [2024-07-15 00:12:50.636034] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:04:51.664 [2024-07-15 00:12:50.636144] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:04:52.599 00:12:51 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:04:52.599 00:12:51 -- common/autotest_common.sh@852 -- # return 0 00:04:52.599 00:12:51 -- common/autotest_common.sh@1585 -- # bdf_id=0 00:04:52.599 00:12:51 -- common/autotest_common.sh@1586 -- # for bdf in "${bdfs[@]}" 00:04:52.599 00:12:51 -- common/autotest_common.sh@1587 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py bdev_nvme_attach_controller -b nvme0 -t pcie -a 0000:d8:00.0 00:04:55.887 nvme0n1 00:04:55.887 00:12:54 -- common/autotest_common.sh@1589 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py bdev_nvme_opal_revert -b nvme0 -p test 00:04:55.887 [2024-07-15 00:12:54.432426] vbdev_opal_rpc.c: 125:rpc_bdev_nvme_opal_revert: *ERROR*: nvme0 not support opal 00:04:55.887 request: 00:04:55.887 { 00:04:55.887 "nvme_ctrlr_name": "nvme0", 00:04:55.887 "password": "test", 00:04:55.887 "method": "bdev_nvme_opal_revert", 00:04:55.887 "req_id": 1 00:04:55.887 } 00:04:55.887 Got JSON-RPC error response 00:04:55.887 response: 00:04:55.887 { 00:04:55.887 "code": -32602, 00:04:55.887 "message": "Invalid parameters" 00:04:55.887 } 00:04:55.887 00:12:54 -- common/autotest_common.sh@1589 -- # true 00:04:55.887 00:12:54 -- common/autotest_common.sh@1590 -- # (( ++bdf_id )) 00:04:55.887 00:12:54 -- common/autotest_common.sh@1593 -- # killprocess 300790 00:04:55.887 00:12:54 -- common/autotest_common.sh@926 -- # '[' -z 300790 ']' 00:04:55.887 00:12:54 -- common/autotest_common.sh@930 -- # kill -0 300790 00:04:55.887 00:12:54 -- common/autotest_common.sh@931 -- # uname 00:04:55.887 00:12:54 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:04:55.887 00:12:54 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 300790 00:04:55.887 00:12:54 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:04:55.887 00:12:54 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:04:55.887 00:12:54 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 300790' 00:04:55.887 killing process with pid 300790 00:04:55.887 00:12:54 -- common/autotest_common.sh@945 -- # kill 300790 00:04:55.887 00:12:54 -- common/autotest_common.sh@950 -- # wait 300790 00:04:57.794 00:12:56 -- spdk/autotest.sh@161 -- # '[' 0 -eq 1 ']' 00:04:57.794 00:12:56 -- spdk/autotest.sh@165 -- # '[' 1 -eq 1 ']' 00:04:57.794 00:12:56 -- spdk/autotest.sh@166 -- # [[ 0 -eq 1 ]] 00:04:57.794 00:12:56 -- spdk/autotest.sh@166 -- # [[ 0 -eq 1 ]] 00:04:57.794 00:12:56 -- spdk/autotest.sh@173 -- # timing_enter lib 00:04:57.794 00:12:56 -- common/autotest_common.sh@712 -- # xtrace_disable 00:04:57.794 00:12:56 -- common/autotest_common.sh@10 -- # set +x 00:04:57.794 00:12:56 -- spdk/autotest.sh@175 -- # run_test env /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env.sh 00:04:57.794 00:12:56 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:57.794 00:12:56 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:57.794 00:12:56 -- common/autotest_common.sh@10 -- # set +x 00:04:57.794 ************************************ 00:04:57.794 START TEST env 00:04:57.794 ************************************ 00:04:57.794 00:12:56 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env.sh 00:04:57.794 * Looking for test storage... 00:04:57.794 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env 00:04:57.794 00:12:56 -- env/env.sh@10 -- # run_test env_memory /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/memory/memory_ut 00:04:57.794 00:12:56 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:57.794 00:12:56 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:57.794 00:12:56 -- common/autotest_common.sh@10 -- # set +x 00:04:57.794 ************************************ 00:04:57.794 START TEST env_memory 00:04:57.794 ************************************ 00:04:57.794 00:12:56 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/memory/memory_ut 00:04:57.794 00:04:57.794 00:04:57.794 CUnit - A unit testing framework for C - Version 2.1-3 00:04:57.794 http://cunit.sourceforge.net/ 00:04:57.794 00:04:57.794 00:04:57.794 Suite: memory 00:04:57.794 Test: alloc and free memory map ...[2024-07-15 00:12:56.755610] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 283:spdk_mem_map_alloc: *ERROR*: Initial mem_map notify failed 00:04:57.794 passed 00:04:57.794 Test: mem map translation ...[2024-07-15 00:12:56.769491] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 591:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=2097152 len=1234 00:04:57.794 [2024-07-15 00:12:56.769507] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 591:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=1234 len=2097152 00:04:57.794 [2024-07-15 00:12:56.769539] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 584:spdk_mem_map_set_translation: *ERROR*: invalid usermode virtual address 281474976710656 00:04:57.794 [2024-07-15 00:12:56.769548] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 600:spdk_mem_map_set_translation: *ERROR*: could not get 0xffffffe00000 map 00:04:57.794 passed 00:04:57.794 Test: mem map registration ...[2024-07-15 00:12:56.791455] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 347:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x200000 len=1234 00:04:57.794 [2024-07-15 00:12:56.791475] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 347:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x4d2 len=2097152 00:04:57.794 passed 00:04:57.794 Test: mem map adjacent registrations ...passed 00:04:57.794 00:04:57.794 Run Summary: Type Total Ran Passed Failed Inactive 00:04:57.794 suites 1 1 n/a 0 0 00:04:57.794 tests 4 4 4 0 0 00:04:57.794 asserts 152 152 152 0 n/a 00:04:57.794 00:04:57.794 Elapsed time = 0.091 seconds 00:04:57.794 00:04:57.794 real 0m0.105s 00:04:57.794 user 0m0.090s 00:04:57.794 sys 0m0.014s 00:04:57.794 00:12:56 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:57.794 00:12:56 -- common/autotest_common.sh@10 -- # set +x 00:04:57.794 ************************************ 00:04:57.794 END TEST env_memory 00:04:57.794 ************************************ 00:04:58.054 00:12:56 -- env/env.sh@11 -- # run_test env_vtophys /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/vtophys/vtophys 00:04:58.054 00:12:56 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:58.054 00:12:56 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:58.054 00:12:56 -- common/autotest_common.sh@10 -- # set +x 00:04:58.054 ************************************ 00:04:58.054 START TEST env_vtophys 00:04:58.054 ************************************ 00:04:58.054 00:12:56 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/vtophys/vtophys 00:04:58.054 EAL: lib.eal log level changed from notice to debug 00:04:58.054 EAL: Detected lcore 0 as core 0 on socket 0 00:04:58.054 EAL: Detected lcore 1 as core 1 on socket 0 00:04:58.054 EAL: Detected lcore 2 as core 2 on socket 0 00:04:58.054 EAL: Detected lcore 3 as core 3 on socket 0 00:04:58.054 EAL: Detected lcore 4 as core 4 on socket 0 00:04:58.054 EAL: Detected lcore 5 as core 5 on socket 0 00:04:58.054 EAL: Detected lcore 6 as core 6 on socket 0 00:04:58.054 EAL: Detected lcore 7 as core 8 on socket 0 00:04:58.054 EAL: Detected lcore 8 as core 9 on socket 0 00:04:58.054 EAL: Detected lcore 9 as core 10 on socket 0 00:04:58.054 EAL: Detected lcore 10 as core 11 on socket 0 00:04:58.054 EAL: Detected lcore 11 as core 12 on socket 0 00:04:58.054 EAL: Detected lcore 12 as core 13 on socket 0 00:04:58.054 EAL: Detected lcore 13 as core 14 on socket 0 00:04:58.054 EAL: Detected lcore 14 as core 16 on socket 0 00:04:58.054 EAL: Detected lcore 15 as core 17 on socket 0 00:04:58.054 EAL: Detected lcore 16 as core 18 on socket 0 00:04:58.054 EAL: Detected lcore 17 as core 19 on socket 0 00:04:58.054 EAL: Detected lcore 18 as core 20 on socket 0 00:04:58.054 EAL: Detected lcore 19 as core 21 on socket 0 00:04:58.054 EAL: Detected lcore 20 as core 22 on socket 0 00:04:58.054 EAL: Detected lcore 21 as core 24 on socket 0 00:04:58.054 EAL: Detected lcore 22 as core 25 on socket 0 00:04:58.054 EAL: Detected lcore 23 as core 26 on socket 0 00:04:58.054 EAL: Detected lcore 24 as core 27 on socket 0 00:04:58.054 EAL: Detected lcore 25 as core 28 on socket 0 00:04:58.054 EAL: Detected lcore 26 as core 29 on socket 0 00:04:58.054 EAL: Detected lcore 27 as core 30 on socket 0 00:04:58.054 EAL: Detected lcore 28 as core 0 on socket 1 00:04:58.054 EAL: Detected lcore 29 as core 1 on socket 1 00:04:58.054 EAL: Detected lcore 30 as core 2 on socket 1 00:04:58.054 EAL: Detected lcore 31 as core 3 on socket 1 00:04:58.054 EAL: Detected lcore 32 as core 4 on socket 1 00:04:58.054 EAL: Detected lcore 33 as core 5 on socket 1 00:04:58.054 EAL: Detected lcore 34 as core 6 on socket 1 00:04:58.054 EAL: Detected lcore 35 as core 8 on socket 1 00:04:58.054 EAL: Detected lcore 36 as core 9 on socket 1 00:04:58.054 EAL: Detected lcore 37 as core 10 on socket 1 00:04:58.054 EAL: Detected lcore 38 as core 11 on socket 1 00:04:58.054 EAL: Detected lcore 39 as core 12 on socket 1 00:04:58.054 EAL: Detected lcore 40 as core 13 on socket 1 00:04:58.054 EAL: Detected lcore 41 as core 14 on socket 1 00:04:58.054 EAL: Detected lcore 42 as core 16 on socket 1 00:04:58.054 EAL: Detected lcore 43 as core 17 on socket 1 00:04:58.054 EAL: Detected lcore 44 as core 18 on socket 1 00:04:58.055 EAL: Detected lcore 45 as core 19 on socket 1 00:04:58.055 EAL: Detected lcore 46 as core 20 on socket 1 00:04:58.055 EAL: Detected lcore 47 as core 21 on socket 1 00:04:58.055 EAL: Detected lcore 48 as core 22 on socket 1 00:04:58.055 EAL: Detected lcore 49 as core 24 on socket 1 00:04:58.055 EAL: Detected lcore 50 as core 25 on socket 1 00:04:58.055 EAL: Detected lcore 51 as core 26 on socket 1 00:04:58.055 EAL: Detected lcore 52 as core 27 on socket 1 00:04:58.055 EAL: Detected lcore 53 as core 28 on socket 1 00:04:58.055 EAL: Detected lcore 54 as core 29 on socket 1 00:04:58.055 EAL: Detected lcore 55 as core 30 on socket 1 00:04:58.055 EAL: Detected lcore 56 as core 0 on socket 0 00:04:58.055 EAL: Detected lcore 57 as core 1 on socket 0 00:04:58.055 EAL: Detected lcore 58 as core 2 on socket 0 00:04:58.055 EAL: Detected lcore 59 as core 3 on socket 0 00:04:58.055 EAL: Detected lcore 60 as core 4 on socket 0 00:04:58.055 EAL: Detected lcore 61 as core 5 on socket 0 00:04:58.055 EAL: Detected lcore 62 as core 6 on socket 0 00:04:58.055 EAL: Detected lcore 63 as core 8 on socket 0 00:04:58.055 EAL: Detected lcore 64 as core 9 on socket 0 00:04:58.055 EAL: Detected lcore 65 as core 10 on socket 0 00:04:58.055 EAL: Detected lcore 66 as core 11 on socket 0 00:04:58.055 EAL: Detected lcore 67 as core 12 on socket 0 00:04:58.055 EAL: Detected lcore 68 as core 13 on socket 0 00:04:58.055 EAL: Detected lcore 69 as core 14 on socket 0 00:04:58.055 EAL: Detected lcore 70 as core 16 on socket 0 00:04:58.055 EAL: Detected lcore 71 as core 17 on socket 0 00:04:58.055 EAL: Detected lcore 72 as core 18 on socket 0 00:04:58.055 EAL: Detected lcore 73 as core 19 on socket 0 00:04:58.055 EAL: Detected lcore 74 as core 20 on socket 0 00:04:58.055 EAL: Detected lcore 75 as core 21 on socket 0 00:04:58.055 EAL: Detected lcore 76 as core 22 on socket 0 00:04:58.055 EAL: Detected lcore 77 as core 24 on socket 0 00:04:58.055 EAL: Detected lcore 78 as core 25 on socket 0 00:04:58.055 EAL: Detected lcore 79 as core 26 on socket 0 00:04:58.055 EAL: Detected lcore 80 as core 27 on socket 0 00:04:58.055 EAL: Detected lcore 81 as core 28 on socket 0 00:04:58.055 EAL: Detected lcore 82 as core 29 on socket 0 00:04:58.055 EAL: Detected lcore 83 as core 30 on socket 0 00:04:58.055 EAL: Detected lcore 84 as core 0 on socket 1 00:04:58.055 EAL: Detected lcore 85 as core 1 on socket 1 00:04:58.055 EAL: Detected lcore 86 as core 2 on socket 1 00:04:58.055 EAL: Detected lcore 87 as core 3 on socket 1 00:04:58.055 EAL: Detected lcore 88 as core 4 on socket 1 00:04:58.055 EAL: Detected lcore 89 as core 5 on socket 1 00:04:58.055 EAL: Detected lcore 90 as core 6 on socket 1 00:04:58.055 EAL: Detected lcore 91 as core 8 on socket 1 00:04:58.055 EAL: Detected lcore 92 as core 9 on socket 1 00:04:58.055 EAL: Detected lcore 93 as core 10 on socket 1 00:04:58.055 EAL: Detected lcore 94 as core 11 on socket 1 00:04:58.055 EAL: Detected lcore 95 as core 12 on socket 1 00:04:58.055 EAL: Detected lcore 96 as core 13 on socket 1 00:04:58.055 EAL: Detected lcore 97 as core 14 on socket 1 00:04:58.055 EAL: Detected lcore 98 as core 16 on socket 1 00:04:58.055 EAL: Detected lcore 99 as core 17 on socket 1 00:04:58.055 EAL: Detected lcore 100 as core 18 on socket 1 00:04:58.055 EAL: Detected lcore 101 as core 19 on socket 1 00:04:58.055 EAL: Detected lcore 102 as core 20 on socket 1 00:04:58.055 EAL: Detected lcore 103 as core 21 on socket 1 00:04:58.055 EAL: Detected lcore 104 as core 22 on socket 1 00:04:58.055 EAL: Detected lcore 105 as core 24 on socket 1 00:04:58.055 EAL: Detected lcore 106 as core 25 on socket 1 00:04:58.055 EAL: Detected lcore 107 as core 26 on socket 1 00:04:58.055 EAL: Detected lcore 108 as core 27 on socket 1 00:04:58.055 EAL: Detected lcore 109 as core 28 on socket 1 00:04:58.055 EAL: Detected lcore 110 as core 29 on socket 1 00:04:58.055 EAL: Detected lcore 111 as core 30 on socket 1 00:04:58.055 EAL: Maximum logical cores by configuration: 128 00:04:58.055 EAL: Detected CPU lcores: 112 00:04:58.055 EAL: Detected NUMA nodes: 2 00:04:58.055 EAL: Checking presence of .so 'librte_eal.so.24.0' 00:04:58.055 EAL: Checking presence of .so 'librte_eal.so.24' 00:04:58.055 EAL: Checking presence of .so 'librte_eal.so' 00:04:58.055 EAL: Detected static linkage of DPDK 00:04:58.055 EAL: No shared files mode enabled, IPC will be disabled 00:04:58.055 EAL: Bus pci wants IOVA as 'DC' 00:04:58.055 EAL: Buses did not request a specific IOVA mode. 00:04:58.055 EAL: IOMMU is available, selecting IOVA as VA mode. 00:04:58.055 EAL: Selected IOVA mode 'VA' 00:04:58.055 EAL: No free 2048 kB hugepages reported on node 1 00:04:58.055 EAL: Probing VFIO support... 00:04:58.055 EAL: IOMMU type 1 (Type 1) is supported 00:04:58.055 EAL: IOMMU type 7 (sPAPR) is not supported 00:04:58.055 EAL: IOMMU type 8 (No-IOMMU) is not supported 00:04:58.055 EAL: VFIO support initialized 00:04:58.055 EAL: Ask a virtual area of 0x2e000 bytes 00:04:58.055 EAL: Virtual area found at 0x200000000000 (size = 0x2e000) 00:04:58.055 EAL: Setting up physically contiguous memory... 00:04:58.055 EAL: Setting maximum number of open files to 524288 00:04:58.055 EAL: Detected memory type: socket_id:0 hugepage_sz:2097152 00:04:58.055 EAL: Detected memory type: socket_id:1 hugepage_sz:2097152 00:04:58.055 EAL: Creating 4 segment lists: n_segs:8192 socket_id:0 hugepage_sz:2097152 00:04:58.055 EAL: Ask a virtual area of 0x61000 bytes 00:04:58.055 EAL: Virtual area found at 0x20000002e000 (size = 0x61000) 00:04:58.055 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:04:58.055 EAL: Ask a virtual area of 0x400000000 bytes 00:04:58.055 EAL: Virtual area found at 0x200000200000 (size = 0x400000000) 00:04:58.055 EAL: VA reserved for memseg list at 0x200000200000, size 400000000 00:04:58.055 EAL: Ask a virtual area of 0x61000 bytes 00:04:58.055 EAL: Virtual area found at 0x200400200000 (size = 0x61000) 00:04:58.055 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:04:58.055 EAL: Ask a virtual area of 0x400000000 bytes 00:04:58.055 EAL: Virtual area found at 0x200400400000 (size = 0x400000000) 00:04:58.055 EAL: VA reserved for memseg list at 0x200400400000, size 400000000 00:04:58.055 EAL: Ask a virtual area of 0x61000 bytes 00:04:58.055 EAL: Virtual area found at 0x200800400000 (size = 0x61000) 00:04:58.055 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:04:58.055 EAL: Ask a virtual area of 0x400000000 bytes 00:04:58.055 EAL: Virtual area found at 0x200800600000 (size = 0x400000000) 00:04:58.055 EAL: VA reserved for memseg list at 0x200800600000, size 400000000 00:04:58.055 EAL: Ask a virtual area of 0x61000 bytes 00:04:58.055 EAL: Virtual area found at 0x200c00600000 (size = 0x61000) 00:04:58.055 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:04:58.055 EAL: Ask a virtual area of 0x400000000 bytes 00:04:58.055 EAL: Virtual area found at 0x200c00800000 (size = 0x400000000) 00:04:58.055 EAL: VA reserved for memseg list at 0x200c00800000, size 400000000 00:04:58.055 EAL: Creating 4 segment lists: n_segs:8192 socket_id:1 hugepage_sz:2097152 00:04:58.055 EAL: Ask a virtual area of 0x61000 bytes 00:04:58.055 EAL: Virtual area found at 0x201000800000 (size = 0x61000) 00:04:58.055 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:04:58.055 EAL: Ask a virtual area of 0x400000000 bytes 00:04:58.055 EAL: Virtual area found at 0x201000a00000 (size = 0x400000000) 00:04:58.055 EAL: VA reserved for memseg list at 0x201000a00000, size 400000000 00:04:58.055 EAL: Ask a virtual area of 0x61000 bytes 00:04:58.055 EAL: Virtual area found at 0x201400a00000 (size = 0x61000) 00:04:58.055 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:04:58.055 EAL: Ask a virtual area of 0x400000000 bytes 00:04:58.055 EAL: Virtual area found at 0x201400c00000 (size = 0x400000000) 00:04:58.055 EAL: VA reserved for memseg list at 0x201400c00000, size 400000000 00:04:58.055 EAL: Ask a virtual area of 0x61000 bytes 00:04:58.055 EAL: Virtual area found at 0x201800c00000 (size = 0x61000) 00:04:58.055 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:04:58.055 EAL: Ask a virtual area of 0x400000000 bytes 00:04:58.055 EAL: Virtual area found at 0x201800e00000 (size = 0x400000000) 00:04:58.055 EAL: VA reserved for memseg list at 0x201800e00000, size 400000000 00:04:58.055 EAL: Ask a virtual area of 0x61000 bytes 00:04:58.055 EAL: Virtual area found at 0x201c00e00000 (size = 0x61000) 00:04:58.055 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:04:58.055 EAL: Ask a virtual area of 0x400000000 bytes 00:04:58.055 EAL: Virtual area found at 0x201c01000000 (size = 0x400000000) 00:04:58.055 EAL: VA reserved for memseg list at 0x201c01000000, size 400000000 00:04:58.055 EAL: Hugepages will be freed exactly as allocated. 00:04:58.055 EAL: No shared files mode enabled, IPC is disabled 00:04:58.055 EAL: No shared files mode enabled, IPC is disabled 00:04:58.055 EAL: TSC frequency is ~2500000 KHz 00:04:58.055 EAL: Main lcore 0 is ready (tid=7f3bbbfb3a00;cpuset=[0]) 00:04:58.055 EAL: Trying to obtain current memory policy. 00:04:58.055 EAL: Setting policy MPOL_PREFERRED for socket 0 00:04:58.056 EAL: Restoring previous memory policy: 0 00:04:58.056 EAL: request: mp_malloc_sync 00:04:58.056 EAL: No shared files mode enabled, IPC is disabled 00:04:58.056 EAL: Heap on socket 0 was expanded by 2MB 00:04:58.056 EAL: No shared files mode enabled, IPC is disabled 00:04:58.056 EAL: Mem event callback 'spdk:(nil)' registered 00:04:58.056 00:04:58.056 00:04:58.056 CUnit - A unit testing framework for C - Version 2.1-3 00:04:58.056 http://cunit.sourceforge.net/ 00:04:58.056 00:04:58.056 00:04:58.056 Suite: components_suite 00:04:58.056 Test: vtophys_malloc_test ...passed 00:04:58.056 Test: vtophys_spdk_malloc_test ...EAL: Trying to obtain current memory policy. 00:04:58.056 EAL: Setting policy MPOL_PREFERRED for socket 0 00:04:58.056 EAL: Restoring previous memory policy: 4 00:04:58.056 EAL: Calling mem event callback 'spdk:(nil)' 00:04:58.056 EAL: request: mp_malloc_sync 00:04:58.056 EAL: No shared files mode enabled, IPC is disabled 00:04:58.056 EAL: Heap on socket 0 was expanded by 4MB 00:04:58.056 EAL: Calling mem event callback 'spdk:(nil)' 00:04:58.056 EAL: request: mp_malloc_sync 00:04:58.056 EAL: No shared files mode enabled, IPC is disabled 00:04:58.056 EAL: Heap on socket 0 was shrunk by 4MB 00:04:58.056 EAL: Trying to obtain current memory policy. 00:04:58.056 EAL: Setting policy MPOL_PREFERRED for socket 0 00:04:58.056 EAL: Restoring previous memory policy: 4 00:04:58.056 EAL: Calling mem event callback 'spdk:(nil)' 00:04:58.056 EAL: request: mp_malloc_sync 00:04:58.056 EAL: No shared files mode enabled, IPC is disabled 00:04:58.056 EAL: Heap on socket 0 was expanded by 6MB 00:04:58.056 EAL: Calling mem event callback 'spdk:(nil)' 00:04:58.056 EAL: request: mp_malloc_sync 00:04:58.056 EAL: No shared files mode enabled, IPC is disabled 00:04:58.056 EAL: Heap on socket 0 was shrunk by 6MB 00:04:58.056 EAL: Trying to obtain current memory policy. 00:04:58.056 EAL: Setting policy MPOL_PREFERRED for socket 0 00:04:58.056 EAL: Restoring previous memory policy: 4 00:04:58.056 EAL: Calling mem event callback 'spdk:(nil)' 00:04:58.056 EAL: request: mp_malloc_sync 00:04:58.056 EAL: No shared files mode enabled, IPC is disabled 00:04:58.056 EAL: Heap on socket 0 was expanded by 10MB 00:04:58.056 EAL: Calling mem event callback 'spdk:(nil)' 00:04:58.056 EAL: request: mp_malloc_sync 00:04:58.056 EAL: No shared files mode enabled, IPC is disabled 00:04:58.056 EAL: Heap on socket 0 was shrunk by 10MB 00:04:58.056 EAL: Trying to obtain current memory policy. 00:04:58.056 EAL: Setting policy MPOL_PREFERRED for socket 0 00:04:58.056 EAL: Restoring previous memory policy: 4 00:04:58.056 EAL: Calling mem event callback 'spdk:(nil)' 00:04:58.056 EAL: request: mp_malloc_sync 00:04:58.056 EAL: No shared files mode enabled, IPC is disabled 00:04:58.056 EAL: Heap on socket 0 was expanded by 18MB 00:04:58.056 EAL: Calling mem event callback 'spdk:(nil)' 00:04:58.056 EAL: request: mp_malloc_sync 00:04:58.056 EAL: No shared files mode enabled, IPC is disabled 00:04:58.056 EAL: Heap on socket 0 was shrunk by 18MB 00:04:58.056 EAL: Trying to obtain current memory policy. 00:04:58.056 EAL: Setting policy MPOL_PREFERRED for socket 0 00:04:58.056 EAL: Restoring previous memory policy: 4 00:04:58.056 EAL: Calling mem event callback 'spdk:(nil)' 00:04:58.056 EAL: request: mp_malloc_sync 00:04:58.056 EAL: No shared files mode enabled, IPC is disabled 00:04:58.056 EAL: Heap on socket 0 was expanded by 34MB 00:04:58.056 EAL: Calling mem event callback 'spdk:(nil)' 00:04:58.056 EAL: request: mp_malloc_sync 00:04:58.056 EAL: No shared files mode enabled, IPC is disabled 00:04:58.056 EAL: Heap on socket 0 was shrunk by 34MB 00:04:58.056 EAL: Trying to obtain current memory policy. 00:04:58.056 EAL: Setting policy MPOL_PREFERRED for socket 0 00:04:58.056 EAL: Restoring previous memory policy: 4 00:04:58.056 EAL: Calling mem event callback 'spdk:(nil)' 00:04:58.056 EAL: request: mp_malloc_sync 00:04:58.056 EAL: No shared files mode enabled, IPC is disabled 00:04:58.056 EAL: Heap on socket 0 was expanded by 66MB 00:04:58.056 EAL: Calling mem event callback 'spdk:(nil)' 00:04:58.056 EAL: request: mp_malloc_sync 00:04:58.056 EAL: No shared files mode enabled, IPC is disabled 00:04:58.056 EAL: Heap on socket 0 was shrunk by 66MB 00:04:58.056 EAL: Trying to obtain current memory policy. 00:04:58.056 EAL: Setting policy MPOL_PREFERRED for socket 0 00:04:58.056 EAL: Restoring previous memory policy: 4 00:04:58.056 EAL: Calling mem event callback 'spdk:(nil)' 00:04:58.056 EAL: request: mp_malloc_sync 00:04:58.056 EAL: No shared files mode enabled, IPC is disabled 00:04:58.056 EAL: Heap on socket 0 was expanded by 130MB 00:04:58.056 EAL: Calling mem event callback 'spdk:(nil)' 00:04:58.056 EAL: request: mp_malloc_sync 00:04:58.056 EAL: No shared files mode enabled, IPC is disabled 00:04:58.056 EAL: Heap on socket 0 was shrunk by 130MB 00:04:58.056 EAL: Trying to obtain current memory policy. 00:04:58.056 EAL: Setting policy MPOL_PREFERRED for socket 0 00:04:58.056 EAL: Restoring previous memory policy: 4 00:04:58.056 EAL: Calling mem event callback 'spdk:(nil)' 00:04:58.056 EAL: request: mp_malloc_sync 00:04:58.056 EAL: No shared files mode enabled, IPC is disabled 00:04:58.056 EAL: Heap on socket 0 was expanded by 258MB 00:04:58.315 EAL: Calling mem event callback 'spdk:(nil)' 00:04:58.315 EAL: request: mp_malloc_sync 00:04:58.315 EAL: No shared files mode enabled, IPC is disabled 00:04:58.315 EAL: Heap on socket 0 was shrunk by 258MB 00:04:58.315 EAL: Trying to obtain current memory policy. 00:04:58.315 EAL: Setting policy MPOL_PREFERRED for socket 0 00:04:58.315 EAL: Restoring previous memory policy: 4 00:04:58.315 EAL: Calling mem event callback 'spdk:(nil)' 00:04:58.315 EAL: request: mp_malloc_sync 00:04:58.315 EAL: No shared files mode enabled, IPC is disabled 00:04:58.315 EAL: Heap on socket 0 was expanded by 514MB 00:04:58.315 EAL: Calling mem event callback 'spdk:(nil)' 00:04:58.574 EAL: request: mp_malloc_sync 00:04:58.574 EAL: No shared files mode enabled, IPC is disabled 00:04:58.574 EAL: Heap on socket 0 was shrunk by 514MB 00:04:58.574 EAL: Trying to obtain current memory policy. 00:04:58.574 EAL: Setting policy MPOL_PREFERRED for socket 0 00:04:58.574 EAL: Restoring previous memory policy: 4 00:04:58.574 EAL: Calling mem event callback 'spdk:(nil)' 00:04:58.575 EAL: request: mp_malloc_sync 00:04:58.575 EAL: No shared files mode enabled, IPC is disabled 00:04:58.575 EAL: Heap on socket 0 was expanded by 1026MB 00:04:58.833 EAL: Calling mem event callback 'spdk:(nil)' 00:04:59.093 EAL: request: mp_malloc_sync 00:04:59.093 EAL: No shared files mode enabled, IPC is disabled 00:04:59.093 EAL: Heap on socket 0 was shrunk by 1026MB 00:04:59.093 passed 00:04:59.093 00:04:59.093 Run Summary: Type Total Ran Passed Failed Inactive 00:04:59.093 suites 1 1 n/a 0 0 00:04:59.093 tests 2 2 2 0 0 00:04:59.093 asserts 497 497 497 0 n/a 00:04:59.093 00:04:59.093 Elapsed time = 0.954 seconds 00:04:59.093 EAL: Calling mem event callback 'spdk:(nil)' 00:04:59.093 EAL: request: mp_malloc_sync 00:04:59.093 EAL: No shared files mode enabled, IPC is disabled 00:04:59.093 EAL: Heap on socket 0 was shrunk by 2MB 00:04:59.093 EAL: No shared files mode enabled, IPC is disabled 00:04:59.093 EAL: No shared files mode enabled, IPC is disabled 00:04:59.093 EAL: No shared files mode enabled, IPC is disabled 00:04:59.093 00:04:59.093 real 0m1.069s 00:04:59.093 user 0m0.624s 00:04:59.093 sys 0m0.421s 00:04:59.093 00:12:57 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:59.093 00:12:57 -- common/autotest_common.sh@10 -- # set +x 00:04:59.093 ************************************ 00:04:59.093 END TEST env_vtophys 00:04:59.093 ************************************ 00:04:59.093 00:12:57 -- env/env.sh@12 -- # run_test env_pci /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/pci/pci_ut 00:04:59.093 00:12:57 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:59.093 00:12:57 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:59.093 00:12:57 -- common/autotest_common.sh@10 -- # set +x 00:04:59.093 ************************************ 00:04:59.093 START TEST env_pci 00:04:59.093 ************************************ 00:04:59.093 00:12:57 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/pci/pci_ut 00:04:59.093 00:04:59.093 00:04:59.093 CUnit - A unit testing framework for C - Version 2.1-3 00:04:59.093 http://cunit.sourceforge.net/ 00:04:59.093 00:04:59.093 00:04:59.093 Suite: pci 00:04:59.093 Test: pci_hook ...[2024-07-15 00:12:57.987328] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/pci.c:1041:spdk_pci_device_claim: *ERROR*: Cannot create lock on device /var/tmp/spdk_pci_lock_10000:00:01.0, probably process 302116 has claimed it 00:04:59.093 EAL: Cannot find device (10000:00:01.0) 00:04:59.093 EAL: Failed to attach device on primary process 00:04:59.093 passed 00:04:59.093 00:04:59.093 Run Summary: Type Total Ran Passed Failed Inactive 00:04:59.093 suites 1 1 n/a 0 0 00:04:59.093 tests 1 1 1 0 0 00:04:59.093 asserts 25 25 25 0 n/a 00:04:59.093 00:04:59.093 Elapsed time = 0.024 seconds 00:04:59.093 00:04:59.093 real 0m0.033s 00:04:59.093 user 0m0.009s 00:04:59.093 sys 0m0.024s 00:04:59.093 00:12:58 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:59.093 00:12:58 -- common/autotest_common.sh@10 -- # set +x 00:04:59.093 ************************************ 00:04:59.093 END TEST env_pci 00:04:59.093 ************************************ 00:04:59.093 00:12:58 -- env/env.sh@14 -- # argv='-c 0x1 ' 00:04:59.093 00:12:58 -- env/env.sh@15 -- # uname 00:04:59.093 00:12:58 -- env/env.sh@15 -- # '[' Linux = Linux ']' 00:04:59.093 00:12:58 -- env/env.sh@22 -- # argv+=--base-virtaddr=0x200000000000 00:04:59.093 00:12:58 -- env/env.sh@24 -- # run_test env_dpdk_post_init /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:04:59.093 00:12:58 -- common/autotest_common.sh@1077 -- # '[' 5 -le 1 ']' 00:04:59.093 00:12:58 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:59.093 00:12:58 -- common/autotest_common.sh@10 -- # set +x 00:04:59.093 ************************************ 00:04:59.093 START TEST env_dpdk_post_init 00:04:59.093 ************************************ 00:04:59.093 00:12:58 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:04:59.093 EAL: Detected CPU lcores: 112 00:04:59.093 EAL: Detected NUMA nodes: 2 00:04:59.093 EAL: Detected static linkage of DPDK 00:04:59.093 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:04:59.093 EAL: Selected IOVA mode 'VA' 00:04:59.093 EAL: No free 2048 kB hugepages reported on node 1 00:04:59.093 EAL: VFIO support initialized 00:04:59.093 TELEMETRY: No legacy callbacks, legacy socket not created 00:04:59.352 EAL: Using IOMMU type 1 (Type 1) 00:04:59.921 EAL: Probe PCI driver: spdk_nvme (8086:0a54) device: 0000:d8:00.0 (socket 1) 00:05:04.114 EAL: Releasing PCI mapped resource for 0000:d8:00.0 00:05:04.114 EAL: Calling pci_unmap_resource for 0000:d8:00.0 at 0x202001000000 00:05:04.114 Starting DPDK initialization... 00:05:04.114 Starting SPDK post initialization... 00:05:04.114 SPDK NVMe probe 00:05:04.114 Attaching to 0000:d8:00.0 00:05:04.114 Attached to 0000:d8:00.0 00:05:04.114 Cleaning up... 00:05:04.114 00:05:04.114 real 0m4.714s 00:05:04.114 user 0m3.573s 00:05:04.114 sys 0m0.389s 00:05:04.114 00:13:02 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:04.114 00:13:02 -- common/autotest_common.sh@10 -- # set +x 00:05:04.114 ************************************ 00:05:04.114 END TEST env_dpdk_post_init 00:05:04.114 ************************************ 00:05:04.114 00:13:02 -- env/env.sh@26 -- # uname 00:05:04.114 00:13:02 -- env/env.sh@26 -- # '[' Linux = Linux ']' 00:05:04.114 00:13:02 -- env/env.sh@29 -- # run_test env_mem_callbacks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:05:04.114 00:13:02 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:04.114 00:13:02 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:04.114 00:13:02 -- common/autotest_common.sh@10 -- # set +x 00:05:04.114 ************************************ 00:05:04.114 START TEST env_mem_callbacks 00:05:04.114 ************************************ 00:05:04.114 00:13:02 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:05:04.114 EAL: Detected CPU lcores: 112 00:05:04.114 EAL: Detected NUMA nodes: 2 00:05:04.114 EAL: Detected static linkage of DPDK 00:05:04.114 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:05:04.114 EAL: Selected IOVA mode 'VA' 00:05:04.114 EAL: No free 2048 kB hugepages reported on node 1 00:05:04.114 EAL: VFIO support initialized 00:05:04.114 TELEMETRY: No legacy callbacks, legacy socket not created 00:05:04.114 00:05:04.114 00:05:04.114 CUnit - A unit testing framework for C - Version 2.1-3 00:05:04.114 http://cunit.sourceforge.net/ 00:05:04.114 00:05:04.114 00:05:04.114 Suite: memory 00:05:04.114 Test: test ... 00:05:04.114 register 0x200000200000 2097152 00:05:04.114 malloc 3145728 00:05:04.114 register 0x200000400000 4194304 00:05:04.114 buf 0x200000500000 len 3145728 PASSED 00:05:04.114 malloc 64 00:05:04.114 buf 0x2000004fff40 len 64 PASSED 00:05:04.114 malloc 4194304 00:05:04.114 register 0x200000800000 6291456 00:05:04.114 buf 0x200000a00000 len 4194304 PASSED 00:05:04.114 free 0x200000500000 3145728 00:05:04.114 free 0x2000004fff40 64 00:05:04.114 unregister 0x200000400000 4194304 PASSED 00:05:04.114 free 0x200000a00000 4194304 00:05:04.114 unregister 0x200000800000 6291456 PASSED 00:05:04.114 malloc 8388608 00:05:04.114 register 0x200000400000 10485760 00:05:04.114 buf 0x200000600000 len 8388608 PASSED 00:05:04.114 free 0x200000600000 8388608 00:05:04.114 unregister 0x200000400000 10485760 PASSED 00:05:04.114 passed 00:05:04.114 00:05:04.114 Run Summary: Type Total Ran Passed Failed Inactive 00:05:04.115 suites 1 1 n/a 0 0 00:05:04.115 tests 1 1 1 0 0 00:05:04.115 asserts 15 15 15 0 n/a 00:05:04.115 00:05:04.115 Elapsed time = 0.005 seconds 00:05:04.115 00:05:04.115 real 0m0.050s 00:05:04.115 user 0m0.020s 00:05:04.115 sys 0m0.030s 00:05:04.115 00:13:02 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:04.115 00:13:02 -- common/autotest_common.sh@10 -- # set +x 00:05:04.115 ************************************ 00:05:04.115 END TEST env_mem_callbacks 00:05:04.115 ************************************ 00:05:04.115 00:05:04.115 real 0m6.305s 00:05:04.115 user 0m4.442s 00:05:04.115 sys 0m1.132s 00:05:04.115 00:13:02 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:04.115 00:13:02 -- common/autotest_common.sh@10 -- # set +x 00:05:04.115 ************************************ 00:05:04.115 END TEST env 00:05:04.115 ************************************ 00:05:04.115 00:13:02 -- spdk/autotest.sh@176 -- # run_test rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/rpc.sh 00:05:04.115 00:13:02 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:04.115 00:13:02 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:04.115 00:13:02 -- common/autotest_common.sh@10 -- # set +x 00:05:04.115 ************************************ 00:05:04.115 START TEST rpc 00:05:04.115 ************************************ 00:05:04.115 00:13:02 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/rpc.sh 00:05:04.115 * Looking for test storage... 00:05:04.115 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:05:04.115 00:13:03 -- rpc/rpc.sh@65 -- # spdk_pid=303254 00:05:04.115 00:13:03 -- rpc/rpc.sh@66 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:05:04.115 00:13:03 -- rpc/rpc.sh@64 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -e bdev 00:05:04.115 00:13:03 -- rpc/rpc.sh@67 -- # waitforlisten 303254 00:05:04.115 00:13:03 -- common/autotest_common.sh@819 -- # '[' -z 303254 ']' 00:05:04.115 00:13:03 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:04.115 00:13:03 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:04.115 00:13:03 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:04.115 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:04.115 00:13:03 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:04.115 00:13:03 -- common/autotest_common.sh@10 -- # set +x 00:05:04.115 [2024-07-15 00:13:03.090538] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:04.115 [2024-07-15 00:13:03.090605] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid303254 ] 00:05:04.115 EAL: No free 2048 kB hugepages reported on node 1 00:05:04.115 [2024-07-15 00:13:03.158830] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:04.396 [2024-07-15 00:13:03.229129] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:04.396 [2024-07-15 00:13:03.229234] app.c: 488:app_setup_trace: *NOTICE*: Tracepoint Group Mask bdev specified. 00:05:04.396 [2024-07-15 00:13:03.229243] app.c: 492:app_setup_trace: *NOTICE*: Use 'spdk_trace -s spdk_tgt -p 303254' to capture a snapshot of events at runtime. 00:05:04.396 [2024-07-15 00:13:03.229252] app.c: 494:app_setup_trace: *NOTICE*: Or copy /dev/shm/spdk_tgt_trace.pid303254 for offline analysis/debug. 00:05:04.396 [2024-07-15 00:13:03.229269] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:05.042 00:13:03 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:05.042 00:13:03 -- common/autotest_common.sh@852 -- # return 0 00:05:05.042 00:13:03 -- rpc/rpc.sh@69 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:05:05.042 00:13:03 -- rpc/rpc.sh@69 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:05:05.042 00:13:03 -- rpc/rpc.sh@72 -- # rpc=rpc_cmd 00:05:05.042 00:13:03 -- rpc/rpc.sh@73 -- # run_test rpc_integrity rpc_integrity 00:05:05.042 00:13:03 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:05.042 00:13:03 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:05.042 00:13:03 -- common/autotest_common.sh@10 -- # set +x 00:05:05.042 ************************************ 00:05:05.042 START TEST rpc_integrity 00:05:05.042 ************************************ 00:05:05.042 00:13:03 -- common/autotest_common.sh@1104 -- # rpc_integrity 00:05:05.042 00:13:03 -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:05:05.042 00:13:03 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:05.042 00:13:03 -- common/autotest_common.sh@10 -- # set +x 00:05:05.042 00:13:03 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:05.042 00:13:03 -- rpc/rpc.sh@12 -- # bdevs='[]' 00:05:05.042 00:13:03 -- rpc/rpc.sh@13 -- # jq length 00:05:05.042 00:13:03 -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:05:05.042 00:13:03 -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:05:05.042 00:13:03 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:05.042 00:13:03 -- common/autotest_common.sh@10 -- # set +x 00:05:05.042 00:13:03 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:05.042 00:13:03 -- rpc/rpc.sh@15 -- # malloc=Malloc0 00:05:05.042 00:13:03 -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:05:05.042 00:13:03 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:05.042 00:13:03 -- common/autotest_common.sh@10 -- # set +x 00:05:05.042 00:13:03 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:05.042 00:13:03 -- rpc/rpc.sh@16 -- # bdevs='[ 00:05:05.042 { 00:05:05.042 "name": "Malloc0", 00:05:05.042 "aliases": [ 00:05:05.042 "3f4497de-14c7-4dc5-b498-50573191c577" 00:05:05.042 ], 00:05:05.042 "product_name": "Malloc disk", 00:05:05.042 "block_size": 512, 00:05:05.042 "num_blocks": 16384, 00:05:05.042 "uuid": "3f4497de-14c7-4dc5-b498-50573191c577", 00:05:05.042 "assigned_rate_limits": { 00:05:05.042 "rw_ios_per_sec": 0, 00:05:05.042 "rw_mbytes_per_sec": 0, 00:05:05.042 "r_mbytes_per_sec": 0, 00:05:05.042 "w_mbytes_per_sec": 0 00:05:05.042 }, 00:05:05.042 "claimed": false, 00:05:05.042 "zoned": false, 00:05:05.042 "supported_io_types": { 00:05:05.042 "read": true, 00:05:05.042 "write": true, 00:05:05.042 "unmap": true, 00:05:05.042 "write_zeroes": true, 00:05:05.042 "flush": true, 00:05:05.042 "reset": true, 00:05:05.042 "compare": false, 00:05:05.042 "compare_and_write": false, 00:05:05.042 "abort": true, 00:05:05.042 "nvme_admin": false, 00:05:05.042 "nvme_io": false 00:05:05.042 }, 00:05:05.042 "memory_domains": [ 00:05:05.042 { 00:05:05.043 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:05.043 "dma_device_type": 2 00:05:05.043 } 00:05:05.043 ], 00:05:05.043 "driver_specific": {} 00:05:05.043 } 00:05:05.043 ]' 00:05:05.043 00:13:03 -- rpc/rpc.sh@17 -- # jq length 00:05:05.043 00:13:04 -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:05:05.043 00:13:04 -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc0 -p Passthru0 00:05:05.043 00:13:04 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:05.043 00:13:04 -- common/autotest_common.sh@10 -- # set +x 00:05:05.043 [2024-07-15 00:13:04.032999] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc0 00:05:05.043 [2024-07-15 00:13:04.033034] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:05:05.043 [2024-07-15 00:13:04.033051] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x548d5e0 00:05:05.043 [2024-07-15 00:13:04.033064] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:05:05.043 [2024-07-15 00:13:04.033914] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:05:05.043 [2024-07-15 00:13:04.033936] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:05:05.043 Passthru0 00:05:05.043 00:13:04 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:05.043 00:13:04 -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:05:05.043 00:13:04 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:05.043 00:13:04 -- common/autotest_common.sh@10 -- # set +x 00:05:05.043 00:13:04 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:05.043 00:13:04 -- rpc/rpc.sh@20 -- # bdevs='[ 00:05:05.043 { 00:05:05.043 "name": "Malloc0", 00:05:05.043 "aliases": [ 00:05:05.043 "3f4497de-14c7-4dc5-b498-50573191c577" 00:05:05.043 ], 00:05:05.043 "product_name": "Malloc disk", 00:05:05.043 "block_size": 512, 00:05:05.043 "num_blocks": 16384, 00:05:05.043 "uuid": "3f4497de-14c7-4dc5-b498-50573191c577", 00:05:05.043 "assigned_rate_limits": { 00:05:05.043 "rw_ios_per_sec": 0, 00:05:05.043 "rw_mbytes_per_sec": 0, 00:05:05.043 "r_mbytes_per_sec": 0, 00:05:05.043 "w_mbytes_per_sec": 0 00:05:05.043 }, 00:05:05.043 "claimed": true, 00:05:05.043 "claim_type": "exclusive_write", 00:05:05.043 "zoned": false, 00:05:05.043 "supported_io_types": { 00:05:05.043 "read": true, 00:05:05.043 "write": true, 00:05:05.043 "unmap": true, 00:05:05.043 "write_zeroes": true, 00:05:05.043 "flush": true, 00:05:05.043 "reset": true, 00:05:05.043 "compare": false, 00:05:05.043 "compare_and_write": false, 00:05:05.043 "abort": true, 00:05:05.043 "nvme_admin": false, 00:05:05.043 "nvme_io": false 00:05:05.043 }, 00:05:05.043 "memory_domains": [ 00:05:05.043 { 00:05:05.043 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:05.043 "dma_device_type": 2 00:05:05.043 } 00:05:05.043 ], 00:05:05.043 "driver_specific": {} 00:05:05.043 }, 00:05:05.043 { 00:05:05.043 "name": "Passthru0", 00:05:05.043 "aliases": [ 00:05:05.043 "15100b89-4254-5631-ade2-31a842c6f0f2" 00:05:05.043 ], 00:05:05.043 "product_name": "passthru", 00:05:05.043 "block_size": 512, 00:05:05.043 "num_blocks": 16384, 00:05:05.043 "uuid": "15100b89-4254-5631-ade2-31a842c6f0f2", 00:05:05.043 "assigned_rate_limits": { 00:05:05.043 "rw_ios_per_sec": 0, 00:05:05.043 "rw_mbytes_per_sec": 0, 00:05:05.043 "r_mbytes_per_sec": 0, 00:05:05.043 "w_mbytes_per_sec": 0 00:05:05.043 }, 00:05:05.043 "claimed": false, 00:05:05.043 "zoned": false, 00:05:05.043 "supported_io_types": { 00:05:05.043 "read": true, 00:05:05.043 "write": true, 00:05:05.043 "unmap": true, 00:05:05.043 "write_zeroes": true, 00:05:05.043 "flush": true, 00:05:05.043 "reset": true, 00:05:05.043 "compare": false, 00:05:05.043 "compare_and_write": false, 00:05:05.043 "abort": true, 00:05:05.043 "nvme_admin": false, 00:05:05.043 "nvme_io": false 00:05:05.043 }, 00:05:05.043 "memory_domains": [ 00:05:05.043 { 00:05:05.043 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:05.043 "dma_device_type": 2 00:05:05.043 } 00:05:05.043 ], 00:05:05.043 "driver_specific": { 00:05:05.043 "passthru": { 00:05:05.043 "name": "Passthru0", 00:05:05.043 "base_bdev_name": "Malloc0" 00:05:05.043 } 00:05:05.043 } 00:05:05.043 } 00:05:05.043 ]' 00:05:05.043 00:13:04 -- rpc/rpc.sh@21 -- # jq length 00:05:05.302 00:13:04 -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:05:05.302 00:13:04 -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:05:05.302 00:13:04 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:05.302 00:13:04 -- common/autotest_common.sh@10 -- # set +x 00:05:05.302 00:13:04 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:05.302 00:13:04 -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc0 00:05:05.302 00:13:04 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:05.302 00:13:04 -- common/autotest_common.sh@10 -- # set +x 00:05:05.302 00:13:04 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:05.302 00:13:04 -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:05:05.302 00:13:04 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:05.302 00:13:04 -- common/autotest_common.sh@10 -- # set +x 00:05:05.302 00:13:04 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:05.302 00:13:04 -- rpc/rpc.sh@25 -- # bdevs='[]' 00:05:05.302 00:13:04 -- rpc/rpc.sh@26 -- # jq length 00:05:05.302 00:13:04 -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:05:05.302 00:05:05.302 real 0m0.292s 00:05:05.302 user 0m0.175s 00:05:05.302 sys 0m0.048s 00:05:05.302 00:13:04 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:05.302 00:13:04 -- common/autotest_common.sh@10 -- # set +x 00:05:05.302 ************************************ 00:05:05.302 END TEST rpc_integrity 00:05:05.302 ************************************ 00:05:05.302 00:13:04 -- rpc/rpc.sh@74 -- # run_test rpc_plugins rpc_plugins 00:05:05.302 00:13:04 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:05.302 00:13:04 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:05.302 00:13:04 -- common/autotest_common.sh@10 -- # set +x 00:05:05.302 ************************************ 00:05:05.302 START TEST rpc_plugins 00:05:05.302 ************************************ 00:05:05.302 00:13:04 -- common/autotest_common.sh@1104 -- # rpc_plugins 00:05:05.302 00:13:04 -- rpc/rpc.sh@30 -- # rpc_cmd --plugin rpc_plugin create_malloc 00:05:05.302 00:13:04 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:05.302 00:13:04 -- common/autotest_common.sh@10 -- # set +x 00:05:05.302 00:13:04 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:05.302 00:13:04 -- rpc/rpc.sh@30 -- # malloc=Malloc1 00:05:05.302 00:13:04 -- rpc/rpc.sh@31 -- # rpc_cmd bdev_get_bdevs 00:05:05.302 00:13:04 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:05.302 00:13:04 -- common/autotest_common.sh@10 -- # set +x 00:05:05.302 00:13:04 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:05.302 00:13:04 -- rpc/rpc.sh@31 -- # bdevs='[ 00:05:05.302 { 00:05:05.302 "name": "Malloc1", 00:05:05.302 "aliases": [ 00:05:05.302 "8c223b15-23f8-4e23-842e-34006b278d0e" 00:05:05.302 ], 00:05:05.302 "product_name": "Malloc disk", 00:05:05.302 "block_size": 4096, 00:05:05.302 "num_blocks": 256, 00:05:05.302 "uuid": "8c223b15-23f8-4e23-842e-34006b278d0e", 00:05:05.302 "assigned_rate_limits": { 00:05:05.302 "rw_ios_per_sec": 0, 00:05:05.302 "rw_mbytes_per_sec": 0, 00:05:05.302 "r_mbytes_per_sec": 0, 00:05:05.302 "w_mbytes_per_sec": 0 00:05:05.302 }, 00:05:05.302 "claimed": false, 00:05:05.302 "zoned": false, 00:05:05.302 "supported_io_types": { 00:05:05.302 "read": true, 00:05:05.302 "write": true, 00:05:05.302 "unmap": true, 00:05:05.302 "write_zeroes": true, 00:05:05.302 "flush": true, 00:05:05.302 "reset": true, 00:05:05.302 "compare": false, 00:05:05.302 "compare_and_write": false, 00:05:05.302 "abort": true, 00:05:05.302 "nvme_admin": false, 00:05:05.302 "nvme_io": false 00:05:05.302 }, 00:05:05.302 "memory_domains": [ 00:05:05.302 { 00:05:05.302 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:05.302 "dma_device_type": 2 00:05:05.302 } 00:05:05.302 ], 00:05:05.302 "driver_specific": {} 00:05:05.302 } 00:05:05.302 ]' 00:05:05.302 00:13:04 -- rpc/rpc.sh@32 -- # jq length 00:05:05.302 00:13:04 -- rpc/rpc.sh@32 -- # '[' 1 == 1 ']' 00:05:05.302 00:13:04 -- rpc/rpc.sh@34 -- # rpc_cmd --plugin rpc_plugin delete_malloc Malloc1 00:05:05.302 00:13:04 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:05.302 00:13:04 -- common/autotest_common.sh@10 -- # set +x 00:05:05.302 00:13:04 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:05.302 00:13:04 -- rpc/rpc.sh@35 -- # rpc_cmd bdev_get_bdevs 00:05:05.302 00:13:04 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:05.302 00:13:04 -- common/autotest_common.sh@10 -- # set +x 00:05:05.302 00:13:04 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:05.302 00:13:04 -- rpc/rpc.sh@35 -- # bdevs='[]' 00:05:05.302 00:13:04 -- rpc/rpc.sh@36 -- # jq length 00:05:05.562 00:13:04 -- rpc/rpc.sh@36 -- # '[' 0 == 0 ']' 00:05:05.562 00:05:05.562 real 0m0.137s 00:05:05.562 user 0m0.091s 00:05:05.562 sys 0m0.010s 00:05:05.562 00:13:04 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:05.562 00:13:04 -- common/autotest_common.sh@10 -- # set +x 00:05:05.562 ************************************ 00:05:05.562 END TEST rpc_plugins 00:05:05.562 ************************************ 00:05:05.562 00:13:04 -- rpc/rpc.sh@75 -- # run_test rpc_trace_cmd_test rpc_trace_cmd_test 00:05:05.562 00:13:04 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:05.562 00:13:04 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:05.562 00:13:04 -- common/autotest_common.sh@10 -- # set +x 00:05:05.562 ************************************ 00:05:05.562 START TEST rpc_trace_cmd_test 00:05:05.562 ************************************ 00:05:05.562 00:13:04 -- common/autotest_common.sh@1104 -- # rpc_trace_cmd_test 00:05:05.562 00:13:04 -- rpc/rpc.sh@40 -- # local info 00:05:05.562 00:13:04 -- rpc/rpc.sh@42 -- # rpc_cmd trace_get_info 00:05:05.562 00:13:04 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:05.562 00:13:04 -- common/autotest_common.sh@10 -- # set +x 00:05:05.562 00:13:04 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:05.562 00:13:04 -- rpc/rpc.sh@42 -- # info='{ 00:05:05.562 "tpoint_shm_path": "/dev/shm/spdk_tgt_trace.pid303254", 00:05:05.562 "tpoint_group_mask": "0x8", 00:05:05.562 "iscsi_conn": { 00:05:05.562 "mask": "0x2", 00:05:05.562 "tpoint_mask": "0x0" 00:05:05.562 }, 00:05:05.562 "scsi": { 00:05:05.562 "mask": "0x4", 00:05:05.562 "tpoint_mask": "0x0" 00:05:05.562 }, 00:05:05.562 "bdev": { 00:05:05.562 "mask": "0x8", 00:05:05.562 "tpoint_mask": "0xffffffffffffffff" 00:05:05.562 }, 00:05:05.562 "nvmf_rdma": { 00:05:05.562 "mask": "0x10", 00:05:05.562 "tpoint_mask": "0x0" 00:05:05.562 }, 00:05:05.562 "nvmf_tcp": { 00:05:05.562 "mask": "0x20", 00:05:05.562 "tpoint_mask": "0x0" 00:05:05.562 }, 00:05:05.562 "ftl": { 00:05:05.562 "mask": "0x40", 00:05:05.562 "tpoint_mask": "0x0" 00:05:05.562 }, 00:05:05.562 "blobfs": { 00:05:05.562 "mask": "0x80", 00:05:05.562 "tpoint_mask": "0x0" 00:05:05.562 }, 00:05:05.562 "dsa": { 00:05:05.562 "mask": "0x200", 00:05:05.562 "tpoint_mask": "0x0" 00:05:05.562 }, 00:05:05.562 "thread": { 00:05:05.562 "mask": "0x400", 00:05:05.562 "tpoint_mask": "0x0" 00:05:05.562 }, 00:05:05.562 "nvme_pcie": { 00:05:05.562 "mask": "0x800", 00:05:05.562 "tpoint_mask": "0x0" 00:05:05.562 }, 00:05:05.562 "iaa": { 00:05:05.562 "mask": "0x1000", 00:05:05.562 "tpoint_mask": "0x0" 00:05:05.562 }, 00:05:05.562 "nvme_tcp": { 00:05:05.562 "mask": "0x2000", 00:05:05.562 "tpoint_mask": "0x0" 00:05:05.562 }, 00:05:05.562 "bdev_nvme": { 00:05:05.562 "mask": "0x4000", 00:05:05.562 "tpoint_mask": "0x0" 00:05:05.562 } 00:05:05.562 }' 00:05:05.562 00:13:04 -- rpc/rpc.sh@43 -- # jq length 00:05:05.562 00:13:04 -- rpc/rpc.sh@43 -- # '[' 15 -gt 2 ']' 00:05:05.562 00:13:04 -- rpc/rpc.sh@44 -- # jq 'has("tpoint_group_mask")' 00:05:05.562 00:13:04 -- rpc/rpc.sh@44 -- # '[' true = true ']' 00:05:05.562 00:13:04 -- rpc/rpc.sh@45 -- # jq 'has("tpoint_shm_path")' 00:05:05.562 00:13:04 -- rpc/rpc.sh@45 -- # '[' true = true ']' 00:05:05.562 00:13:04 -- rpc/rpc.sh@46 -- # jq 'has("bdev")' 00:05:05.562 00:13:04 -- rpc/rpc.sh@46 -- # '[' true = true ']' 00:05:05.562 00:13:04 -- rpc/rpc.sh@47 -- # jq -r .bdev.tpoint_mask 00:05:05.822 00:13:04 -- rpc/rpc.sh@47 -- # '[' 0xffffffffffffffff '!=' 0x0 ']' 00:05:05.822 00:05:05.822 real 0m0.216s 00:05:05.822 user 0m0.171s 00:05:05.822 sys 0m0.036s 00:05:05.822 00:13:04 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:05.822 00:13:04 -- common/autotest_common.sh@10 -- # set +x 00:05:05.822 ************************************ 00:05:05.822 END TEST rpc_trace_cmd_test 00:05:05.822 ************************************ 00:05:05.822 00:13:04 -- rpc/rpc.sh@76 -- # [[ 0 -eq 1 ]] 00:05:05.822 00:13:04 -- rpc/rpc.sh@80 -- # rpc=rpc_cmd 00:05:05.822 00:13:04 -- rpc/rpc.sh@81 -- # run_test rpc_daemon_integrity rpc_integrity 00:05:05.822 00:13:04 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:05.822 00:13:04 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:05.822 00:13:04 -- common/autotest_common.sh@10 -- # set +x 00:05:05.822 ************************************ 00:05:05.822 START TEST rpc_daemon_integrity 00:05:05.822 ************************************ 00:05:05.822 00:13:04 -- common/autotest_common.sh@1104 -- # rpc_integrity 00:05:05.822 00:13:04 -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:05:05.822 00:13:04 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:05.822 00:13:04 -- common/autotest_common.sh@10 -- # set +x 00:05:05.822 00:13:04 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:05.822 00:13:04 -- rpc/rpc.sh@12 -- # bdevs='[]' 00:05:05.822 00:13:04 -- rpc/rpc.sh@13 -- # jq length 00:05:05.822 00:13:04 -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:05:05.822 00:13:04 -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:05:05.822 00:13:04 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:05.822 00:13:04 -- common/autotest_common.sh@10 -- # set +x 00:05:05.822 00:13:04 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:05.822 00:13:04 -- rpc/rpc.sh@15 -- # malloc=Malloc2 00:05:05.822 00:13:04 -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:05:05.822 00:13:04 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:05.822 00:13:04 -- common/autotest_common.sh@10 -- # set +x 00:05:05.822 00:13:04 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:05.822 00:13:04 -- rpc/rpc.sh@16 -- # bdevs='[ 00:05:05.822 { 00:05:05.822 "name": "Malloc2", 00:05:05.822 "aliases": [ 00:05:05.822 "bcc6b68e-502e-4b77-af7d-3157100321a2" 00:05:05.822 ], 00:05:05.822 "product_name": "Malloc disk", 00:05:05.822 "block_size": 512, 00:05:05.822 "num_blocks": 16384, 00:05:05.822 "uuid": "bcc6b68e-502e-4b77-af7d-3157100321a2", 00:05:05.822 "assigned_rate_limits": { 00:05:05.822 "rw_ios_per_sec": 0, 00:05:05.822 "rw_mbytes_per_sec": 0, 00:05:05.822 "r_mbytes_per_sec": 0, 00:05:05.822 "w_mbytes_per_sec": 0 00:05:05.822 }, 00:05:05.822 "claimed": false, 00:05:05.822 "zoned": false, 00:05:05.822 "supported_io_types": { 00:05:05.822 "read": true, 00:05:05.822 "write": true, 00:05:05.822 "unmap": true, 00:05:05.822 "write_zeroes": true, 00:05:05.822 "flush": true, 00:05:05.822 "reset": true, 00:05:05.822 "compare": false, 00:05:05.822 "compare_and_write": false, 00:05:05.822 "abort": true, 00:05:05.822 "nvme_admin": false, 00:05:05.822 "nvme_io": false 00:05:05.822 }, 00:05:05.822 "memory_domains": [ 00:05:05.822 { 00:05:05.822 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:05.822 "dma_device_type": 2 00:05:05.822 } 00:05:05.822 ], 00:05:05.822 "driver_specific": {} 00:05:05.822 } 00:05:05.822 ]' 00:05:05.822 00:13:04 -- rpc/rpc.sh@17 -- # jq length 00:05:05.822 00:13:04 -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:05:05.822 00:13:04 -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc2 -p Passthru0 00:05:05.822 00:13:04 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:05.822 00:13:04 -- common/autotest_common.sh@10 -- # set +x 00:05:05.822 [2024-07-15 00:13:04.790957] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc2 00:05:05.822 [2024-07-15 00:13:04.790990] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:05:05.822 [2024-07-15 00:13:04.791005] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x52f55e0 00:05:05.822 [2024-07-15 00:13:04.791015] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:05:05.822 [2024-07-15 00:13:04.791731] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:05:05.822 [2024-07-15 00:13:04.791750] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:05:05.822 Passthru0 00:05:05.822 00:13:04 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:05.822 00:13:04 -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:05:05.822 00:13:04 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:05.822 00:13:04 -- common/autotest_common.sh@10 -- # set +x 00:05:05.822 00:13:04 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:05.822 00:13:04 -- rpc/rpc.sh@20 -- # bdevs='[ 00:05:05.822 { 00:05:05.822 "name": "Malloc2", 00:05:05.822 "aliases": [ 00:05:05.822 "bcc6b68e-502e-4b77-af7d-3157100321a2" 00:05:05.822 ], 00:05:05.822 "product_name": "Malloc disk", 00:05:05.822 "block_size": 512, 00:05:05.822 "num_blocks": 16384, 00:05:05.822 "uuid": "bcc6b68e-502e-4b77-af7d-3157100321a2", 00:05:05.822 "assigned_rate_limits": { 00:05:05.822 "rw_ios_per_sec": 0, 00:05:05.822 "rw_mbytes_per_sec": 0, 00:05:05.822 "r_mbytes_per_sec": 0, 00:05:05.822 "w_mbytes_per_sec": 0 00:05:05.822 }, 00:05:05.822 "claimed": true, 00:05:05.822 "claim_type": "exclusive_write", 00:05:05.822 "zoned": false, 00:05:05.822 "supported_io_types": { 00:05:05.822 "read": true, 00:05:05.822 "write": true, 00:05:05.822 "unmap": true, 00:05:05.822 "write_zeroes": true, 00:05:05.822 "flush": true, 00:05:05.822 "reset": true, 00:05:05.822 "compare": false, 00:05:05.822 "compare_and_write": false, 00:05:05.822 "abort": true, 00:05:05.822 "nvme_admin": false, 00:05:05.822 "nvme_io": false 00:05:05.822 }, 00:05:05.822 "memory_domains": [ 00:05:05.822 { 00:05:05.822 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:05.822 "dma_device_type": 2 00:05:05.822 } 00:05:05.822 ], 00:05:05.822 "driver_specific": {} 00:05:05.822 }, 00:05:05.822 { 00:05:05.822 "name": "Passthru0", 00:05:05.822 "aliases": [ 00:05:05.822 "a4562526-dcef-59e9-ad8c-9f7bf241cd37" 00:05:05.822 ], 00:05:05.822 "product_name": "passthru", 00:05:05.822 "block_size": 512, 00:05:05.822 "num_blocks": 16384, 00:05:05.822 "uuid": "a4562526-dcef-59e9-ad8c-9f7bf241cd37", 00:05:05.822 "assigned_rate_limits": { 00:05:05.822 "rw_ios_per_sec": 0, 00:05:05.822 "rw_mbytes_per_sec": 0, 00:05:05.822 "r_mbytes_per_sec": 0, 00:05:05.822 "w_mbytes_per_sec": 0 00:05:05.822 }, 00:05:05.822 "claimed": false, 00:05:05.822 "zoned": false, 00:05:05.822 "supported_io_types": { 00:05:05.822 "read": true, 00:05:05.822 "write": true, 00:05:05.822 "unmap": true, 00:05:05.822 "write_zeroes": true, 00:05:05.822 "flush": true, 00:05:05.822 "reset": true, 00:05:05.822 "compare": false, 00:05:05.822 "compare_and_write": false, 00:05:05.822 "abort": true, 00:05:05.822 "nvme_admin": false, 00:05:05.822 "nvme_io": false 00:05:05.822 }, 00:05:05.822 "memory_domains": [ 00:05:05.822 { 00:05:05.822 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:05.822 "dma_device_type": 2 00:05:05.822 } 00:05:05.822 ], 00:05:05.822 "driver_specific": { 00:05:05.822 "passthru": { 00:05:05.822 "name": "Passthru0", 00:05:05.822 "base_bdev_name": "Malloc2" 00:05:05.822 } 00:05:05.822 } 00:05:05.822 } 00:05:05.822 ]' 00:05:05.822 00:13:04 -- rpc/rpc.sh@21 -- # jq length 00:05:05.822 00:13:04 -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:05:05.822 00:13:04 -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:05:05.822 00:13:04 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:05.822 00:13:04 -- common/autotest_common.sh@10 -- # set +x 00:05:06.082 00:13:04 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:06.082 00:13:04 -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc2 00:05:06.082 00:13:04 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:06.082 00:13:04 -- common/autotest_common.sh@10 -- # set +x 00:05:06.082 00:13:04 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:06.082 00:13:04 -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:05:06.082 00:13:04 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:06.082 00:13:04 -- common/autotest_common.sh@10 -- # set +x 00:05:06.082 00:13:04 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:06.082 00:13:04 -- rpc/rpc.sh@25 -- # bdevs='[]' 00:05:06.082 00:13:04 -- rpc/rpc.sh@26 -- # jq length 00:05:06.082 00:13:04 -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:05:06.082 00:05:06.082 real 0m0.276s 00:05:06.082 user 0m0.166s 00:05:06.082 sys 0m0.044s 00:05:06.082 00:13:04 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:06.082 00:13:04 -- common/autotest_common.sh@10 -- # set +x 00:05:06.082 ************************************ 00:05:06.082 END TEST rpc_daemon_integrity 00:05:06.082 ************************************ 00:05:06.082 00:13:04 -- rpc/rpc.sh@83 -- # trap - SIGINT SIGTERM EXIT 00:05:06.082 00:13:04 -- rpc/rpc.sh@84 -- # killprocess 303254 00:05:06.082 00:13:04 -- common/autotest_common.sh@926 -- # '[' -z 303254 ']' 00:05:06.082 00:13:04 -- common/autotest_common.sh@930 -- # kill -0 303254 00:05:06.082 00:13:04 -- common/autotest_common.sh@931 -- # uname 00:05:06.082 00:13:04 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:05:06.082 00:13:04 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 303254 00:05:06.082 00:13:05 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:05:06.082 00:13:05 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:05:06.082 00:13:05 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 303254' 00:05:06.082 killing process with pid 303254 00:05:06.082 00:13:05 -- common/autotest_common.sh@945 -- # kill 303254 00:05:06.082 00:13:05 -- common/autotest_common.sh@950 -- # wait 303254 00:05:06.342 00:05:06.342 real 0m2.372s 00:05:06.342 user 0m3.007s 00:05:06.342 sys 0m0.676s 00:05:06.342 00:13:05 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:06.342 00:13:05 -- common/autotest_common.sh@10 -- # set +x 00:05:06.342 ************************************ 00:05:06.342 END TEST rpc 00:05:06.342 ************************************ 00:05:06.342 00:13:05 -- spdk/autotest.sh@177 -- # run_test rpc_client /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:05:06.342 00:13:05 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:06.342 00:13:05 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:06.342 00:13:05 -- common/autotest_common.sh@10 -- # set +x 00:05:06.342 ************************************ 00:05:06.342 START TEST rpc_client 00:05:06.342 ************************************ 00:05:06.342 00:13:05 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:05:06.603 * Looking for test storage... 00:05:06.603 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client 00:05:06.603 00:13:05 -- rpc_client/rpc_client.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client_test 00:05:06.603 OK 00:05:06.603 00:13:05 -- rpc_client/rpc_client.sh@12 -- # trap - SIGINT SIGTERM EXIT 00:05:06.603 00:05:06.603 real 0m0.112s 00:05:06.603 user 0m0.041s 00:05:06.603 sys 0m0.080s 00:05:06.603 00:13:05 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:06.603 00:13:05 -- common/autotest_common.sh@10 -- # set +x 00:05:06.603 ************************************ 00:05:06.603 END TEST rpc_client 00:05:06.603 ************************************ 00:05:06.603 00:13:05 -- spdk/autotest.sh@178 -- # run_test json_config /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config.sh 00:05:06.603 00:13:05 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:06.603 00:13:05 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:06.603 00:13:05 -- common/autotest_common.sh@10 -- # set +x 00:05:06.603 ************************************ 00:05:06.603 START TEST json_config 00:05:06.603 ************************************ 00:05:06.603 00:13:05 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config.sh 00:05:06.603 00:13:05 -- json_config/json_config.sh@8 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:05:06.603 00:13:05 -- nvmf/common.sh@7 -- # uname -s 00:05:06.603 00:13:05 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:05:06.603 00:13:05 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:05:06.603 00:13:05 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:05:06.603 00:13:05 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:05:06.603 00:13:05 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:05:06.603 00:13:05 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:05:06.603 00:13:05 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:05:06.603 00:13:05 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:05:06.603 00:13:05 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:05:06.603 00:13:05 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:05:06.603 00:13:05 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:05:06.603 00:13:05 -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:05:06.603 00:13:05 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:05:06.603 00:13:05 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:05:06.603 00:13:05 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:05:06.603 00:13:05 -- nvmf/common.sh@44 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:05:06.603 00:13:05 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:05:06.603 00:13:05 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:05:06.603 00:13:05 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:05:06.603 00:13:05 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:06.603 00:13:05 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:06.603 00:13:05 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:06.603 00:13:05 -- paths/export.sh@5 -- # export PATH 00:05:06.603 00:13:05 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:06.603 00:13:05 -- nvmf/common.sh@46 -- # : 0 00:05:06.603 00:13:05 -- nvmf/common.sh@47 -- # export NVMF_APP_SHM_ID 00:05:06.603 00:13:05 -- nvmf/common.sh@48 -- # build_nvmf_app_args 00:05:06.603 00:13:05 -- nvmf/common.sh@24 -- # '[' 0 -eq 1 ']' 00:05:06.603 00:13:05 -- nvmf/common.sh@28 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:05:06.603 00:13:05 -- nvmf/common.sh@30 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:05:06.603 00:13:05 -- nvmf/common.sh@32 -- # '[' -n '' ']' 00:05:06.603 00:13:05 -- nvmf/common.sh@34 -- # '[' 0 -eq 1 ']' 00:05:06.603 00:13:05 -- nvmf/common.sh@50 -- # have_pci_nics=0 00:05:06.603 00:13:05 -- json_config/json_config.sh@10 -- # [[ 0 -eq 1 ]] 00:05:06.603 00:13:05 -- json_config/json_config.sh@14 -- # [[ 0 -ne 1 ]] 00:05:06.603 00:13:05 -- json_config/json_config.sh@14 -- # [[ 0 -eq 1 ]] 00:05:06.603 00:13:05 -- json_config/json_config.sh@25 -- # (( SPDK_TEST_BLOCKDEV + SPDK_TEST_ISCSI + SPDK_TEST_NVMF + SPDK_TEST_VHOST + SPDK_TEST_VHOST_INIT + SPDK_TEST_RBD == 0 )) 00:05:06.603 00:13:05 -- json_config/json_config.sh@26 -- # echo 'WARNING: No tests are enabled so not running JSON configuration tests' 00:05:06.603 WARNING: No tests are enabled so not running JSON configuration tests 00:05:06.603 00:13:05 -- json_config/json_config.sh@27 -- # exit 0 00:05:06.603 00:05:06.603 real 0m0.100s 00:05:06.603 user 0m0.044s 00:05:06.603 sys 0m0.057s 00:05:06.603 00:13:05 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:06.603 00:13:05 -- common/autotest_common.sh@10 -- # set +x 00:05:06.603 ************************************ 00:05:06.603 END TEST json_config 00:05:06.603 ************************************ 00:05:06.863 00:13:05 -- spdk/autotest.sh@179 -- # run_test json_config_extra_key /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:05:06.863 00:13:05 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:06.863 00:13:05 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:06.863 00:13:05 -- common/autotest_common.sh@10 -- # set +x 00:05:06.863 ************************************ 00:05:06.863 START TEST json_config_extra_key 00:05:06.863 ************************************ 00:05:06.863 00:13:05 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:05:06.863 00:13:05 -- json_config/json_config_extra_key.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:05:06.863 00:13:05 -- nvmf/common.sh@7 -- # uname -s 00:05:06.863 00:13:05 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:05:06.863 00:13:05 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:05:06.863 00:13:05 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:05:06.863 00:13:05 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:05:06.863 00:13:05 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:05:06.863 00:13:05 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:05:06.863 00:13:05 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:05:06.863 00:13:05 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:05:06.863 00:13:05 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:05:06.863 00:13:05 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:05:06.863 00:13:05 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:05:06.863 00:13:05 -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:05:06.863 00:13:05 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:05:06.863 00:13:05 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:05:06.863 00:13:05 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:05:06.863 00:13:05 -- nvmf/common.sh@44 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:05:06.863 00:13:05 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:05:06.863 00:13:05 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:05:06.863 00:13:05 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:05:06.863 00:13:05 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:06.863 00:13:05 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:06.863 00:13:05 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:06.863 00:13:05 -- paths/export.sh@5 -- # export PATH 00:05:06.863 00:13:05 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:06.863 00:13:05 -- nvmf/common.sh@46 -- # : 0 00:05:06.863 00:13:05 -- nvmf/common.sh@47 -- # export NVMF_APP_SHM_ID 00:05:06.863 00:13:05 -- nvmf/common.sh@48 -- # build_nvmf_app_args 00:05:06.863 00:13:05 -- nvmf/common.sh@24 -- # '[' 0 -eq 1 ']' 00:05:06.863 00:13:05 -- nvmf/common.sh@28 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:05:06.863 00:13:05 -- nvmf/common.sh@30 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:05:06.863 00:13:05 -- nvmf/common.sh@32 -- # '[' -n '' ']' 00:05:06.863 00:13:05 -- nvmf/common.sh@34 -- # '[' 0 -eq 1 ']' 00:05:06.863 00:13:05 -- nvmf/common.sh@50 -- # have_pci_nics=0 00:05:06.863 00:13:05 -- json_config/json_config_extra_key.sh@16 -- # app_pid=(['target']='') 00:05:06.863 00:13:05 -- json_config/json_config_extra_key.sh@16 -- # declare -A app_pid 00:05:06.863 00:13:05 -- json_config/json_config_extra_key.sh@17 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock') 00:05:06.863 00:13:05 -- json_config/json_config_extra_key.sh@17 -- # declare -A app_socket 00:05:06.863 00:13:05 -- json_config/json_config_extra_key.sh@18 -- # app_params=(['target']='-m 0x1 -s 1024') 00:05:06.863 00:13:05 -- json_config/json_config_extra_key.sh@18 -- # declare -A app_params 00:05:06.863 00:13:05 -- json_config/json_config_extra_key.sh@19 -- # configs_path=(['target']='/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json') 00:05:06.864 00:13:05 -- json_config/json_config_extra_key.sh@19 -- # declare -A configs_path 00:05:06.864 00:13:05 -- json_config/json_config_extra_key.sh@74 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:05:06.864 00:13:05 -- json_config/json_config_extra_key.sh@76 -- # echo 'INFO: launching applications...' 00:05:06.864 INFO: launching applications... 00:05:06.864 00:13:05 -- json_config/json_config_extra_key.sh@77 -- # json_config_test_start_app target --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json 00:05:06.864 00:13:05 -- json_config/json_config_extra_key.sh@24 -- # local app=target 00:05:06.864 00:13:05 -- json_config/json_config_extra_key.sh@25 -- # shift 00:05:06.864 00:13:05 -- json_config/json_config_extra_key.sh@27 -- # [[ -n 22 ]] 00:05:06.864 00:13:05 -- json_config/json_config_extra_key.sh@28 -- # [[ -z '' ]] 00:05:06.864 00:13:05 -- json_config/json_config_extra_key.sh@31 -- # app_pid[$app]=303851 00:05:06.864 00:13:05 -- json_config/json_config_extra_key.sh@30 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json 00:05:06.864 00:13:05 -- json_config/json_config_extra_key.sh@33 -- # echo 'Waiting for target to run...' 00:05:06.864 Waiting for target to run... 00:05:06.864 00:13:05 -- json_config/json_config_extra_key.sh@34 -- # waitforlisten 303851 /var/tmp/spdk_tgt.sock 00:05:06.864 00:13:05 -- common/autotest_common.sh@819 -- # '[' -z 303851 ']' 00:05:06.864 00:13:05 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:05:06.864 00:13:05 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:06.864 00:13:05 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:05:06.864 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:05:06.864 00:13:05 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:06.864 00:13:05 -- common/autotest_common.sh@10 -- # set +x 00:05:06.864 [2024-07-15 00:13:05.803297] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:06.864 [2024-07-15 00:13:05.803363] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid303851 ] 00:05:06.864 EAL: No free 2048 kB hugepages reported on node 1 00:05:07.124 [2024-07-15 00:13:06.085172] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:07.124 [2024-07-15 00:13:06.146394] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:07.124 [2024-07-15 00:13:06.146507] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:07.692 00:13:06 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:07.692 00:13:06 -- common/autotest_common.sh@852 -- # return 0 00:05:07.692 00:13:06 -- json_config/json_config_extra_key.sh@35 -- # echo '' 00:05:07.692 00:05:07.692 00:13:06 -- json_config/json_config_extra_key.sh@79 -- # echo 'INFO: shutting down applications...' 00:05:07.692 INFO: shutting down applications... 00:05:07.692 00:13:06 -- json_config/json_config_extra_key.sh@80 -- # json_config_test_shutdown_app target 00:05:07.692 00:13:06 -- json_config/json_config_extra_key.sh@40 -- # local app=target 00:05:07.692 00:13:06 -- json_config/json_config_extra_key.sh@43 -- # [[ -n 22 ]] 00:05:07.692 00:13:06 -- json_config/json_config_extra_key.sh@44 -- # [[ -n 303851 ]] 00:05:07.692 00:13:06 -- json_config/json_config_extra_key.sh@47 -- # kill -SIGINT 303851 00:05:07.692 00:13:06 -- json_config/json_config_extra_key.sh@49 -- # (( i = 0 )) 00:05:07.692 00:13:06 -- json_config/json_config_extra_key.sh@49 -- # (( i < 30 )) 00:05:07.692 00:13:06 -- json_config/json_config_extra_key.sh@50 -- # kill -0 303851 00:05:07.692 00:13:06 -- json_config/json_config_extra_key.sh@54 -- # sleep 0.5 00:05:08.261 00:13:07 -- json_config/json_config_extra_key.sh@49 -- # (( i++ )) 00:05:08.261 00:13:07 -- json_config/json_config_extra_key.sh@49 -- # (( i < 30 )) 00:05:08.261 00:13:07 -- json_config/json_config_extra_key.sh@50 -- # kill -0 303851 00:05:08.261 00:13:07 -- json_config/json_config_extra_key.sh@51 -- # app_pid[$app]= 00:05:08.261 00:13:07 -- json_config/json_config_extra_key.sh@52 -- # break 00:05:08.261 00:13:07 -- json_config/json_config_extra_key.sh@57 -- # [[ -n '' ]] 00:05:08.261 00:13:07 -- json_config/json_config_extra_key.sh@62 -- # echo 'SPDK target shutdown done' 00:05:08.261 SPDK target shutdown done 00:05:08.261 00:13:07 -- json_config/json_config_extra_key.sh@82 -- # echo Success 00:05:08.261 Success 00:05:08.261 00:05:08.261 real 0m1.426s 00:05:08.261 user 0m1.176s 00:05:08.261 sys 0m0.362s 00:05:08.261 00:13:07 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:08.261 00:13:07 -- common/autotest_common.sh@10 -- # set +x 00:05:08.261 ************************************ 00:05:08.261 END TEST json_config_extra_key 00:05:08.261 ************************************ 00:05:08.261 00:13:07 -- spdk/autotest.sh@180 -- # run_test alias_rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:05:08.261 00:13:07 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:08.261 00:13:07 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:08.261 00:13:07 -- common/autotest_common.sh@10 -- # set +x 00:05:08.261 ************************************ 00:05:08.261 START TEST alias_rpc 00:05:08.261 ************************************ 00:05:08.261 00:13:07 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:05:08.261 * Looking for test storage... 00:05:08.261 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc 00:05:08.261 00:13:07 -- alias_rpc/alias_rpc.sh@10 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:05:08.261 00:13:07 -- alias_rpc/alias_rpc.sh@13 -- # spdk_tgt_pid=304121 00:05:08.261 00:13:07 -- alias_rpc/alias_rpc.sh@14 -- # waitforlisten 304121 00:05:08.261 00:13:07 -- alias_rpc/alias_rpc.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:05:08.261 00:13:07 -- common/autotest_common.sh@819 -- # '[' -z 304121 ']' 00:05:08.261 00:13:07 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:08.261 00:13:07 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:08.261 00:13:07 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:08.261 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:08.261 00:13:07 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:08.261 00:13:07 -- common/autotest_common.sh@10 -- # set +x 00:05:08.261 [2024-07-15 00:13:07.256065] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:08.261 [2024-07-15 00:13:07.256123] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid304121 ] 00:05:08.261 EAL: No free 2048 kB hugepages reported on node 1 00:05:08.520 [2024-07-15 00:13:07.322038] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:08.520 [2024-07-15 00:13:07.392966] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:08.520 [2024-07-15 00:13:07.393090] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:09.087 00:13:08 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:09.087 00:13:08 -- common/autotest_common.sh@852 -- # return 0 00:05:09.087 00:13:08 -- alias_rpc/alias_rpc.sh@17 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py load_config -i 00:05:09.346 00:13:08 -- alias_rpc/alias_rpc.sh@19 -- # killprocess 304121 00:05:09.346 00:13:08 -- common/autotest_common.sh@926 -- # '[' -z 304121 ']' 00:05:09.346 00:13:08 -- common/autotest_common.sh@930 -- # kill -0 304121 00:05:09.346 00:13:08 -- common/autotest_common.sh@931 -- # uname 00:05:09.346 00:13:08 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:05:09.346 00:13:08 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 304121 00:05:09.346 00:13:08 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:05:09.346 00:13:08 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:05:09.346 00:13:08 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 304121' 00:05:09.346 killing process with pid 304121 00:05:09.346 00:13:08 -- common/autotest_common.sh@945 -- # kill 304121 00:05:09.346 00:13:08 -- common/autotest_common.sh@950 -- # wait 304121 00:05:09.605 00:05:09.605 real 0m1.440s 00:05:09.605 user 0m1.551s 00:05:09.605 sys 0m0.405s 00:05:09.605 00:13:08 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:09.605 00:13:08 -- common/autotest_common.sh@10 -- # set +x 00:05:09.605 ************************************ 00:05:09.605 END TEST alias_rpc 00:05:09.605 ************************************ 00:05:09.605 00:13:08 -- spdk/autotest.sh@182 -- # [[ 0 -eq 0 ]] 00:05:09.605 00:13:08 -- spdk/autotest.sh@183 -- # run_test spdkcli_tcp /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/tcp.sh 00:05:09.605 00:13:08 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:09.605 00:13:08 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:09.605 00:13:08 -- common/autotest_common.sh@10 -- # set +x 00:05:09.605 ************************************ 00:05:09.605 START TEST spdkcli_tcp 00:05:09.605 ************************************ 00:05:09.605 00:13:08 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/tcp.sh 00:05:09.864 * Looking for test storage... 00:05:09.864 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli 00:05:09.864 00:13:08 -- spdkcli/tcp.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/common.sh 00:05:09.864 00:13:08 -- spdkcli/common.sh@6 -- # spdkcli_job=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/spdkcli_job.py 00:05:09.865 00:13:08 -- spdkcli/common.sh@7 -- # spdk_clear_config_py=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/clear_config.py 00:05:09.865 00:13:08 -- spdkcli/tcp.sh@18 -- # IP_ADDRESS=127.0.0.1 00:05:09.865 00:13:08 -- spdkcli/tcp.sh@19 -- # PORT=9998 00:05:09.865 00:13:08 -- spdkcli/tcp.sh@21 -- # trap 'err_cleanup; exit 1' SIGINT SIGTERM EXIT 00:05:09.865 00:13:08 -- spdkcli/tcp.sh@23 -- # timing_enter run_spdk_tgt_tcp 00:05:09.865 00:13:08 -- common/autotest_common.sh@712 -- # xtrace_disable 00:05:09.865 00:13:08 -- common/autotest_common.sh@10 -- # set +x 00:05:09.865 00:13:08 -- spdkcli/tcp.sh@25 -- # spdk_tgt_pid=304436 00:05:09.865 00:13:08 -- spdkcli/tcp.sh@27 -- # waitforlisten 304436 00:05:09.865 00:13:08 -- spdkcli/tcp.sh@24 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x3 -p 0 00:05:09.865 00:13:08 -- common/autotest_common.sh@819 -- # '[' -z 304436 ']' 00:05:09.865 00:13:08 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:09.865 00:13:08 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:09.865 00:13:08 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:09.865 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:09.865 00:13:08 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:09.865 00:13:08 -- common/autotest_common.sh@10 -- # set +x 00:05:09.865 [2024-07-15 00:13:08.781806] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:09.865 [2024-07-15 00:13:08.781876] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid304436 ] 00:05:09.865 EAL: No free 2048 kB hugepages reported on node 1 00:05:09.865 [2024-07-15 00:13:08.850177] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:10.123 [2024-07-15 00:13:08.927940] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:10.123 [2024-07-15 00:13:08.928064] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:10.123 [2024-07-15 00:13:08.928067] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:10.691 00:13:09 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:10.691 00:13:09 -- common/autotest_common.sh@852 -- # return 0 00:05:10.691 00:13:09 -- spdkcli/tcp.sh@31 -- # socat_pid=304681 00:05:10.691 00:13:09 -- spdkcli/tcp.sh@33 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -r 100 -t 2 -s 127.0.0.1 -p 9998 rpc_get_methods 00:05:10.691 00:13:09 -- spdkcli/tcp.sh@30 -- # socat TCP-LISTEN:9998 UNIX-CONNECT:/var/tmp/spdk.sock 00:05:10.691 [ 00:05:10.691 "spdk_get_version", 00:05:10.691 "rpc_get_methods", 00:05:10.691 "trace_get_info", 00:05:10.691 "trace_get_tpoint_group_mask", 00:05:10.691 "trace_disable_tpoint_group", 00:05:10.691 "trace_enable_tpoint_group", 00:05:10.691 "trace_clear_tpoint_mask", 00:05:10.691 "trace_set_tpoint_mask", 00:05:10.691 "vfu_tgt_set_base_path", 00:05:10.691 "framework_get_pci_devices", 00:05:10.691 "framework_get_config", 00:05:10.691 "framework_get_subsystems", 00:05:10.691 "iobuf_get_stats", 00:05:10.691 "iobuf_set_options", 00:05:10.691 "sock_set_default_impl", 00:05:10.691 "sock_impl_set_options", 00:05:10.691 "sock_impl_get_options", 00:05:10.691 "vmd_rescan", 00:05:10.691 "vmd_remove_device", 00:05:10.691 "vmd_enable", 00:05:10.691 "accel_get_stats", 00:05:10.691 "accel_set_options", 00:05:10.691 "accel_set_driver", 00:05:10.691 "accel_crypto_key_destroy", 00:05:10.691 "accel_crypto_keys_get", 00:05:10.691 "accel_crypto_key_create", 00:05:10.691 "accel_assign_opc", 00:05:10.691 "accel_get_module_info", 00:05:10.691 "accel_get_opc_assignments", 00:05:10.691 "notify_get_notifications", 00:05:10.691 "notify_get_types", 00:05:10.691 "bdev_get_histogram", 00:05:10.691 "bdev_enable_histogram", 00:05:10.691 "bdev_set_qos_limit", 00:05:10.691 "bdev_set_qd_sampling_period", 00:05:10.691 "bdev_get_bdevs", 00:05:10.691 "bdev_reset_iostat", 00:05:10.691 "bdev_get_iostat", 00:05:10.691 "bdev_examine", 00:05:10.691 "bdev_wait_for_examine", 00:05:10.691 "bdev_set_options", 00:05:10.691 "scsi_get_devices", 00:05:10.691 "thread_set_cpumask", 00:05:10.691 "framework_get_scheduler", 00:05:10.691 "framework_set_scheduler", 00:05:10.691 "framework_get_reactors", 00:05:10.691 "thread_get_io_channels", 00:05:10.691 "thread_get_pollers", 00:05:10.691 "thread_get_stats", 00:05:10.691 "framework_monitor_context_switch", 00:05:10.691 "spdk_kill_instance", 00:05:10.691 "log_enable_timestamps", 00:05:10.691 "log_get_flags", 00:05:10.691 "log_clear_flag", 00:05:10.691 "log_set_flag", 00:05:10.691 "log_get_level", 00:05:10.691 "log_set_level", 00:05:10.691 "log_get_print_level", 00:05:10.691 "log_set_print_level", 00:05:10.691 "framework_enable_cpumask_locks", 00:05:10.691 "framework_disable_cpumask_locks", 00:05:10.691 "framework_wait_init", 00:05:10.691 "framework_start_init", 00:05:10.691 "virtio_blk_create_transport", 00:05:10.691 "virtio_blk_get_transports", 00:05:10.691 "vhost_controller_set_coalescing", 00:05:10.691 "vhost_get_controllers", 00:05:10.691 "vhost_delete_controller", 00:05:10.691 "vhost_create_blk_controller", 00:05:10.691 "vhost_scsi_controller_remove_target", 00:05:10.691 "vhost_scsi_controller_add_target", 00:05:10.691 "vhost_start_scsi_controller", 00:05:10.691 "vhost_create_scsi_controller", 00:05:10.691 "ublk_recover_disk", 00:05:10.691 "ublk_get_disks", 00:05:10.691 "ublk_stop_disk", 00:05:10.691 "ublk_start_disk", 00:05:10.691 "ublk_destroy_target", 00:05:10.691 "ublk_create_target", 00:05:10.691 "nbd_get_disks", 00:05:10.691 "nbd_stop_disk", 00:05:10.691 "nbd_start_disk", 00:05:10.691 "env_dpdk_get_mem_stats", 00:05:10.691 "nvmf_subsystem_get_listeners", 00:05:10.691 "nvmf_subsystem_get_qpairs", 00:05:10.691 "nvmf_subsystem_get_controllers", 00:05:10.691 "nvmf_get_stats", 00:05:10.691 "nvmf_get_transports", 00:05:10.691 "nvmf_create_transport", 00:05:10.691 "nvmf_get_targets", 00:05:10.691 "nvmf_delete_target", 00:05:10.691 "nvmf_create_target", 00:05:10.691 "nvmf_subsystem_allow_any_host", 00:05:10.691 "nvmf_subsystem_remove_host", 00:05:10.691 "nvmf_subsystem_add_host", 00:05:10.691 "nvmf_subsystem_remove_ns", 00:05:10.691 "nvmf_subsystem_add_ns", 00:05:10.691 "nvmf_subsystem_listener_set_ana_state", 00:05:10.691 "nvmf_discovery_get_referrals", 00:05:10.691 "nvmf_discovery_remove_referral", 00:05:10.691 "nvmf_discovery_add_referral", 00:05:10.691 "nvmf_subsystem_remove_listener", 00:05:10.691 "nvmf_subsystem_add_listener", 00:05:10.691 "nvmf_delete_subsystem", 00:05:10.691 "nvmf_create_subsystem", 00:05:10.691 "nvmf_get_subsystems", 00:05:10.691 "nvmf_set_crdt", 00:05:10.691 "nvmf_set_config", 00:05:10.691 "nvmf_set_max_subsystems", 00:05:10.691 "iscsi_set_options", 00:05:10.691 "iscsi_get_auth_groups", 00:05:10.691 "iscsi_auth_group_remove_secret", 00:05:10.691 "iscsi_auth_group_add_secret", 00:05:10.691 "iscsi_delete_auth_group", 00:05:10.691 "iscsi_create_auth_group", 00:05:10.691 "iscsi_set_discovery_auth", 00:05:10.691 "iscsi_get_options", 00:05:10.691 "iscsi_target_node_request_logout", 00:05:10.691 "iscsi_target_node_set_redirect", 00:05:10.691 "iscsi_target_node_set_auth", 00:05:10.691 "iscsi_target_node_add_lun", 00:05:10.691 "iscsi_get_connections", 00:05:10.691 "iscsi_portal_group_set_auth", 00:05:10.691 "iscsi_start_portal_group", 00:05:10.691 "iscsi_delete_portal_group", 00:05:10.691 "iscsi_create_portal_group", 00:05:10.691 "iscsi_get_portal_groups", 00:05:10.691 "iscsi_delete_target_node", 00:05:10.691 "iscsi_target_node_remove_pg_ig_maps", 00:05:10.691 "iscsi_target_node_add_pg_ig_maps", 00:05:10.691 "iscsi_create_target_node", 00:05:10.691 "iscsi_get_target_nodes", 00:05:10.691 "iscsi_delete_initiator_group", 00:05:10.691 "iscsi_initiator_group_remove_initiators", 00:05:10.691 "iscsi_initiator_group_add_initiators", 00:05:10.691 "iscsi_create_initiator_group", 00:05:10.691 "iscsi_get_initiator_groups", 00:05:10.691 "vfu_virtio_create_scsi_endpoint", 00:05:10.691 "vfu_virtio_scsi_remove_target", 00:05:10.691 "vfu_virtio_scsi_add_target", 00:05:10.691 "vfu_virtio_create_blk_endpoint", 00:05:10.691 "vfu_virtio_delete_endpoint", 00:05:10.691 "iaa_scan_accel_module", 00:05:10.691 "dsa_scan_accel_module", 00:05:10.691 "ioat_scan_accel_module", 00:05:10.691 "accel_error_inject_error", 00:05:10.691 "bdev_iscsi_delete", 00:05:10.691 "bdev_iscsi_create", 00:05:10.691 "bdev_iscsi_set_options", 00:05:10.691 "bdev_virtio_attach_controller", 00:05:10.691 "bdev_virtio_scsi_get_devices", 00:05:10.691 "bdev_virtio_detach_controller", 00:05:10.691 "bdev_virtio_blk_set_hotplug", 00:05:10.691 "bdev_ftl_set_property", 00:05:10.691 "bdev_ftl_get_properties", 00:05:10.691 "bdev_ftl_get_stats", 00:05:10.691 "bdev_ftl_unmap", 00:05:10.691 "bdev_ftl_unload", 00:05:10.691 "bdev_ftl_delete", 00:05:10.691 "bdev_ftl_load", 00:05:10.691 "bdev_ftl_create", 00:05:10.691 "bdev_aio_delete", 00:05:10.691 "bdev_aio_rescan", 00:05:10.691 "bdev_aio_create", 00:05:10.691 "blobfs_create", 00:05:10.691 "blobfs_detect", 00:05:10.691 "blobfs_set_cache_size", 00:05:10.691 "bdev_zone_block_delete", 00:05:10.691 "bdev_zone_block_create", 00:05:10.691 "bdev_delay_delete", 00:05:10.691 "bdev_delay_create", 00:05:10.691 "bdev_delay_update_latency", 00:05:10.691 "bdev_split_delete", 00:05:10.691 "bdev_split_create", 00:05:10.691 "bdev_error_inject_error", 00:05:10.691 "bdev_error_delete", 00:05:10.691 "bdev_error_create", 00:05:10.691 "bdev_raid_set_options", 00:05:10.691 "bdev_raid_remove_base_bdev", 00:05:10.691 "bdev_raid_add_base_bdev", 00:05:10.691 "bdev_raid_delete", 00:05:10.691 "bdev_raid_create", 00:05:10.691 "bdev_raid_get_bdevs", 00:05:10.691 "bdev_lvol_grow_lvstore", 00:05:10.691 "bdev_lvol_get_lvols", 00:05:10.691 "bdev_lvol_get_lvstores", 00:05:10.691 "bdev_lvol_delete", 00:05:10.691 "bdev_lvol_set_read_only", 00:05:10.691 "bdev_lvol_resize", 00:05:10.691 "bdev_lvol_decouple_parent", 00:05:10.691 "bdev_lvol_inflate", 00:05:10.691 "bdev_lvol_rename", 00:05:10.691 "bdev_lvol_clone_bdev", 00:05:10.691 "bdev_lvol_clone", 00:05:10.691 "bdev_lvol_snapshot", 00:05:10.691 "bdev_lvol_create", 00:05:10.692 "bdev_lvol_delete_lvstore", 00:05:10.692 "bdev_lvol_rename_lvstore", 00:05:10.692 "bdev_lvol_create_lvstore", 00:05:10.692 "bdev_passthru_delete", 00:05:10.692 "bdev_passthru_create", 00:05:10.692 "bdev_nvme_cuse_unregister", 00:05:10.692 "bdev_nvme_cuse_register", 00:05:10.692 "bdev_opal_new_user", 00:05:10.692 "bdev_opal_set_lock_state", 00:05:10.692 "bdev_opal_delete", 00:05:10.692 "bdev_opal_get_info", 00:05:10.692 "bdev_opal_create", 00:05:10.692 "bdev_nvme_opal_revert", 00:05:10.692 "bdev_nvme_opal_init", 00:05:10.692 "bdev_nvme_send_cmd", 00:05:10.692 "bdev_nvme_get_path_iostat", 00:05:10.692 "bdev_nvme_get_mdns_discovery_info", 00:05:10.692 "bdev_nvme_stop_mdns_discovery", 00:05:10.692 "bdev_nvme_start_mdns_discovery", 00:05:10.692 "bdev_nvme_set_multipath_policy", 00:05:10.692 "bdev_nvme_set_preferred_path", 00:05:10.692 "bdev_nvme_get_io_paths", 00:05:10.692 "bdev_nvme_remove_error_injection", 00:05:10.692 "bdev_nvme_add_error_injection", 00:05:10.692 "bdev_nvme_get_discovery_info", 00:05:10.692 "bdev_nvme_stop_discovery", 00:05:10.692 "bdev_nvme_start_discovery", 00:05:10.692 "bdev_nvme_get_controller_health_info", 00:05:10.692 "bdev_nvme_disable_controller", 00:05:10.692 "bdev_nvme_enable_controller", 00:05:10.692 "bdev_nvme_reset_controller", 00:05:10.692 "bdev_nvme_get_transport_statistics", 00:05:10.692 "bdev_nvme_apply_firmware", 00:05:10.692 "bdev_nvme_detach_controller", 00:05:10.692 "bdev_nvme_get_controllers", 00:05:10.692 "bdev_nvme_attach_controller", 00:05:10.692 "bdev_nvme_set_hotplug", 00:05:10.692 "bdev_nvme_set_options", 00:05:10.692 "bdev_null_resize", 00:05:10.692 "bdev_null_delete", 00:05:10.692 "bdev_null_create", 00:05:10.692 "bdev_malloc_delete", 00:05:10.692 "bdev_malloc_create" 00:05:10.692 ] 00:05:10.950 00:13:09 -- spdkcli/tcp.sh@35 -- # timing_exit run_spdk_tgt_tcp 00:05:10.950 00:13:09 -- common/autotest_common.sh@718 -- # xtrace_disable 00:05:10.950 00:13:09 -- common/autotest_common.sh@10 -- # set +x 00:05:10.950 00:13:09 -- spdkcli/tcp.sh@37 -- # trap - SIGINT SIGTERM EXIT 00:05:10.950 00:13:09 -- spdkcli/tcp.sh@38 -- # killprocess 304436 00:05:10.950 00:13:09 -- common/autotest_common.sh@926 -- # '[' -z 304436 ']' 00:05:10.950 00:13:09 -- common/autotest_common.sh@930 -- # kill -0 304436 00:05:10.950 00:13:09 -- common/autotest_common.sh@931 -- # uname 00:05:10.950 00:13:09 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:05:10.950 00:13:09 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 304436 00:05:10.950 00:13:09 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:05:10.950 00:13:09 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:05:10.950 00:13:09 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 304436' 00:05:10.950 killing process with pid 304436 00:05:10.950 00:13:09 -- common/autotest_common.sh@945 -- # kill 304436 00:05:10.950 00:13:09 -- common/autotest_common.sh@950 -- # wait 304436 00:05:11.209 00:05:11.209 real 0m1.513s 00:05:11.209 user 0m2.793s 00:05:11.209 sys 0m0.471s 00:05:11.209 00:13:10 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:11.209 00:13:10 -- common/autotest_common.sh@10 -- # set +x 00:05:11.209 ************************************ 00:05:11.209 END TEST spdkcli_tcp 00:05:11.209 ************************************ 00:05:11.209 00:13:10 -- spdk/autotest.sh@186 -- # run_test dpdk_mem_utility /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:05:11.209 00:13:10 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:11.209 00:13:10 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:11.209 00:13:10 -- common/autotest_common.sh@10 -- # set +x 00:05:11.209 ************************************ 00:05:11.209 START TEST dpdk_mem_utility 00:05:11.209 ************************************ 00:05:11.209 00:13:10 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:05:11.467 * Looking for test storage... 00:05:11.467 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility 00:05:11.467 00:13:10 -- dpdk_memory_utility/test_dpdk_mem_info.sh@10 -- # MEM_SCRIPT=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:05:11.467 00:13:10 -- dpdk_memory_utility/test_dpdk_mem_info.sh@13 -- # spdkpid=304774 00:05:11.467 00:13:10 -- dpdk_memory_utility/test_dpdk_mem_info.sh@15 -- # waitforlisten 304774 00:05:11.467 00:13:10 -- dpdk_memory_utility/test_dpdk_mem_info.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:05:11.467 00:13:10 -- common/autotest_common.sh@819 -- # '[' -z 304774 ']' 00:05:11.467 00:13:10 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:11.467 00:13:10 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:11.467 00:13:10 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:11.467 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:11.467 00:13:10 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:11.467 00:13:10 -- common/autotest_common.sh@10 -- # set +x 00:05:11.467 [2024-07-15 00:13:10.342617] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:11.467 [2024-07-15 00:13:10.342706] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid304774 ] 00:05:11.467 EAL: No free 2048 kB hugepages reported on node 1 00:05:11.467 [2024-07-15 00:13:10.411854] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:11.467 [2024-07-15 00:13:10.483755] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:11.467 [2024-07-15 00:13:10.483874] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:12.403 00:13:11 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:12.403 00:13:11 -- common/autotest_common.sh@852 -- # return 0 00:05:12.403 00:13:11 -- dpdk_memory_utility/test_dpdk_mem_info.sh@17 -- # trap 'killprocess $spdkpid' SIGINT SIGTERM EXIT 00:05:12.403 00:13:11 -- dpdk_memory_utility/test_dpdk_mem_info.sh@19 -- # rpc_cmd env_dpdk_get_mem_stats 00:05:12.403 00:13:11 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:12.403 00:13:11 -- common/autotest_common.sh@10 -- # set +x 00:05:12.403 { 00:05:12.403 "filename": "/tmp/spdk_mem_dump.txt" 00:05:12.403 } 00:05:12.403 00:13:11 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:12.403 00:13:11 -- dpdk_memory_utility/test_dpdk_mem_info.sh@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:05:12.403 DPDK memory size 814.000000 MiB in 1 heap(s) 00:05:12.403 1 heaps totaling size 814.000000 MiB 00:05:12.403 size: 814.000000 MiB heap id: 0 00:05:12.403 end heaps---------- 00:05:12.403 8 mempools totaling size 598.116089 MiB 00:05:12.403 size: 212.674988 MiB name: PDU_immediate_data_Pool 00:05:12.403 size: 158.602051 MiB name: PDU_data_out_Pool 00:05:12.403 size: 84.521057 MiB name: bdev_io_304774 00:05:12.403 size: 51.011292 MiB name: evtpool_304774 00:05:12.403 size: 50.003479 MiB name: msgpool_304774 00:05:12.403 size: 21.763794 MiB name: PDU_Pool 00:05:12.403 size: 19.513306 MiB name: SCSI_TASK_Pool 00:05:12.403 size: 0.026123 MiB name: Session_Pool 00:05:12.403 end mempools------- 00:05:12.403 6 memzones totaling size 4.142822 MiB 00:05:12.403 size: 1.000366 MiB name: RG_ring_0_304774 00:05:12.403 size: 1.000366 MiB name: RG_ring_1_304774 00:05:12.404 size: 1.000366 MiB name: RG_ring_4_304774 00:05:12.404 size: 1.000366 MiB name: RG_ring_5_304774 00:05:12.404 size: 0.125366 MiB name: RG_ring_2_304774 00:05:12.404 size: 0.015991 MiB name: RG_ring_3_304774 00:05:12.404 end memzones------- 00:05:12.404 00:13:11 -- dpdk_memory_utility/test_dpdk_mem_info.sh@23 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py -m 0 00:05:12.404 heap id: 0 total size: 814.000000 MiB number of busy elements: 41 number of free elements: 15 00:05:12.404 list of free elements. size: 12.519348 MiB 00:05:12.404 element at address: 0x200000400000 with size: 1.999512 MiB 00:05:12.404 element at address: 0x200018e00000 with size: 0.999878 MiB 00:05:12.404 element at address: 0x200019000000 with size: 0.999878 MiB 00:05:12.404 element at address: 0x200003e00000 with size: 0.996277 MiB 00:05:12.404 element at address: 0x200031c00000 with size: 0.994446 MiB 00:05:12.404 element at address: 0x200013800000 with size: 0.978699 MiB 00:05:12.404 element at address: 0x200007000000 with size: 0.959839 MiB 00:05:12.404 element at address: 0x200019200000 with size: 0.936584 MiB 00:05:12.404 element at address: 0x200000200000 with size: 0.841614 MiB 00:05:12.404 element at address: 0x20001aa00000 with size: 0.582886 MiB 00:05:12.404 element at address: 0x20000b200000 with size: 0.490723 MiB 00:05:12.404 element at address: 0x200000800000 with size: 0.487793 MiB 00:05:12.404 element at address: 0x200019400000 with size: 0.485657 MiB 00:05:12.404 element at address: 0x200027e00000 with size: 0.410034 MiB 00:05:12.404 element at address: 0x200003a00000 with size: 0.355530 MiB 00:05:12.404 list of standard malloc elements. size: 199.218079 MiB 00:05:12.404 element at address: 0x20000b3fff80 with size: 132.000122 MiB 00:05:12.404 element at address: 0x2000071fff80 with size: 64.000122 MiB 00:05:12.404 element at address: 0x200018efff80 with size: 1.000122 MiB 00:05:12.404 element at address: 0x2000190fff80 with size: 1.000122 MiB 00:05:12.404 element at address: 0x2000192fff80 with size: 1.000122 MiB 00:05:12.404 element at address: 0x2000003d9f00 with size: 0.140747 MiB 00:05:12.404 element at address: 0x2000192eff00 with size: 0.062622 MiB 00:05:12.404 element at address: 0x2000003fdf80 with size: 0.007935 MiB 00:05:12.404 element at address: 0x2000192efdc0 with size: 0.000305 MiB 00:05:12.404 element at address: 0x2000002d7740 with size: 0.000183 MiB 00:05:12.404 element at address: 0x2000002d7800 with size: 0.000183 MiB 00:05:12.404 element at address: 0x2000002d78c0 with size: 0.000183 MiB 00:05:12.404 element at address: 0x2000002d7ac0 with size: 0.000183 MiB 00:05:12.404 element at address: 0x2000002d7b80 with size: 0.000183 MiB 00:05:12.404 element at address: 0x2000002d7c40 with size: 0.000183 MiB 00:05:12.404 element at address: 0x2000003d9e40 with size: 0.000183 MiB 00:05:12.404 element at address: 0x20000087ce00 with size: 0.000183 MiB 00:05:12.404 element at address: 0x20000087cec0 with size: 0.000183 MiB 00:05:12.404 element at address: 0x2000008fd180 with size: 0.000183 MiB 00:05:12.404 element at address: 0x200003a5b040 with size: 0.000183 MiB 00:05:12.404 element at address: 0x200003adb300 with size: 0.000183 MiB 00:05:12.404 element at address: 0x200003adb500 with size: 0.000183 MiB 00:05:12.404 element at address: 0x200003adf7c0 with size: 0.000183 MiB 00:05:12.404 element at address: 0x200003affa80 with size: 0.000183 MiB 00:05:12.404 element at address: 0x200003affb40 with size: 0.000183 MiB 00:05:12.404 element at address: 0x200003eff0c0 with size: 0.000183 MiB 00:05:12.404 element at address: 0x2000070fdd80 with size: 0.000183 MiB 00:05:12.404 element at address: 0x20000b27da00 with size: 0.000183 MiB 00:05:12.404 element at address: 0x20000b27dac0 with size: 0.000183 MiB 00:05:12.404 element at address: 0x20000b2fdd80 with size: 0.000183 MiB 00:05:12.404 element at address: 0x2000138fa8c0 with size: 0.000183 MiB 00:05:12.404 element at address: 0x2000192efc40 with size: 0.000183 MiB 00:05:12.404 element at address: 0x2000192efd00 with size: 0.000183 MiB 00:05:12.404 element at address: 0x2000194bc740 with size: 0.000183 MiB 00:05:12.404 element at address: 0x20001aa95380 with size: 0.000183 MiB 00:05:12.404 element at address: 0x20001aa95440 with size: 0.000183 MiB 00:05:12.404 element at address: 0x200027e68f80 with size: 0.000183 MiB 00:05:12.404 element at address: 0x200027e69040 with size: 0.000183 MiB 00:05:12.404 element at address: 0x200027e6fc40 with size: 0.000183 MiB 00:05:12.404 element at address: 0x200027e6fe40 with size: 0.000183 MiB 00:05:12.404 element at address: 0x200027e6ff00 with size: 0.000183 MiB 00:05:12.404 list of memzone associated elements. size: 602.262573 MiB 00:05:12.404 element at address: 0x20001aa95500 with size: 211.416748 MiB 00:05:12.404 associated memzone info: size: 211.416626 MiB name: MP_PDU_immediate_data_Pool_0 00:05:12.404 element at address: 0x200027e6ffc0 with size: 157.562561 MiB 00:05:12.404 associated memzone info: size: 157.562439 MiB name: MP_PDU_data_out_Pool_0 00:05:12.404 element at address: 0x2000139fab80 with size: 84.020630 MiB 00:05:12.404 associated memzone info: size: 84.020508 MiB name: MP_bdev_io_304774_0 00:05:12.404 element at address: 0x2000009ff380 with size: 48.003052 MiB 00:05:12.404 associated memzone info: size: 48.002930 MiB name: MP_evtpool_304774_0 00:05:12.404 element at address: 0x200003fff380 with size: 48.003052 MiB 00:05:12.404 associated memzone info: size: 48.002930 MiB name: MP_msgpool_304774_0 00:05:12.404 element at address: 0x2000195be940 with size: 20.255554 MiB 00:05:12.404 associated memzone info: size: 20.255432 MiB name: MP_PDU_Pool_0 00:05:12.404 element at address: 0x200031dfeb40 with size: 18.005066 MiB 00:05:12.404 associated memzone info: size: 18.004944 MiB name: MP_SCSI_TASK_Pool_0 00:05:12.404 element at address: 0x2000005ffe00 with size: 2.000488 MiB 00:05:12.404 associated memzone info: size: 2.000366 MiB name: RG_MP_evtpool_304774 00:05:12.404 element at address: 0x200003bffe00 with size: 2.000488 MiB 00:05:12.404 associated memzone info: size: 2.000366 MiB name: RG_MP_msgpool_304774 00:05:12.404 element at address: 0x2000002d7d00 with size: 1.008118 MiB 00:05:12.404 associated memzone info: size: 1.007996 MiB name: MP_evtpool_304774 00:05:12.404 element at address: 0x20000b2fde40 with size: 1.008118 MiB 00:05:12.404 associated memzone info: size: 1.007996 MiB name: MP_PDU_Pool 00:05:12.404 element at address: 0x2000194bc800 with size: 1.008118 MiB 00:05:12.404 associated memzone info: size: 1.007996 MiB name: MP_PDU_immediate_data_Pool 00:05:12.404 element at address: 0x2000070fde40 with size: 1.008118 MiB 00:05:12.404 associated memzone info: size: 1.007996 MiB name: MP_PDU_data_out_Pool 00:05:12.404 element at address: 0x2000008fd240 with size: 1.008118 MiB 00:05:12.404 associated memzone info: size: 1.007996 MiB name: MP_SCSI_TASK_Pool 00:05:12.404 element at address: 0x200003eff180 with size: 1.000488 MiB 00:05:12.404 associated memzone info: size: 1.000366 MiB name: RG_ring_0_304774 00:05:12.404 element at address: 0x200003affc00 with size: 1.000488 MiB 00:05:12.404 associated memzone info: size: 1.000366 MiB name: RG_ring_1_304774 00:05:12.404 element at address: 0x2000138fa980 with size: 1.000488 MiB 00:05:12.404 associated memzone info: size: 1.000366 MiB name: RG_ring_4_304774 00:05:12.404 element at address: 0x200031cfe940 with size: 1.000488 MiB 00:05:12.404 associated memzone info: size: 1.000366 MiB name: RG_ring_5_304774 00:05:12.404 element at address: 0x200003a5b100 with size: 0.500488 MiB 00:05:12.404 associated memzone info: size: 0.500366 MiB name: RG_MP_bdev_io_304774 00:05:12.404 element at address: 0x20000b27db80 with size: 0.500488 MiB 00:05:12.404 associated memzone info: size: 0.500366 MiB name: RG_MP_PDU_Pool 00:05:12.404 element at address: 0x20000087cf80 with size: 0.500488 MiB 00:05:12.404 associated memzone info: size: 0.500366 MiB name: RG_MP_SCSI_TASK_Pool 00:05:12.404 element at address: 0x20001947c540 with size: 0.250488 MiB 00:05:12.404 associated memzone info: size: 0.250366 MiB name: RG_MP_PDU_immediate_data_Pool 00:05:12.404 element at address: 0x200003adf880 with size: 0.125488 MiB 00:05:12.404 associated memzone info: size: 0.125366 MiB name: RG_ring_2_304774 00:05:12.404 element at address: 0x2000070f5b80 with size: 0.031738 MiB 00:05:12.404 associated memzone info: size: 0.031616 MiB name: RG_MP_PDU_data_out_Pool 00:05:12.404 element at address: 0x200027e69100 with size: 0.023743 MiB 00:05:12.404 associated memzone info: size: 0.023621 MiB name: MP_Session_Pool_0 00:05:12.404 element at address: 0x200003adb5c0 with size: 0.016113 MiB 00:05:12.404 associated memzone info: size: 0.015991 MiB name: RG_ring_3_304774 00:05:12.404 element at address: 0x200027e6f240 with size: 0.002441 MiB 00:05:12.404 associated memzone info: size: 0.002319 MiB name: RG_MP_Session_Pool 00:05:12.404 element at address: 0x2000002d7980 with size: 0.000305 MiB 00:05:12.404 associated memzone info: size: 0.000183 MiB name: MP_msgpool_304774 00:05:12.404 element at address: 0x200003adb3c0 with size: 0.000305 MiB 00:05:12.404 associated memzone info: size: 0.000183 MiB name: MP_bdev_io_304774 00:05:12.404 element at address: 0x200027e6fd00 with size: 0.000305 MiB 00:05:12.404 associated memzone info: size: 0.000183 MiB name: MP_Session_Pool 00:05:12.404 00:13:11 -- dpdk_memory_utility/test_dpdk_mem_info.sh@25 -- # trap - SIGINT SIGTERM EXIT 00:05:12.404 00:13:11 -- dpdk_memory_utility/test_dpdk_mem_info.sh@26 -- # killprocess 304774 00:05:12.404 00:13:11 -- common/autotest_common.sh@926 -- # '[' -z 304774 ']' 00:05:12.404 00:13:11 -- common/autotest_common.sh@930 -- # kill -0 304774 00:05:12.404 00:13:11 -- common/autotest_common.sh@931 -- # uname 00:05:12.404 00:13:11 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:05:12.404 00:13:11 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 304774 00:05:12.404 00:13:11 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:05:12.404 00:13:11 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:05:12.404 00:13:11 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 304774' 00:05:12.404 killing process with pid 304774 00:05:12.404 00:13:11 -- common/autotest_common.sh@945 -- # kill 304774 00:05:12.404 00:13:11 -- common/autotest_common.sh@950 -- # wait 304774 00:05:12.663 00:05:12.663 real 0m1.396s 00:05:12.663 user 0m1.419s 00:05:12.663 sys 0m0.445s 00:05:12.663 00:13:11 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:12.663 00:13:11 -- common/autotest_common.sh@10 -- # set +x 00:05:12.663 ************************************ 00:05:12.663 END TEST dpdk_mem_utility 00:05:12.663 ************************************ 00:05:12.663 00:13:11 -- spdk/autotest.sh@187 -- # run_test event /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event.sh 00:05:12.663 00:13:11 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:12.663 00:13:11 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:12.663 00:13:11 -- common/autotest_common.sh@10 -- # set +x 00:05:12.663 ************************************ 00:05:12.663 START TEST event 00:05:12.663 ************************************ 00:05:12.663 00:13:11 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event.sh 00:05:12.922 * Looking for test storage... 00:05:12.922 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event 00:05:12.922 00:13:11 -- event/event.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/bdev/nbd_common.sh 00:05:12.922 00:13:11 -- bdev/nbd_common.sh@6 -- # set -e 00:05:12.922 00:13:11 -- event/event.sh@45 -- # run_test event_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:05:12.922 00:13:11 -- common/autotest_common.sh@1077 -- # '[' 6 -le 1 ']' 00:05:12.922 00:13:11 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:12.922 00:13:11 -- common/autotest_common.sh@10 -- # set +x 00:05:12.922 ************************************ 00:05:12.922 START TEST event_perf 00:05:12.922 ************************************ 00:05:12.922 00:13:11 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:05:12.922 Running I/O for 1 seconds...[2024-07-15 00:13:11.786966] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:12.922 [2024-07-15 00:13:11.787049] [ DPDK EAL parameters: event_perf --no-shconf -c 0xF --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid305077 ] 00:05:12.922 EAL: No free 2048 kB hugepages reported on node 1 00:05:12.922 [2024-07-15 00:13:11.859254] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:05:12.922 [2024-07-15 00:13:11.933820] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:12.922 [2024-07-15 00:13:11.933834] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:05:12.922 [2024-07-15 00:13:11.933905] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:12.922 [2024-07-15 00:13:11.933903] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:05:14.300 Running I/O for 1 seconds... 00:05:14.300 lcore 0: 198224 00:05:14.300 lcore 1: 198223 00:05:14.300 lcore 2: 198225 00:05:14.300 lcore 3: 198225 00:05:14.300 done. 00:05:14.300 00:05:14.300 real 0m1.224s 00:05:14.300 user 0m4.132s 00:05:14.300 sys 0m0.090s 00:05:14.300 00:13:12 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:14.300 00:13:12 -- common/autotest_common.sh@10 -- # set +x 00:05:14.300 ************************************ 00:05:14.300 END TEST event_perf 00:05:14.300 ************************************ 00:05:14.300 00:13:13 -- event/event.sh@46 -- # run_test event_reactor /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:05:14.300 00:13:13 -- common/autotest_common.sh@1077 -- # '[' 4 -le 1 ']' 00:05:14.300 00:13:13 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:14.300 00:13:13 -- common/autotest_common.sh@10 -- # set +x 00:05:14.300 ************************************ 00:05:14.300 START TEST event_reactor 00:05:14.300 ************************************ 00:05:14.300 00:13:13 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:05:14.301 [2024-07-15 00:13:13.063246] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:14.301 [2024-07-15 00:13:13.063343] [ DPDK EAL parameters: reactor --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid305363 ] 00:05:14.301 EAL: No free 2048 kB hugepages reported on node 1 00:05:14.301 [2024-07-15 00:13:13.133444] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:14.301 [2024-07-15 00:13:13.200199] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:15.236 test_start 00:05:15.236 oneshot 00:05:15.236 tick 100 00:05:15.236 tick 100 00:05:15.236 tick 250 00:05:15.236 tick 100 00:05:15.236 tick 100 00:05:15.236 tick 100 00:05:15.236 tick 250 00:05:15.236 tick 500 00:05:15.236 tick 100 00:05:15.236 tick 100 00:05:15.236 tick 250 00:05:15.236 tick 100 00:05:15.236 tick 100 00:05:15.236 test_end 00:05:15.236 00:05:15.236 real 0m1.217s 00:05:15.236 user 0m1.124s 00:05:15.236 sys 0m0.089s 00:05:15.236 00:13:14 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:15.236 00:13:14 -- common/autotest_common.sh@10 -- # set +x 00:05:15.236 ************************************ 00:05:15.236 END TEST event_reactor 00:05:15.236 ************************************ 00:05:15.495 00:13:14 -- event/event.sh@47 -- # run_test event_reactor_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:05:15.495 00:13:14 -- common/autotest_common.sh@1077 -- # '[' 4 -le 1 ']' 00:05:15.495 00:13:14 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:15.495 00:13:14 -- common/autotest_common.sh@10 -- # set +x 00:05:15.495 ************************************ 00:05:15.495 START TEST event_reactor_perf 00:05:15.495 ************************************ 00:05:15.495 00:13:14 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:05:15.495 [2024-07-15 00:13:14.332618] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:15.495 [2024-07-15 00:13:14.332707] [ DPDK EAL parameters: reactor_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid305653 ] 00:05:15.495 EAL: No free 2048 kB hugepages reported on node 1 00:05:15.495 [2024-07-15 00:13:14.404378] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:15.495 [2024-07-15 00:13:14.477133] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:16.874 test_start 00:05:16.874 test_end 00:05:16.874 Performance: 907081 events per second 00:05:16.874 00:05:16.874 real 0m1.224s 00:05:16.874 user 0m1.131s 00:05:16.874 sys 0m0.089s 00:05:16.874 00:13:15 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:16.874 00:13:15 -- common/autotest_common.sh@10 -- # set +x 00:05:16.874 ************************************ 00:05:16.874 END TEST event_reactor_perf 00:05:16.874 ************************************ 00:05:16.874 00:13:15 -- event/event.sh@49 -- # uname -s 00:05:16.874 00:13:15 -- event/event.sh@49 -- # '[' Linux = Linux ']' 00:05:16.874 00:13:15 -- event/event.sh@50 -- # run_test event_scheduler /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:05:16.874 00:13:15 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:16.874 00:13:15 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:16.874 00:13:15 -- common/autotest_common.sh@10 -- # set +x 00:05:16.874 ************************************ 00:05:16.874 START TEST event_scheduler 00:05:16.874 ************************************ 00:05:16.874 00:13:15 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:05:16.874 * Looking for test storage... 00:05:16.874 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler 00:05:16.874 00:13:15 -- scheduler/scheduler.sh@29 -- # rpc=rpc_cmd 00:05:16.874 00:13:15 -- scheduler/scheduler.sh@35 -- # scheduler_pid=305966 00:05:16.874 00:13:15 -- scheduler/scheduler.sh@36 -- # trap 'killprocess $scheduler_pid; exit 1' SIGINT SIGTERM EXIT 00:05:16.874 00:13:15 -- scheduler/scheduler.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler -m 0xF -p 0x2 --wait-for-rpc -f 00:05:16.874 00:13:15 -- scheduler/scheduler.sh@37 -- # waitforlisten 305966 00:05:16.874 00:13:15 -- common/autotest_common.sh@819 -- # '[' -z 305966 ']' 00:05:16.874 00:13:15 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:16.874 00:13:15 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:16.874 00:13:15 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:16.874 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:16.874 00:13:15 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:16.874 00:13:15 -- common/autotest_common.sh@10 -- # set +x 00:05:16.874 [2024-07-15 00:13:15.717658] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:16.874 [2024-07-15 00:13:15.717751] [ DPDK EAL parameters: scheduler --no-shconf -c 0xF --main-lcore=2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid305966 ] 00:05:16.874 EAL: No free 2048 kB hugepages reported on node 1 00:05:16.874 [2024-07-15 00:13:15.786065] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:05:16.874 [2024-07-15 00:13:15.857903] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:16.874 [2024-07-15 00:13:15.857987] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:16.874 [2024-07-15 00:13:15.858005] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:05:16.874 [2024-07-15 00:13:15.858006] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:05:17.812 00:13:16 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:17.812 00:13:16 -- common/autotest_common.sh@852 -- # return 0 00:05:17.812 00:13:16 -- scheduler/scheduler.sh@39 -- # rpc_cmd framework_set_scheduler dynamic 00:05:17.812 00:13:16 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:17.812 00:13:16 -- common/autotest_common.sh@10 -- # set +x 00:05:17.812 POWER: Env isn't set yet! 00:05:17.812 POWER: Attempting to initialise ACPI cpufreq power management... 00:05:17.812 POWER: Failed to write /sys/devices/system/cpu/cpu%u/cpufreq/scaling_governor 00:05:17.812 POWER: Cannot set governor of lcore 0 to userspace 00:05:17.812 POWER: Attempting to initialise PSTAT power management... 00:05:17.812 POWER: Power management governor of lcore 0 has been set to 'performance' successfully 00:05:17.812 POWER: Initialized successfully for lcore 0 power management 00:05:17.812 POWER: Power management governor of lcore 1 has been set to 'performance' successfully 00:05:17.812 POWER: Initialized successfully for lcore 1 power management 00:05:17.812 POWER: Power management governor of lcore 2 has been set to 'performance' successfully 00:05:17.812 POWER: Initialized successfully for lcore 2 power management 00:05:17.812 POWER: Power management governor of lcore 3 has been set to 'performance' successfully 00:05:17.812 POWER: Initialized successfully for lcore 3 power management 00:05:17.812 [2024-07-15 00:13:16.583658] scheduler_dynamic.c: 387:set_opts: *NOTICE*: Setting scheduler load limit to 20 00:05:17.813 [2024-07-15 00:13:16.583674] scheduler_dynamic.c: 389:set_opts: *NOTICE*: Setting scheduler core limit to 80 00:05:17.813 [2024-07-15 00:13:16.583685] scheduler_dynamic.c: 391:set_opts: *NOTICE*: Setting scheduler core busy to 95 00:05:17.813 00:13:16 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:17.813 00:13:16 -- scheduler/scheduler.sh@40 -- # rpc_cmd framework_start_init 00:05:17.813 00:13:16 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:17.813 00:13:16 -- common/autotest_common.sh@10 -- # set +x 00:05:17.813 [2024-07-15 00:13:16.654825] scheduler.c: 382:test_start: *NOTICE*: Scheduler test application started. 00:05:17.813 00:13:16 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:17.813 00:13:16 -- scheduler/scheduler.sh@43 -- # run_test scheduler_create_thread scheduler_create_thread 00:05:17.813 00:13:16 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:17.813 00:13:16 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:17.813 00:13:16 -- common/autotest_common.sh@10 -- # set +x 00:05:17.813 ************************************ 00:05:17.813 START TEST scheduler_create_thread 00:05:17.813 ************************************ 00:05:17.813 00:13:16 -- common/autotest_common.sh@1104 -- # scheduler_create_thread 00:05:17.813 00:13:16 -- scheduler/scheduler.sh@12 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x1 -a 100 00:05:17.813 00:13:16 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:17.813 00:13:16 -- common/autotest_common.sh@10 -- # set +x 00:05:17.813 2 00:05:17.813 00:13:16 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:17.813 00:13:16 -- scheduler/scheduler.sh@13 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x2 -a 100 00:05:17.813 00:13:16 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:17.813 00:13:16 -- common/autotest_common.sh@10 -- # set +x 00:05:17.813 3 00:05:17.813 00:13:16 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:17.813 00:13:16 -- scheduler/scheduler.sh@14 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x4 -a 100 00:05:17.813 00:13:16 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:17.813 00:13:16 -- common/autotest_common.sh@10 -- # set +x 00:05:17.813 4 00:05:17.813 00:13:16 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:17.813 00:13:16 -- scheduler/scheduler.sh@15 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x8 -a 100 00:05:17.813 00:13:16 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:17.813 00:13:16 -- common/autotest_common.sh@10 -- # set +x 00:05:17.813 5 00:05:17.813 00:13:16 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:17.813 00:13:16 -- scheduler/scheduler.sh@16 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x1 -a 0 00:05:17.813 00:13:16 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:17.813 00:13:16 -- common/autotest_common.sh@10 -- # set +x 00:05:17.813 6 00:05:17.813 00:13:16 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:17.813 00:13:16 -- scheduler/scheduler.sh@17 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x2 -a 0 00:05:17.813 00:13:16 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:17.813 00:13:16 -- common/autotest_common.sh@10 -- # set +x 00:05:17.813 7 00:05:17.813 00:13:16 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:17.813 00:13:16 -- scheduler/scheduler.sh@18 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x4 -a 0 00:05:17.813 00:13:16 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:17.813 00:13:16 -- common/autotest_common.sh@10 -- # set +x 00:05:17.813 8 00:05:17.813 00:13:16 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:17.813 00:13:16 -- scheduler/scheduler.sh@19 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x8 -a 0 00:05:17.813 00:13:16 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:17.813 00:13:16 -- common/autotest_common.sh@10 -- # set +x 00:05:17.813 9 00:05:17.813 00:13:16 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:17.813 00:13:16 -- scheduler/scheduler.sh@21 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n one_third_active -a 30 00:05:17.813 00:13:16 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:17.813 00:13:16 -- common/autotest_common.sh@10 -- # set +x 00:05:17.813 10 00:05:17.813 00:13:16 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:17.813 00:13:16 -- scheduler/scheduler.sh@22 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n half_active -a 0 00:05:17.813 00:13:16 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:17.813 00:13:16 -- common/autotest_common.sh@10 -- # set +x 00:05:17.813 00:13:16 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:17.813 00:13:16 -- scheduler/scheduler.sh@22 -- # thread_id=11 00:05:17.813 00:13:16 -- scheduler/scheduler.sh@23 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_set_active 11 50 00:05:17.813 00:13:16 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:17.813 00:13:16 -- common/autotest_common.sh@10 -- # set +x 00:05:18.751 00:13:17 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:18.751 00:13:17 -- scheduler/scheduler.sh@25 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n deleted -a 100 00:05:18.751 00:13:17 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:18.751 00:13:17 -- common/autotest_common.sh@10 -- # set +x 00:05:20.130 00:13:19 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:20.130 00:13:19 -- scheduler/scheduler.sh@25 -- # thread_id=12 00:05:20.130 00:13:19 -- scheduler/scheduler.sh@26 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_delete 12 00:05:20.130 00:13:19 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:20.130 00:13:19 -- common/autotest_common.sh@10 -- # set +x 00:05:21.066 00:13:20 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:21.066 00:05:21.066 real 0m3.382s 00:05:21.066 user 0m0.022s 00:05:21.066 sys 0m0.009s 00:05:21.066 00:13:20 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:21.066 00:13:20 -- common/autotest_common.sh@10 -- # set +x 00:05:21.066 ************************************ 00:05:21.066 END TEST scheduler_create_thread 00:05:21.066 ************************************ 00:05:21.066 00:13:20 -- scheduler/scheduler.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:05:21.066 00:13:20 -- scheduler/scheduler.sh@46 -- # killprocess 305966 00:05:21.066 00:13:20 -- common/autotest_common.sh@926 -- # '[' -z 305966 ']' 00:05:21.066 00:13:20 -- common/autotest_common.sh@930 -- # kill -0 305966 00:05:21.066 00:13:20 -- common/autotest_common.sh@931 -- # uname 00:05:21.066 00:13:20 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:05:21.066 00:13:20 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 305966 00:05:21.324 00:13:20 -- common/autotest_common.sh@932 -- # process_name=reactor_2 00:05:21.324 00:13:20 -- common/autotest_common.sh@936 -- # '[' reactor_2 = sudo ']' 00:05:21.324 00:13:20 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 305966' 00:05:21.324 killing process with pid 305966 00:05:21.324 00:13:20 -- common/autotest_common.sh@945 -- # kill 305966 00:05:21.324 00:13:20 -- common/autotest_common.sh@950 -- # wait 305966 00:05:21.582 [2024-07-15 00:13:20.426739] scheduler.c: 360:test_shutdown: *NOTICE*: Scheduler test application stopped. 00:05:21.582 POWER: Power management governor of lcore 0 has been set to 'powersave' successfully 00:05:21.582 POWER: Power management of lcore 0 has exited from 'performance' mode and been set back to the original 00:05:21.582 POWER: Power management governor of lcore 1 has been set to 'powersave' successfully 00:05:21.582 POWER: Power management of lcore 1 has exited from 'performance' mode and been set back to the original 00:05:21.582 POWER: Power management governor of lcore 2 has been set to 'powersave' successfully 00:05:21.582 POWER: Power management of lcore 2 has exited from 'performance' mode and been set back to the original 00:05:21.582 POWER: Power management governor of lcore 3 has been set to 'powersave' successfully 00:05:21.582 POWER: Power management of lcore 3 has exited from 'performance' mode and been set back to the original 00:05:21.841 00:05:21.841 real 0m5.055s 00:05:21.841 user 0m10.438s 00:05:21.841 sys 0m0.383s 00:05:21.841 00:13:20 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:21.841 00:13:20 -- common/autotest_common.sh@10 -- # set +x 00:05:21.841 ************************************ 00:05:21.841 END TEST event_scheduler 00:05:21.841 ************************************ 00:05:21.841 00:13:20 -- event/event.sh@51 -- # modprobe -n nbd 00:05:21.841 00:13:20 -- event/event.sh@52 -- # run_test app_repeat app_repeat_test 00:05:21.841 00:13:20 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:21.841 00:13:20 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:21.841 00:13:20 -- common/autotest_common.sh@10 -- # set +x 00:05:21.841 ************************************ 00:05:21.841 START TEST app_repeat 00:05:21.841 ************************************ 00:05:21.841 00:13:20 -- common/autotest_common.sh@1104 -- # app_repeat_test 00:05:21.841 00:13:20 -- event/event.sh@12 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:21.841 00:13:20 -- event/event.sh@13 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:21.841 00:13:20 -- event/event.sh@13 -- # local nbd_list 00:05:21.841 00:13:20 -- event/event.sh@14 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:21.841 00:13:20 -- event/event.sh@14 -- # local bdev_list 00:05:21.841 00:13:20 -- event/event.sh@15 -- # local repeat_times=4 00:05:21.841 00:13:20 -- event/event.sh@17 -- # modprobe nbd 00:05:21.841 00:13:20 -- event/event.sh@19 -- # repeat_pid=306831 00:05:21.841 00:13:20 -- event/event.sh@20 -- # trap 'killprocess $repeat_pid; exit 1' SIGINT SIGTERM EXIT 00:05:21.841 00:13:20 -- event/event.sh@18 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/app_repeat/app_repeat -r /var/tmp/spdk-nbd.sock -m 0x3 -t 4 00:05:21.841 00:13:20 -- event/event.sh@21 -- # echo 'Process app_repeat pid: 306831' 00:05:21.841 Process app_repeat pid: 306831 00:05:21.841 00:13:20 -- event/event.sh@23 -- # for i in {0..2} 00:05:21.841 00:13:20 -- event/event.sh@24 -- # echo 'spdk_app_start Round 0' 00:05:21.841 spdk_app_start Round 0 00:05:21.841 00:13:20 -- event/event.sh@25 -- # waitforlisten 306831 /var/tmp/spdk-nbd.sock 00:05:21.841 00:13:20 -- common/autotest_common.sh@819 -- # '[' -z 306831 ']' 00:05:21.841 00:13:20 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:05:21.841 00:13:20 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:21.841 00:13:20 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:05:21.841 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:05:21.841 00:13:20 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:21.841 00:13:20 -- common/autotest_common.sh@10 -- # set +x 00:05:21.841 [2024-07-15 00:13:20.734147] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:21.841 [2024-07-15 00:13:20.734240] [ DPDK EAL parameters: app_repeat --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid306831 ] 00:05:21.841 EAL: No free 2048 kB hugepages reported on node 1 00:05:21.841 [2024-07-15 00:13:20.806558] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:21.841 [2024-07-15 00:13:20.874916] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:21.841 [2024-07-15 00:13:20.874918] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:22.778 00:13:21 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:22.778 00:13:21 -- common/autotest_common.sh@852 -- # return 0 00:05:22.778 00:13:21 -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:22.778 Malloc0 00:05:22.778 00:13:21 -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:23.038 Malloc1 00:05:23.038 00:13:21 -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:23.038 00:13:21 -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:23.038 00:13:21 -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:23.038 00:13:21 -- bdev/nbd_common.sh@91 -- # local bdev_list 00:05:23.038 00:13:21 -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:23.038 00:13:21 -- bdev/nbd_common.sh@92 -- # local nbd_list 00:05:23.038 00:13:21 -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:23.038 00:13:21 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:23.038 00:13:21 -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:23.038 00:13:21 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:05:23.038 00:13:21 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:23.038 00:13:21 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:05:23.038 00:13:21 -- bdev/nbd_common.sh@12 -- # local i 00:05:23.038 00:13:21 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:05:23.038 00:13:21 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:23.038 00:13:21 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:05:23.038 /dev/nbd0 00:05:23.038 00:13:22 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:05:23.038 00:13:22 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:05:23.038 00:13:22 -- common/autotest_common.sh@856 -- # local nbd_name=nbd0 00:05:23.038 00:13:22 -- common/autotest_common.sh@857 -- # local i 00:05:23.038 00:13:22 -- common/autotest_common.sh@859 -- # (( i = 1 )) 00:05:23.038 00:13:22 -- common/autotest_common.sh@859 -- # (( i <= 20 )) 00:05:23.038 00:13:22 -- common/autotest_common.sh@860 -- # grep -q -w nbd0 /proc/partitions 00:05:23.038 00:13:22 -- common/autotest_common.sh@861 -- # break 00:05:23.038 00:13:22 -- common/autotest_common.sh@872 -- # (( i = 1 )) 00:05:23.038 00:13:22 -- common/autotest_common.sh@872 -- # (( i <= 20 )) 00:05:23.038 00:13:22 -- common/autotest_common.sh@873 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:23.038 1+0 records in 00:05:23.038 1+0 records out 00:05:23.038 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000247284 s, 16.6 MB/s 00:05:23.038 00:13:22 -- common/autotest_common.sh@874 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:23.038 00:13:22 -- common/autotest_common.sh@874 -- # size=4096 00:05:23.038 00:13:22 -- common/autotest_common.sh@875 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:23.038 00:13:22 -- common/autotest_common.sh@876 -- # '[' 4096 '!=' 0 ']' 00:05:23.038 00:13:22 -- common/autotest_common.sh@877 -- # return 0 00:05:23.038 00:13:22 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:23.038 00:13:22 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:23.038 00:13:22 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:05:23.297 /dev/nbd1 00:05:23.297 00:13:22 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:05:23.297 00:13:22 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:05:23.297 00:13:22 -- common/autotest_common.sh@856 -- # local nbd_name=nbd1 00:05:23.297 00:13:22 -- common/autotest_common.sh@857 -- # local i 00:05:23.297 00:13:22 -- common/autotest_common.sh@859 -- # (( i = 1 )) 00:05:23.297 00:13:22 -- common/autotest_common.sh@859 -- # (( i <= 20 )) 00:05:23.297 00:13:22 -- common/autotest_common.sh@860 -- # grep -q -w nbd1 /proc/partitions 00:05:23.297 00:13:22 -- common/autotest_common.sh@861 -- # break 00:05:23.297 00:13:22 -- common/autotest_common.sh@872 -- # (( i = 1 )) 00:05:23.297 00:13:22 -- common/autotest_common.sh@872 -- # (( i <= 20 )) 00:05:23.297 00:13:22 -- common/autotest_common.sh@873 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:23.297 1+0 records in 00:05:23.297 1+0 records out 00:05:23.297 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000234631 s, 17.5 MB/s 00:05:23.297 00:13:22 -- common/autotest_common.sh@874 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:23.297 00:13:22 -- common/autotest_common.sh@874 -- # size=4096 00:05:23.297 00:13:22 -- common/autotest_common.sh@875 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:23.297 00:13:22 -- common/autotest_common.sh@876 -- # '[' 4096 '!=' 0 ']' 00:05:23.297 00:13:22 -- common/autotest_common.sh@877 -- # return 0 00:05:23.297 00:13:22 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:23.297 00:13:22 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:23.297 00:13:22 -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:23.297 00:13:22 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:23.297 00:13:22 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:23.557 00:13:22 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:05:23.557 { 00:05:23.557 "nbd_device": "/dev/nbd0", 00:05:23.557 "bdev_name": "Malloc0" 00:05:23.557 }, 00:05:23.557 { 00:05:23.557 "nbd_device": "/dev/nbd1", 00:05:23.557 "bdev_name": "Malloc1" 00:05:23.557 } 00:05:23.557 ]' 00:05:23.557 00:13:22 -- bdev/nbd_common.sh@64 -- # echo '[ 00:05:23.557 { 00:05:23.557 "nbd_device": "/dev/nbd0", 00:05:23.557 "bdev_name": "Malloc0" 00:05:23.557 }, 00:05:23.557 { 00:05:23.557 "nbd_device": "/dev/nbd1", 00:05:23.557 "bdev_name": "Malloc1" 00:05:23.557 } 00:05:23.557 ]' 00:05:23.557 00:13:22 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:23.557 00:13:22 -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:05:23.557 /dev/nbd1' 00:05:23.557 00:13:22 -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:05:23.557 /dev/nbd1' 00:05:23.557 00:13:22 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:23.557 00:13:22 -- bdev/nbd_common.sh@65 -- # count=2 00:05:23.557 00:13:22 -- bdev/nbd_common.sh@66 -- # echo 2 00:05:23.557 00:13:22 -- bdev/nbd_common.sh@95 -- # count=2 00:05:23.557 00:13:22 -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:05:23.557 00:13:22 -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:05:23.557 00:13:22 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:23.557 00:13:22 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:23.557 00:13:22 -- bdev/nbd_common.sh@71 -- # local operation=write 00:05:23.557 00:13:22 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:23.557 00:13:22 -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:05:23.557 00:13:22 -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:05:23.557 256+0 records in 00:05:23.557 256+0 records out 00:05:23.557 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0106263 s, 98.7 MB/s 00:05:23.557 00:13:22 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:23.557 00:13:22 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:05:23.557 256+0 records in 00:05:23.557 256+0 records out 00:05:23.557 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0200407 s, 52.3 MB/s 00:05:23.557 00:13:22 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:23.557 00:13:22 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:05:23.557 256+0 records in 00:05:23.557 256+0 records out 00:05:23.557 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0217102 s, 48.3 MB/s 00:05:23.557 00:13:22 -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:05:23.557 00:13:22 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:23.557 00:13:22 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:23.557 00:13:22 -- bdev/nbd_common.sh@71 -- # local operation=verify 00:05:23.557 00:13:22 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:23.557 00:13:22 -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:05:23.557 00:13:22 -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:05:23.557 00:13:22 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:23.557 00:13:22 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:05:23.557 00:13:22 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:23.557 00:13:22 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:05:23.557 00:13:22 -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:23.557 00:13:22 -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:05:23.557 00:13:22 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:23.557 00:13:22 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:23.557 00:13:22 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:05:23.557 00:13:22 -- bdev/nbd_common.sh@51 -- # local i 00:05:23.557 00:13:22 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:23.557 00:13:22 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:05:23.816 00:13:22 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:05:23.816 00:13:22 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:05:23.816 00:13:22 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:05:23.816 00:13:22 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:23.816 00:13:22 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:23.816 00:13:22 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:05:23.816 00:13:22 -- bdev/nbd_common.sh@41 -- # break 00:05:23.816 00:13:22 -- bdev/nbd_common.sh@45 -- # return 0 00:05:23.816 00:13:22 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:23.816 00:13:22 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:05:24.076 00:13:22 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:05:24.076 00:13:22 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:05:24.076 00:13:22 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:05:24.076 00:13:22 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:24.076 00:13:22 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:24.076 00:13:22 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:05:24.076 00:13:22 -- bdev/nbd_common.sh@41 -- # break 00:05:24.076 00:13:22 -- bdev/nbd_common.sh@45 -- # return 0 00:05:24.076 00:13:22 -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:24.076 00:13:22 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:24.076 00:13:22 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:24.076 00:13:23 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:05:24.076 00:13:23 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:05:24.076 00:13:23 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:24.334 00:13:23 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:05:24.334 00:13:23 -- bdev/nbd_common.sh@65 -- # echo '' 00:05:24.334 00:13:23 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:24.334 00:13:23 -- bdev/nbd_common.sh@65 -- # true 00:05:24.334 00:13:23 -- bdev/nbd_common.sh@65 -- # count=0 00:05:24.334 00:13:23 -- bdev/nbd_common.sh@66 -- # echo 0 00:05:24.334 00:13:23 -- bdev/nbd_common.sh@104 -- # count=0 00:05:24.334 00:13:23 -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:05:24.334 00:13:23 -- bdev/nbd_common.sh@109 -- # return 0 00:05:24.334 00:13:23 -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:05:24.334 00:13:23 -- event/event.sh@35 -- # sleep 3 00:05:24.591 [2024-07-15 00:13:23.533313] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:24.591 [2024-07-15 00:13:23.596140] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:24.591 [2024-07-15 00:13:23.596143] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:24.591 [2024-07-15 00:13:23.636656] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:05:24.591 [2024-07-15 00:13:23.636699] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:05:27.877 00:13:26 -- event/event.sh@23 -- # for i in {0..2} 00:05:27.877 00:13:26 -- event/event.sh@24 -- # echo 'spdk_app_start Round 1' 00:05:27.877 spdk_app_start Round 1 00:05:27.877 00:13:26 -- event/event.sh@25 -- # waitforlisten 306831 /var/tmp/spdk-nbd.sock 00:05:27.877 00:13:26 -- common/autotest_common.sh@819 -- # '[' -z 306831 ']' 00:05:27.877 00:13:26 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:05:27.877 00:13:26 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:27.877 00:13:26 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:05:27.877 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:05:27.877 00:13:26 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:27.877 00:13:26 -- common/autotest_common.sh@10 -- # set +x 00:05:27.877 00:13:26 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:27.877 00:13:26 -- common/autotest_common.sh@852 -- # return 0 00:05:27.877 00:13:26 -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:27.877 Malloc0 00:05:27.877 00:13:26 -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:27.877 Malloc1 00:05:27.877 00:13:26 -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:27.877 00:13:26 -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:27.877 00:13:26 -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:27.877 00:13:26 -- bdev/nbd_common.sh@91 -- # local bdev_list 00:05:27.877 00:13:26 -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:27.877 00:13:26 -- bdev/nbd_common.sh@92 -- # local nbd_list 00:05:27.877 00:13:26 -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:27.877 00:13:26 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:27.877 00:13:26 -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:27.877 00:13:26 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:05:27.877 00:13:26 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:27.877 00:13:26 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:05:27.877 00:13:26 -- bdev/nbd_common.sh@12 -- # local i 00:05:27.877 00:13:26 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:05:27.877 00:13:26 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:27.877 00:13:26 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:05:28.137 /dev/nbd0 00:05:28.137 00:13:27 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:05:28.137 00:13:27 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:05:28.137 00:13:27 -- common/autotest_common.sh@856 -- # local nbd_name=nbd0 00:05:28.137 00:13:27 -- common/autotest_common.sh@857 -- # local i 00:05:28.137 00:13:27 -- common/autotest_common.sh@859 -- # (( i = 1 )) 00:05:28.137 00:13:27 -- common/autotest_common.sh@859 -- # (( i <= 20 )) 00:05:28.137 00:13:27 -- common/autotest_common.sh@860 -- # grep -q -w nbd0 /proc/partitions 00:05:28.137 00:13:27 -- common/autotest_common.sh@861 -- # break 00:05:28.137 00:13:27 -- common/autotest_common.sh@872 -- # (( i = 1 )) 00:05:28.137 00:13:27 -- common/autotest_common.sh@872 -- # (( i <= 20 )) 00:05:28.137 00:13:27 -- common/autotest_common.sh@873 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:28.137 1+0 records in 00:05:28.137 1+0 records out 00:05:28.137 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000286432 s, 14.3 MB/s 00:05:28.137 00:13:27 -- common/autotest_common.sh@874 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:28.137 00:13:27 -- common/autotest_common.sh@874 -- # size=4096 00:05:28.137 00:13:27 -- common/autotest_common.sh@875 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:28.137 00:13:27 -- common/autotest_common.sh@876 -- # '[' 4096 '!=' 0 ']' 00:05:28.137 00:13:27 -- common/autotest_common.sh@877 -- # return 0 00:05:28.137 00:13:27 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:28.137 00:13:27 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:28.137 00:13:27 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:05:28.137 /dev/nbd1 00:05:28.396 00:13:27 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:05:28.396 00:13:27 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:05:28.396 00:13:27 -- common/autotest_common.sh@856 -- # local nbd_name=nbd1 00:05:28.396 00:13:27 -- common/autotest_common.sh@857 -- # local i 00:05:28.396 00:13:27 -- common/autotest_common.sh@859 -- # (( i = 1 )) 00:05:28.396 00:13:27 -- common/autotest_common.sh@859 -- # (( i <= 20 )) 00:05:28.396 00:13:27 -- common/autotest_common.sh@860 -- # grep -q -w nbd1 /proc/partitions 00:05:28.396 00:13:27 -- common/autotest_common.sh@861 -- # break 00:05:28.396 00:13:27 -- common/autotest_common.sh@872 -- # (( i = 1 )) 00:05:28.396 00:13:27 -- common/autotest_common.sh@872 -- # (( i <= 20 )) 00:05:28.396 00:13:27 -- common/autotest_common.sh@873 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:28.396 1+0 records in 00:05:28.396 1+0 records out 00:05:28.396 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000240512 s, 17.0 MB/s 00:05:28.396 00:13:27 -- common/autotest_common.sh@874 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:28.396 00:13:27 -- common/autotest_common.sh@874 -- # size=4096 00:05:28.396 00:13:27 -- common/autotest_common.sh@875 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:28.396 00:13:27 -- common/autotest_common.sh@876 -- # '[' 4096 '!=' 0 ']' 00:05:28.397 00:13:27 -- common/autotest_common.sh@877 -- # return 0 00:05:28.397 00:13:27 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:28.397 00:13:27 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:28.397 00:13:27 -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:28.397 00:13:27 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:28.397 00:13:27 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:28.397 00:13:27 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:05:28.397 { 00:05:28.397 "nbd_device": "/dev/nbd0", 00:05:28.397 "bdev_name": "Malloc0" 00:05:28.397 }, 00:05:28.397 { 00:05:28.397 "nbd_device": "/dev/nbd1", 00:05:28.397 "bdev_name": "Malloc1" 00:05:28.397 } 00:05:28.397 ]' 00:05:28.397 00:13:27 -- bdev/nbd_common.sh@64 -- # echo '[ 00:05:28.397 { 00:05:28.397 "nbd_device": "/dev/nbd0", 00:05:28.397 "bdev_name": "Malloc0" 00:05:28.397 }, 00:05:28.397 { 00:05:28.397 "nbd_device": "/dev/nbd1", 00:05:28.397 "bdev_name": "Malloc1" 00:05:28.397 } 00:05:28.397 ]' 00:05:28.397 00:13:27 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:28.397 00:13:27 -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:05:28.397 /dev/nbd1' 00:05:28.397 00:13:27 -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:05:28.397 /dev/nbd1' 00:05:28.397 00:13:27 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:28.397 00:13:27 -- bdev/nbd_common.sh@65 -- # count=2 00:05:28.397 00:13:27 -- bdev/nbd_common.sh@66 -- # echo 2 00:05:28.656 00:13:27 -- bdev/nbd_common.sh@95 -- # count=2 00:05:28.656 00:13:27 -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:05:28.656 00:13:27 -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:05:28.656 00:13:27 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:28.656 00:13:27 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:28.656 00:13:27 -- bdev/nbd_common.sh@71 -- # local operation=write 00:05:28.656 00:13:27 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:28.656 00:13:27 -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:05:28.656 00:13:27 -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:05:28.656 256+0 records in 00:05:28.656 256+0 records out 00:05:28.656 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0113714 s, 92.2 MB/s 00:05:28.656 00:13:27 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:28.656 00:13:27 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:05:28.656 256+0 records in 00:05:28.656 256+0 records out 00:05:28.656 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0199864 s, 52.5 MB/s 00:05:28.656 00:13:27 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:28.656 00:13:27 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:05:28.656 256+0 records in 00:05:28.656 256+0 records out 00:05:28.656 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0218525 s, 48.0 MB/s 00:05:28.656 00:13:27 -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:05:28.656 00:13:27 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:28.656 00:13:27 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:28.656 00:13:27 -- bdev/nbd_common.sh@71 -- # local operation=verify 00:05:28.656 00:13:27 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:28.656 00:13:27 -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:05:28.656 00:13:27 -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:05:28.656 00:13:27 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:28.656 00:13:27 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:05:28.656 00:13:27 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:28.656 00:13:27 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:05:28.656 00:13:27 -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:28.656 00:13:27 -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:05:28.656 00:13:27 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:28.656 00:13:27 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:28.656 00:13:27 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:05:28.657 00:13:27 -- bdev/nbd_common.sh@51 -- # local i 00:05:28.657 00:13:27 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:28.657 00:13:27 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:05:28.916 00:13:27 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:05:28.916 00:13:27 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:05:28.916 00:13:27 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:05:28.916 00:13:27 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:28.916 00:13:27 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:28.916 00:13:27 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:05:28.916 00:13:27 -- bdev/nbd_common.sh@41 -- # break 00:05:28.916 00:13:27 -- bdev/nbd_common.sh@45 -- # return 0 00:05:28.916 00:13:27 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:28.916 00:13:27 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:05:28.916 00:13:27 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:05:28.916 00:13:27 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:05:28.916 00:13:27 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:05:28.916 00:13:27 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:28.916 00:13:27 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:28.916 00:13:27 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:05:28.916 00:13:27 -- bdev/nbd_common.sh@41 -- # break 00:05:28.916 00:13:27 -- bdev/nbd_common.sh@45 -- # return 0 00:05:28.916 00:13:27 -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:28.916 00:13:27 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:28.916 00:13:27 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:29.175 00:13:28 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:05:29.175 00:13:28 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:05:29.175 00:13:28 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:29.175 00:13:28 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:05:29.175 00:13:28 -- bdev/nbd_common.sh@65 -- # echo '' 00:05:29.175 00:13:28 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:29.175 00:13:28 -- bdev/nbd_common.sh@65 -- # true 00:05:29.175 00:13:28 -- bdev/nbd_common.sh@65 -- # count=0 00:05:29.175 00:13:28 -- bdev/nbd_common.sh@66 -- # echo 0 00:05:29.175 00:13:28 -- bdev/nbd_common.sh@104 -- # count=0 00:05:29.175 00:13:28 -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:05:29.175 00:13:28 -- bdev/nbd_common.sh@109 -- # return 0 00:05:29.175 00:13:28 -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:05:29.434 00:13:28 -- event/event.sh@35 -- # sleep 3 00:05:29.693 [2024-07-15 00:13:28.498356] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:29.693 [2024-07-15 00:13:28.561569] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:29.693 [2024-07-15 00:13:28.561571] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:29.693 [2024-07-15 00:13:28.602078] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:05:29.693 [2024-07-15 00:13:28.602122] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:05:32.302 00:13:31 -- event/event.sh@23 -- # for i in {0..2} 00:05:32.302 00:13:31 -- event/event.sh@24 -- # echo 'spdk_app_start Round 2' 00:05:32.302 spdk_app_start Round 2 00:05:32.302 00:13:31 -- event/event.sh@25 -- # waitforlisten 306831 /var/tmp/spdk-nbd.sock 00:05:32.302 00:13:31 -- common/autotest_common.sh@819 -- # '[' -z 306831 ']' 00:05:32.302 00:13:31 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:05:32.302 00:13:31 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:32.302 00:13:31 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:05:32.302 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:05:32.302 00:13:31 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:32.302 00:13:31 -- common/autotest_common.sh@10 -- # set +x 00:05:32.560 00:13:31 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:32.561 00:13:31 -- common/autotest_common.sh@852 -- # return 0 00:05:32.561 00:13:31 -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:32.819 Malloc0 00:05:32.819 00:13:31 -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:32.819 Malloc1 00:05:32.819 00:13:31 -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:32.819 00:13:31 -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:32.819 00:13:31 -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:32.819 00:13:31 -- bdev/nbd_common.sh@91 -- # local bdev_list 00:05:32.819 00:13:31 -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:32.819 00:13:31 -- bdev/nbd_common.sh@92 -- # local nbd_list 00:05:32.819 00:13:31 -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:32.819 00:13:31 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:32.819 00:13:31 -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:32.819 00:13:31 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:05:32.819 00:13:31 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:32.819 00:13:31 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:05:32.819 00:13:31 -- bdev/nbd_common.sh@12 -- # local i 00:05:32.819 00:13:31 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:05:32.819 00:13:31 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:32.819 00:13:31 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:05:33.077 /dev/nbd0 00:05:33.077 00:13:31 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:05:33.077 00:13:31 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:05:33.077 00:13:31 -- common/autotest_common.sh@856 -- # local nbd_name=nbd0 00:05:33.077 00:13:31 -- common/autotest_common.sh@857 -- # local i 00:05:33.077 00:13:31 -- common/autotest_common.sh@859 -- # (( i = 1 )) 00:05:33.077 00:13:31 -- common/autotest_common.sh@859 -- # (( i <= 20 )) 00:05:33.077 00:13:31 -- common/autotest_common.sh@860 -- # grep -q -w nbd0 /proc/partitions 00:05:33.077 00:13:32 -- common/autotest_common.sh@861 -- # break 00:05:33.077 00:13:32 -- common/autotest_common.sh@872 -- # (( i = 1 )) 00:05:33.077 00:13:32 -- common/autotest_common.sh@872 -- # (( i <= 20 )) 00:05:33.077 00:13:32 -- common/autotest_common.sh@873 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:33.077 1+0 records in 00:05:33.077 1+0 records out 00:05:33.077 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000250068 s, 16.4 MB/s 00:05:33.077 00:13:32 -- common/autotest_common.sh@874 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:33.077 00:13:32 -- common/autotest_common.sh@874 -- # size=4096 00:05:33.077 00:13:32 -- common/autotest_common.sh@875 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:33.077 00:13:32 -- common/autotest_common.sh@876 -- # '[' 4096 '!=' 0 ']' 00:05:33.077 00:13:32 -- common/autotest_common.sh@877 -- # return 0 00:05:33.077 00:13:32 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:33.077 00:13:32 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:33.077 00:13:32 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:05:33.335 /dev/nbd1 00:05:33.335 00:13:32 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:05:33.335 00:13:32 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:05:33.335 00:13:32 -- common/autotest_common.sh@856 -- # local nbd_name=nbd1 00:05:33.335 00:13:32 -- common/autotest_common.sh@857 -- # local i 00:05:33.335 00:13:32 -- common/autotest_common.sh@859 -- # (( i = 1 )) 00:05:33.335 00:13:32 -- common/autotest_common.sh@859 -- # (( i <= 20 )) 00:05:33.335 00:13:32 -- common/autotest_common.sh@860 -- # grep -q -w nbd1 /proc/partitions 00:05:33.335 00:13:32 -- common/autotest_common.sh@861 -- # break 00:05:33.335 00:13:32 -- common/autotest_common.sh@872 -- # (( i = 1 )) 00:05:33.335 00:13:32 -- common/autotest_common.sh@872 -- # (( i <= 20 )) 00:05:33.335 00:13:32 -- common/autotest_common.sh@873 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:33.335 1+0 records in 00:05:33.335 1+0 records out 00:05:33.335 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000237173 s, 17.3 MB/s 00:05:33.335 00:13:32 -- common/autotest_common.sh@874 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:33.335 00:13:32 -- common/autotest_common.sh@874 -- # size=4096 00:05:33.335 00:13:32 -- common/autotest_common.sh@875 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:33.335 00:13:32 -- common/autotest_common.sh@876 -- # '[' 4096 '!=' 0 ']' 00:05:33.335 00:13:32 -- common/autotest_common.sh@877 -- # return 0 00:05:33.335 00:13:32 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:33.335 00:13:32 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:33.335 00:13:32 -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:33.335 00:13:32 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:33.335 00:13:32 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:33.593 00:13:32 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:05:33.593 { 00:05:33.593 "nbd_device": "/dev/nbd0", 00:05:33.593 "bdev_name": "Malloc0" 00:05:33.593 }, 00:05:33.593 { 00:05:33.593 "nbd_device": "/dev/nbd1", 00:05:33.593 "bdev_name": "Malloc1" 00:05:33.593 } 00:05:33.593 ]' 00:05:33.593 00:13:32 -- bdev/nbd_common.sh@64 -- # echo '[ 00:05:33.593 { 00:05:33.593 "nbd_device": "/dev/nbd0", 00:05:33.593 "bdev_name": "Malloc0" 00:05:33.593 }, 00:05:33.593 { 00:05:33.593 "nbd_device": "/dev/nbd1", 00:05:33.593 "bdev_name": "Malloc1" 00:05:33.593 } 00:05:33.593 ]' 00:05:33.593 00:13:32 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:33.594 00:13:32 -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:05:33.594 /dev/nbd1' 00:05:33.594 00:13:32 -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:05:33.594 /dev/nbd1' 00:05:33.594 00:13:32 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:33.594 00:13:32 -- bdev/nbd_common.sh@65 -- # count=2 00:05:33.594 00:13:32 -- bdev/nbd_common.sh@66 -- # echo 2 00:05:33.594 00:13:32 -- bdev/nbd_common.sh@95 -- # count=2 00:05:33.594 00:13:32 -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:05:33.594 00:13:32 -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:05:33.594 00:13:32 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:33.594 00:13:32 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:33.594 00:13:32 -- bdev/nbd_common.sh@71 -- # local operation=write 00:05:33.594 00:13:32 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:33.594 00:13:32 -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:05:33.594 00:13:32 -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:05:33.594 256+0 records in 00:05:33.594 256+0 records out 00:05:33.594 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0103545 s, 101 MB/s 00:05:33.594 00:13:32 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:33.594 00:13:32 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:05:33.594 256+0 records in 00:05:33.594 256+0 records out 00:05:33.594 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0204317 s, 51.3 MB/s 00:05:33.594 00:13:32 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:33.594 00:13:32 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:05:33.594 256+0 records in 00:05:33.594 256+0 records out 00:05:33.594 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0215134 s, 48.7 MB/s 00:05:33.594 00:13:32 -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:05:33.594 00:13:32 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:33.594 00:13:32 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:33.594 00:13:32 -- bdev/nbd_common.sh@71 -- # local operation=verify 00:05:33.594 00:13:32 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:33.594 00:13:32 -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:05:33.594 00:13:32 -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:05:33.594 00:13:32 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:33.594 00:13:32 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:05:33.594 00:13:32 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:33.594 00:13:32 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:05:33.594 00:13:32 -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:33.594 00:13:32 -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:05:33.594 00:13:32 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:33.594 00:13:32 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:33.594 00:13:32 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:05:33.594 00:13:32 -- bdev/nbd_common.sh@51 -- # local i 00:05:33.594 00:13:32 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:33.594 00:13:32 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:05:33.853 00:13:32 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:05:33.853 00:13:32 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:05:33.853 00:13:32 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:05:33.853 00:13:32 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:33.853 00:13:32 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:33.853 00:13:32 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:05:33.853 00:13:32 -- bdev/nbd_common.sh@41 -- # break 00:05:33.853 00:13:32 -- bdev/nbd_common.sh@45 -- # return 0 00:05:33.853 00:13:32 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:33.853 00:13:32 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:05:34.112 00:13:32 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:05:34.112 00:13:32 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:05:34.112 00:13:32 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:05:34.112 00:13:32 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:34.112 00:13:32 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:34.112 00:13:32 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:05:34.112 00:13:32 -- bdev/nbd_common.sh@41 -- # break 00:05:34.112 00:13:32 -- bdev/nbd_common.sh@45 -- # return 0 00:05:34.112 00:13:32 -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:34.112 00:13:32 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:34.112 00:13:32 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:34.112 00:13:33 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:05:34.112 00:13:33 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:05:34.112 00:13:33 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:34.112 00:13:33 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:05:34.112 00:13:33 -- bdev/nbd_common.sh@65 -- # echo '' 00:05:34.112 00:13:33 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:34.112 00:13:33 -- bdev/nbd_common.sh@65 -- # true 00:05:34.112 00:13:33 -- bdev/nbd_common.sh@65 -- # count=0 00:05:34.112 00:13:33 -- bdev/nbd_common.sh@66 -- # echo 0 00:05:34.112 00:13:33 -- bdev/nbd_common.sh@104 -- # count=0 00:05:34.112 00:13:33 -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:05:34.112 00:13:33 -- bdev/nbd_common.sh@109 -- # return 0 00:05:34.112 00:13:33 -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:05:34.370 00:13:33 -- event/event.sh@35 -- # sleep 3 00:05:34.629 [2024-07-15 00:13:33.500955] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:34.629 [2024-07-15 00:13:33.566002] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:34.629 [2024-07-15 00:13:33.566004] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:34.629 [2024-07-15 00:13:33.606577] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:05:34.629 [2024-07-15 00:13:33.606620] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:05:37.917 00:13:36 -- event/event.sh@38 -- # waitforlisten 306831 /var/tmp/spdk-nbd.sock 00:05:37.917 00:13:36 -- common/autotest_common.sh@819 -- # '[' -z 306831 ']' 00:05:37.917 00:13:36 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:05:37.917 00:13:36 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:37.917 00:13:36 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:05:37.917 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:05:37.917 00:13:36 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:37.917 00:13:36 -- common/autotest_common.sh@10 -- # set +x 00:05:37.917 00:13:36 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:37.917 00:13:36 -- common/autotest_common.sh@852 -- # return 0 00:05:37.917 00:13:36 -- event/event.sh@39 -- # killprocess 306831 00:05:37.917 00:13:36 -- common/autotest_common.sh@926 -- # '[' -z 306831 ']' 00:05:37.917 00:13:36 -- common/autotest_common.sh@930 -- # kill -0 306831 00:05:37.917 00:13:36 -- common/autotest_common.sh@931 -- # uname 00:05:37.917 00:13:36 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:05:37.917 00:13:36 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 306831 00:05:37.917 00:13:36 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:05:37.917 00:13:36 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:05:37.917 00:13:36 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 306831' 00:05:37.917 killing process with pid 306831 00:05:37.917 00:13:36 -- common/autotest_common.sh@945 -- # kill 306831 00:05:37.917 00:13:36 -- common/autotest_common.sh@950 -- # wait 306831 00:05:37.917 spdk_app_start is called in Round 0. 00:05:37.917 Shutdown signal received, stop current app iteration 00:05:37.917 Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 reinitialization... 00:05:37.917 spdk_app_start is called in Round 1. 00:05:37.917 Shutdown signal received, stop current app iteration 00:05:37.917 Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 reinitialization... 00:05:37.917 spdk_app_start is called in Round 2. 00:05:37.917 Shutdown signal received, stop current app iteration 00:05:37.917 Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 reinitialization... 00:05:37.917 spdk_app_start is called in Round 3. 00:05:37.917 Shutdown signal received, stop current app iteration 00:05:37.917 00:13:36 -- event/event.sh@40 -- # trap - SIGINT SIGTERM EXIT 00:05:37.917 00:13:36 -- event/event.sh@42 -- # return 0 00:05:37.917 00:05:37.917 real 0m15.981s 00:05:37.917 user 0m33.739s 00:05:37.917 sys 0m3.045s 00:05:37.917 00:13:36 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:37.917 00:13:36 -- common/autotest_common.sh@10 -- # set +x 00:05:37.917 ************************************ 00:05:37.917 END TEST app_repeat 00:05:37.917 ************************************ 00:05:37.917 00:13:36 -- event/event.sh@54 -- # (( SPDK_TEST_CRYPTO == 0 )) 00:05:37.917 00:13:36 -- event/event.sh@55 -- # run_test cpu_locks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/cpu_locks.sh 00:05:37.917 00:13:36 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:37.917 00:13:36 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:37.917 00:13:36 -- common/autotest_common.sh@10 -- # set +x 00:05:37.917 ************************************ 00:05:37.917 START TEST cpu_locks 00:05:37.917 ************************************ 00:05:37.917 00:13:36 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/cpu_locks.sh 00:05:37.917 * Looking for test storage... 00:05:37.917 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event 00:05:37.917 00:13:36 -- event/cpu_locks.sh@11 -- # rpc_sock1=/var/tmp/spdk.sock 00:05:37.917 00:13:36 -- event/cpu_locks.sh@12 -- # rpc_sock2=/var/tmp/spdk2.sock 00:05:37.917 00:13:36 -- event/cpu_locks.sh@164 -- # trap cleanup EXIT SIGTERM SIGINT 00:05:37.917 00:13:36 -- event/cpu_locks.sh@166 -- # run_test default_locks default_locks 00:05:37.917 00:13:36 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:37.917 00:13:36 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:37.917 00:13:36 -- common/autotest_common.sh@10 -- # set +x 00:05:37.917 ************************************ 00:05:37.917 START TEST default_locks 00:05:37.917 ************************************ 00:05:37.917 00:13:36 -- common/autotest_common.sh@1104 -- # default_locks 00:05:37.917 00:13:36 -- event/cpu_locks.sh@46 -- # spdk_tgt_pid=309920 00:05:37.917 00:13:36 -- event/cpu_locks.sh@47 -- # waitforlisten 309920 00:05:37.917 00:13:36 -- event/cpu_locks.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:05:37.917 00:13:36 -- common/autotest_common.sh@819 -- # '[' -z 309920 ']' 00:05:37.917 00:13:36 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:37.917 00:13:36 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:37.917 00:13:36 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:37.917 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:37.917 00:13:36 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:37.917 00:13:36 -- common/autotest_common.sh@10 -- # set +x 00:05:37.917 [2024-07-15 00:13:36.867410] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:37.917 [2024-07-15 00:13:36.867512] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid309920 ] 00:05:37.917 EAL: No free 2048 kB hugepages reported on node 1 00:05:37.917 [2024-07-15 00:13:36.936859] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:38.175 [2024-07-15 00:13:37.013176] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:38.175 [2024-07-15 00:13:37.013284] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:38.743 00:13:37 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:38.743 00:13:37 -- common/autotest_common.sh@852 -- # return 0 00:05:38.743 00:13:37 -- event/cpu_locks.sh@49 -- # locks_exist 309920 00:05:38.743 00:13:37 -- event/cpu_locks.sh@22 -- # lslocks -p 309920 00:05:38.743 00:13:37 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:05:39.311 lslocks: write error 00:05:39.311 00:13:38 -- event/cpu_locks.sh@50 -- # killprocess 309920 00:05:39.311 00:13:38 -- common/autotest_common.sh@926 -- # '[' -z 309920 ']' 00:05:39.311 00:13:38 -- common/autotest_common.sh@930 -- # kill -0 309920 00:05:39.311 00:13:38 -- common/autotest_common.sh@931 -- # uname 00:05:39.311 00:13:38 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:05:39.311 00:13:38 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 309920 00:05:39.311 00:13:38 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:05:39.311 00:13:38 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:05:39.311 00:13:38 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 309920' 00:05:39.311 killing process with pid 309920 00:05:39.311 00:13:38 -- common/autotest_common.sh@945 -- # kill 309920 00:05:39.311 00:13:38 -- common/autotest_common.sh@950 -- # wait 309920 00:05:39.571 00:13:38 -- event/cpu_locks.sh@52 -- # NOT waitforlisten 309920 00:05:39.571 00:13:38 -- common/autotest_common.sh@640 -- # local es=0 00:05:39.571 00:13:38 -- common/autotest_common.sh@642 -- # valid_exec_arg waitforlisten 309920 00:05:39.571 00:13:38 -- common/autotest_common.sh@628 -- # local arg=waitforlisten 00:05:39.571 00:13:38 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:05:39.571 00:13:38 -- common/autotest_common.sh@632 -- # type -t waitforlisten 00:05:39.571 00:13:38 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:05:39.571 00:13:38 -- common/autotest_common.sh@643 -- # waitforlisten 309920 00:05:39.571 00:13:38 -- common/autotest_common.sh@819 -- # '[' -z 309920 ']' 00:05:39.571 00:13:38 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:39.571 00:13:38 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:39.571 00:13:38 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:39.571 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:39.571 00:13:38 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:39.571 00:13:38 -- common/autotest_common.sh@10 -- # set +x 00:05:39.571 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 834: kill: (309920) - No such process 00:05:39.571 ERROR: process (pid: 309920) is no longer running 00:05:39.571 00:13:38 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:39.571 00:13:38 -- common/autotest_common.sh@852 -- # return 1 00:05:39.571 00:13:38 -- common/autotest_common.sh@643 -- # es=1 00:05:39.571 00:13:38 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:05:39.571 00:13:38 -- common/autotest_common.sh@662 -- # [[ -n '' ]] 00:05:39.571 00:13:38 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:05:39.571 00:13:38 -- event/cpu_locks.sh@54 -- # no_locks 00:05:39.571 00:13:38 -- event/cpu_locks.sh@26 -- # lock_files=() 00:05:39.571 00:13:38 -- event/cpu_locks.sh@26 -- # local lock_files 00:05:39.571 00:13:38 -- event/cpu_locks.sh@27 -- # (( 0 != 0 )) 00:05:39.571 00:05:39.571 real 0m1.740s 00:05:39.571 user 0m1.805s 00:05:39.571 sys 0m0.619s 00:05:39.571 00:13:38 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:39.571 00:13:38 -- common/autotest_common.sh@10 -- # set +x 00:05:39.571 ************************************ 00:05:39.571 END TEST default_locks 00:05:39.571 ************************************ 00:05:39.571 00:13:38 -- event/cpu_locks.sh@167 -- # run_test default_locks_via_rpc default_locks_via_rpc 00:05:39.571 00:13:38 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:39.571 00:13:38 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:39.571 00:13:38 -- common/autotest_common.sh@10 -- # set +x 00:05:39.571 ************************************ 00:05:39.571 START TEST default_locks_via_rpc 00:05:39.571 ************************************ 00:05:39.571 00:13:38 -- common/autotest_common.sh@1104 -- # default_locks_via_rpc 00:05:39.571 00:13:38 -- event/cpu_locks.sh@62 -- # spdk_tgt_pid=310324 00:05:39.571 00:13:38 -- event/cpu_locks.sh@63 -- # waitforlisten 310324 00:05:39.571 00:13:38 -- common/autotest_common.sh@819 -- # '[' -z 310324 ']' 00:05:39.571 00:13:38 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:39.571 00:13:38 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:39.572 00:13:38 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:39.572 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:39.572 00:13:38 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:39.572 00:13:38 -- common/autotest_common.sh@10 -- # set +x 00:05:39.572 00:13:38 -- event/cpu_locks.sh@61 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:05:39.832 [2024-07-15 00:13:38.647190] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:39.832 [2024-07-15 00:13:38.647259] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid310324 ] 00:05:39.832 EAL: No free 2048 kB hugepages reported on node 1 00:05:39.832 [2024-07-15 00:13:38.714108] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:39.832 [2024-07-15 00:13:38.789057] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:39.832 [2024-07-15 00:13:38.789166] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:40.400 00:13:39 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:40.400 00:13:39 -- common/autotest_common.sh@852 -- # return 0 00:05:40.400 00:13:39 -- event/cpu_locks.sh@65 -- # rpc_cmd framework_disable_cpumask_locks 00:05:40.400 00:13:39 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:40.400 00:13:39 -- common/autotest_common.sh@10 -- # set +x 00:05:40.659 00:13:39 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:40.659 00:13:39 -- event/cpu_locks.sh@67 -- # no_locks 00:05:40.659 00:13:39 -- event/cpu_locks.sh@26 -- # lock_files=() 00:05:40.659 00:13:39 -- event/cpu_locks.sh@26 -- # local lock_files 00:05:40.659 00:13:39 -- event/cpu_locks.sh@27 -- # (( 0 != 0 )) 00:05:40.659 00:13:39 -- event/cpu_locks.sh@69 -- # rpc_cmd framework_enable_cpumask_locks 00:05:40.659 00:13:39 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:40.659 00:13:39 -- common/autotest_common.sh@10 -- # set +x 00:05:40.659 00:13:39 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:40.659 00:13:39 -- event/cpu_locks.sh@71 -- # locks_exist 310324 00:05:40.659 00:13:39 -- event/cpu_locks.sh@22 -- # lslocks -p 310324 00:05:40.659 00:13:39 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:05:40.918 00:13:39 -- event/cpu_locks.sh@73 -- # killprocess 310324 00:05:40.918 00:13:39 -- common/autotest_common.sh@926 -- # '[' -z 310324 ']' 00:05:40.918 00:13:39 -- common/autotest_common.sh@930 -- # kill -0 310324 00:05:40.918 00:13:39 -- common/autotest_common.sh@931 -- # uname 00:05:40.918 00:13:39 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:05:40.918 00:13:39 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 310324 00:05:40.918 00:13:39 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:05:40.918 00:13:39 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:05:40.918 00:13:39 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 310324' 00:05:40.918 killing process with pid 310324 00:05:40.918 00:13:39 -- common/autotest_common.sh@945 -- # kill 310324 00:05:40.918 00:13:39 -- common/autotest_common.sh@950 -- # wait 310324 00:05:41.177 00:05:41.177 real 0m1.499s 00:05:41.177 user 0m1.550s 00:05:41.177 sys 0m0.528s 00:05:41.177 00:13:40 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:41.177 00:13:40 -- common/autotest_common.sh@10 -- # set +x 00:05:41.177 ************************************ 00:05:41.177 END TEST default_locks_via_rpc 00:05:41.177 ************************************ 00:05:41.177 00:13:40 -- event/cpu_locks.sh@168 -- # run_test non_locking_app_on_locked_coremask non_locking_app_on_locked_coremask 00:05:41.177 00:13:40 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:41.177 00:13:40 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:41.177 00:13:40 -- common/autotest_common.sh@10 -- # set +x 00:05:41.177 ************************************ 00:05:41.177 START TEST non_locking_app_on_locked_coremask 00:05:41.177 ************************************ 00:05:41.178 00:13:40 -- common/autotest_common.sh@1104 -- # non_locking_app_on_locked_coremask 00:05:41.178 00:13:40 -- event/cpu_locks.sh@80 -- # spdk_tgt_pid=310637 00:05:41.178 00:13:40 -- event/cpu_locks.sh@81 -- # waitforlisten 310637 /var/tmp/spdk.sock 00:05:41.178 00:13:40 -- event/cpu_locks.sh@79 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:05:41.178 00:13:40 -- common/autotest_common.sh@819 -- # '[' -z 310637 ']' 00:05:41.178 00:13:40 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:41.178 00:13:40 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:41.178 00:13:40 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:41.178 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:41.178 00:13:40 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:41.178 00:13:40 -- common/autotest_common.sh@10 -- # set +x 00:05:41.178 [2024-07-15 00:13:40.191052] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:41.178 [2024-07-15 00:13:40.191122] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid310637 ] 00:05:41.178 EAL: No free 2048 kB hugepages reported on node 1 00:05:41.437 [2024-07-15 00:13:40.259604] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:41.437 [2024-07-15 00:13:40.324776] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:41.437 [2024-07-15 00:13:40.324884] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:42.006 00:13:40 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:42.006 00:13:40 -- common/autotest_common.sh@852 -- # return 0 00:05:42.006 00:13:40 -- event/cpu_locks.sh@84 -- # spdk_tgt_pid2=310654 00:05:42.006 00:13:40 -- event/cpu_locks.sh@85 -- # waitforlisten 310654 /var/tmp/spdk2.sock 00:05:42.006 00:13:40 -- event/cpu_locks.sh@83 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 --disable-cpumask-locks -r /var/tmp/spdk2.sock 00:05:42.006 00:13:40 -- common/autotest_common.sh@819 -- # '[' -z 310654 ']' 00:05:42.006 00:13:40 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk2.sock 00:05:42.006 00:13:40 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:42.006 00:13:40 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:05:42.006 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:05:42.006 00:13:40 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:42.006 00:13:40 -- common/autotest_common.sh@10 -- # set +x 00:05:42.006 [2024-07-15 00:13:41.013047] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:42.006 [2024-07-15 00:13:41.013120] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid310654 ] 00:05:42.006 EAL: No free 2048 kB hugepages reported on node 1 00:05:42.266 [2024-07-15 00:13:41.109645] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:05:42.266 [2024-07-15 00:13:41.109676] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:42.266 [2024-07-15 00:13:41.247156] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:42.266 [2024-07-15 00:13:41.247286] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:42.834 00:13:41 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:42.834 00:13:41 -- common/autotest_common.sh@852 -- # return 0 00:05:42.834 00:13:41 -- event/cpu_locks.sh@87 -- # locks_exist 310637 00:05:42.834 00:13:41 -- event/cpu_locks.sh@22 -- # lslocks -p 310637 00:05:42.834 00:13:41 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:05:44.214 lslocks: write error 00:05:44.214 00:13:43 -- event/cpu_locks.sh@89 -- # killprocess 310637 00:05:44.214 00:13:43 -- common/autotest_common.sh@926 -- # '[' -z 310637 ']' 00:05:44.214 00:13:43 -- common/autotest_common.sh@930 -- # kill -0 310637 00:05:44.214 00:13:43 -- common/autotest_common.sh@931 -- # uname 00:05:44.214 00:13:43 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:05:44.214 00:13:43 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 310637 00:05:44.214 00:13:43 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:05:44.214 00:13:43 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:05:44.214 00:13:43 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 310637' 00:05:44.214 killing process with pid 310637 00:05:44.214 00:13:43 -- common/autotest_common.sh@945 -- # kill 310637 00:05:44.214 00:13:43 -- common/autotest_common.sh@950 -- # wait 310637 00:05:44.781 00:13:43 -- event/cpu_locks.sh@90 -- # killprocess 310654 00:05:44.781 00:13:43 -- common/autotest_common.sh@926 -- # '[' -z 310654 ']' 00:05:44.781 00:13:43 -- common/autotest_common.sh@930 -- # kill -0 310654 00:05:44.781 00:13:43 -- common/autotest_common.sh@931 -- # uname 00:05:44.781 00:13:43 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:05:44.781 00:13:43 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 310654 00:05:44.781 00:13:43 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:05:44.781 00:13:43 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:05:44.781 00:13:43 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 310654' 00:05:44.781 killing process with pid 310654 00:05:44.781 00:13:43 -- common/autotest_common.sh@945 -- # kill 310654 00:05:44.781 00:13:43 -- common/autotest_common.sh@950 -- # wait 310654 00:05:45.347 00:05:45.347 real 0m3.930s 00:05:45.347 user 0m4.175s 00:05:45.347 sys 0m1.316s 00:05:45.347 00:13:44 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:45.347 00:13:44 -- common/autotest_common.sh@10 -- # set +x 00:05:45.347 ************************************ 00:05:45.347 END TEST non_locking_app_on_locked_coremask 00:05:45.347 ************************************ 00:05:45.347 00:13:44 -- event/cpu_locks.sh@169 -- # run_test locking_app_on_unlocked_coremask locking_app_on_unlocked_coremask 00:05:45.347 00:13:44 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:45.347 00:13:44 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:45.347 00:13:44 -- common/autotest_common.sh@10 -- # set +x 00:05:45.347 ************************************ 00:05:45.347 START TEST locking_app_on_unlocked_coremask 00:05:45.347 ************************************ 00:05:45.347 00:13:44 -- common/autotest_common.sh@1104 -- # locking_app_on_unlocked_coremask 00:05:45.347 00:13:44 -- event/cpu_locks.sh@98 -- # spdk_tgt_pid=311231 00:05:45.348 00:13:44 -- event/cpu_locks.sh@99 -- # waitforlisten 311231 /var/tmp/spdk.sock 00:05:45.348 00:13:44 -- event/cpu_locks.sh@97 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 --disable-cpumask-locks 00:05:45.348 00:13:44 -- common/autotest_common.sh@819 -- # '[' -z 311231 ']' 00:05:45.348 00:13:44 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:45.348 00:13:44 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:45.348 00:13:44 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:45.348 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:45.348 00:13:44 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:45.348 00:13:44 -- common/autotest_common.sh@10 -- # set +x 00:05:45.348 [2024-07-15 00:13:44.168306] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:45.348 [2024-07-15 00:13:44.168383] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid311231 ] 00:05:45.348 EAL: No free 2048 kB hugepages reported on node 1 00:05:45.348 [2024-07-15 00:13:44.237858] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:05:45.348 [2024-07-15 00:13:44.237882] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:45.348 [2024-07-15 00:13:44.313414] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:45.348 [2024-07-15 00:13:44.313528] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:46.283 00:13:44 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:46.283 00:13:44 -- common/autotest_common.sh@852 -- # return 0 00:05:46.283 00:13:44 -- event/cpu_locks.sh@102 -- # spdk_tgt_pid2=311497 00:05:46.283 00:13:44 -- event/cpu_locks.sh@103 -- # waitforlisten 311497 /var/tmp/spdk2.sock 00:05:46.283 00:13:44 -- event/cpu_locks.sh@101 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -r /var/tmp/spdk2.sock 00:05:46.283 00:13:44 -- common/autotest_common.sh@819 -- # '[' -z 311497 ']' 00:05:46.283 00:13:44 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk2.sock 00:05:46.283 00:13:44 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:46.283 00:13:44 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:05:46.283 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:05:46.283 00:13:44 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:46.283 00:13:44 -- common/autotest_common.sh@10 -- # set +x 00:05:46.283 [2024-07-15 00:13:45.003858] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:46.283 [2024-07-15 00:13:45.003922] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid311497 ] 00:05:46.283 EAL: No free 2048 kB hugepages reported on node 1 00:05:46.283 [2024-07-15 00:13:45.094511] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:46.283 [2024-07-15 00:13:45.231692] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:46.283 [2024-07-15 00:13:45.231819] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:46.850 00:13:45 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:46.850 00:13:45 -- common/autotest_common.sh@852 -- # return 0 00:05:46.850 00:13:45 -- event/cpu_locks.sh@105 -- # locks_exist 311497 00:05:46.850 00:13:45 -- event/cpu_locks.sh@22 -- # lslocks -p 311497 00:05:46.850 00:13:45 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:05:47.784 lslocks: write error 00:05:47.784 00:13:46 -- event/cpu_locks.sh@107 -- # killprocess 311231 00:05:47.784 00:13:46 -- common/autotest_common.sh@926 -- # '[' -z 311231 ']' 00:05:47.784 00:13:46 -- common/autotest_common.sh@930 -- # kill -0 311231 00:05:47.784 00:13:46 -- common/autotest_common.sh@931 -- # uname 00:05:47.784 00:13:46 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:05:47.784 00:13:46 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 311231 00:05:47.784 00:13:46 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:05:47.784 00:13:46 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:05:47.784 00:13:46 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 311231' 00:05:47.784 killing process with pid 311231 00:05:47.784 00:13:46 -- common/autotest_common.sh@945 -- # kill 311231 00:05:47.784 00:13:46 -- common/autotest_common.sh@950 -- # wait 311231 00:05:48.350 00:13:47 -- event/cpu_locks.sh@108 -- # killprocess 311497 00:05:48.350 00:13:47 -- common/autotest_common.sh@926 -- # '[' -z 311497 ']' 00:05:48.350 00:13:47 -- common/autotest_common.sh@930 -- # kill -0 311497 00:05:48.350 00:13:47 -- common/autotest_common.sh@931 -- # uname 00:05:48.350 00:13:47 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:05:48.350 00:13:47 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 311497 00:05:48.350 00:13:47 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:05:48.350 00:13:47 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:05:48.350 00:13:47 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 311497' 00:05:48.350 killing process with pid 311497 00:05:48.350 00:13:47 -- common/autotest_common.sh@945 -- # kill 311497 00:05:48.350 00:13:47 -- common/autotest_common.sh@950 -- # wait 311497 00:05:48.609 00:05:48.609 real 0m3.436s 00:05:48.609 user 0m3.646s 00:05:48.609 sys 0m1.074s 00:05:48.609 00:13:47 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:48.609 00:13:47 -- common/autotest_common.sh@10 -- # set +x 00:05:48.609 ************************************ 00:05:48.609 END TEST locking_app_on_unlocked_coremask 00:05:48.609 ************************************ 00:05:48.609 00:13:47 -- event/cpu_locks.sh@170 -- # run_test locking_app_on_locked_coremask locking_app_on_locked_coremask 00:05:48.609 00:13:47 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:48.609 00:13:47 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:48.609 00:13:47 -- common/autotest_common.sh@10 -- # set +x 00:05:48.609 ************************************ 00:05:48.609 START TEST locking_app_on_locked_coremask 00:05:48.609 ************************************ 00:05:48.609 00:13:47 -- common/autotest_common.sh@1104 -- # locking_app_on_locked_coremask 00:05:48.609 00:13:47 -- event/cpu_locks.sh@115 -- # spdk_tgt_pid=311958 00:05:48.609 00:13:47 -- event/cpu_locks.sh@116 -- # waitforlisten 311958 /var/tmp/spdk.sock 00:05:48.609 00:13:47 -- event/cpu_locks.sh@114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:05:48.609 00:13:47 -- common/autotest_common.sh@819 -- # '[' -z 311958 ']' 00:05:48.609 00:13:47 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:48.609 00:13:47 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:48.610 00:13:47 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:48.610 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:48.610 00:13:47 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:48.610 00:13:47 -- common/autotest_common.sh@10 -- # set +x 00:05:48.610 [2024-07-15 00:13:47.653506] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:48.610 [2024-07-15 00:13:47.653583] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid311958 ] 00:05:48.868 EAL: No free 2048 kB hugepages reported on node 1 00:05:48.868 [2024-07-15 00:13:47.722798] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:48.869 [2024-07-15 00:13:47.792219] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:48.869 [2024-07-15 00:13:47.792349] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:49.435 00:13:48 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:49.435 00:13:48 -- common/autotest_common.sh@852 -- # return 0 00:05:49.436 00:13:48 -- event/cpu_locks.sh@119 -- # spdk_tgt_pid2=312075 00:05:49.436 00:13:48 -- event/cpu_locks.sh@120 -- # NOT waitforlisten 312075 /var/tmp/spdk2.sock 00:05:49.436 00:13:48 -- event/cpu_locks.sh@118 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -r /var/tmp/spdk2.sock 00:05:49.436 00:13:48 -- common/autotest_common.sh@640 -- # local es=0 00:05:49.436 00:13:48 -- common/autotest_common.sh@642 -- # valid_exec_arg waitforlisten 312075 /var/tmp/spdk2.sock 00:05:49.436 00:13:48 -- common/autotest_common.sh@628 -- # local arg=waitforlisten 00:05:49.436 00:13:48 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:05:49.436 00:13:48 -- common/autotest_common.sh@632 -- # type -t waitforlisten 00:05:49.436 00:13:48 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:05:49.436 00:13:48 -- common/autotest_common.sh@643 -- # waitforlisten 312075 /var/tmp/spdk2.sock 00:05:49.436 00:13:48 -- common/autotest_common.sh@819 -- # '[' -z 312075 ']' 00:05:49.436 00:13:48 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk2.sock 00:05:49.436 00:13:48 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:49.436 00:13:48 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:05:49.436 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:05:49.436 00:13:48 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:49.436 00:13:48 -- common/autotest_common.sh@10 -- # set +x 00:05:49.436 [2024-07-15 00:13:48.489704] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:49.436 [2024-07-15 00:13:48.489794] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid312075 ] 00:05:49.693 EAL: No free 2048 kB hugepages reported on node 1 00:05:49.693 [2024-07-15 00:13:48.580338] app.c: 666:claim_cpu_cores: *ERROR*: Cannot create lock on core 0, probably process 311958 has claimed it. 00:05:49.693 [2024-07-15 00:13:48.580376] app.c: 791:spdk_app_start: *ERROR*: Unable to acquire lock on assigned core mask - exiting. 00:05:50.262 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 834: kill: (312075) - No such process 00:05:50.262 ERROR: process (pid: 312075) is no longer running 00:05:50.262 00:13:49 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:50.262 00:13:49 -- common/autotest_common.sh@852 -- # return 1 00:05:50.262 00:13:49 -- common/autotest_common.sh@643 -- # es=1 00:05:50.262 00:13:49 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:05:50.262 00:13:49 -- common/autotest_common.sh@662 -- # [[ -n '' ]] 00:05:50.262 00:13:49 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:05:50.262 00:13:49 -- event/cpu_locks.sh@122 -- # locks_exist 311958 00:05:50.262 00:13:49 -- event/cpu_locks.sh@22 -- # lslocks -p 311958 00:05:50.262 00:13:49 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:05:50.830 lslocks: write error 00:05:50.830 00:13:49 -- event/cpu_locks.sh@124 -- # killprocess 311958 00:05:50.830 00:13:49 -- common/autotest_common.sh@926 -- # '[' -z 311958 ']' 00:05:50.830 00:13:49 -- common/autotest_common.sh@930 -- # kill -0 311958 00:05:50.830 00:13:49 -- common/autotest_common.sh@931 -- # uname 00:05:50.830 00:13:49 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:05:50.830 00:13:49 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 311958 00:05:50.830 00:13:49 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:05:50.830 00:13:49 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:05:50.830 00:13:49 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 311958' 00:05:50.830 killing process with pid 311958 00:05:50.830 00:13:49 -- common/autotest_common.sh@945 -- # kill 311958 00:05:50.830 00:13:49 -- common/autotest_common.sh@950 -- # wait 311958 00:05:51.399 00:05:51.399 real 0m2.536s 00:05:51.399 user 0m2.730s 00:05:51.399 sys 0m0.768s 00:05:51.399 00:13:50 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:51.399 00:13:50 -- common/autotest_common.sh@10 -- # set +x 00:05:51.399 ************************************ 00:05:51.399 END TEST locking_app_on_locked_coremask 00:05:51.399 ************************************ 00:05:51.399 00:13:50 -- event/cpu_locks.sh@171 -- # run_test locking_overlapped_coremask locking_overlapped_coremask 00:05:51.399 00:13:50 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:51.399 00:13:50 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:51.399 00:13:50 -- common/autotest_common.sh@10 -- # set +x 00:05:51.399 ************************************ 00:05:51.399 START TEST locking_overlapped_coremask 00:05:51.399 ************************************ 00:05:51.399 00:13:50 -- common/autotest_common.sh@1104 -- # locking_overlapped_coremask 00:05:51.399 00:13:50 -- event/cpu_locks.sh@132 -- # spdk_tgt_pid=312385 00:05:51.399 00:13:50 -- event/cpu_locks.sh@133 -- # waitforlisten 312385 /var/tmp/spdk.sock 00:05:51.399 00:13:50 -- event/cpu_locks.sh@131 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x7 00:05:51.399 00:13:50 -- common/autotest_common.sh@819 -- # '[' -z 312385 ']' 00:05:51.399 00:13:50 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:51.399 00:13:50 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:51.399 00:13:50 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:51.399 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:51.399 00:13:50 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:51.399 00:13:50 -- common/autotest_common.sh@10 -- # set +x 00:05:51.399 [2024-07-15 00:13:50.241500] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:51.399 [2024-07-15 00:13:50.241596] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid312385 ] 00:05:51.399 EAL: No free 2048 kB hugepages reported on node 1 00:05:51.399 [2024-07-15 00:13:50.311160] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:05:51.399 [2024-07-15 00:13:50.378909] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:51.399 [2024-07-15 00:13:50.379068] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:51.399 [2024-07-15 00:13:50.379183] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:05:51.399 [2024-07-15 00:13:50.379185] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:52.336 00:13:51 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:52.336 00:13:51 -- common/autotest_common.sh@852 -- # return 0 00:05:52.336 00:13:51 -- event/cpu_locks.sh@135 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1c -r /var/tmp/spdk2.sock 00:05:52.336 00:13:51 -- event/cpu_locks.sh@136 -- # spdk_tgt_pid2=312653 00:05:52.336 00:13:51 -- event/cpu_locks.sh@137 -- # NOT waitforlisten 312653 /var/tmp/spdk2.sock 00:05:52.336 00:13:51 -- common/autotest_common.sh@640 -- # local es=0 00:05:52.336 00:13:51 -- common/autotest_common.sh@642 -- # valid_exec_arg waitforlisten 312653 /var/tmp/spdk2.sock 00:05:52.336 00:13:51 -- common/autotest_common.sh@628 -- # local arg=waitforlisten 00:05:52.336 00:13:51 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:05:52.336 00:13:51 -- common/autotest_common.sh@632 -- # type -t waitforlisten 00:05:52.336 00:13:51 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:05:52.336 00:13:51 -- common/autotest_common.sh@643 -- # waitforlisten 312653 /var/tmp/spdk2.sock 00:05:52.336 00:13:51 -- common/autotest_common.sh@819 -- # '[' -z 312653 ']' 00:05:52.336 00:13:51 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk2.sock 00:05:52.336 00:13:51 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:52.336 00:13:51 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:05:52.336 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:05:52.336 00:13:51 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:52.336 00:13:51 -- common/autotest_common.sh@10 -- # set +x 00:05:52.336 [2024-07-15 00:13:51.060960] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:52.336 [2024-07-15 00:13:51.061024] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1c --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid312653 ] 00:05:52.336 EAL: No free 2048 kB hugepages reported on node 1 00:05:52.336 [2024-07-15 00:13:51.152663] app.c: 666:claim_cpu_cores: *ERROR*: Cannot create lock on core 2, probably process 312385 has claimed it. 00:05:52.336 [2024-07-15 00:13:51.152703] app.c: 791:spdk_app_start: *ERROR*: Unable to acquire lock on assigned core mask - exiting. 00:05:52.903 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 834: kill: (312653) - No such process 00:05:52.903 ERROR: process (pid: 312653) is no longer running 00:05:52.903 00:13:51 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:52.903 00:13:51 -- common/autotest_common.sh@852 -- # return 1 00:05:52.903 00:13:51 -- common/autotest_common.sh@643 -- # es=1 00:05:52.903 00:13:51 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:05:52.903 00:13:51 -- common/autotest_common.sh@662 -- # [[ -n '' ]] 00:05:52.903 00:13:51 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:05:52.903 00:13:51 -- event/cpu_locks.sh@139 -- # check_remaining_locks 00:05:52.903 00:13:51 -- event/cpu_locks.sh@36 -- # locks=(/var/tmp/spdk_cpu_lock_*) 00:05:52.903 00:13:51 -- event/cpu_locks.sh@37 -- # locks_expected=(/var/tmp/spdk_cpu_lock_{000..002}) 00:05:52.903 00:13:51 -- event/cpu_locks.sh@38 -- # [[ /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 == \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\0\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\1\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\2 ]] 00:05:52.903 00:13:51 -- event/cpu_locks.sh@141 -- # killprocess 312385 00:05:52.903 00:13:51 -- common/autotest_common.sh@926 -- # '[' -z 312385 ']' 00:05:52.903 00:13:51 -- common/autotest_common.sh@930 -- # kill -0 312385 00:05:52.903 00:13:51 -- common/autotest_common.sh@931 -- # uname 00:05:52.903 00:13:51 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:05:52.903 00:13:51 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 312385 00:05:52.903 00:13:51 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:05:52.903 00:13:51 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:05:52.903 00:13:51 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 312385' 00:05:52.903 killing process with pid 312385 00:05:52.903 00:13:51 -- common/autotest_common.sh@945 -- # kill 312385 00:05:52.903 00:13:51 -- common/autotest_common.sh@950 -- # wait 312385 00:05:53.162 00:05:53.162 real 0m1.864s 00:05:53.162 user 0m5.249s 00:05:53.162 sys 0m0.425s 00:05:53.162 00:13:52 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:53.162 00:13:52 -- common/autotest_common.sh@10 -- # set +x 00:05:53.162 ************************************ 00:05:53.162 END TEST locking_overlapped_coremask 00:05:53.162 ************************************ 00:05:53.162 00:13:52 -- event/cpu_locks.sh@172 -- # run_test locking_overlapped_coremask_via_rpc locking_overlapped_coremask_via_rpc 00:05:53.162 00:13:52 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:53.162 00:13:52 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:53.163 00:13:52 -- common/autotest_common.sh@10 -- # set +x 00:05:53.163 ************************************ 00:05:53.163 START TEST locking_overlapped_coremask_via_rpc 00:05:53.163 ************************************ 00:05:53.163 00:13:52 -- common/autotest_common.sh@1104 -- # locking_overlapped_coremask_via_rpc 00:05:53.163 00:13:52 -- event/cpu_locks.sh@148 -- # spdk_tgt_pid=312774 00:05:53.163 00:13:52 -- event/cpu_locks.sh@149 -- # waitforlisten 312774 /var/tmp/spdk.sock 00:05:53.163 00:13:52 -- event/cpu_locks.sh@147 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x7 --disable-cpumask-locks 00:05:53.163 00:13:52 -- common/autotest_common.sh@819 -- # '[' -z 312774 ']' 00:05:53.163 00:13:52 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:53.163 00:13:52 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:53.163 00:13:52 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:53.163 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:53.163 00:13:52 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:53.163 00:13:52 -- common/autotest_common.sh@10 -- # set +x 00:05:53.163 [2024-07-15 00:13:52.155798] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:53.163 [2024-07-15 00:13:52.155886] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid312774 ] 00:05:53.163 EAL: No free 2048 kB hugepages reported on node 1 00:05:53.421 [2024-07-15 00:13:52.225346] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:05:53.421 [2024-07-15 00:13:52.225377] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:05:53.421 [2024-07-15 00:13:52.297013] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:53.421 [2024-07-15 00:13:52.297157] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:53.421 [2024-07-15 00:13:52.297272] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:53.421 [2024-07-15 00:13:52.297272] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:05:53.991 00:13:52 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:53.991 00:13:52 -- common/autotest_common.sh@852 -- # return 0 00:05:53.991 00:13:52 -- event/cpu_locks.sh@151 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1c -r /var/tmp/spdk2.sock --disable-cpumask-locks 00:05:53.991 00:13:52 -- event/cpu_locks.sh@152 -- # spdk_tgt_pid2=312954 00:05:53.991 00:13:52 -- event/cpu_locks.sh@153 -- # waitforlisten 312954 /var/tmp/spdk2.sock 00:05:53.991 00:13:52 -- common/autotest_common.sh@819 -- # '[' -z 312954 ']' 00:05:53.991 00:13:52 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk2.sock 00:05:53.991 00:13:52 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:53.991 00:13:52 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:05:53.991 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:05:53.991 00:13:52 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:53.991 00:13:52 -- common/autotest_common.sh@10 -- # set +x 00:05:53.991 [2024-07-15 00:13:52.984909] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:53.991 [2024-07-15 00:13:52.984994] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1c --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid312954 ] 00:05:53.991 EAL: No free 2048 kB hugepages reported on node 1 00:05:54.250 [2024-07-15 00:13:53.079485] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:05:54.250 [2024-07-15 00:13:53.079515] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:05:54.250 [2024-07-15 00:13:53.221802] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:54.250 [2024-07-15 00:13:53.221965] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:05:54.250 [2024-07-15 00:13:53.222098] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:05:54.250 [2024-07-15 00:13:53.222100] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 4 00:05:54.814 00:13:53 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:54.814 00:13:53 -- common/autotest_common.sh@852 -- # return 0 00:05:54.814 00:13:53 -- event/cpu_locks.sh@155 -- # rpc_cmd framework_enable_cpumask_locks 00:05:54.814 00:13:53 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:54.814 00:13:53 -- common/autotest_common.sh@10 -- # set +x 00:05:54.814 00:13:53 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:54.814 00:13:53 -- event/cpu_locks.sh@156 -- # NOT rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:05:54.814 00:13:53 -- common/autotest_common.sh@640 -- # local es=0 00:05:54.814 00:13:53 -- common/autotest_common.sh@642 -- # valid_exec_arg rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:05:54.814 00:13:53 -- common/autotest_common.sh@628 -- # local arg=rpc_cmd 00:05:54.814 00:13:53 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:05:54.814 00:13:53 -- common/autotest_common.sh@632 -- # type -t rpc_cmd 00:05:54.814 00:13:53 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:05:54.814 00:13:53 -- common/autotest_common.sh@643 -- # rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:05:54.814 00:13:53 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:54.814 00:13:53 -- common/autotest_common.sh@10 -- # set +x 00:05:54.814 [2024-07-15 00:13:53.825502] app.c: 666:claim_cpu_cores: *ERROR*: Cannot create lock on core 2, probably process 312774 has claimed it. 00:05:54.814 request: 00:05:54.814 { 00:05:54.814 "method": "framework_enable_cpumask_locks", 00:05:54.814 "req_id": 1 00:05:54.814 } 00:05:54.814 Got JSON-RPC error response 00:05:54.814 response: 00:05:54.814 { 00:05:54.814 "code": -32603, 00:05:54.814 "message": "Failed to claim CPU core: 2" 00:05:54.814 } 00:05:54.814 00:13:53 -- common/autotest_common.sh@579 -- # [[ 1 == 0 ]] 00:05:54.814 00:13:53 -- common/autotest_common.sh@643 -- # es=1 00:05:54.814 00:13:53 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:05:54.814 00:13:53 -- common/autotest_common.sh@662 -- # [[ -n '' ]] 00:05:54.814 00:13:53 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:05:54.814 00:13:53 -- event/cpu_locks.sh@158 -- # waitforlisten 312774 /var/tmp/spdk.sock 00:05:54.814 00:13:53 -- common/autotest_common.sh@819 -- # '[' -z 312774 ']' 00:05:54.814 00:13:53 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:54.814 00:13:53 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:54.814 00:13:53 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:54.814 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:54.815 00:13:53 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:54.815 00:13:53 -- common/autotest_common.sh@10 -- # set +x 00:05:55.073 00:13:54 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:55.073 00:13:54 -- common/autotest_common.sh@852 -- # return 0 00:05:55.073 00:13:54 -- event/cpu_locks.sh@159 -- # waitforlisten 312954 /var/tmp/spdk2.sock 00:05:55.073 00:13:54 -- common/autotest_common.sh@819 -- # '[' -z 312954 ']' 00:05:55.073 00:13:54 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk2.sock 00:05:55.073 00:13:54 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:55.073 00:13:54 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:05:55.073 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:05:55.073 00:13:54 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:55.073 00:13:54 -- common/autotest_common.sh@10 -- # set +x 00:05:55.331 00:13:54 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:55.331 00:13:54 -- common/autotest_common.sh@852 -- # return 0 00:05:55.331 00:13:54 -- event/cpu_locks.sh@161 -- # check_remaining_locks 00:05:55.331 00:13:54 -- event/cpu_locks.sh@36 -- # locks=(/var/tmp/spdk_cpu_lock_*) 00:05:55.331 00:13:54 -- event/cpu_locks.sh@37 -- # locks_expected=(/var/tmp/spdk_cpu_lock_{000..002}) 00:05:55.331 00:13:54 -- event/cpu_locks.sh@38 -- # [[ /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 == \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\0\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\1\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\2 ]] 00:05:55.331 00:05:55.331 real 0m2.081s 00:05:55.331 user 0m0.816s 00:05:55.331 sys 0m0.196s 00:05:55.332 00:13:54 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:55.332 00:13:54 -- common/autotest_common.sh@10 -- # set +x 00:05:55.332 ************************************ 00:05:55.332 END TEST locking_overlapped_coremask_via_rpc 00:05:55.332 ************************************ 00:05:55.332 00:13:54 -- event/cpu_locks.sh@174 -- # cleanup 00:05:55.332 00:13:54 -- event/cpu_locks.sh@15 -- # [[ -z 312774 ]] 00:05:55.332 00:13:54 -- event/cpu_locks.sh@15 -- # killprocess 312774 00:05:55.332 00:13:54 -- common/autotest_common.sh@926 -- # '[' -z 312774 ']' 00:05:55.332 00:13:54 -- common/autotest_common.sh@930 -- # kill -0 312774 00:05:55.332 00:13:54 -- common/autotest_common.sh@931 -- # uname 00:05:55.332 00:13:54 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:05:55.332 00:13:54 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 312774 00:05:55.332 00:13:54 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:05:55.332 00:13:54 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:05:55.332 00:13:54 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 312774' 00:05:55.332 killing process with pid 312774 00:05:55.332 00:13:54 -- common/autotest_common.sh@945 -- # kill 312774 00:05:55.332 00:13:54 -- common/autotest_common.sh@950 -- # wait 312774 00:05:55.590 00:13:54 -- event/cpu_locks.sh@16 -- # [[ -z 312954 ]] 00:05:55.590 00:13:54 -- event/cpu_locks.sh@16 -- # killprocess 312954 00:05:55.590 00:13:54 -- common/autotest_common.sh@926 -- # '[' -z 312954 ']' 00:05:55.590 00:13:54 -- common/autotest_common.sh@930 -- # kill -0 312954 00:05:55.590 00:13:54 -- common/autotest_common.sh@931 -- # uname 00:05:55.590 00:13:54 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:05:55.590 00:13:54 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 312954 00:05:55.856 00:13:54 -- common/autotest_common.sh@932 -- # process_name=reactor_2 00:05:55.856 00:13:54 -- common/autotest_common.sh@936 -- # '[' reactor_2 = sudo ']' 00:05:55.856 00:13:54 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 312954' 00:05:55.856 killing process with pid 312954 00:05:55.856 00:13:54 -- common/autotest_common.sh@945 -- # kill 312954 00:05:55.856 00:13:54 -- common/autotest_common.sh@950 -- # wait 312954 00:05:56.234 00:13:54 -- event/cpu_locks.sh@18 -- # rm -f 00:05:56.234 00:13:54 -- event/cpu_locks.sh@1 -- # cleanup 00:05:56.234 00:13:54 -- event/cpu_locks.sh@15 -- # [[ -z 312774 ]] 00:05:56.234 00:13:54 -- event/cpu_locks.sh@15 -- # killprocess 312774 00:05:56.234 00:13:54 -- common/autotest_common.sh@926 -- # '[' -z 312774 ']' 00:05:56.234 00:13:54 -- common/autotest_common.sh@930 -- # kill -0 312774 00:05:56.234 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 930: kill: (312774) - No such process 00:05:56.234 00:13:54 -- common/autotest_common.sh@953 -- # echo 'Process with pid 312774 is not found' 00:05:56.234 Process with pid 312774 is not found 00:05:56.234 00:13:54 -- event/cpu_locks.sh@16 -- # [[ -z 312954 ]] 00:05:56.234 00:13:54 -- event/cpu_locks.sh@16 -- # killprocess 312954 00:05:56.234 00:13:54 -- common/autotest_common.sh@926 -- # '[' -z 312954 ']' 00:05:56.234 00:13:54 -- common/autotest_common.sh@930 -- # kill -0 312954 00:05:56.234 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 930: kill: (312954) - No such process 00:05:56.234 00:13:54 -- common/autotest_common.sh@953 -- # echo 'Process with pid 312954 is not found' 00:05:56.234 Process with pid 312954 is not found 00:05:56.234 00:13:54 -- event/cpu_locks.sh@18 -- # rm -f 00:05:56.234 00:05:56.234 real 0m18.254s 00:05:56.234 user 0m30.477s 00:05:56.234 sys 0m5.820s 00:05:56.234 00:13:54 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:56.234 00:13:54 -- common/autotest_common.sh@10 -- # set +x 00:05:56.234 ************************************ 00:05:56.234 END TEST cpu_locks 00:05:56.234 ************************************ 00:05:56.234 00:05:56.234 real 0m43.356s 00:05:56.234 user 1m21.169s 00:05:56.234 sys 0m9.845s 00:05:56.234 00:13:55 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:56.234 00:13:55 -- common/autotest_common.sh@10 -- # set +x 00:05:56.234 ************************************ 00:05:56.234 END TEST event 00:05:56.234 ************************************ 00:05:56.234 00:13:55 -- spdk/autotest.sh@188 -- # run_test thread /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/thread.sh 00:05:56.234 00:13:55 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:56.235 00:13:55 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:56.235 00:13:55 -- common/autotest_common.sh@10 -- # set +x 00:05:56.235 ************************************ 00:05:56.235 START TEST thread 00:05:56.235 ************************************ 00:05:56.235 00:13:55 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/thread.sh 00:05:56.235 * Looking for test storage... 00:05:56.235 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread 00:05:56.235 00:13:55 -- thread/thread.sh@11 -- # run_test thread_poller_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:05:56.235 00:13:55 -- common/autotest_common.sh@1077 -- # '[' 8 -le 1 ']' 00:05:56.235 00:13:55 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:56.235 00:13:55 -- common/autotest_common.sh@10 -- # set +x 00:05:56.235 ************************************ 00:05:56.235 START TEST thread_poller_perf 00:05:56.235 ************************************ 00:05:56.235 00:13:55 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:05:56.235 [2024-07-15 00:13:55.186731] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:56.235 [2024-07-15 00:13:55.186821] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid313391 ] 00:05:56.235 EAL: No free 2048 kB hugepages reported on node 1 00:05:56.235 [2024-07-15 00:13:55.259236] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:56.493 [2024-07-15 00:13:55.329505] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:56.493 Running 1000 pollers for 1 seconds with 1 microseconds period. 00:05:57.430 ====================================== 00:05:57.430 busy:2505062540 (cyc) 00:05:57.430 total_run_count: 824000 00:05:57.430 tsc_hz: 2500000000 (cyc) 00:05:57.430 ====================================== 00:05:57.430 poller_cost: 3040 (cyc), 1216 (nsec) 00:05:57.430 00:05:57.430 real 0m1.225s 00:05:57.430 user 0m1.129s 00:05:57.430 sys 0m0.091s 00:05:57.430 00:13:56 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:57.430 00:13:56 -- common/autotest_common.sh@10 -- # set +x 00:05:57.430 ************************************ 00:05:57.430 END TEST thread_poller_perf 00:05:57.430 ************************************ 00:05:57.430 00:13:56 -- thread/thread.sh@12 -- # run_test thread_poller_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:05:57.430 00:13:56 -- common/autotest_common.sh@1077 -- # '[' 8 -le 1 ']' 00:05:57.430 00:13:56 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:57.430 00:13:56 -- common/autotest_common.sh@10 -- # set +x 00:05:57.430 ************************************ 00:05:57.430 START TEST thread_poller_perf 00:05:57.430 ************************************ 00:05:57.430 00:13:56 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:05:57.430 [2024-07-15 00:13:56.453522] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:57.430 [2024-07-15 00:13:56.453618] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid313622 ] 00:05:57.689 EAL: No free 2048 kB hugepages reported on node 1 00:05:57.689 [2024-07-15 00:13:56.523852] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:57.689 [2024-07-15 00:13:56.590093] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:57.689 Running 1000 pollers for 1 seconds with 0 microseconds period. 00:05:58.627 ====================================== 00:05:58.627 busy:2501902472 (cyc) 00:05:58.627 total_run_count: 14432000 00:05:58.627 tsc_hz: 2500000000 (cyc) 00:05:58.627 ====================================== 00:05:58.627 poller_cost: 173 (cyc), 69 (nsec) 00:05:58.627 00:05:58.627 real 0m1.214s 00:05:58.627 user 0m1.125s 00:05:58.627 sys 0m0.085s 00:05:58.627 00:13:57 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:58.627 00:13:57 -- common/autotest_common.sh@10 -- # set +x 00:05:58.627 ************************************ 00:05:58.627 END TEST thread_poller_perf 00:05:58.627 ************************************ 00:05:58.886 00:13:57 -- thread/thread.sh@17 -- # [[ n != \y ]] 00:05:58.886 00:13:57 -- thread/thread.sh@18 -- # run_test thread_spdk_lock /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock 00:05:58.886 00:13:57 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:58.886 00:13:57 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:58.886 00:13:57 -- common/autotest_common.sh@10 -- # set +x 00:05:58.886 ************************************ 00:05:58.886 START TEST thread_spdk_lock 00:05:58.886 ************************************ 00:05:58.886 00:13:57 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock 00:05:58.886 [2024-07-15 00:13:57.708838] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:58.886 [2024-07-15 00:13:57.708944] [ DPDK EAL parameters: spdk_lock_test --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid313904 ] 00:05:58.886 EAL: No free 2048 kB hugepages reported on node 1 00:05:58.886 [2024-07-15 00:13:57.778901] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:58.886 [2024-07-15 00:13:57.845863] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:58.886 [2024-07-15 00:13:57.845865] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:59.453 [2024-07-15 00:13:58.334666] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 955:thread_execute_poller: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:05:59.454 [2024-07-15 00:13:58.334702] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:3062:spdk_spin_lock: *ERROR*: unrecoverable spinlock error 2: Deadlock detected (thread != sspin->thread) 00:05:59.454 [2024-07-15 00:13:58.334716] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:3017:sspin_stacks_print: *ERROR*: spinlock 0x149c080 00:05:59.454 [2024-07-15 00:13:58.335586] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 850:msg_queue_run_batch: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:05:59.454 [2024-07-15 00:13:58.335691] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:1016:thread_execute_timed_poller: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:05:59.454 [2024-07-15 00:13:58.335710] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 850:msg_queue_run_batch: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:05:59.454 Starting test contend 00:05:59.454 Worker Delay Wait us Hold us Total us 00:05:59.454 0 3 176711 184480 361191 00:05:59.454 1 5 91168 284249 375418 00:05:59.454 PASS test contend 00:05:59.454 Starting test hold_by_poller 00:05:59.454 PASS test hold_by_poller 00:05:59.454 Starting test hold_by_message 00:05:59.454 PASS test hold_by_message 00:05:59.454 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock summary: 00:05:59.454 100014 assertions passed 00:05:59.454 0 assertions failed 00:05:59.454 00:05:59.454 real 0m0.703s 00:05:59.454 user 0m1.104s 00:05:59.454 sys 0m0.085s 00:05:59.454 00:13:58 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:59.454 00:13:58 -- common/autotest_common.sh@10 -- # set +x 00:05:59.454 ************************************ 00:05:59.454 END TEST thread_spdk_lock 00:05:59.454 ************************************ 00:05:59.454 00:05:59.454 real 0m3.361s 00:05:59.454 user 0m3.438s 00:05:59.454 sys 0m0.431s 00:05:59.454 00:13:58 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:59.454 00:13:58 -- common/autotest_common.sh@10 -- # set +x 00:05:59.454 ************************************ 00:05:59.454 END TEST thread 00:05:59.454 ************************************ 00:05:59.454 00:13:58 -- spdk/autotest.sh@189 -- # run_test accel /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel.sh 00:05:59.454 00:13:58 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:59.454 00:13:58 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:59.454 00:13:58 -- common/autotest_common.sh@10 -- # set +x 00:05:59.454 ************************************ 00:05:59.454 START TEST accel 00:05:59.454 ************************************ 00:05:59.454 00:13:58 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel.sh 00:05:59.713 * Looking for test storage... 00:05:59.713 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel 00:05:59.713 00:13:58 -- accel/accel.sh@73 -- # declare -A expected_opcs 00:05:59.713 00:13:58 -- accel/accel.sh@74 -- # get_expected_opcs 00:05:59.713 00:13:58 -- accel/accel.sh@57 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:05:59.713 00:13:58 -- accel/accel.sh@59 -- # spdk_tgt_pid=314168 00:05:59.713 00:13:58 -- accel/accel.sh@60 -- # waitforlisten 314168 00:05:59.713 00:13:58 -- common/autotest_common.sh@819 -- # '[' -z 314168 ']' 00:05:59.713 00:13:58 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:59.713 00:13:58 -- accel/accel.sh@58 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:05:59.713 00:13:58 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:59.713 00:13:58 -- accel/accel.sh@58 -- # build_accel_config 00:05:59.713 00:13:58 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:59.713 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:59.713 00:13:58 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:05:59.713 00:13:58 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:59.713 00:13:58 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:59.713 00:13:58 -- common/autotest_common.sh@10 -- # set +x 00:05:59.713 00:13:58 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:59.713 00:13:58 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:05:59.713 00:13:58 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:05:59.713 00:13:58 -- accel/accel.sh@41 -- # local IFS=, 00:05:59.713 00:13:58 -- accel/accel.sh@42 -- # jq -r . 00:05:59.713 [2024-07-15 00:13:58.602342] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:59.713 [2024-07-15 00:13:58.602438] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid314168 ] 00:05:59.713 EAL: No free 2048 kB hugepages reported on node 1 00:05:59.713 [2024-07-15 00:13:58.671860] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:59.713 [2024-07-15 00:13:58.746478] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:59.714 [2024-07-15 00:13:58.746602] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:00.651 00:13:59 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:00.651 00:13:59 -- common/autotest_common.sh@852 -- # return 0 00:06:00.651 00:13:59 -- accel/accel.sh@62 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:06:00.651 00:13:59 -- accel/accel.sh@62 -- # rpc_cmd accel_get_opc_assignments 00:06:00.651 00:13:59 -- common/autotest_common.sh@551 -- # xtrace_disable 00:06:00.651 00:13:59 -- accel/accel.sh@62 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:06:00.651 00:13:59 -- common/autotest_common.sh@10 -- # set +x 00:06:00.651 00:13:59 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:06:00.651 00:13:59 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:00.651 00:13:59 -- accel/accel.sh@64 -- # IFS== 00:06:00.651 00:13:59 -- accel/accel.sh@64 -- # read -r opc module 00:06:00.651 00:13:59 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:00.651 00:13:59 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:00.651 00:13:59 -- accel/accel.sh@64 -- # IFS== 00:06:00.651 00:13:59 -- accel/accel.sh@64 -- # read -r opc module 00:06:00.651 00:13:59 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:00.651 00:13:59 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:00.651 00:13:59 -- accel/accel.sh@64 -- # IFS== 00:06:00.651 00:13:59 -- accel/accel.sh@64 -- # read -r opc module 00:06:00.651 00:13:59 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:00.651 00:13:59 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:00.651 00:13:59 -- accel/accel.sh@64 -- # IFS== 00:06:00.651 00:13:59 -- accel/accel.sh@64 -- # read -r opc module 00:06:00.651 00:13:59 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:00.651 00:13:59 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:00.651 00:13:59 -- accel/accel.sh@64 -- # IFS== 00:06:00.651 00:13:59 -- accel/accel.sh@64 -- # read -r opc module 00:06:00.651 00:13:59 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:00.651 00:13:59 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:00.651 00:13:59 -- accel/accel.sh@64 -- # IFS== 00:06:00.651 00:13:59 -- accel/accel.sh@64 -- # read -r opc module 00:06:00.651 00:13:59 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:00.651 00:13:59 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:00.651 00:13:59 -- accel/accel.sh@64 -- # IFS== 00:06:00.651 00:13:59 -- accel/accel.sh@64 -- # read -r opc module 00:06:00.651 00:13:59 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:00.651 00:13:59 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:00.651 00:13:59 -- accel/accel.sh@64 -- # IFS== 00:06:00.651 00:13:59 -- accel/accel.sh@64 -- # read -r opc module 00:06:00.651 00:13:59 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:00.651 00:13:59 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:00.651 00:13:59 -- accel/accel.sh@64 -- # IFS== 00:06:00.651 00:13:59 -- accel/accel.sh@64 -- # read -r opc module 00:06:00.651 00:13:59 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:00.651 00:13:59 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:00.651 00:13:59 -- accel/accel.sh@64 -- # IFS== 00:06:00.651 00:13:59 -- accel/accel.sh@64 -- # read -r opc module 00:06:00.651 00:13:59 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:00.651 00:13:59 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:00.651 00:13:59 -- accel/accel.sh@64 -- # IFS== 00:06:00.651 00:13:59 -- accel/accel.sh@64 -- # read -r opc module 00:06:00.651 00:13:59 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:00.651 00:13:59 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:00.651 00:13:59 -- accel/accel.sh@64 -- # IFS== 00:06:00.651 00:13:59 -- accel/accel.sh@64 -- # read -r opc module 00:06:00.651 00:13:59 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:00.651 00:13:59 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:00.651 00:13:59 -- accel/accel.sh@64 -- # IFS== 00:06:00.651 00:13:59 -- accel/accel.sh@64 -- # read -r opc module 00:06:00.651 00:13:59 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:00.651 00:13:59 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:00.651 00:13:59 -- accel/accel.sh@64 -- # IFS== 00:06:00.651 00:13:59 -- accel/accel.sh@64 -- # read -r opc module 00:06:00.651 00:13:59 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:00.651 00:13:59 -- accel/accel.sh@67 -- # killprocess 314168 00:06:00.651 00:13:59 -- common/autotest_common.sh@926 -- # '[' -z 314168 ']' 00:06:00.651 00:13:59 -- common/autotest_common.sh@930 -- # kill -0 314168 00:06:00.651 00:13:59 -- common/autotest_common.sh@931 -- # uname 00:06:00.651 00:13:59 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:06:00.651 00:13:59 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 314168 00:06:00.651 00:13:59 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:06:00.651 00:13:59 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:06:00.651 00:13:59 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 314168' 00:06:00.651 killing process with pid 314168 00:06:00.651 00:13:59 -- common/autotest_common.sh@945 -- # kill 314168 00:06:00.651 00:13:59 -- common/autotest_common.sh@950 -- # wait 314168 00:06:00.912 00:13:59 -- accel/accel.sh@68 -- # trap - ERR 00:06:00.912 00:13:59 -- accel/accel.sh@81 -- # run_test accel_help accel_perf -h 00:06:00.912 00:13:59 -- common/autotest_common.sh@1077 -- # '[' 3 -le 1 ']' 00:06:00.912 00:13:59 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:00.912 00:13:59 -- common/autotest_common.sh@10 -- # set +x 00:06:00.912 00:13:59 -- common/autotest_common.sh@1104 -- # accel_perf -h 00:06:00.912 00:13:59 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -h 00:06:00.912 00:13:59 -- accel/accel.sh@12 -- # build_accel_config 00:06:00.912 00:13:59 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:00.912 00:13:59 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:00.912 00:13:59 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:00.912 00:13:59 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:00.912 00:13:59 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:00.912 00:13:59 -- accel/accel.sh@41 -- # local IFS=, 00:06:00.912 00:13:59 -- accel/accel.sh@42 -- # jq -r . 00:06:00.912 00:13:59 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:00.912 00:13:59 -- common/autotest_common.sh@10 -- # set +x 00:06:00.912 00:13:59 -- accel/accel.sh@83 -- # run_test accel_missing_filename NOT accel_perf -t 1 -w compress 00:06:00.912 00:13:59 -- common/autotest_common.sh@1077 -- # '[' 7 -le 1 ']' 00:06:00.912 00:13:59 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:00.912 00:13:59 -- common/autotest_common.sh@10 -- # set +x 00:06:00.912 ************************************ 00:06:00.912 START TEST accel_missing_filename 00:06:00.912 ************************************ 00:06:00.912 00:13:59 -- common/autotest_common.sh@1104 -- # NOT accel_perf -t 1 -w compress 00:06:00.912 00:13:59 -- common/autotest_common.sh@640 -- # local es=0 00:06:00.912 00:13:59 -- common/autotest_common.sh@642 -- # valid_exec_arg accel_perf -t 1 -w compress 00:06:00.912 00:13:59 -- common/autotest_common.sh@628 -- # local arg=accel_perf 00:06:00.912 00:13:59 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:00.912 00:13:59 -- common/autotest_common.sh@632 -- # type -t accel_perf 00:06:00.912 00:13:59 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:00.912 00:13:59 -- common/autotest_common.sh@643 -- # accel_perf -t 1 -w compress 00:06:00.912 00:13:59 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress 00:06:00.912 00:13:59 -- accel/accel.sh@12 -- # build_accel_config 00:06:00.912 00:13:59 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:00.912 00:13:59 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:00.912 00:13:59 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:00.912 00:13:59 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:00.912 00:13:59 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:00.912 00:13:59 -- accel/accel.sh@41 -- # local IFS=, 00:06:00.912 00:13:59 -- accel/accel.sh@42 -- # jq -r . 00:06:00.912 [2024-07-15 00:13:59.902106] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:00.912 [2024-07-15 00:13:59.902213] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid314363 ] 00:06:00.912 EAL: No free 2048 kB hugepages reported on node 1 00:06:01.171 [2024-07-15 00:13:59.975967] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:01.171 [2024-07-15 00:14:00.054556] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:01.171 [2024-07-15 00:14:00.095303] app.c: 910:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:06:01.171 [2024-07-15 00:14:00.156090] accel_perf.c:1385:main: *ERROR*: ERROR starting application 00:06:01.171 A filename is required. 00:06:01.171 00:14:00 -- common/autotest_common.sh@643 -- # es=234 00:06:01.171 00:14:00 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:06:01.171 00:14:00 -- common/autotest_common.sh@652 -- # es=106 00:06:01.171 00:14:00 -- common/autotest_common.sh@653 -- # case "$es" in 00:06:01.171 00:14:00 -- common/autotest_common.sh@660 -- # es=1 00:06:01.171 00:14:00 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:06:01.171 00:06:01.171 real 0m0.344s 00:06:01.171 user 0m0.240s 00:06:01.171 sys 0m0.138s 00:06:01.171 00:14:00 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:01.171 00:14:00 -- common/autotest_common.sh@10 -- # set +x 00:06:01.171 ************************************ 00:06:01.171 END TEST accel_missing_filename 00:06:01.171 ************************************ 00:06:01.430 00:14:00 -- accel/accel.sh@85 -- # run_test accel_compress_verify NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:01.430 00:14:00 -- common/autotest_common.sh@1077 -- # '[' 10 -le 1 ']' 00:06:01.430 00:14:00 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:01.430 00:14:00 -- common/autotest_common.sh@10 -- # set +x 00:06:01.430 ************************************ 00:06:01.430 START TEST accel_compress_verify 00:06:01.430 ************************************ 00:06:01.430 00:14:00 -- common/autotest_common.sh@1104 -- # NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:01.430 00:14:00 -- common/autotest_common.sh@640 -- # local es=0 00:06:01.430 00:14:00 -- common/autotest_common.sh@642 -- # valid_exec_arg accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:01.430 00:14:00 -- common/autotest_common.sh@628 -- # local arg=accel_perf 00:06:01.430 00:14:00 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:01.430 00:14:00 -- common/autotest_common.sh@632 -- # type -t accel_perf 00:06:01.430 00:14:00 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:01.430 00:14:00 -- common/autotest_common.sh@643 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:01.430 00:14:00 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:01.430 00:14:00 -- accel/accel.sh@12 -- # build_accel_config 00:06:01.430 00:14:00 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:01.430 00:14:00 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:01.430 00:14:00 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:01.430 00:14:00 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:01.430 00:14:00 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:01.430 00:14:00 -- accel/accel.sh@41 -- # local IFS=, 00:06:01.430 00:14:00 -- accel/accel.sh@42 -- # jq -r . 00:06:01.430 [2024-07-15 00:14:00.294518] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:01.430 [2024-07-15 00:14:00.294611] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid314555 ] 00:06:01.430 EAL: No free 2048 kB hugepages reported on node 1 00:06:01.430 [2024-07-15 00:14:00.368223] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:01.430 [2024-07-15 00:14:00.433514] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:01.430 [2024-07-15 00:14:00.473661] app.c: 910:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:06:01.690 [2024-07-15 00:14:00.533222] accel_perf.c:1385:main: *ERROR*: ERROR starting application 00:06:01.690 00:06:01.690 Compression does not support the verify option, aborting. 00:06:01.690 00:14:00 -- common/autotest_common.sh@643 -- # es=161 00:06:01.690 00:14:00 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:06:01.690 00:14:00 -- common/autotest_common.sh@652 -- # es=33 00:06:01.690 00:14:00 -- common/autotest_common.sh@653 -- # case "$es" in 00:06:01.690 00:14:00 -- common/autotest_common.sh@660 -- # es=1 00:06:01.690 00:14:00 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:06:01.690 00:06:01.690 real 0m0.329s 00:06:01.690 user 0m0.237s 00:06:01.690 sys 0m0.132s 00:06:01.690 00:14:00 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:01.690 00:14:00 -- common/autotest_common.sh@10 -- # set +x 00:06:01.690 ************************************ 00:06:01.690 END TEST accel_compress_verify 00:06:01.690 ************************************ 00:06:01.690 00:14:00 -- accel/accel.sh@87 -- # run_test accel_wrong_workload NOT accel_perf -t 1 -w foobar 00:06:01.690 00:14:00 -- common/autotest_common.sh@1077 -- # '[' 7 -le 1 ']' 00:06:01.690 00:14:00 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:01.690 00:14:00 -- common/autotest_common.sh@10 -- # set +x 00:06:01.690 ************************************ 00:06:01.690 START TEST accel_wrong_workload 00:06:01.690 ************************************ 00:06:01.690 00:14:00 -- common/autotest_common.sh@1104 -- # NOT accel_perf -t 1 -w foobar 00:06:01.690 00:14:00 -- common/autotest_common.sh@640 -- # local es=0 00:06:01.690 00:14:00 -- common/autotest_common.sh@642 -- # valid_exec_arg accel_perf -t 1 -w foobar 00:06:01.690 00:14:00 -- common/autotest_common.sh@628 -- # local arg=accel_perf 00:06:01.690 00:14:00 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:01.690 00:14:00 -- common/autotest_common.sh@632 -- # type -t accel_perf 00:06:01.690 00:14:00 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:01.690 00:14:00 -- common/autotest_common.sh@643 -- # accel_perf -t 1 -w foobar 00:06:01.690 00:14:00 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w foobar 00:06:01.690 00:14:00 -- accel/accel.sh@12 -- # build_accel_config 00:06:01.690 00:14:00 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:01.690 00:14:00 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:01.690 00:14:00 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:01.690 00:14:00 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:01.690 00:14:00 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:01.690 00:14:00 -- accel/accel.sh@41 -- # local IFS=, 00:06:01.690 00:14:00 -- accel/accel.sh@42 -- # jq -r . 00:06:01.690 Unsupported workload type: foobar 00:06:01.690 [2024-07-15 00:14:00.668745] app.c:1292:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'w' failed: 1 00:06:01.690 accel_perf options: 00:06:01.690 [-h help message] 00:06:01.690 [-q queue depth per core] 00:06:01.690 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:06:01.690 [-T number of threads per core 00:06:01.690 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:06:01.690 [-t time in seconds] 00:06:01.690 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:06:01.690 [ dif_verify, , dif_generate, dif_generate_copy 00:06:01.690 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:06:01.690 [-l for compress/decompress workloads, name of uncompressed input file 00:06:01.690 [-S for crc32c workload, use this seed value (default 0) 00:06:01.690 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:06:01.690 [-f for fill workload, use this BYTE value (default 255) 00:06:01.690 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:06:01.690 [-y verify result if this switch is on] 00:06:01.690 [-a tasks to allocate per core (default: same value as -q)] 00:06:01.690 Can be used to spread operations across a wider range of memory. 00:06:01.690 00:14:00 -- common/autotest_common.sh@643 -- # es=1 00:06:01.690 00:14:00 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:06:01.690 00:14:00 -- common/autotest_common.sh@662 -- # [[ -n '' ]] 00:06:01.690 00:14:00 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:06:01.690 00:06:01.690 real 0m0.028s 00:06:01.690 user 0m0.011s 00:06:01.690 sys 0m0.017s 00:06:01.690 00:14:00 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:01.690 00:14:00 -- common/autotest_common.sh@10 -- # set +x 00:06:01.690 ************************************ 00:06:01.690 END TEST accel_wrong_workload 00:06:01.690 ************************************ 00:06:01.690 Error: writing output failed: Broken pipe 00:06:01.690 00:14:00 -- accel/accel.sh@89 -- # run_test accel_negative_buffers NOT accel_perf -t 1 -w xor -y -x -1 00:06:01.690 00:14:00 -- common/autotest_common.sh@1077 -- # '[' 10 -le 1 ']' 00:06:01.690 00:14:00 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:01.690 00:14:00 -- common/autotest_common.sh@10 -- # set +x 00:06:01.690 ************************************ 00:06:01.690 START TEST accel_negative_buffers 00:06:01.690 ************************************ 00:06:01.690 00:14:00 -- common/autotest_common.sh@1104 -- # NOT accel_perf -t 1 -w xor -y -x -1 00:06:01.690 00:14:00 -- common/autotest_common.sh@640 -- # local es=0 00:06:01.690 00:14:00 -- common/autotest_common.sh@642 -- # valid_exec_arg accel_perf -t 1 -w xor -y -x -1 00:06:01.691 00:14:00 -- common/autotest_common.sh@628 -- # local arg=accel_perf 00:06:01.691 00:14:00 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:01.691 00:14:00 -- common/autotest_common.sh@632 -- # type -t accel_perf 00:06:01.691 00:14:00 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:01.691 00:14:00 -- common/autotest_common.sh@643 -- # accel_perf -t 1 -w xor -y -x -1 00:06:01.691 00:14:00 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x -1 00:06:01.691 00:14:00 -- accel/accel.sh@12 -- # build_accel_config 00:06:01.691 00:14:00 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:01.691 00:14:00 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:01.691 00:14:00 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:01.691 00:14:00 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:01.691 00:14:00 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:01.691 00:14:00 -- accel/accel.sh@41 -- # local IFS=, 00:06:01.691 00:14:00 -- accel/accel.sh@42 -- # jq -r . 00:06:01.691 -x option must be non-negative. 00:06:01.691 [2024-07-15 00:14:00.744588] app.c:1292:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'x' failed: 1 00:06:01.950 accel_perf options: 00:06:01.950 [-h help message] 00:06:01.950 [-q queue depth per core] 00:06:01.950 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:06:01.950 [-T number of threads per core 00:06:01.950 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:06:01.950 [-t time in seconds] 00:06:01.950 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:06:01.950 [ dif_verify, , dif_generate, dif_generate_copy 00:06:01.950 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:06:01.950 [-l for compress/decompress workloads, name of uncompressed input file 00:06:01.950 [-S for crc32c workload, use this seed value (default 0) 00:06:01.950 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:06:01.950 [-f for fill workload, use this BYTE value (default 255) 00:06:01.950 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:06:01.950 [-y verify result if this switch is on] 00:06:01.950 [-a tasks to allocate per core (default: same value as -q)] 00:06:01.950 Can be used to spread operations across a wider range of memory. 00:06:01.950 00:14:00 -- common/autotest_common.sh@643 -- # es=1 00:06:01.950 00:14:00 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:06:01.950 00:14:00 -- common/autotest_common.sh@662 -- # [[ -n '' ]] 00:06:01.950 00:14:00 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:06:01.950 00:06:01.950 real 0m0.029s 00:06:01.950 user 0m0.012s 00:06:01.950 sys 0m0.017s 00:06:01.950 00:14:00 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:01.950 00:14:00 -- common/autotest_common.sh@10 -- # set +x 00:06:01.950 ************************************ 00:06:01.950 END TEST accel_negative_buffers 00:06:01.950 ************************************ 00:06:01.950 Error: writing output failed: Broken pipe 00:06:01.950 00:14:00 -- accel/accel.sh@93 -- # run_test accel_crc32c accel_test -t 1 -w crc32c -S 32 -y 00:06:01.950 00:14:00 -- common/autotest_common.sh@1077 -- # '[' 9 -le 1 ']' 00:06:01.950 00:14:00 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:01.950 00:14:00 -- common/autotest_common.sh@10 -- # set +x 00:06:01.950 ************************************ 00:06:01.950 START TEST accel_crc32c 00:06:01.950 ************************************ 00:06:01.950 00:14:00 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w crc32c -S 32 -y 00:06:01.950 00:14:00 -- accel/accel.sh@16 -- # local accel_opc 00:06:01.950 00:14:00 -- accel/accel.sh@17 -- # local accel_module 00:06:01.950 00:14:00 -- accel/accel.sh@18 -- # accel_perf -t 1 -w crc32c -S 32 -y 00:06:01.950 00:14:00 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -S 32 -y 00:06:01.950 00:14:00 -- accel/accel.sh@12 -- # build_accel_config 00:06:01.950 00:14:00 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:01.950 00:14:00 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:01.950 00:14:00 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:01.950 00:14:00 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:01.950 00:14:00 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:01.950 00:14:00 -- accel/accel.sh@41 -- # local IFS=, 00:06:01.950 00:14:00 -- accel/accel.sh@42 -- # jq -r . 00:06:01.950 [2024-07-15 00:14:00.822190] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:01.950 [2024-07-15 00:14:00.822279] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid314608 ] 00:06:01.950 EAL: No free 2048 kB hugepages reported on node 1 00:06:01.950 [2024-07-15 00:14:00.892985] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:01.950 [2024-07-15 00:14:00.964960] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:03.328 00:14:02 -- accel/accel.sh@18 -- # out=' 00:06:03.328 SPDK Configuration: 00:06:03.328 Core mask: 0x1 00:06:03.328 00:06:03.328 Accel Perf Configuration: 00:06:03.328 Workload Type: crc32c 00:06:03.328 CRC-32C seed: 32 00:06:03.328 Transfer size: 4096 bytes 00:06:03.328 Vector count 1 00:06:03.328 Module: software 00:06:03.328 Queue depth: 32 00:06:03.328 Allocate depth: 32 00:06:03.328 # threads/core: 1 00:06:03.328 Run time: 1 seconds 00:06:03.328 Verify: Yes 00:06:03.328 00:06:03.328 Running for 1 seconds... 00:06:03.328 00:06:03.329 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:03.329 ------------------------------------------------------------------------------------ 00:06:03.329 0,0 844384/s 3298 MiB/s 0 0 00:06:03.329 ==================================================================================== 00:06:03.329 Total 844384/s 3298 MiB/s 0 0' 00:06:03.329 00:14:02 -- accel/accel.sh@20 -- # IFS=: 00:06:03.329 00:14:02 -- accel/accel.sh@20 -- # read -r var val 00:06:03.329 00:14:02 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -S 32 -y 00:06:03.329 00:14:02 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -S 32 -y 00:06:03.329 00:14:02 -- accel/accel.sh@12 -- # build_accel_config 00:06:03.329 00:14:02 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:03.329 00:14:02 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:03.329 00:14:02 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:03.329 00:14:02 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:03.329 00:14:02 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:03.329 00:14:02 -- accel/accel.sh@41 -- # local IFS=, 00:06:03.329 00:14:02 -- accel/accel.sh@42 -- # jq -r . 00:06:03.329 [2024-07-15 00:14:02.151028] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:03.329 [2024-07-15 00:14:02.151117] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid314874 ] 00:06:03.329 EAL: No free 2048 kB hugepages reported on node 1 00:06:03.329 [2024-07-15 00:14:02.219523] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:03.329 [2024-07-15 00:14:02.287207] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:03.329 00:14:02 -- accel/accel.sh@21 -- # val= 00:06:03.329 00:14:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:03.329 00:14:02 -- accel/accel.sh@20 -- # IFS=: 00:06:03.329 00:14:02 -- accel/accel.sh@20 -- # read -r var val 00:06:03.329 00:14:02 -- accel/accel.sh@21 -- # val= 00:06:03.329 00:14:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:03.329 00:14:02 -- accel/accel.sh@20 -- # IFS=: 00:06:03.329 00:14:02 -- accel/accel.sh@20 -- # read -r var val 00:06:03.329 00:14:02 -- accel/accel.sh@21 -- # val=0x1 00:06:03.329 00:14:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:03.329 00:14:02 -- accel/accel.sh@20 -- # IFS=: 00:06:03.329 00:14:02 -- accel/accel.sh@20 -- # read -r var val 00:06:03.329 00:14:02 -- accel/accel.sh@21 -- # val= 00:06:03.329 00:14:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:03.329 00:14:02 -- accel/accel.sh@20 -- # IFS=: 00:06:03.329 00:14:02 -- accel/accel.sh@20 -- # read -r var val 00:06:03.329 00:14:02 -- accel/accel.sh@21 -- # val= 00:06:03.329 00:14:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:03.329 00:14:02 -- accel/accel.sh@20 -- # IFS=: 00:06:03.329 00:14:02 -- accel/accel.sh@20 -- # read -r var val 00:06:03.329 00:14:02 -- accel/accel.sh@21 -- # val=crc32c 00:06:03.329 00:14:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:03.329 00:14:02 -- accel/accel.sh@24 -- # accel_opc=crc32c 00:06:03.329 00:14:02 -- accel/accel.sh@20 -- # IFS=: 00:06:03.329 00:14:02 -- accel/accel.sh@20 -- # read -r var val 00:06:03.329 00:14:02 -- accel/accel.sh@21 -- # val=32 00:06:03.329 00:14:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:03.329 00:14:02 -- accel/accel.sh@20 -- # IFS=: 00:06:03.329 00:14:02 -- accel/accel.sh@20 -- # read -r var val 00:06:03.329 00:14:02 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:03.329 00:14:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:03.329 00:14:02 -- accel/accel.sh@20 -- # IFS=: 00:06:03.329 00:14:02 -- accel/accel.sh@20 -- # read -r var val 00:06:03.329 00:14:02 -- accel/accel.sh@21 -- # val= 00:06:03.329 00:14:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:03.329 00:14:02 -- accel/accel.sh@20 -- # IFS=: 00:06:03.329 00:14:02 -- accel/accel.sh@20 -- # read -r var val 00:06:03.329 00:14:02 -- accel/accel.sh@21 -- # val=software 00:06:03.329 00:14:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:03.329 00:14:02 -- accel/accel.sh@23 -- # accel_module=software 00:06:03.329 00:14:02 -- accel/accel.sh@20 -- # IFS=: 00:06:03.329 00:14:02 -- accel/accel.sh@20 -- # read -r var val 00:06:03.329 00:14:02 -- accel/accel.sh@21 -- # val=32 00:06:03.329 00:14:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:03.329 00:14:02 -- accel/accel.sh@20 -- # IFS=: 00:06:03.329 00:14:02 -- accel/accel.sh@20 -- # read -r var val 00:06:03.329 00:14:02 -- accel/accel.sh@21 -- # val=32 00:06:03.329 00:14:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:03.329 00:14:02 -- accel/accel.sh@20 -- # IFS=: 00:06:03.329 00:14:02 -- accel/accel.sh@20 -- # read -r var val 00:06:03.329 00:14:02 -- accel/accel.sh@21 -- # val=1 00:06:03.329 00:14:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:03.329 00:14:02 -- accel/accel.sh@20 -- # IFS=: 00:06:03.329 00:14:02 -- accel/accel.sh@20 -- # read -r var val 00:06:03.329 00:14:02 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:03.329 00:14:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:03.329 00:14:02 -- accel/accel.sh@20 -- # IFS=: 00:06:03.329 00:14:02 -- accel/accel.sh@20 -- # read -r var val 00:06:03.329 00:14:02 -- accel/accel.sh@21 -- # val=Yes 00:06:03.329 00:14:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:03.329 00:14:02 -- accel/accel.sh@20 -- # IFS=: 00:06:03.329 00:14:02 -- accel/accel.sh@20 -- # read -r var val 00:06:03.329 00:14:02 -- accel/accel.sh@21 -- # val= 00:06:03.329 00:14:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:03.329 00:14:02 -- accel/accel.sh@20 -- # IFS=: 00:06:03.329 00:14:02 -- accel/accel.sh@20 -- # read -r var val 00:06:03.329 00:14:02 -- accel/accel.sh@21 -- # val= 00:06:03.329 00:14:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:03.329 00:14:02 -- accel/accel.sh@20 -- # IFS=: 00:06:03.329 00:14:02 -- accel/accel.sh@20 -- # read -r var val 00:06:04.709 00:14:03 -- accel/accel.sh@21 -- # val= 00:06:04.709 00:14:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:04.709 00:14:03 -- accel/accel.sh@20 -- # IFS=: 00:06:04.709 00:14:03 -- accel/accel.sh@20 -- # read -r var val 00:06:04.709 00:14:03 -- accel/accel.sh@21 -- # val= 00:06:04.709 00:14:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:04.709 00:14:03 -- accel/accel.sh@20 -- # IFS=: 00:06:04.709 00:14:03 -- accel/accel.sh@20 -- # read -r var val 00:06:04.709 00:14:03 -- accel/accel.sh@21 -- # val= 00:06:04.709 00:14:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:04.709 00:14:03 -- accel/accel.sh@20 -- # IFS=: 00:06:04.709 00:14:03 -- accel/accel.sh@20 -- # read -r var val 00:06:04.709 00:14:03 -- accel/accel.sh@21 -- # val= 00:06:04.709 00:14:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:04.709 00:14:03 -- accel/accel.sh@20 -- # IFS=: 00:06:04.709 00:14:03 -- accel/accel.sh@20 -- # read -r var val 00:06:04.709 00:14:03 -- accel/accel.sh@21 -- # val= 00:06:04.709 00:14:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:04.709 00:14:03 -- accel/accel.sh@20 -- # IFS=: 00:06:04.709 00:14:03 -- accel/accel.sh@20 -- # read -r var val 00:06:04.709 00:14:03 -- accel/accel.sh@21 -- # val= 00:06:04.709 00:14:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:04.709 00:14:03 -- accel/accel.sh@20 -- # IFS=: 00:06:04.709 00:14:03 -- accel/accel.sh@20 -- # read -r var val 00:06:04.709 00:14:03 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:04.709 00:14:03 -- accel/accel.sh@28 -- # [[ -n crc32c ]] 00:06:04.709 00:14:03 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:04.709 00:06:04.709 real 0m2.659s 00:06:04.709 user 0m2.403s 00:06:04.709 sys 0m0.265s 00:06:04.709 00:14:03 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:04.709 00:14:03 -- common/autotest_common.sh@10 -- # set +x 00:06:04.709 ************************************ 00:06:04.709 END TEST accel_crc32c 00:06:04.709 ************************************ 00:06:04.709 00:14:03 -- accel/accel.sh@94 -- # run_test accel_crc32c_C2 accel_test -t 1 -w crc32c -y -C 2 00:06:04.709 00:14:03 -- common/autotest_common.sh@1077 -- # '[' 9 -le 1 ']' 00:06:04.709 00:14:03 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:04.709 00:14:03 -- common/autotest_common.sh@10 -- # set +x 00:06:04.709 ************************************ 00:06:04.709 START TEST accel_crc32c_C2 00:06:04.709 ************************************ 00:06:04.709 00:14:03 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w crc32c -y -C 2 00:06:04.709 00:14:03 -- accel/accel.sh@16 -- # local accel_opc 00:06:04.709 00:14:03 -- accel/accel.sh@17 -- # local accel_module 00:06:04.709 00:14:03 -- accel/accel.sh@18 -- # accel_perf -t 1 -w crc32c -y -C 2 00:06:04.709 00:14:03 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -y -C 2 00:06:04.709 00:14:03 -- accel/accel.sh@12 -- # build_accel_config 00:06:04.709 00:14:03 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:04.709 00:14:03 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:04.709 00:14:03 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:04.709 00:14:03 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:04.709 00:14:03 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:04.709 00:14:03 -- accel/accel.sh@41 -- # local IFS=, 00:06:04.709 00:14:03 -- accel/accel.sh@42 -- # jq -r . 00:06:04.709 [2024-07-15 00:14:03.519276] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:04.709 [2024-07-15 00:14:03.519332] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid315167 ] 00:06:04.709 EAL: No free 2048 kB hugepages reported on node 1 00:06:04.709 [2024-07-15 00:14:03.579759] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:04.709 [2024-07-15 00:14:03.647925] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:06.091 00:14:04 -- accel/accel.sh@18 -- # out=' 00:06:06.091 SPDK Configuration: 00:06:06.091 Core mask: 0x1 00:06:06.091 00:06:06.091 Accel Perf Configuration: 00:06:06.091 Workload Type: crc32c 00:06:06.091 CRC-32C seed: 0 00:06:06.091 Transfer size: 4096 bytes 00:06:06.091 Vector count 2 00:06:06.091 Module: software 00:06:06.091 Queue depth: 32 00:06:06.091 Allocate depth: 32 00:06:06.091 # threads/core: 1 00:06:06.091 Run time: 1 seconds 00:06:06.091 Verify: Yes 00:06:06.091 00:06:06.091 Running for 1 seconds... 00:06:06.091 00:06:06.091 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:06.091 ------------------------------------------------------------------------------------ 00:06:06.091 0,0 622176/s 4860 MiB/s 0 0 00:06:06.091 ==================================================================================== 00:06:06.091 Total 622176/s 2430 MiB/s 0 0' 00:06:06.091 00:14:04 -- accel/accel.sh@20 -- # IFS=: 00:06:06.091 00:14:04 -- accel/accel.sh@20 -- # read -r var val 00:06:06.091 00:14:04 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -y -C 2 00:06:06.091 00:14:04 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -y -C 2 00:06:06.091 00:14:04 -- accel/accel.sh@12 -- # build_accel_config 00:06:06.091 00:14:04 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:06.091 00:14:04 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:06.091 00:14:04 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:06.091 00:14:04 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:06.091 00:14:04 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:06.091 00:14:04 -- accel/accel.sh@41 -- # local IFS=, 00:06:06.091 00:14:04 -- accel/accel.sh@42 -- # jq -r . 00:06:06.091 [2024-07-15 00:14:04.837206] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:06.091 [2024-07-15 00:14:04.837302] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid315429 ] 00:06:06.091 EAL: No free 2048 kB hugepages reported on node 1 00:06:06.091 [2024-07-15 00:14:04.907603] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:06.091 [2024-07-15 00:14:04.973734] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:06.091 00:14:05 -- accel/accel.sh@21 -- # val= 00:06:06.091 00:14:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:06.091 00:14:05 -- accel/accel.sh@20 -- # IFS=: 00:06:06.091 00:14:05 -- accel/accel.sh@20 -- # read -r var val 00:06:06.091 00:14:05 -- accel/accel.sh@21 -- # val= 00:06:06.091 00:14:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:06.091 00:14:05 -- accel/accel.sh@20 -- # IFS=: 00:06:06.091 00:14:05 -- accel/accel.sh@20 -- # read -r var val 00:06:06.091 00:14:05 -- accel/accel.sh@21 -- # val=0x1 00:06:06.091 00:14:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:06.091 00:14:05 -- accel/accel.sh@20 -- # IFS=: 00:06:06.091 00:14:05 -- accel/accel.sh@20 -- # read -r var val 00:06:06.091 00:14:05 -- accel/accel.sh@21 -- # val= 00:06:06.091 00:14:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:06.091 00:14:05 -- accel/accel.sh@20 -- # IFS=: 00:06:06.091 00:14:05 -- accel/accel.sh@20 -- # read -r var val 00:06:06.091 00:14:05 -- accel/accel.sh@21 -- # val= 00:06:06.091 00:14:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:06.091 00:14:05 -- accel/accel.sh@20 -- # IFS=: 00:06:06.091 00:14:05 -- accel/accel.sh@20 -- # read -r var val 00:06:06.091 00:14:05 -- accel/accel.sh@21 -- # val=crc32c 00:06:06.091 00:14:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:06.091 00:14:05 -- accel/accel.sh@24 -- # accel_opc=crc32c 00:06:06.091 00:14:05 -- accel/accel.sh@20 -- # IFS=: 00:06:06.091 00:14:05 -- accel/accel.sh@20 -- # read -r var val 00:06:06.091 00:14:05 -- accel/accel.sh@21 -- # val=0 00:06:06.091 00:14:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:06.091 00:14:05 -- accel/accel.sh@20 -- # IFS=: 00:06:06.091 00:14:05 -- accel/accel.sh@20 -- # read -r var val 00:06:06.091 00:14:05 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:06.091 00:14:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:06.091 00:14:05 -- accel/accel.sh@20 -- # IFS=: 00:06:06.091 00:14:05 -- accel/accel.sh@20 -- # read -r var val 00:06:06.091 00:14:05 -- accel/accel.sh@21 -- # val= 00:06:06.091 00:14:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:06.091 00:14:05 -- accel/accel.sh@20 -- # IFS=: 00:06:06.091 00:14:05 -- accel/accel.sh@20 -- # read -r var val 00:06:06.091 00:14:05 -- accel/accel.sh@21 -- # val=software 00:06:06.091 00:14:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:06.091 00:14:05 -- accel/accel.sh@23 -- # accel_module=software 00:06:06.091 00:14:05 -- accel/accel.sh@20 -- # IFS=: 00:06:06.091 00:14:05 -- accel/accel.sh@20 -- # read -r var val 00:06:06.091 00:14:05 -- accel/accel.sh@21 -- # val=32 00:06:06.091 00:14:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:06.091 00:14:05 -- accel/accel.sh@20 -- # IFS=: 00:06:06.091 00:14:05 -- accel/accel.sh@20 -- # read -r var val 00:06:06.091 00:14:05 -- accel/accel.sh@21 -- # val=32 00:06:06.091 00:14:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:06.091 00:14:05 -- accel/accel.sh@20 -- # IFS=: 00:06:06.091 00:14:05 -- accel/accel.sh@20 -- # read -r var val 00:06:06.091 00:14:05 -- accel/accel.sh@21 -- # val=1 00:06:06.091 00:14:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:06.091 00:14:05 -- accel/accel.sh@20 -- # IFS=: 00:06:06.091 00:14:05 -- accel/accel.sh@20 -- # read -r var val 00:06:06.091 00:14:05 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:06.091 00:14:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:06.091 00:14:05 -- accel/accel.sh@20 -- # IFS=: 00:06:06.091 00:14:05 -- accel/accel.sh@20 -- # read -r var val 00:06:06.091 00:14:05 -- accel/accel.sh@21 -- # val=Yes 00:06:06.091 00:14:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:06.091 00:14:05 -- accel/accel.sh@20 -- # IFS=: 00:06:06.091 00:14:05 -- accel/accel.sh@20 -- # read -r var val 00:06:06.091 00:14:05 -- accel/accel.sh@21 -- # val= 00:06:06.091 00:14:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:06.091 00:14:05 -- accel/accel.sh@20 -- # IFS=: 00:06:06.091 00:14:05 -- accel/accel.sh@20 -- # read -r var val 00:06:06.091 00:14:05 -- accel/accel.sh@21 -- # val= 00:06:06.091 00:14:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:06.091 00:14:05 -- accel/accel.sh@20 -- # IFS=: 00:06:06.091 00:14:05 -- accel/accel.sh@20 -- # read -r var val 00:06:07.477 00:14:06 -- accel/accel.sh@21 -- # val= 00:06:07.477 00:14:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:07.477 00:14:06 -- accel/accel.sh@20 -- # IFS=: 00:06:07.477 00:14:06 -- accel/accel.sh@20 -- # read -r var val 00:06:07.477 00:14:06 -- accel/accel.sh@21 -- # val= 00:06:07.477 00:14:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:07.477 00:14:06 -- accel/accel.sh@20 -- # IFS=: 00:06:07.477 00:14:06 -- accel/accel.sh@20 -- # read -r var val 00:06:07.477 00:14:06 -- accel/accel.sh@21 -- # val= 00:06:07.477 00:14:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:07.477 00:14:06 -- accel/accel.sh@20 -- # IFS=: 00:06:07.477 00:14:06 -- accel/accel.sh@20 -- # read -r var val 00:06:07.477 00:14:06 -- accel/accel.sh@21 -- # val= 00:06:07.477 00:14:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:07.477 00:14:06 -- accel/accel.sh@20 -- # IFS=: 00:06:07.477 00:14:06 -- accel/accel.sh@20 -- # read -r var val 00:06:07.477 00:14:06 -- accel/accel.sh@21 -- # val= 00:06:07.477 00:14:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:07.477 00:14:06 -- accel/accel.sh@20 -- # IFS=: 00:06:07.477 00:14:06 -- accel/accel.sh@20 -- # read -r var val 00:06:07.477 00:14:06 -- accel/accel.sh@21 -- # val= 00:06:07.477 00:14:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:07.477 00:14:06 -- accel/accel.sh@20 -- # IFS=: 00:06:07.477 00:14:06 -- accel/accel.sh@20 -- # read -r var val 00:06:07.478 00:14:06 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:07.478 00:14:06 -- accel/accel.sh@28 -- # [[ -n crc32c ]] 00:06:07.478 00:14:06 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:07.478 00:06:07.478 real 0m2.638s 00:06:07.478 user 0m2.399s 00:06:07.478 sys 0m0.248s 00:06:07.478 00:14:06 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:07.478 00:14:06 -- common/autotest_common.sh@10 -- # set +x 00:06:07.478 ************************************ 00:06:07.478 END TEST accel_crc32c_C2 00:06:07.478 ************************************ 00:06:07.478 00:14:06 -- accel/accel.sh@95 -- # run_test accel_copy accel_test -t 1 -w copy -y 00:06:07.478 00:14:06 -- common/autotest_common.sh@1077 -- # '[' 7 -le 1 ']' 00:06:07.478 00:14:06 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:07.478 00:14:06 -- common/autotest_common.sh@10 -- # set +x 00:06:07.478 ************************************ 00:06:07.478 START TEST accel_copy 00:06:07.478 ************************************ 00:06:07.478 00:14:06 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w copy -y 00:06:07.478 00:14:06 -- accel/accel.sh@16 -- # local accel_opc 00:06:07.478 00:14:06 -- accel/accel.sh@17 -- # local accel_module 00:06:07.478 00:14:06 -- accel/accel.sh@18 -- # accel_perf -t 1 -w copy -y 00:06:07.478 00:14:06 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy -y 00:06:07.478 00:14:06 -- accel/accel.sh@12 -- # build_accel_config 00:06:07.478 00:14:06 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:07.478 00:14:06 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:07.478 00:14:06 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:07.478 00:14:06 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:07.478 00:14:06 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:07.478 00:14:06 -- accel/accel.sh@41 -- # local IFS=, 00:06:07.478 00:14:06 -- accel/accel.sh@42 -- # jq -r . 00:06:07.478 [2024-07-15 00:14:06.215298] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:07.478 [2024-07-15 00:14:06.215391] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid315624 ] 00:06:07.478 EAL: No free 2048 kB hugepages reported on node 1 00:06:07.478 [2024-07-15 00:14:06.284993] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:07.478 [2024-07-15 00:14:06.353250] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:08.856 00:14:07 -- accel/accel.sh@18 -- # out=' 00:06:08.856 SPDK Configuration: 00:06:08.856 Core mask: 0x1 00:06:08.856 00:06:08.856 Accel Perf Configuration: 00:06:08.856 Workload Type: copy 00:06:08.856 Transfer size: 4096 bytes 00:06:08.856 Vector count 1 00:06:08.856 Module: software 00:06:08.856 Queue depth: 32 00:06:08.856 Allocate depth: 32 00:06:08.856 # threads/core: 1 00:06:08.856 Run time: 1 seconds 00:06:08.856 Verify: Yes 00:06:08.856 00:06:08.856 Running for 1 seconds... 00:06:08.856 00:06:08.856 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:08.856 ------------------------------------------------------------------------------------ 00:06:08.856 0,0 558208/s 2180 MiB/s 0 0 00:06:08.856 ==================================================================================== 00:06:08.856 Total 558208/s 2180 MiB/s 0 0' 00:06:08.856 00:14:07 -- accel/accel.sh@20 -- # IFS=: 00:06:08.856 00:14:07 -- accel/accel.sh@20 -- # read -r var val 00:06:08.856 00:14:07 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy -y 00:06:08.856 00:14:07 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy -y 00:06:08.856 00:14:07 -- accel/accel.sh@12 -- # build_accel_config 00:06:08.857 00:14:07 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:08.857 00:14:07 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:08.857 00:14:07 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:08.857 00:14:07 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:08.857 00:14:07 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:08.857 00:14:07 -- accel/accel.sh@41 -- # local IFS=, 00:06:08.857 00:14:07 -- accel/accel.sh@42 -- # jq -r . 00:06:08.857 [2024-07-15 00:14:07.540759] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:08.857 [2024-07-15 00:14:07.540850] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid315793 ] 00:06:08.857 EAL: No free 2048 kB hugepages reported on node 1 00:06:08.857 [2024-07-15 00:14:07.609789] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:08.857 [2024-07-15 00:14:07.676339] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:08.857 00:14:07 -- accel/accel.sh@21 -- # val= 00:06:08.857 00:14:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:08.857 00:14:07 -- accel/accel.sh@20 -- # IFS=: 00:06:08.857 00:14:07 -- accel/accel.sh@20 -- # read -r var val 00:06:08.857 00:14:07 -- accel/accel.sh@21 -- # val= 00:06:08.857 00:14:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:08.857 00:14:07 -- accel/accel.sh@20 -- # IFS=: 00:06:08.857 00:14:07 -- accel/accel.sh@20 -- # read -r var val 00:06:08.857 00:14:07 -- accel/accel.sh@21 -- # val=0x1 00:06:08.857 00:14:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:08.857 00:14:07 -- accel/accel.sh@20 -- # IFS=: 00:06:08.857 00:14:07 -- accel/accel.sh@20 -- # read -r var val 00:06:08.857 00:14:07 -- accel/accel.sh@21 -- # val= 00:06:08.857 00:14:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:08.857 00:14:07 -- accel/accel.sh@20 -- # IFS=: 00:06:08.857 00:14:07 -- accel/accel.sh@20 -- # read -r var val 00:06:08.857 00:14:07 -- accel/accel.sh@21 -- # val= 00:06:08.857 00:14:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:08.857 00:14:07 -- accel/accel.sh@20 -- # IFS=: 00:06:08.857 00:14:07 -- accel/accel.sh@20 -- # read -r var val 00:06:08.857 00:14:07 -- accel/accel.sh@21 -- # val=copy 00:06:08.857 00:14:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:08.857 00:14:07 -- accel/accel.sh@24 -- # accel_opc=copy 00:06:08.857 00:14:07 -- accel/accel.sh@20 -- # IFS=: 00:06:08.857 00:14:07 -- accel/accel.sh@20 -- # read -r var val 00:06:08.857 00:14:07 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:08.857 00:14:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:08.857 00:14:07 -- accel/accel.sh@20 -- # IFS=: 00:06:08.857 00:14:07 -- accel/accel.sh@20 -- # read -r var val 00:06:08.857 00:14:07 -- accel/accel.sh@21 -- # val= 00:06:08.857 00:14:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:08.857 00:14:07 -- accel/accel.sh@20 -- # IFS=: 00:06:08.857 00:14:07 -- accel/accel.sh@20 -- # read -r var val 00:06:08.857 00:14:07 -- accel/accel.sh@21 -- # val=software 00:06:08.857 00:14:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:08.857 00:14:07 -- accel/accel.sh@23 -- # accel_module=software 00:06:08.857 00:14:07 -- accel/accel.sh@20 -- # IFS=: 00:06:08.857 00:14:07 -- accel/accel.sh@20 -- # read -r var val 00:06:08.857 00:14:07 -- accel/accel.sh@21 -- # val=32 00:06:08.857 00:14:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:08.857 00:14:07 -- accel/accel.sh@20 -- # IFS=: 00:06:08.857 00:14:07 -- accel/accel.sh@20 -- # read -r var val 00:06:08.857 00:14:07 -- accel/accel.sh@21 -- # val=32 00:06:08.857 00:14:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:08.857 00:14:07 -- accel/accel.sh@20 -- # IFS=: 00:06:08.857 00:14:07 -- accel/accel.sh@20 -- # read -r var val 00:06:08.857 00:14:07 -- accel/accel.sh@21 -- # val=1 00:06:08.857 00:14:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:08.857 00:14:07 -- accel/accel.sh@20 -- # IFS=: 00:06:08.857 00:14:07 -- accel/accel.sh@20 -- # read -r var val 00:06:08.857 00:14:07 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:08.857 00:14:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:08.857 00:14:07 -- accel/accel.sh@20 -- # IFS=: 00:06:08.857 00:14:07 -- accel/accel.sh@20 -- # read -r var val 00:06:08.857 00:14:07 -- accel/accel.sh@21 -- # val=Yes 00:06:08.857 00:14:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:08.857 00:14:07 -- accel/accel.sh@20 -- # IFS=: 00:06:08.857 00:14:07 -- accel/accel.sh@20 -- # read -r var val 00:06:08.857 00:14:07 -- accel/accel.sh@21 -- # val= 00:06:08.857 00:14:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:08.857 00:14:07 -- accel/accel.sh@20 -- # IFS=: 00:06:08.857 00:14:07 -- accel/accel.sh@20 -- # read -r var val 00:06:08.857 00:14:07 -- accel/accel.sh@21 -- # val= 00:06:08.857 00:14:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:08.857 00:14:07 -- accel/accel.sh@20 -- # IFS=: 00:06:08.857 00:14:07 -- accel/accel.sh@20 -- # read -r var val 00:06:09.794 00:14:08 -- accel/accel.sh@21 -- # val= 00:06:09.794 00:14:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:09.794 00:14:08 -- accel/accel.sh@20 -- # IFS=: 00:06:09.794 00:14:08 -- accel/accel.sh@20 -- # read -r var val 00:06:09.794 00:14:08 -- accel/accel.sh@21 -- # val= 00:06:09.794 00:14:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:09.794 00:14:08 -- accel/accel.sh@20 -- # IFS=: 00:06:09.794 00:14:08 -- accel/accel.sh@20 -- # read -r var val 00:06:09.794 00:14:08 -- accel/accel.sh@21 -- # val= 00:06:09.794 00:14:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:09.794 00:14:08 -- accel/accel.sh@20 -- # IFS=: 00:06:09.794 00:14:08 -- accel/accel.sh@20 -- # read -r var val 00:06:09.794 00:14:08 -- accel/accel.sh@21 -- # val= 00:06:09.794 00:14:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:09.794 00:14:08 -- accel/accel.sh@20 -- # IFS=: 00:06:09.794 00:14:08 -- accel/accel.sh@20 -- # read -r var val 00:06:09.794 00:14:08 -- accel/accel.sh@21 -- # val= 00:06:09.794 00:14:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:09.794 00:14:08 -- accel/accel.sh@20 -- # IFS=: 00:06:09.794 00:14:08 -- accel/accel.sh@20 -- # read -r var val 00:06:09.794 00:14:08 -- accel/accel.sh@21 -- # val= 00:06:09.794 00:14:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:09.794 00:14:08 -- accel/accel.sh@20 -- # IFS=: 00:06:09.794 00:14:08 -- accel/accel.sh@20 -- # read -r var val 00:06:09.794 00:14:08 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:09.794 00:14:08 -- accel/accel.sh@28 -- # [[ -n copy ]] 00:06:09.794 00:14:08 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:09.794 00:06:09.794 real 0m2.654s 00:06:09.794 user 0m2.404s 00:06:09.794 sys 0m0.258s 00:06:09.794 00:14:08 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:09.794 00:14:08 -- common/autotest_common.sh@10 -- # set +x 00:06:09.794 ************************************ 00:06:09.794 END TEST accel_copy 00:06:09.794 ************************************ 00:06:10.053 00:14:08 -- accel/accel.sh@96 -- # run_test accel_fill accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:10.053 00:14:08 -- common/autotest_common.sh@1077 -- # '[' 13 -le 1 ']' 00:06:10.053 00:14:08 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:10.053 00:14:08 -- common/autotest_common.sh@10 -- # set +x 00:06:10.053 ************************************ 00:06:10.054 START TEST accel_fill 00:06:10.054 ************************************ 00:06:10.054 00:14:08 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:10.054 00:14:08 -- accel/accel.sh@16 -- # local accel_opc 00:06:10.054 00:14:08 -- accel/accel.sh@17 -- # local accel_module 00:06:10.054 00:14:08 -- accel/accel.sh@18 -- # accel_perf -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:10.054 00:14:08 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:10.054 00:14:08 -- accel/accel.sh@12 -- # build_accel_config 00:06:10.054 00:14:08 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:10.054 00:14:08 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:10.054 00:14:08 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:10.054 00:14:08 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:10.054 00:14:08 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:10.054 00:14:08 -- accel/accel.sh@41 -- # local IFS=, 00:06:10.054 00:14:08 -- accel/accel.sh@42 -- # jq -r . 00:06:10.054 [2024-07-15 00:14:08.916089] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:10.054 [2024-07-15 00:14:08.916180] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid316030 ] 00:06:10.054 EAL: No free 2048 kB hugepages reported on node 1 00:06:10.054 [2024-07-15 00:14:08.985869] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:10.054 [2024-07-15 00:14:09.055262] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:11.434 00:14:10 -- accel/accel.sh@18 -- # out=' 00:06:11.434 SPDK Configuration: 00:06:11.434 Core mask: 0x1 00:06:11.434 00:06:11.434 Accel Perf Configuration: 00:06:11.434 Workload Type: fill 00:06:11.434 Fill pattern: 0x80 00:06:11.434 Transfer size: 4096 bytes 00:06:11.434 Vector count 1 00:06:11.434 Module: software 00:06:11.434 Queue depth: 64 00:06:11.434 Allocate depth: 64 00:06:11.434 # threads/core: 1 00:06:11.434 Run time: 1 seconds 00:06:11.434 Verify: Yes 00:06:11.434 00:06:11.434 Running for 1 seconds... 00:06:11.434 00:06:11.434 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:11.434 ------------------------------------------------------------------------------------ 00:06:11.434 0,0 977024/s 3816 MiB/s 0 0 00:06:11.434 ==================================================================================== 00:06:11.434 Total 977024/s 3816 MiB/s 0 0' 00:06:11.434 00:14:10 -- accel/accel.sh@20 -- # IFS=: 00:06:11.434 00:14:10 -- accel/accel.sh@20 -- # read -r var val 00:06:11.434 00:14:10 -- accel/accel.sh@15 -- # accel_perf -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:11.434 00:14:10 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:11.434 00:14:10 -- accel/accel.sh@12 -- # build_accel_config 00:06:11.434 00:14:10 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:11.434 00:14:10 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:11.434 00:14:10 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:11.434 00:14:10 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:11.434 00:14:10 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:11.434 00:14:10 -- accel/accel.sh@41 -- # local IFS=, 00:06:11.434 00:14:10 -- accel/accel.sh@42 -- # jq -r . 00:06:11.434 [2024-07-15 00:14:10.242081] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:11.434 [2024-07-15 00:14:10.242174] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid316296 ] 00:06:11.434 EAL: No free 2048 kB hugepages reported on node 1 00:06:11.434 [2024-07-15 00:14:10.310594] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:11.434 [2024-07-15 00:14:10.376279] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:11.434 00:14:10 -- accel/accel.sh@21 -- # val= 00:06:11.434 00:14:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:11.434 00:14:10 -- accel/accel.sh@20 -- # IFS=: 00:06:11.434 00:14:10 -- accel/accel.sh@20 -- # read -r var val 00:06:11.434 00:14:10 -- accel/accel.sh@21 -- # val= 00:06:11.434 00:14:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:11.434 00:14:10 -- accel/accel.sh@20 -- # IFS=: 00:06:11.434 00:14:10 -- accel/accel.sh@20 -- # read -r var val 00:06:11.434 00:14:10 -- accel/accel.sh@21 -- # val=0x1 00:06:11.434 00:14:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:11.434 00:14:10 -- accel/accel.sh@20 -- # IFS=: 00:06:11.434 00:14:10 -- accel/accel.sh@20 -- # read -r var val 00:06:11.434 00:14:10 -- accel/accel.sh@21 -- # val= 00:06:11.434 00:14:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:11.434 00:14:10 -- accel/accel.sh@20 -- # IFS=: 00:06:11.434 00:14:10 -- accel/accel.sh@20 -- # read -r var val 00:06:11.434 00:14:10 -- accel/accel.sh@21 -- # val= 00:06:11.434 00:14:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:11.434 00:14:10 -- accel/accel.sh@20 -- # IFS=: 00:06:11.434 00:14:10 -- accel/accel.sh@20 -- # read -r var val 00:06:11.434 00:14:10 -- accel/accel.sh@21 -- # val=fill 00:06:11.434 00:14:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:11.434 00:14:10 -- accel/accel.sh@24 -- # accel_opc=fill 00:06:11.434 00:14:10 -- accel/accel.sh@20 -- # IFS=: 00:06:11.434 00:14:10 -- accel/accel.sh@20 -- # read -r var val 00:06:11.434 00:14:10 -- accel/accel.sh@21 -- # val=0x80 00:06:11.434 00:14:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:11.434 00:14:10 -- accel/accel.sh@20 -- # IFS=: 00:06:11.434 00:14:10 -- accel/accel.sh@20 -- # read -r var val 00:06:11.434 00:14:10 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:11.434 00:14:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:11.434 00:14:10 -- accel/accel.sh@20 -- # IFS=: 00:06:11.434 00:14:10 -- accel/accel.sh@20 -- # read -r var val 00:06:11.434 00:14:10 -- accel/accel.sh@21 -- # val= 00:06:11.434 00:14:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:11.434 00:14:10 -- accel/accel.sh@20 -- # IFS=: 00:06:11.434 00:14:10 -- accel/accel.sh@20 -- # read -r var val 00:06:11.434 00:14:10 -- accel/accel.sh@21 -- # val=software 00:06:11.434 00:14:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:11.434 00:14:10 -- accel/accel.sh@23 -- # accel_module=software 00:06:11.434 00:14:10 -- accel/accel.sh@20 -- # IFS=: 00:06:11.434 00:14:10 -- accel/accel.sh@20 -- # read -r var val 00:06:11.434 00:14:10 -- accel/accel.sh@21 -- # val=64 00:06:11.434 00:14:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:11.434 00:14:10 -- accel/accel.sh@20 -- # IFS=: 00:06:11.434 00:14:10 -- accel/accel.sh@20 -- # read -r var val 00:06:11.434 00:14:10 -- accel/accel.sh@21 -- # val=64 00:06:11.434 00:14:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:11.434 00:14:10 -- accel/accel.sh@20 -- # IFS=: 00:06:11.434 00:14:10 -- accel/accel.sh@20 -- # read -r var val 00:06:11.434 00:14:10 -- accel/accel.sh@21 -- # val=1 00:06:11.434 00:14:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:11.434 00:14:10 -- accel/accel.sh@20 -- # IFS=: 00:06:11.434 00:14:10 -- accel/accel.sh@20 -- # read -r var val 00:06:11.434 00:14:10 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:11.434 00:14:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:11.434 00:14:10 -- accel/accel.sh@20 -- # IFS=: 00:06:11.434 00:14:10 -- accel/accel.sh@20 -- # read -r var val 00:06:11.434 00:14:10 -- accel/accel.sh@21 -- # val=Yes 00:06:11.434 00:14:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:11.434 00:14:10 -- accel/accel.sh@20 -- # IFS=: 00:06:11.434 00:14:10 -- accel/accel.sh@20 -- # read -r var val 00:06:11.434 00:14:10 -- accel/accel.sh@21 -- # val= 00:06:11.434 00:14:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:11.434 00:14:10 -- accel/accel.sh@20 -- # IFS=: 00:06:11.434 00:14:10 -- accel/accel.sh@20 -- # read -r var val 00:06:11.434 00:14:10 -- accel/accel.sh@21 -- # val= 00:06:11.434 00:14:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:11.434 00:14:10 -- accel/accel.sh@20 -- # IFS=: 00:06:11.434 00:14:10 -- accel/accel.sh@20 -- # read -r var val 00:06:12.816 00:14:11 -- accel/accel.sh@21 -- # val= 00:06:12.816 00:14:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:12.816 00:14:11 -- accel/accel.sh@20 -- # IFS=: 00:06:12.816 00:14:11 -- accel/accel.sh@20 -- # read -r var val 00:06:12.816 00:14:11 -- accel/accel.sh@21 -- # val= 00:06:12.816 00:14:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:12.816 00:14:11 -- accel/accel.sh@20 -- # IFS=: 00:06:12.816 00:14:11 -- accel/accel.sh@20 -- # read -r var val 00:06:12.816 00:14:11 -- accel/accel.sh@21 -- # val= 00:06:12.816 00:14:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:12.816 00:14:11 -- accel/accel.sh@20 -- # IFS=: 00:06:12.816 00:14:11 -- accel/accel.sh@20 -- # read -r var val 00:06:12.816 00:14:11 -- accel/accel.sh@21 -- # val= 00:06:12.816 00:14:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:12.816 00:14:11 -- accel/accel.sh@20 -- # IFS=: 00:06:12.816 00:14:11 -- accel/accel.sh@20 -- # read -r var val 00:06:12.816 00:14:11 -- accel/accel.sh@21 -- # val= 00:06:12.816 00:14:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:12.816 00:14:11 -- accel/accel.sh@20 -- # IFS=: 00:06:12.816 00:14:11 -- accel/accel.sh@20 -- # read -r var val 00:06:12.816 00:14:11 -- accel/accel.sh@21 -- # val= 00:06:12.816 00:14:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:12.816 00:14:11 -- accel/accel.sh@20 -- # IFS=: 00:06:12.816 00:14:11 -- accel/accel.sh@20 -- # read -r var val 00:06:12.816 00:14:11 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:12.816 00:14:11 -- accel/accel.sh@28 -- # [[ -n fill ]] 00:06:12.816 00:14:11 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:12.816 00:06:12.816 real 0m2.653s 00:06:12.816 user 0m2.404s 00:06:12.816 sys 0m0.256s 00:06:12.816 00:14:11 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:12.816 00:14:11 -- common/autotest_common.sh@10 -- # set +x 00:06:12.816 ************************************ 00:06:12.816 END TEST accel_fill 00:06:12.816 ************************************ 00:06:12.816 00:14:11 -- accel/accel.sh@97 -- # run_test accel_copy_crc32c accel_test -t 1 -w copy_crc32c -y 00:06:12.816 00:14:11 -- common/autotest_common.sh@1077 -- # '[' 7 -le 1 ']' 00:06:12.816 00:14:11 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:12.816 00:14:11 -- common/autotest_common.sh@10 -- # set +x 00:06:12.816 ************************************ 00:06:12.816 START TEST accel_copy_crc32c 00:06:12.816 ************************************ 00:06:12.816 00:14:11 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w copy_crc32c -y 00:06:12.816 00:14:11 -- accel/accel.sh@16 -- # local accel_opc 00:06:12.816 00:14:11 -- accel/accel.sh@17 -- # local accel_module 00:06:12.816 00:14:11 -- accel/accel.sh@18 -- # accel_perf -t 1 -w copy_crc32c -y 00:06:12.816 00:14:11 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y 00:06:12.816 00:14:11 -- accel/accel.sh@12 -- # build_accel_config 00:06:12.816 00:14:11 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:12.816 00:14:11 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:12.816 00:14:11 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:12.816 00:14:11 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:12.816 00:14:11 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:12.816 00:14:11 -- accel/accel.sh@41 -- # local IFS=, 00:06:12.816 00:14:11 -- accel/accel.sh@42 -- # jq -r . 00:06:12.816 [2024-07-15 00:14:11.618946] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:12.816 [2024-07-15 00:14:11.619033] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid316587 ] 00:06:12.816 EAL: No free 2048 kB hugepages reported on node 1 00:06:12.816 [2024-07-15 00:14:11.688922] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:12.816 [2024-07-15 00:14:11.756345] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:14.196 00:14:12 -- accel/accel.sh@18 -- # out=' 00:06:14.196 SPDK Configuration: 00:06:14.196 Core mask: 0x1 00:06:14.196 00:06:14.196 Accel Perf Configuration: 00:06:14.196 Workload Type: copy_crc32c 00:06:14.196 CRC-32C seed: 0 00:06:14.196 Vector size: 4096 bytes 00:06:14.196 Transfer size: 4096 bytes 00:06:14.196 Vector count 1 00:06:14.196 Module: software 00:06:14.196 Queue depth: 32 00:06:14.196 Allocate depth: 32 00:06:14.196 # threads/core: 1 00:06:14.196 Run time: 1 seconds 00:06:14.196 Verify: Yes 00:06:14.196 00:06:14.196 Running for 1 seconds... 00:06:14.196 00:06:14.196 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:14.196 ------------------------------------------------------------------------------------ 00:06:14.196 0,0 434432/s 1697 MiB/s 0 0 00:06:14.196 ==================================================================================== 00:06:14.196 Total 434432/s 1697 MiB/s 0 0' 00:06:14.196 00:14:12 -- accel/accel.sh@20 -- # IFS=: 00:06:14.196 00:14:12 -- accel/accel.sh@20 -- # read -r var val 00:06:14.196 00:14:12 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y 00:06:14.196 00:14:12 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y 00:06:14.196 00:14:12 -- accel/accel.sh@12 -- # build_accel_config 00:06:14.196 00:14:12 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:14.196 00:14:12 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:14.196 00:14:12 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:14.196 00:14:12 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:14.196 00:14:12 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:14.196 00:14:12 -- accel/accel.sh@41 -- # local IFS=, 00:06:14.196 00:14:12 -- accel/accel.sh@42 -- # jq -r . 00:06:14.196 [2024-07-15 00:14:12.944780] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:14.196 [2024-07-15 00:14:12.944871] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid316856 ] 00:06:14.196 EAL: No free 2048 kB hugepages reported on node 1 00:06:14.196 [2024-07-15 00:14:13.015446] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:14.196 [2024-07-15 00:14:13.081131] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:14.196 00:14:13 -- accel/accel.sh@21 -- # val= 00:06:14.196 00:14:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.196 00:14:13 -- accel/accel.sh@20 -- # IFS=: 00:06:14.196 00:14:13 -- accel/accel.sh@20 -- # read -r var val 00:06:14.196 00:14:13 -- accel/accel.sh@21 -- # val= 00:06:14.196 00:14:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.196 00:14:13 -- accel/accel.sh@20 -- # IFS=: 00:06:14.196 00:14:13 -- accel/accel.sh@20 -- # read -r var val 00:06:14.196 00:14:13 -- accel/accel.sh@21 -- # val=0x1 00:06:14.196 00:14:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.196 00:14:13 -- accel/accel.sh@20 -- # IFS=: 00:06:14.196 00:14:13 -- accel/accel.sh@20 -- # read -r var val 00:06:14.196 00:14:13 -- accel/accel.sh@21 -- # val= 00:06:14.196 00:14:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.196 00:14:13 -- accel/accel.sh@20 -- # IFS=: 00:06:14.196 00:14:13 -- accel/accel.sh@20 -- # read -r var val 00:06:14.196 00:14:13 -- accel/accel.sh@21 -- # val= 00:06:14.196 00:14:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.196 00:14:13 -- accel/accel.sh@20 -- # IFS=: 00:06:14.196 00:14:13 -- accel/accel.sh@20 -- # read -r var val 00:06:14.196 00:14:13 -- accel/accel.sh@21 -- # val=copy_crc32c 00:06:14.196 00:14:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.196 00:14:13 -- accel/accel.sh@24 -- # accel_opc=copy_crc32c 00:06:14.196 00:14:13 -- accel/accel.sh@20 -- # IFS=: 00:06:14.196 00:14:13 -- accel/accel.sh@20 -- # read -r var val 00:06:14.196 00:14:13 -- accel/accel.sh@21 -- # val=0 00:06:14.196 00:14:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.196 00:14:13 -- accel/accel.sh@20 -- # IFS=: 00:06:14.196 00:14:13 -- accel/accel.sh@20 -- # read -r var val 00:06:14.196 00:14:13 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:14.196 00:14:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.196 00:14:13 -- accel/accel.sh@20 -- # IFS=: 00:06:14.196 00:14:13 -- accel/accel.sh@20 -- # read -r var val 00:06:14.196 00:14:13 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:14.196 00:14:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.196 00:14:13 -- accel/accel.sh@20 -- # IFS=: 00:06:14.196 00:14:13 -- accel/accel.sh@20 -- # read -r var val 00:06:14.196 00:14:13 -- accel/accel.sh@21 -- # val= 00:06:14.196 00:14:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.196 00:14:13 -- accel/accel.sh@20 -- # IFS=: 00:06:14.196 00:14:13 -- accel/accel.sh@20 -- # read -r var val 00:06:14.196 00:14:13 -- accel/accel.sh@21 -- # val=software 00:06:14.196 00:14:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.196 00:14:13 -- accel/accel.sh@23 -- # accel_module=software 00:06:14.196 00:14:13 -- accel/accel.sh@20 -- # IFS=: 00:06:14.196 00:14:13 -- accel/accel.sh@20 -- # read -r var val 00:06:14.196 00:14:13 -- accel/accel.sh@21 -- # val=32 00:06:14.196 00:14:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.196 00:14:13 -- accel/accel.sh@20 -- # IFS=: 00:06:14.196 00:14:13 -- accel/accel.sh@20 -- # read -r var val 00:06:14.196 00:14:13 -- accel/accel.sh@21 -- # val=32 00:06:14.196 00:14:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.196 00:14:13 -- accel/accel.sh@20 -- # IFS=: 00:06:14.196 00:14:13 -- accel/accel.sh@20 -- # read -r var val 00:06:14.196 00:14:13 -- accel/accel.sh@21 -- # val=1 00:06:14.196 00:14:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.196 00:14:13 -- accel/accel.sh@20 -- # IFS=: 00:06:14.196 00:14:13 -- accel/accel.sh@20 -- # read -r var val 00:06:14.196 00:14:13 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:14.196 00:14:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.196 00:14:13 -- accel/accel.sh@20 -- # IFS=: 00:06:14.196 00:14:13 -- accel/accel.sh@20 -- # read -r var val 00:06:14.196 00:14:13 -- accel/accel.sh@21 -- # val=Yes 00:06:14.196 00:14:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.196 00:14:13 -- accel/accel.sh@20 -- # IFS=: 00:06:14.196 00:14:13 -- accel/accel.sh@20 -- # read -r var val 00:06:14.196 00:14:13 -- accel/accel.sh@21 -- # val= 00:06:14.196 00:14:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.196 00:14:13 -- accel/accel.sh@20 -- # IFS=: 00:06:14.196 00:14:13 -- accel/accel.sh@20 -- # read -r var val 00:06:14.196 00:14:13 -- accel/accel.sh@21 -- # val= 00:06:14.196 00:14:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.196 00:14:13 -- accel/accel.sh@20 -- # IFS=: 00:06:14.196 00:14:13 -- accel/accel.sh@20 -- # read -r var val 00:06:15.573 00:14:14 -- accel/accel.sh@21 -- # val= 00:06:15.573 00:14:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:15.573 00:14:14 -- accel/accel.sh@20 -- # IFS=: 00:06:15.573 00:14:14 -- accel/accel.sh@20 -- # read -r var val 00:06:15.573 00:14:14 -- accel/accel.sh@21 -- # val= 00:06:15.573 00:14:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:15.573 00:14:14 -- accel/accel.sh@20 -- # IFS=: 00:06:15.573 00:14:14 -- accel/accel.sh@20 -- # read -r var val 00:06:15.573 00:14:14 -- accel/accel.sh@21 -- # val= 00:06:15.573 00:14:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:15.573 00:14:14 -- accel/accel.sh@20 -- # IFS=: 00:06:15.573 00:14:14 -- accel/accel.sh@20 -- # read -r var val 00:06:15.573 00:14:14 -- accel/accel.sh@21 -- # val= 00:06:15.573 00:14:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:15.573 00:14:14 -- accel/accel.sh@20 -- # IFS=: 00:06:15.573 00:14:14 -- accel/accel.sh@20 -- # read -r var val 00:06:15.573 00:14:14 -- accel/accel.sh@21 -- # val= 00:06:15.573 00:14:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:15.573 00:14:14 -- accel/accel.sh@20 -- # IFS=: 00:06:15.573 00:14:14 -- accel/accel.sh@20 -- # read -r var val 00:06:15.573 00:14:14 -- accel/accel.sh@21 -- # val= 00:06:15.573 00:14:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:15.573 00:14:14 -- accel/accel.sh@20 -- # IFS=: 00:06:15.573 00:14:14 -- accel/accel.sh@20 -- # read -r var val 00:06:15.573 00:14:14 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:15.573 00:14:14 -- accel/accel.sh@28 -- # [[ -n copy_crc32c ]] 00:06:15.573 00:14:14 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:15.573 00:06:15.573 real 0m2.657s 00:06:15.573 user 0m2.398s 00:06:15.573 sys 0m0.267s 00:06:15.573 00:14:14 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:15.573 00:14:14 -- common/autotest_common.sh@10 -- # set +x 00:06:15.573 ************************************ 00:06:15.573 END TEST accel_copy_crc32c 00:06:15.573 ************************************ 00:06:15.573 00:14:14 -- accel/accel.sh@98 -- # run_test accel_copy_crc32c_C2 accel_test -t 1 -w copy_crc32c -y -C 2 00:06:15.573 00:14:14 -- common/autotest_common.sh@1077 -- # '[' 9 -le 1 ']' 00:06:15.573 00:14:14 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:15.573 00:14:14 -- common/autotest_common.sh@10 -- # set +x 00:06:15.573 ************************************ 00:06:15.573 START TEST accel_copy_crc32c_C2 00:06:15.573 ************************************ 00:06:15.573 00:14:14 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w copy_crc32c -y -C 2 00:06:15.573 00:14:14 -- accel/accel.sh@16 -- # local accel_opc 00:06:15.573 00:14:14 -- accel/accel.sh@17 -- # local accel_module 00:06:15.573 00:14:14 -- accel/accel.sh@18 -- # accel_perf -t 1 -w copy_crc32c -y -C 2 00:06:15.573 00:14:14 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y -C 2 00:06:15.573 00:14:14 -- accel/accel.sh@12 -- # build_accel_config 00:06:15.573 00:14:14 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:15.573 00:14:14 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:15.573 00:14:14 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:15.573 00:14:14 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:15.573 00:14:14 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:15.573 00:14:14 -- accel/accel.sh@41 -- # local IFS=, 00:06:15.573 00:14:14 -- accel/accel.sh@42 -- # jq -r . 00:06:15.573 [2024-07-15 00:14:14.315652] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:15.573 [2024-07-15 00:14:14.315717] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid317139 ] 00:06:15.573 EAL: No free 2048 kB hugepages reported on node 1 00:06:15.573 [2024-07-15 00:14:14.376198] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:15.573 [2024-07-15 00:14:14.443392] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:16.952 00:14:15 -- accel/accel.sh@18 -- # out=' 00:06:16.952 SPDK Configuration: 00:06:16.952 Core mask: 0x1 00:06:16.952 00:06:16.952 Accel Perf Configuration: 00:06:16.952 Workload Type: copy_crc32c 00:06:16.952 CRC-32C seed: 0 00:06:16.952 Vector size: 4096 bytes 00:06:16.952 Transfer size: 8192 bytes 00:06:16.952 Vector count 2 00:06:16.952 Module: software 00:06:16.952 Queue depth: 32 00:06:16.952 Allocate depth: 32 00:06:16.952 # threads/core: 1 00:06:16.952 Run time: 1 seconds 00:06:16.952 Verify: Yes 00:06:16.952 00:06:16.952 Running for 1 seconds... 00:06:16.952 00:06:16.952 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:16.952 ------------------------------------------------------------------------------------ 00:06:16.952 0,0 301440/s 2355 MiB/s 0 0 00:06:16.952 ==================================================================================== 00:06:16.952 Total 301440/s 1177 MiB/s 0 0' 00:06:16.952 00:14:15 -- accel/accel.sh@20 -- # IFS=: 00:06:16.952 00:14:15 -- accel/accel.sh@20 -- # read -r var val 00:06:16.952 00:14:15 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y -C 2 00:06:16.952 00:14:15 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y -C 2 00:06:16.952 00:14:15 -- accel/accel.sh@12 -- # build_accel_config 00:06:16.952 00:14:15 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:16.952 00:14:15 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:16.952 00:14:15 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:16.952 00:14:15 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:16.952 00:14:15 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:16.952 00:14:15 -- accel/accel.sh@41 -- # local IFS=, 00:06:16.952 00:14:15 -- accel/accel.sh@42 -- # jq -r . 00:06:16.952 [2024-07-15 00:14:15.632200] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:16.952 [2024-07-15 00:14:15.632287] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid317368 ] 00:06:16.952 EAL: No free 2048 kB hugepages reported on node 1 00:06:16.952 [2024-07-15 00:14:15.703417] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:16.952 [2024-07-15 00:14:15.769542] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:16.952 00:14:15 -- accel/accel.sh@21 -- # val= 00:06:16.952 00:14:15 -- accel/accel.sh@22 -- # case "$var" in 00:06:16.952 00:14:15 -- accel/accel.sh@20 -- # IFS=: 00:06:16.952 00:14:15 -- accel/accel.sh@20 -- # read -r var val 00:06:16.952 00:14:15 -- accel/accel.sh@21 -- # val= 00:06:16.952 00:14:15 -- accel/accel.sh@22 -- # case "$var" in 00:06:16.952 00:14:15 -- accel/accel.sh@20 -- # IFS=: 00:06:16.952 00:14:15 -- accel/accel.sh@20 -- # read -r var val 00:06:16.952 00:14:15 -- accel/accel.sh@21 -- # val=0x1 00:06:16.952 00:14:15 -- accel/accel.sh@22 -- # case "$var" in 00:06:16.952 00:14:15 -- accel/accel.sh@20 -- # IFS=: 00:06:16.952 00:14:15 -- accel/accel.sh@20 -- # read -r var val 00:06:16.952 00:14:15 -- accel/accel.sh@21 -- # val= 00:06:16.952 00:14:15 -- accel/accel.sh@22 -- # case "$var" in 00:06:16.952 00:14:15 -- accel/accel.sh@20 -- # IFS=: 00:06:16.952 00:14:15 -- accel/accel.sh@20 -- # read -r var val 00:06:16.952 00:14:15 -- accel/accel.sh@21 -- # val= 00:06:16.952 00:14:15 -- accel/accel.sh@22 -- # case "$var" in 00:06:16.952 00:14:15 -- accel/accel.sh@20 -- # IFS=: 00:06:16.952 00:14:15 -- accel/accel.sh@20 -- # read -r var val 00:06:16.952 00:14:15 -- accel/accel.sh@21 -- # val=copy_crc32c 00:06:16.953 00:14:15 -- accel/accel.sh@22 -- # case "$var" in 00:06:16.953 00:14:15 -- accel/accel.sh@24 -- # accel_opc=copy_crc32c 00:06:16.953 00:14:15 -- accel/accel.sh@20 -- # IFS=: 00:06:16.953 00:14:15 -- accel/accel.sh@20 -- # read -r var val 00:06:16.953 00:14:15 -- accel/accel.sh@21 -- # val=0 00:06:16.953 00:14:15 -- accel/accel.sh@22 -- # case "$var" in 00:06:16.953 00:14:15 -- accel/accel.sh@20 -- # IFS=: 00:06:16.953 00:14:15 -- accel/accel.sh@20 -- # read -r var val 00:06:16.953 00:14:15 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:16.953 00:14:15 -- accel/accel.sh@22 -- # case "$var" in 00:06:16.953 00:14:15 -- accel/accel.sh@20 -- # IFS=: 00:06:16.953 00:14:15 -- accel/accel.sh@20 -- # read -r var val 00:06:16.953 00:14:15 -- accel/accel.sh@21 -- # val='8192 bytes' 00:06:16.953 00:14:15 -- accel/accel.sh@22 -- # case "$var" in 00:06:16.953 00:14:15 -- accel/accel.sh@20 -- # IFS=: 00:06:16.953 00:14:15 -- accel/accel.sh@20 -- # read -r var val 00:06:16.953 00:14:15 -- accel/accel.sh@21 -- # val= 00:06:16.953 00:14:15 -- accel/accel.sh@22 -- # case "$var" in 00:06:16.953 00:14:15 -- accel/accel.sh@20 -- # IFS=: 00:06:16.953 00:14:15 -- accel/accel.sh@20 -- # read -r var val 00:06:16.953 00:14:15 -- accel/accel.sh@21 -- # val=software 00:06:16.953 00:14:15 -- accel/accel.sh@22 -- # case "$var" in 00:06:16.953 00:14:15 -- accel/accel.sh@23 -- # accel_module=software 00:06:16.953 00:14:15 -- accel/accel.sh@20 -- # IFS=: 00:06:16.953 00:14:15 -- accel/accel.sh@20 -- # read -r var val 00:06:16.953 00:14:15 -- accel/accel.sh@21 -- # val=32 00:06:16.953 00:14:15 -- accel/accel.sh@22 -- # case "$var" in 00:06:16.953 00:14:15 -- accel/accel.sh@20 -- # IFS=: 00:06:16.953 00:14:15 -- accel/accel.sh@20 -- # read -r var val 00:06:16.953 00:14:15 -- accel/accel.sh@21 -- # val=32 00:06:16.953 00:14:15 -- accel/accel.sh@22 -- # case "$var" in 00:06:16.953 00:14:15 -- accel/accel.sh@20 -- # IFS=: 00:06:16.953 00:14:15 -- accel/accel.sh@20 -- # read -r var val 00:06:16.953 00:14:15 -- accel/accel.sh@21 -- # val=1 00:06:16.953 00:14:15 -- accel/accel.sh@22 -- # case "$var" in 00:06:16.953 00:14:15 -- accel/accel.sh@20 -- # IFS=: 00:06:16.953 00:14:15 -- accel/accel.sh@20 -- # read -r var val 00:06:16.953 00:14:15 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:16.953 00:14:15 -- accel/accel.sh@22 -- # case "$var" in 00:06:16.953 00:14:15 -- accel/accel.sh@20 -- # IFS=: 00:06:16.953 00:14:15 -- accel/accel.sh@20 -- # read -r var val 00:06:16.953 00:14:15 -- accel/accel.sh@21 -- # val=Yes 00:06:16.953 00:14:15 -- accel/accel.sh@22 -- # case "$var" in 00:06:16.953 00:14:15 -- accel/accel.sh@20 -- # IFS=: 00:06:16.953 00:14:15 -- accel/accel.sh@20 -- # read -r var val 00:06:16.953 00:14:15 -- accel/accel.sh@21 -- # val= 00:06:16.953 00:14:15 -- accel/accel.sh@22 -- # case "$var" in 00:06:16.953 00:14:15 -- accel/accel.sh@20 -- # IFS=: 00:06:16.953 00:14:15 -- accel/accel.sh@20 -- # read -r var val 00:06:16.953 00:14:15 -- accel/accel.sh@21 -- # val= 00:06:16.953 00:14:15 -- accel/accel.sh@22 -- # case "$var" in 00:06:16.953 00:14:15 -- accel/accel.sh@20 -- # IFS=: 00:06:16.953 00:14:15 -- accel/accel.sh@20 -- # read -r var val 00:06:17.889 00:14:16 -- accel/accel.sh@21 -- # val= 00:06:17.889 00:14:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:17.889 00:14:16 -- accel/accel.sh@20 -- # IFS=: 00:06:17.889 00:14:16 -- accel/accel.sh@20 -- # read -r var val 00:06:17.889 00:14:16 -- accel/accel.sh@21 -- # val= 00:06:17.889 00:14:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:17.889 00:14:16 -- accel/accel.sh@20 -- # IFS=: 00:06:17.889 00:14:16 -- accel/accel.sh@20 -- # read -r var val 00:06:17.889 00:14:16 -- accel/accel.sh@21 -- # val= 00:06:17.889 00:14:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:17.889 00:14:16 -- accel/accel.sh@20 -- # IFS=: 00:06:17.889 00:14:16 -- accel/accel.sh@20 -- # read -r var val 00:06:17.889 00:14:16 -- accel/accel.sh@21 -- # val= 00:06:17.889 00:14:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:17.889 00:14:16 -- accel/accel.sh@20 -- # IFS=: 00:06:17.889 00:14:16 -- accel/accel.sh@20 -- # read -r var val 00:06:17.889 00:14:16 -- accel/accel.sh@21 -- # val= 00:06:17.889 00:14:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:17.889 00:14:16 -- accel/accel.sh@20 -- # IFS=: 00:06:17.889 00:14:16 -- accel/accel.sh@20 -- # read -r var val 00:06:17.889 00:14:16 -- accel/accel.sh@21 -- # val= 00:06:17.889 00:14:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:17.889 00:14:16 -- accel/accel.sh@20 -- # IFS=: 00:06:17.889 00:14:16 -- accel/accel.sh@20 -- # read -r var val 00:06:17.889 00:14:16 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:17.889 00:14:16 -- accel/accel.sh@28 -- # [[ -n copy_crc32c ]] 00:06:17.889 00:14:16 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:17.889 00:06:17.890 real 0m2.640s 00:06:17.890 user 0m2.406s 00:06:17.890 sys 0m0.243s 00:06:17.890 00:14:16 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:17.890 00:14:16 -- common/autotest_common.sh@10 -- # set +x 00:06:17.890 ************************************ 00:06:17.890 END TEST accel_copy_crc32c_C2 00:06:17.890 ************************************ 00:06:18.149 00:14:16 -- accel/accel.sh@99 -- # run_test accel_dualcast accel_test -t 1 -w dualcast -y 00:06:18.149 00:14:16 -- common/autotest_common.sh@1077 -- # '[' 7 -le 1 ']' 00:06:18.149 00:14:16 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:18.149 00:14:16 -- common/autotest_common.sh@10 -- # set +x 00:06:18.149 ************************************ 00:06:18.149 START TEST accel_dualcast 00:06:18.149 ************************************ 00:06:18.149 00:14:16 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w dualcast -y 00:06:18.149 00:14:16 -- accel/accel.sh@16 -- # local accel_opc 00:06:18.149 00:14:16 -- accel/accel.sh@17 -- # local accel_module 00:06:18.149 00:14:16 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dualcast -y 00:06:18.149 00:14:16 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dualcast -y 00:06:18.149 00:14:16 -- accel/accel.sh@12 -- # build_accel_config 00:06:18.149 00:14:16 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:18.149 00:14:16 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:18.149 00:14:16 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:18.149 00:14:16 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:18.149 00:14:16 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:18.149 00:14:16 -- accel/accel.sh@41 -- # local IFS=, 00:06:18.149 00:14:16 -- accel/accel.sh@42 -- # jq -r . 00:06:18.149 [2024-07-15 00:14:17.011343] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:18.149 [2024-07-15 00:14:17.011438] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid317569 ] 00:06:18.149 EAL: No free 2048 kB hugepages reported on node 1 00:06:18.149 [2024-07-15 00:14:17.080574] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:18.149 [2024-07-15 00:14:17.149215] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:19.551 00:14:18 -- accel/accel.sh@18 -- # out=' 00:06:19.551 SPDK Configuration: 00:06:19.551 Core mask: 0x1 00:06:19.551 00:06:19.551 Accel Perf Configuration: 00:06:19.551 Workload Type: dualcast 00:06:19.551 Transfer size: 4096 bytes 00:06:19.551 Vector count 1 00:06:19.551 Module: software 00:06:19.551 Queue depth: 32 00:06:19.551 Allocate depth: 32 00:06:19.551 # threads/core: 1 00:06:19.551 Run time: 1 seconds 00:06:19.551 Verify: Yes 00:06:19.551 00:06:19.551 Running for 1 seconds... 00:06:19.551 00:06:19.551 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:19.551 ------------------------------------------------------------------------------------ 00:06:19.551 0,0 670624/s 2619 MiB/s 0 0 00:06:19.551 ==================================================================================== 00:06:19.551 Total 670624/s 2619 MiB/s 0 0' 00:06:19.551 00:14:18 -- accel/accel.sh@20 -- # IFS=: 00:06:19.551 00:14:18 -- accel/accel.sh@20 -- # read -r var val 00:06:19.551 00:14:18 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dualcast -y 00:06:19.551 00:14:18 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dualcast -y 00:06:19.551 00:14:18 -- accel/accel.sh@12 -- # build_accel_config 00:06:19.551 00:14:18 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:19.551 00:14:18 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:19.551 00:14:18 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:19.551 00:14:18 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:19.551 00:14:18 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:19.551 00:14:18 -- accel/accel.sh@41 -- # local IFS=, 00:06:19.551 00:14:18 -- accel/accel.sh@42 -- # jq -r . 00:06:19.551 [2024-07-15 00:14:18.336230] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:19.551 [2024-07-15 00:14:18.336327] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid317729 ] 00:06:19.551 EAL: No free 2048 kB hugepages reported on node 1 00:06:19.551 [2024-07-15 00:14:18.404581] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:19.551 [2024-07-15 00:14:18.474328] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:19.551 00:14:18 -- accel/accel.sh@21 -- # val= 00:06:19.551 00:14:18 -- accel/accel.sh@22 -- # case "$var" in 00:06:19.551 00:14:18 -- accel/accel.sh@20 -- # IFS=: 00:06:19.551 00:14:18 -- accel/accel.sh@20 -- # read -r var val 00:06:19.551 00:14:18 -- accel/accel.sh@21 -- # val= 00:06:19.551 00:14:18 -- accel/accel.sh@22 -- # case "$var" in 00:06:19.551 00:14:18 -- accel/accel.sh@20 -- # IFS=: 00:06:19.551 00:14:18 -- accel/accel.sh@20 -- # read -r var val 00:06:19.551 00:14:18 -- accel/accel.sh@21 -- # val=0x1 00:06:19.551 00:14:18 -- accel/accel.sh@22 -- # case "$var" in 00:06:19.551 00:14:18 -- accel/accel.sh@20 -- # IFS=: 00:06:19.551 00:14:18 -- accel/accel.sh@20 -- # read -r var val 00:06:19.551 00:14:18 -- accel/accel.sh@21 -- # val= 00:06:19.551 00:14:18 -- accel/accel.sh@22 -- # case "$var" in 00:06:19.551 00:14:18 -- accel/accel.sh@20 -- # IFS=: 00:06:19.551 00:14:18 -- accel/accel.sh@20 -- # read -r var val 00:06:19.551 00:14:18 -- accel/accel.sh@21 -- # val= 00:06:19.551 00:14:18 -- accel/accel.sh@22 -- # case "$var" in 00:06:19.551 00:14:18 -- accel/accel.sh@20 -- # IFS=: 00:06:19.551 00:14:18 -- accel/accel.sh@20 -- # read -r var val 00:06:19.551 00:14:18 -- accel/accel.sh@21 -- # val=dualcast 00:06:19.551 00:14:18 -- accel/accel.sh@22 -- # case "$var" in 00:06:19.551 00:14:18 -- accel/accel.sh@24 -- # accel_opc=dualcast 00:06:19.551 00:14:18 -- accel/accel.sh@20 -- # IFS=: 00:06:19.551 00:14:18 -- accel/accel.sh@20 -- # read -r var val 00:06:19.551 00:14:18 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:19.551 00:14:18 -- accel/accel.sh@22 -- # case "$var" in 00:06:19.551 00:14:18 -- accel/accel.sh@20 -- # IFS=: 00:06:19.551 00:14:18 -- accel/accel.sh@20 -- # read -r var val 00:06:19.551 00:14:18 -- accel/accel.sh@21 -- # val= 00:06:19.551 00:14:18 -- accel/accel.sh@22 -- # case "$var" in 00:06:19.551 00:14:18 -- accel/accel.sh@20 -- # IFS=: 00:06:19.551 00:14:18 -- accel/accel.sh@20 -- # read -r var val 00:06:19.551 00:14:18 -- accel/accel.sh@21 -- # val=software 00:06:19.551 00:14:18 -- accel/accel.sh@22 -- # case "$var" in 00:06:19.551 00:14:18 -- accel/accel.sh@23 -- # accel_module=software 00:06:19.551 00:14:18 -- accel/accel.sh@20 -- # IFS=: 00:06:19.552 00:14:18 -- accel/accel.sh@20 -- # read -r var val 00:06:19.552 00:14:18 -- accel/accel.sh@21 -- # val=32 00:06:19.552 00:14:18 -- accel/accel.sh@22 -- # case "$var" in 00:06:19.552 00:14:18 -- accel/accel.sh@20 -- # IFS=: 00:06:19.552 00:14:18 -- accel/accel.sh@20 -- # read -r var val 00:06:19.552 00:14:18 -- accel/accel.sh@21 -- # val=32 00:06:19.552 00:14:18 -- accel/accel.sh@22 -- # case "$var" in 00:06:19.552 00:14:18 -- accel/accel.sh@20 -- # IFS=: 00:06:19.552 00:14:18 -- accel/accel.sh@20 -- # read -r var val 00:06:19.552 00:14:18 -- accel/accel.sh@21 -- # val=1 00:06:19.552 00:14:18 -- accel/accel.sh@22 -- # case "$var" in 00:06:19.552 00:14:18 -- accel/accel.sh@20 -- # IFS=: 00:06:19.552 00:14:18 -- accel/accel.sh@20 -- # read -r var val 00:06:19.552 00:14:18 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:19.552 00:14:18 -- accel/accel.sh@22 -- # case "$var" in 00:06:19.552 00:14:18 -- accel/accel.sh@20 -- # IFS=: 00:06:19.552 00:14:18 -- accel/accel.sh@20 -- # read -r var val 00:06:19.552 00:14:18 -- accel/accel.sh@21 -- # val=Yes 00:06:19.552 00:14:18 -- accel/accel.sh@22 -- # case "$var" in 00:06:19.552 00:14:18 -- accel/accel.sh@20 -- # IFS=: 00:06:19.552 00:14:18 -- accel/accel.sh@20 -- # read -r var val 00:06:19.552 00:14:18 -- accel/accel.sh@21 -- # val= 00:06:19.552 00:14:18 -- accel/accel.sh@22 -- # case "$var" in 00:06:19.552 00:14:18 -- accel/accel.sh@20 -- # IFS=: 00:06:19.552 00:14:18 -- accel/accel.sh@20 -- # read -r var val 00:06:19.552 00:14:18 -- accel/accel.sh@21 -- # val= 00:06:19.552 00:14:18 -- accel/accel.sh@22 -- # case "$var" in 00:06:19.552 00:14:18 -- accel/accel.sh@20 -- # IFS=: 00:06:19.552 00:14:18 -- accel/accel.sh@20 -- # read -r var val 00:06:20.931 00:14:19 -- accel/accel.sh@21 -- # val= 00:06:20.931 00:14:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.931 00:14:19 -- accel/accel.sh@20 -- # IFS=: 00:06:20.931 00:14:19 -- accel/accel.sh@20 -- # read -r var val 00:06:20.931 00:14:19 -- accel/accel.sh@21 -- # val= 00:06:20.931 00:14:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.931 00:14:19 -- accel/accel.sh@20 -- # IFS=: 00:06:20.931 00:14:19 -- accel/accel.sh@20 -- # read -r var val 00:06:20.931 00:14:19 -- accel/accel.sh@21 -- # val= 00:06:20.931 00:14:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.931 00:14:19 -- accel/accel.sh@20 -- # IFS=: 00:06:20.931 00:14:19 -- accel/accel.sh@20 -- # read -r var val 00:06:20.931 00:14:19 -- accel/accel.sh@21 -- # val= 00:06:20.931 00:14:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.931 00:14:19 -- accel/accel.sh@20 -- # IFS=: 00:06:20.931 00:14:19 -- accel/accel.sh@20 -- # read -r var val 00:06:20.931 00:14:19 -- accel/accel.sh@21 -- # val= 00:06:20.931 00:14:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.931 00:14:19 -- accel/accel.sh@20 -- # IFS=: 00:06:20.931 00:14:19 -- accel/accel.sh@20 -- # read -r var val 00:06:20.931 00:14:19 -- accel/accel.sh@21 -- # val= 00:06:20.931 00:14:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.931 00:14:19 -- accel/accel.sh@20 -- # IFS=: 00:06:20.931 00:14:19 -- accel/accel.sh@20 -- # read -r var val 00:06:20.931 00:14:19 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:20.931 00:14:19 -- accel/accel.sh@28 -- # [[ -n dualcast ]] 00:06:20.931 00:14:19 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:20.931 00:06:20.931 real 0m2.657s 00:06:20.931 user 0m2.405s 00:06:20.931 sys 0m0.261s 00:06:20.931 00:14:19 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:20.931 00:14:19 -- common/autotest_common.sh@10 -- # set +x 00:06:20.931 ************************************ 00:06:20.931 END TEST accel_dualcast 00:06:20.931 ************************************ 00:06:20.931 00:14:19 -- accel/accel.sh@100 -- # run_test accel_compare accel_test -t 1 -w compare -y 00:06:20.931 00:14:19 -- common/autotest_common.sh@1077 -- # '[' 7 -le 1 ']' 00:06:20.931 00:14:19 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:20.931 00:14:19 -- common/autotest_common.sh@10 -- # set +x 00:06:20.931 ************************************ 00:06:20.931 START TEST accel_compare 00:06:20.931 ************************************ 00:06:20.931 00:14:19 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w compare -y 00:06:20.931 00:14:19 -- accel/accel.sh@16 -- # local accel_opc 00:06:20.931 00:14:19 -- accel/accel.sh@17 -- # local accel_module 00:06:20.931 00:14:19 -- accel/accel.sh@18 -- # accel_perf -t 1 -w compare -y 00:06:20.931 00:14:19 -- accel/accel.sh@12 -- # build_accel_config 00:06:20.931 00:14:19 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compare -y 00:06:20.931 00:14:19 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:20.931 00:14:19 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:20.931 00:14:19 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:20.931 00:14:19 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:20.931 00:14:19 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:20.931 00:14:19 -- accel/accel.sh@41 -- # local IFS=, 00:06:20.931 00:14:19 -- accel/accel.sh@42 -- # jq -r . 00:06:20.931 [2024-07-15 00:14:19.717023] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:20.931 [2024-07-15 00:14:19.717116] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid318005 ] 00:06:20.931 EAL: No free 2048 kB hugepages reported on node 1 00:06:20.931 [2024-07-15 00:14:19.786338] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:20.931 [2024-07-15 00:14:19.853576] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:22.311 00:14:21 -- accel/accel.sh@18 -- # out=' 00:06:22.311 SPDK Configuration: 00:06:22.311 Core mask: 0x1 00:06:22.311 00:06:22.311 Accel Perf Configuration: 00:06:22.311 Workload Type: compare 00:06:22.311 Transfer size: 4096 bytes 00:06:22.311 Vector count 1 00:06:22.311 Module: software 00:06:22.311 Queue depth: 32 00:06:22.311 Allocate depth: 32 00:06:22.311 # threads/core: 1 00:06:22.311 Run time: 1 seconds 00:06:22.311 Verify: Yes 00:06:22.311 00:06:22.311 Running for 1 seconds... 00:06:22.311 00:06:22.311 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:22.311 ------------------------------------------------------------------------------------ 00:06:22.311 0,0 800704/s 3127 MiB/s 0 0 00:06:22.311 ==================================================================================== 00:06:22.311 Total 800704/s 3127 MiB/s 0 0' 00:06:22.311 00:14:21 -- accel/accel.sh@20 -- # IFS=: 00:06:22.311 00:14:21 -- accel/accel.sh@20 -- # read -r var val 00:06:22.311 00:14:21 -- accel/accel.sh@15 -- # accel_perf -t 1 -w compare -y 00:06:22.312 00:14:21 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compare -y 00:06:22.312 00:14:21 -- accel/accel.sh@12 -- # build_accel_config 00:06:22.312 00:14:21 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:22.312 00:14:21 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:22.312 00:14:21 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:22.312 00:14:21 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:22.312 00:14:21 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:22.312 00:14:21 -- accel/accel.sh@41 -- # local IFS=, 00:06:22.312 00:14:21 -- accel/accel.sh@42 -- # jq -r . 00:06:22.312 [2024-07-15 00:14:21.041932] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:22.312 [2024-07-15 00:14:21.042020] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid318277 ] 00:06:22.312 EAL: No free 2048 kB hugepages reported on node 1 00:06:22.312 [2024-07-15 00:14:21.112091] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:22.312 [2024-07-15 00:14:21.177469] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:22.312 00:14:21 -- accel/accel.sh@21 -- # val= 00:06:22.312 00:14:21 -- accel/accel.sh@22 -- # case "$var" in 00:06:22.312 00:14:21 -- accel/accel.sh@20 -- # IFS=: 00:06:22.312 00:14:21 -- accel/accel.sh@20 -- # read -r var val 00:06:22.312 00:14:21 -- accel/accel.sh@21 -- # val= 00:06:22.312 00:14:21 -- accel/accel.sh@22 -- # case "$var" in 00:06:22.312 00:14:21 -- accel/accel.sh@20 -- # IFS=: 00:06:22.312 00:14:21 -- accel/accel.sh@20 -- # read -r var val 00:06:22.312 00:14:21 -- accel/accel.sh@21 -- # val=0x1 00:06:22.312 00:14:21 -- accel/accel.sh@22 -- # case "$var" in 00:06:22.312 00:14:21 -- accel/accel.sh@20 -- # IFS=: 00:06:22.312 00:14:21 -- accel/accel.sh@20 -- # read -r var val 00:06:22.312 00:14:21 -- accel/accel.sh@21 -- # val= 00:06:22.312 00:14:21 -- accel/accel.sh@22 -- # case "$var" in 00:06:22.312 00:14:21 -- accel/accel.sh@20 -- # IFS=: 00:06:22.312 00:14:21 -- accel/accel.sh@20 -- # read -r var val 00:06:22.312 00:14:21 -- accel/accel.sh@21 -- # val= 00:06:22.312 00:14:21 -- accel/accel.sh@22 -- # case "$var" in 00:06:22.312 00:14:21 -- accel/accel.sh@20 -- # IFS=: 00:06:22.312 00:14:21 -- accel/accel.sh@20 -- # read -r var val 00:06:22.312 00:14:21 -- accel/accel.sh@21 -- # val=compare 00:06:22.312 00:14:21 -- accel/accel.sh@22 -- # case "$var" in 00:06:22.312 00:14:21 -- accel/accel.sh@24 -- # accel_opc=compare 00:06:22.312 00:14:21 -- accel/accel.sh@20 -- # IFS=: 00:06:22.312 00:14:21 -- accel/accel.sh@20 -- # read -r var val 00:06:22.312 00:14:21 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:22.312 00:14:21 -- accel/accel.sh@22 -- # case "$var" in 00:06:22.312 00:14:21 -- accel/accel.sh@20 -- # IFS=: 00:06:22.312 00:14:21 -- accel/accel.sh@20 -- # read -r var val 00:06:22.312 00:14:21 -- accel/accel.sh@21 -- # val= 00:06:22.312 00:14:21 -- accel/accel.sh@22 -- # case "$var" in 00:06:22.312 00:14:21 -- accel/accel.sh@20 -- # IFS=: 00:06:22.312 00:14:21 -- accel/accel.sh@20 -- # read -r var val 00:06:22.312 00:14:21 -- accel/accel.sh@21 -- # val=software 00:06:22.312 00:14:21 -- accel/accel.sh@22 -- # case "$var" in 00:06:22.312 00:14:21 -- accel/accel.sh@23 -- # accel_module=software 00:06:22.312 00:14:21 -- accel/accel.sh@20 -- # IFS=: 00:06:22.312 00:14:21 -- accel/accel.sh@20 -- # read -r var val 00:06:22.312 00:14:21 -- accel/accel.sh@21 -- # val=32 00:06:22.312 00:14:21 -- accel/accel.sh@22 -- # case "$var" in 00:06:22.312 00:14:21 -- accel/accel.sh@20 -- # IFS=: 00:06:22.312 00:14:21 -- accel/accel.sh@20 -- # read -r var val 00:06:22.312 00:14:21 -- accel/accel.sh@21 -- # val=32 00:06:22.312 00:14:21 -- accel/accel.sh@22 -- # case "$var" in 00:06:22.312 00:14:21 -- accel/accel.sh@20 -- # IFS=: 00:06:22.312 00:14:21 -- accel/accel.sh@20 -- # read -r var val 00:06:22.312 00:14:21 -- accel/accel.sh@21 -- # val=1 00:06:22.312 00:14:21 -- accel/accel.sh@22 -- # case "$var" in 00:06:22.312 00:14:21 -- accel/accel.sh@20 -- # IFS=: 00:06:22.312 00:14:21 -- accel/accel.sh@20 -- # read -r var val 00:06:22.312 00:14:21 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:22.312 00:14:21 -- accel/accel.sh@22 -- # case "$var" in 00:06:22.312 00:14:21 -- accel/accel.sh@20 -- # IFS=: 00:06:22.312 00:14:21 -- accel/accel.sh@20 -- # read -r var val 00:06:22.312 00:14:21 -- accel/accel.sh@21 -- # val=Yes 00:06:22.312 00:14:21 -- accel/accel.sh@22 -- # case "$var" in 00:06:22.312 00:14:21 -- accel/accel.sh@20 -- # IFS=: 00:06:22.312 00:14:21 -- accel/accel.sh@20 -- # read -r var val 00:06:22.312 00:14:21 -- accel/accel.sh@21 -- # val= 00:06:22.312 00:14:21 -- accel/accel.sh@22 -- # case "$var" in 00:06:22.312 00:14:21 -- accel/accel.sh@20 -- # IFS=: 00:06:22.312 00:14:21 -- accel/accel.sh@20 -- # read -r var val 00:06:22.312 00:14:21 -- accel/accel.sh@21 -- # val= 00:06:22.312 00:14:21 -- accel/accel.sh@22 -- # case "$var" in 00:06:22.312 00:14:21 -- accel/accel.sh@20 -- # IFS=: 00:06:22.312 00:14:21 -- accel/accel.sh@20 -- # read -r var val 00:06:23.699 00:14:22 -- accel/accel.sh@21 -- # val= 00:06:23.699 00:14:22 -- accel/accel.sh@22 -- # case "$var" in 00:06:23.699 00:14:22 -- accel/accel.sh@20 -- # IFS=: 00:06:23.699 00:14:22 -- accel/accel.sh@20 -- # read -r var val 00:06:23.699 00:14:22 -- accel/accel.sh@21 -- # val= 00:06:23.699 00:14:22 -- accel/accel.sh@22 -- # case "$var" in 00:06:23.699 00:14:22 -- accel/accel.sh@20 -- # IFS=: 00:06:23.699 00:14:22 -- accel/accel.sh@20 -- # read -r var val 00:06:23.700 00:14:22 -- accel/accel.sh@21 -- # val= 00:06:23.700 00:14:22 -- accel/accel.sh@22 -- # case "$var" in 00:06:23.700 00:14:22 -- accel/accel.sh@20 -- # IFS=: 00:06:23.700 00:14:22 -- accel/accel.sh@20 -- # read -r var val 00:06:23.700 00:14:22 -- accel/accel.sh@21 -- # val= 00:06:23.700 00:14:22 -- accel/accel.sh@22 -- # case "$var" in 00:06:23.700 00:14:22 -- accel/accel.sh@20 -- # IFS=: 00:06:23.700 00:14:22 -- accel/accel.sh@20 -- # read -r var val 00:06:23.700 00:14:22 -- accel/accel.sh@21 -- # val= 00:06:23.700 00:14:22 -- accel/accel.sh@22 -- # case "$var" in 00:06:23.700 00:14:22 -- accel/accel.sh@20 -- # IFS=: 00:06:23.700 00:14:22 -- accel/accel.sh@20 -- # read -r var val 00:06:23.700 00:14:22 -- accel/accel.sh@21 -- # val= 00:06:23.700 00:14:22 -- accel/accel.sh@22 -- # case "$var" in 00:06:23.700 00:14:22 -- accel/accel.sh@20 -- # IFS=: 00:06:23.700 00:14:22 -- accel/accel.sh@20 -- # read -r var val 00:06:23.700 00:14:22 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:23.700 00:14:22 -- accel/accel.sh@28 -- # [[ -n compare ]] 00:06:23.700 00:14:22 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:23.700 00:06:23.700 real 0m2.654s 00:06:23.700 user 0m2.401s 00:06:23.700 sys 0m0.259s 00:06:23.700 00:14:22 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:23.700 00:14:22 -- common/autotest_common.sh@10 -- # set +x 00:06:23.700 ************************************ 00:06:23.700 END TEST accel_compare 00:06:23.700 ************************************ 00:06:23.700 00:14:22 -- accel/accel.sh@101 -- # run_test accel_xor accel_test -t 1 -w xor -y 00:06:23.700 00:14:22 -- common/autotest_common.sh@1077 -- # '[' 7 -le 1 ']' 00:06:23.700 00:14:22 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:23.700 00:14:22 -- common/autotest_common.sh@10 -- # set +x 00:06:23.700 ************************************ 00:06:23.700 START TEST accel_xor 00:06:23.700 ************************************ 00:06:23.700 00:14:22 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w xor -y 00:06:23.700 00:14:22 -- accel/accel.sh@16 -- # local accel_opc 00:06:23.700 00:14:22 -- accel/accel.sh@17 -- # local accel_module 00:06:23.700 00:14:22 -- accel/accel.sh@18 -- # accel_perf -t 1 -w xor -y 00:06:23.700 00:14:22 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y 00:06:23.700 00:14:22 -- accel/accel.sh@12 -- # build_accel_config 00:06:23.700 00:14:22 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:23.700 00:14:22 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:23.700 00:14:22 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:23.700 00:14:22 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:23.700 00:14:22 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:23.700 00:14:22 -- accel/accel.sh@41 -- # local IFS=, 00:06:23.700 00:14:22 -- accel/accel.sh@42 -- # jq -r . 00:06:23.700 [2024-07-15 00:14:22.416939] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:23.700 [2024-07-15 00:14:22.417022] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid318558 ] 00:06:23.700 EAL: No free 2048 kB hugepages reported on node 1 00:06:23.700 [2024-07-15 00:14:22.485789] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:23.700 [2024-07-15 00:14:22.552608] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:24.719 00:14:23 -- accel/accel.sh@18 -- # out=' 00:06:24.719 SPDK Configuration: 00:06:24.719 Core mask: 0x1 00:06:24.719 00:06:24.719 Accel Perf Configuration: 00:06:24.719 Workload Type: xor 00:06:24.719 Source buffers: 2 00:06:24.719 Transfer size: 4096 bytes 00:06:24.719 Vector count 1 00:06:24.719 Module: software 00:06:24.719 Queue depth: 32 00:06:24.719 Allocate depth: 32 00:06:24.719 # threads/core: 1 00:06:24.719 Run time: 1 seconds 00:06:24.719 Verify: Yes 00:06:24.719 00:06:24.719 Running for 1 seconds... 00:06:24.719 00:06:24.719 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:24.719 ------------------------------------------------------------------------------------ 00:06:24.719 0,0 703552/s 2748 MiB/s 0 0 00:06:24.719 ==================================================================================== 00:06:24.719 Total 703552/s 2748 MiB/s 0 0' 00:06:24.719 00:14:23 -- accel/accel.sh@20 -- # IFS=: 00:06:24.719 00:14:23 -- accel/accel.sh@20 -- # read -r var val 00:06:24.719 00:14:23 -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y 00:06:24.719 00:14:23 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y 00:06:24.719 00:14:23 -- accel/accel.sh@12 -- # build_accel_config 00:06:24.719 00:14:23 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:24.719 00:14:23 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:24.719 00:14:23 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:24.719 00:14:23 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:24.719 00:14:23 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:24.720 00:14:23 -- accel/accel.sh@41 -- # local IFS=, 00:06:24.720 00:14:23 -- accel/accel.sh@42 -- # jq -r . 00:06:24.720 [2024-07-15 00:14:23.739207] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:24.720 [2024-07-15 00:14:23.739303] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid318835 ] 00:06:24.978 EAL: No free 2048 kB hugepages reported on node 1 00:06:24.978 [2024-07-15 00:14:23.810085] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:24.978 [2024-07-15 00:14:23.875714] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:24.979 00:14:23 -- accel/accel.sh@21 -- # val= 00:06:24.979 00:14:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:24.979 00:14:23 -- accel/accel.sh@20 -- # IFS=: 00:06:24.979 00:14:23 -- accel/accel.sh@20 -- # read -r var val 00:06:24.979 00:14:23 -- accel/accel.sh@21 -- # val= 00:06:24.979 00:14:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:24.979 00:14:23 -- accel/accel.sh@20 -- # IFS=: 00:06:24.979 00:14:23 -- accel/accel.sh@20 -- # read -r var val 00:06:24.979 00:14:23 -- accel/accel.sh@21 -- # val=0x1 00:06:24.979 00:14:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:24.979 00:14:23 -- accel/accel.sh@20 -- # IFS=: 00:06:24.979 00:14:23 -- accel/accel.sh@20 -- # read -r var val 00:06:24.979 00:14:23 -- accel/accel.sh@21 -- # val= 00:06:24.979 00:14:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:24.979 00:14:23 -- accel/accel.sh@20 -- # IFS=: 00:06:24.979 00:14:23 -- accel/accel.sh@20 -- # read -r var val 00:06:24.979 00:14:23 -- accel/accel.sh@21 -- # val= 00:06:24.979 00:14:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:24.979 00:14:23 -- accel/accel.sh@20 -- # IFS=: 00:06:24.979 00:14:23 -- accel/accel.sh@20 -- # read -r var val 00:06:24.979 00:14:23 -- accel/accel.sh@21 -- # val=xor 00:06:24.979 00:14:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:24.979 00:14:23 -- accel/accel.sh@24 -- # accel_opc=xor 00:06:24.979 00:14:23 -- accel/accel.sh@20 -- # IFS=: 00:06:24.979 00:14:23 -- accel/accel.sh@20 -- # read -r var val 00:06:24.979 00:14:23 -- accel/accel.sh@21 -- # val=2 00:06:24.979 00:14:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:24.979 00:14:23 -- accel/accel.sh@20 -- # IFS=: 00:06:24.979 00:14:23 -- accel/accel.sh@20 -- # read -r var val 00:06:24.979 00:14:23 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:24.979 00:14:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:24.979 00:14:23 -- accel/accel.sh@20 -- # IFS=: 00:06:24.979 00:14:23 -- accel/accel.sh@20 -- # read -r var val 00:06:24.979 00:14:23 -- accel/accel.sh@21 -- # val= 00:06:24.979 00:14:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:24.979 00:14:23 -- accel/accel.sh@20 -- # IFS=: 00:06:24.979 00:14:23 -- accel/accel.sh@20 -- # read -r var val 00:06:24.979 00:14:23 -- accel/accel.sh@21 -- # val=software 00:06:24.979 00:14:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:24.979 00:14:23 -- accel/accel.sh@23 -- # accel_module=software 00:06:24.979 00:14:23 -- accel/accel.sh@20 -- # IFS=: 00:06:24.979 00:14:23 -- accel/accel.sh@20 -- # read -r var val 00:06:24.979 00:14:23 -- accel/accel.sh@21 -- # val=32 00:06:24.979 00:14:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:24.979 00:14:23 -- accel/accel.sh@20 -- # IFS=: 00:06:24.979 00:14:23 -- accel/accel.sh@20 -- # read -r var val 00:06:24.979 00:14:23 -- accel/accel.sh@21 -- # val=32 00:06:24.979 00:14:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:24.979 00:14:23 -- accel/accel.sh@20 -- # IFS=: 00:06:24.979 00:14:23 -- accel/accel.sh@20 -- # read -r var val 00:06:24.979 00:14:23 -- accel/accel.sh@21 -- # val=1 00:06:24.979 00:14:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:24.979 00:14:23 -- accel/accel.sh@20 -- # IFS=: 00:06:24.979 00:14:23 -- accel/accel.sh@20 -- # read -r var val 00:06:24.979 00:14:23 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:24.979 00:14:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:24.979 00:14:23 -- accel/accel.sh@20 -- # IFS=: 00:06:24.979 00:14:23 -- accel/accel.sh@20 -- # read -r var val 00:06:24.979 00:14:23 -- accel/accel.sh@21 -- # val=Yes 00:06:24.979 00:14:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:24.979 00:14:23 -- accel/accel.sh@20 -- # IFS=: 00:06:24.979 00:14:23 -- accel/accel.sh@20 -- # read -r var val 00:06:24.979 00:14:23 -- accel/accel.sh@21 -- # val= 00:06:24.979 00:14:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:24.979 00:14:23 -- accel/accel.sh@20 -- # IFS=: 00:06:24.979 00:14:23 -- accel/accel.sh@20 -- # read -r var val 00:06:24.979 00:14:23 -- accel/accel.sh@21 -- # val= 00:06:24.979 00:14:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:24.979 00:14:23 -- accel/accel.sh@20 -- # IFS=: 00:06:24.979 00:14:23 -- accel/accel.sh@20 -- # read -r var val 00:06:26.357 00:14:25 -- accel/accel.sh@21 -- # val= 00:06:26.357 00:14:25 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.357 00:14:25 -- accel/accel.sh@20 -- # IFS=: 00:06:26.357 00:14:25 -- accel/accel.sh@20 -- # read -r var val 00:06:26.357 00:14:25 -- accel/accel.sh@21 -- # val= 00:06:26.357 00:14:25 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.357 00:14:25 -- accel/accel.sh@20 -- # IFS=: 00:06:26.357 00:14:25 -- accel/accel.sh@20 -- # read -r var val 00:06:26.357 00:14:25 -- accel/accel.sh@21 -- # val= 00:06:26.357 00:14:25 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.357 00:14:25 -- accel/accel.sh@20 -- # IFS=: 00:06:26.357 00:14:25 -- accel/accel.sh@20 -- # read -r var val 00:06:26.357 00:14:25 -- accel/accel.sh@21 -- # val= 00:06:26.357 00:14:25 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.357 00:14:25 -- accel/accel.sh@20 -- # IFS=: 00:06:26.357 00:14:25 -- accel/accel.sh@20 -- # read -r var val 00:06:26.357 00:14:25 -- accel/accel.sh@21 -- # val= 00:06:26.357 00:14:25 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.357 00:14:25 -- accel/accel.sh@20 -- # IFS=: 00:06:26.357 00:14:25 -- accel/accel.sh@20 -- # read -r var val 00:06:26.357 00:14:25 -- accel/accel.sh@21 -- # val= 00:06:26.357 00:14:25 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.357 00:14:25 -- accel/accel.sh@20 -- # IFS=: 00:06:26.357 00:14:25 -- accel/accel.sh@20 -- # read -r var val 00:06:26.357 00:14:25 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:26.357 00:14:25 -- accel/accel.sh@28 -- # [[ -n xor ]] 00:06:26.357 00:14:25 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:26.357 00:06:26.357 real 0m2.652s 00:06:26.357 user 0m2.399s 00:06:26.357 sys 0m0.261s 00:06:26.357 00:14:25 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:26.357 00:14:25 -- common/autotest_common.sh@10 -- # set +x 00:06:26.358 ************************************ 00:06:26.358 END TEST accel_xor 00:06:26.358 ************************************ 00:06:26.358 00:14:25 -- accel/accel.sh@102 -- # run_test accel_xor accel_test -t 1 -w xor -y -x 3 00:06:26.358 00:14:25 -- common/autotest_common.sh@1077 -- # '[' 9 -le 1 ']' 00:06:26.358 00:14:25 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:26.358 00:14:25 -- common/autotest_common.sh@10 -- # set +x 00:06:26.358 ************************************ 00:06:26.358 START TEST accel_xor 00:06:26.358 ************************************ 00:06:26.358 00:14:25 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w xor -y -x 3 00:06:26.358 00:14:25 -- accel/accel.sh@16 -- # local accel_opc 00:06:26.358 00:14:25 -- accel/accel.sh@17 -- # local accel_module 00:06:26.358 00:14:25 -- accel/accel.sh@18 -- # accel_perf -t 1 -w xor -y -x 3 00:06:26.358 00:14:25 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x 3 00:06:26.358 00:14:25 -- accel/accel.sh@12 -- # build_accel_config 00:06:26.358 00:14:25 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:26.358 00:14:25 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:26.358 00:14:25 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:26.358 00:14:25 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:26.358 00:14:25 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:26.358 00:14:25 -- accel/accel.sh@41 -- # local IFS=, 00:06:26.358 00:14:25 -- accel/accel.sh@42 -- # jq -r . 00:06:26.358 [2024-07-15 00:14:25.107588] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:26.358 [2024-07-15 00:14:25.107653] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid319117 ] 00:06:26.358 EAL: No free 2048 kB hugepages reported on node 1 00:06:26.358 [2024-07-15 00:14:25.166903] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:26.358 [2024-07-15 00:14:25.233706] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:27.738 00:14:26 -- accel/accel.sh@18 -- # out=' 00:06:27.738 SPDK Configuration: 00:06:27.738 Core mask: 0x1 00:06:27.738 00:06:27.738 Accel Perf Configuration: 00:06:27.738 Workload Type: xor 00:06:27.738 Source buffers: 3 00:06:27.738 Transfer size: 4096 bytes 00:06:27.738 Vector count 1 00:06:27.738 Module: software 00:06:27.738 Queue depth: 32 00:06:27.738 Allocate depth: 32 00:06:27.738 # threads/core: 1 00:06:27.738 Run time: 1 seconds 00:06:27.738 Verify: Yes 00:06:27.738 00:06:27.738 Running for 1 seconds... 00:06:27.738 00:06:27.738 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:27.738 ------------------------------------------------------------------------------------ 00:06:27.738 0,0 673344/s 2630 MiB/s 0 0 00:06:27.738 ==================================================================================== 00:06:27.738 Total 673344/s 2630 MiB/s 0 0' 00:06:27.738 00:14:26 -- accel/accel.sh@20 -- # IFS=: 00:06:27.738 00:14:26 -- accel/accel.sh@20 -- # read -r var val 00:06:27.738 00:14:26 -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y -x 3 00:06:27.738 00:14:26 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x 3 00:06:27.738 00:14:26 -- accel/accel.sh@12 -- # build_accel_config 00:06:27.738 00:14:26 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:27.738 00:14:26 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:27.738 00:14:26 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:27.738 00:14:26 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:27.738 00:14:26 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:27.738 00:14:26 -- accel/accel.sh@41 -- # local IFS=, 00:06:27.738 00:14:26 -- accel/accel.sh@42 -- # jq -r . 00:06:27.738 [2024-07-15 00:14:26.420923] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:27.738 [2024-07-15 00:14:26.421015] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid319338 ] 00:06:27.738 EAL: No free 2048 kB hugepages reported on node 1 00:06:27.738 [2024-07-15 00:14:26.490026] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:27.738 [2024-07-15 00:14:26.556350] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:27.738 00:14:26 -- accel/accel.sh@21 -- # val= 00:06:27.738 00:14:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:27.738 00:14:26 -- accel/accel.sh@20 -- # IFS=: 00:06:27.738 00:14:26 -- accel/accel.sh@20 -- # read -r var val 00:06:27.738 00:14:26 -- accel/accel.sh@21 -- # val= 00:06:27.738 00:14:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:27.738 00:14:26 -- accel/accel.sh@20 -- # IFS=: 00:06:27.738 00:14:26 -- accel/accel.sh@20 -- # read -r var val 00:06:27.738 00:14:26 -- accel/accel.sh@21 -- # val=0x1 00:06:27.738 00:14:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:27.738 00:14:26 -- accel/accel.sh@20 -- # IFS=: 00:06:27.738 00:14:26 -- accel/accel.sh@20 -- # read -r var val 00:06:27.738 00:14:26 -- accel/accel.sh@21 -- # val= 00:06:27.738 00:14:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:27.738 00:14:26 -- accel/accel.sh@20 -- # IFS=: 00:06:27.738 00:14:26 -- accel/accel.sh@20 -- # read -r var val 00:06:27.738 00:14:26 -- accel/accel.sh@21 -- # val= 00:06:27.738 00:14:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:27.738 00:14:26 -- accel/accel.sh@20 -- # IFS=: 00:06:27.738 00:14:26 -- accel/accel.sh@20 -- # read -r var val 00:06:27.738 00:14:26 -- accel/accel.sh@21 -- # val=xor 00:06:27.738 00:14:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:27.738 00:14:26 -- accel/accel.sh@24 -- # accel_opc=xor 00:06:27.738 00:14:26 -- accel/accel.sh@20 -- # IFS=: 00:06:27.739 00:14:26 -- accel/accel.sh@20 -- # read -r var val 00:06:27.739 00:14:26 -- accel/accel.sh@21 -- # val=3 00:06:27.739 00:14:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:27.739 00:14:26 -- accel/accel.sh@20 -- # IFS=: 00:06:27.739 00:14:26 -- accel/accel.sh@20 -- # read -r var val 00:06:27.739 00:14:26 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:27.739 00:14:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:27.739 00:14:26 -- accel/accel.sh@20 -- # IFS=: 00:06:27.739 00:14:26 -- accel/accel.sh@20 -- # read -r var val 00:06:27.739 00:14:26 -- accel/accel.sh@21 -- # val= 00:06:27.739 00:14:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:27.739 00:14:26 -- accel/accel.sh@20 -- # IFS=: 00:06:27.739 00:14:26 -- accel/accel.sh@20 -- # read -r var val 00:06:27.739 00:14:26 -- accel/accel.sh@21 -- # val=software 00:06:27.739 00:14:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:27.739 00:14:26 -- accel/accel.sh@23 -- # accel_module=software 00:06:27.739 00:14:26 -- accel/accel.sh@20 -- # IFS=: 00:06:27.739 00:14:26 -- accel/accel.sh@20 -- # read -r var val 00:06:27.739 00:14:26 -- accel/accel.sh@21 -- # val=32 00:06:27.739 00:14:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:27.739 00:14:26 -- accel/accel.sh@20 -- # IFS=: 00:06:27.739 00:14:26 -- accel/accel.sh@20 -- # read -r var val 00:06:27.739 00:14:26 -- accel/accel.sh@21 -- # val=32 00:06:27.739 00:14:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:27.739 00:14:26 -- accel/accel.sh@20 -- # IFS=: 00:06:27.739 00:14:26 -- accel/accel.sh@20 -- # read -r var val 00:06:27.739 00:14:26 -- accel/accel.sh@21 -- # val=1 00:06:27.739 00:14:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:27.739 00:14:26 -- accel/accel.sh@20 -- # IFS=: 00:06:27.739 00:14:26 -- accel/accel.sh@20 -- # read -r var val 00:06:27.739 00:14:26 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:27.739 00:14:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:27.739 00:14:26 -- accel/accel.sh@20 -- # IFS=: 00:06:27.739 00:14:26 -- accel/accel.sh@20 -- # read -r var val 00:06:27.739 00:14:26 -- accel/accel.sh@21 -- # val=Yes 00:06:27.739 00:14:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:27.739 00:14:26 -- accel/accel.sh@20 -- # IFS=: 00:06:27.739 00:14:26 -- accel/accel.sh@20 -- # read -r var val 00:06:27.739 00:14:26 -- accel/accel.sh@21 -- # val= 00:06:27.739 00:14:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:27.739 00:14:26 -- accel/accel.sh@20 -- # IFS=: 00:06:27.739 00:14:26 -- accel/accel.sh@20 -- # read -r var val 00:06:27.739 00:14:26 -- accel/accel.sh@21 -- # val= 00:06:27.739 00:14:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:27.739 00:14:26 -- accel/accel.sh@20 -- # IFS=: 00:06:27.739 00:14:26 -- accel/accel.sh@20 -- # read -r var val 00:06:28.677 00:14:27 -- accel/accel.sh@21 -- # val= 00:06:28.677 00:14:27 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.677 00:14:27 -- accel/accel.sh@20 -- # IFS=: 00:06:28.677 00:14:27 -- accel/accel.sh@20 -- # read -r var val 00:06:28.677 00:14:27 -- accel/accel.sh@21 -- # val= 00:06:28.677 00:14:27 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.677 00:14:27 -- accel/accel.sh@20 -- # IFS=: 00:06:28.677 00:14:27 -- accel/accel.sh@20 -- # read -r var val 00:06:28.677 00:14:27 -- accel/accel.sh@21 -- # val= 00:06:28.677 00:14:27 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.677 00:14:27 -- accel/accel.sh@20 -- # IFS=: 00:06:28.677 00:14:27 -- accel/accel.sh@20 -- # read -r var val 00:06:28.677 00:14:27 -- accel/accel.sh@21 -- # val= 00:06:28.677 00:14:27 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.677 00:14:27 -- accel/accel.sh@20 -- # IFS=: 00:06:28.677 00:14:27 -- accel/accel.sh@20 -- # read -r var val 00:06:28.677 00:14:27 -- accel/accel.sh@21 -- # val= 00:06:28.677 00:14:27 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.677 00:14:27 -- accel/accel.sh@20 -- # IFS=: 00:06:28.677 00:14:27 -- accel/accel.sh@20 -- # read -r var val 00:06:28.677 00:14:27 -- accel/accel.sh@21 -- # val= 00:06:28.677 00:14:27 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.677 00:14:27 -- accel/accel.sh@20 -- # IFS=: 00:06:28.677 00:14:27 -- accel/accel.sh@20 -- # read -r var val 00:06:28.677 00:14:27 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:28.677 00:14:27 -- accel/accel.sh@28 -- # [[ -n xor ]] 00:06:28.677 00:14:27 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:28.677 00:06:28.677 real 0m2.633s 00:06:28.677 user 0m2.395s 00:06:28.677 sys 0m0.247s 00:06:28.677 00:14:27 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:28.677 00:14:27 -- common/autotest_common.sh@10 -- # set +x 00:06:28.677 ************************************ 00:06:28.677 END TEST accel_xor 00:06:28.677 ************************************ 00:06:28.936 00:14:27 -- accel/accel.sh@103 -- # run_test accel_dif_verify accel_test -t 1 -w dif_verify 00:06:28.936 00:14:27 -- common/autotest_common.sh@1077 -- # '[' 6 -le 1 ']' 00:06:28.936 00:14:27 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:28.936 00:14:27 -- common/autotest_common.sh@10 -- # set +x 00:06:28.936 ************************************ 00:06:28.936 START TEST accel_dif_verify 00:06:28.936 ************************************ 00:06:28.936 00:14:27 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w dif_verify 00:06:28.936 00:14:27 -- accel/accel.sh@16 -- # local accel_opc 00:06:28.936 00:14:27 -- accel/accel.sh@17 -- # local accel_module 00:06:28.936 00:14:27 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dif_verify 00:06:28.936 00:14:27 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_verify 00:06:28.936 00:14:27 -- accel/accel.sh@12 -- # build_accel_config 00:06:28.936 00:14:27 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:28.936 00:14:27 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:28.936 00:14:27 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:28.936 00:14:27 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:28.936 00:14:27 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:28.936 00:14:27 -- accel/accel.sh@41 -- # local IFS=, 00:06:28.936 00:14:27 -- accel/accel.sh@42 -- # jq -r . 00:06:28.936 [2024-07-15 00:14:27.800161] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:28.936 [2024-07-15 00:14:27.800260] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid319545 ] 00:06:28.936 EAL: No free 2048 kB hugepages reported on node 1 00:06:28.936 [2024-07-15 00:14:27.871114] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:28.936 [2024-07-15 00:14:27.940075] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:30.316 00:14:29 -- accel/accel.sh@18 -- # out=' 00:06:30.316 SPDK Configuration: 00:06:30.316 Core mask: 0x1 00:06:30.316 00:06:30.316 Accel Perf Configuration: 00:06:30.316 Workload Type: dif_verify 00:06:30.316 Vector size: 4096 bytes 00:06:30.316 Transfer size: 4096 bytes 00:06:30.316 Block size: 512 bytes 00:06:30.316 Metadata size: 8 bytes 00:06:30.316 Vector count 1 00:06:30.316 Module: software 00:06:30.316 Queue depth: 32 00:06:30.316 Allocate depth: 32 00:06:30.316 # threads/core: 1 00:06:30.316 Run time: 1 seconds 00:06:30.316 Verify: No 00:06:30.316 00:06:30.316 Running for 1 seconds... 00:06:30.316 00:06:30.316 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:30.316 ------------------------------------------------------------------------------------ 00:06:30.316 0,0 238848/s 947 MiB/s 0 0 00:06:30.316 ==================================================================================== 00:06:30.316 Total 238848/s 933 MiB/s 0 0' 00:06:30.316 00:14:29 -- accel/accel.sh@20 -- # IFS=: 00:06:30.316 00:14:29 -- accel/accel.sh@20 -- # read -r var val 00:06:30.316 00:14:29 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_verify 00:06:30.316 00:14:29 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_verify 00:06:30.316 00:14:29 -- accel/accel.sh@12 -- # build_accel_config 00:06:30.316 00:14:29 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:30.316 00:14:29 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:30.316 00:14:29 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:30.316 00:14:29 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:30.316 00:14:29 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:30.316 00:14:29 -- accel/accel.sh@41 -- # local IFS=, 00:06:30.316 00:14:29 -- accel/accel.sh@42 -- # jq -r . 00:06:30.316 [2024-07-15 00:14:29.128962] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:30.316 [2024-07-15 00:14:29.129057] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid319715 ] 00:06:30.316 EAL: No free 2048 kB hugepages reported on node 1 00:06:30.316 [2024-07-15 00:14:29.198650] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:30.316 [2024-07-15 00:14:29.266132] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:30.316 00:14:29 -- accel/accel.sh@21 -- # val= 00:06:30.316 00:14:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:30.316 00:14:29 -- accel/accel.sh@20 -- # IFS=: 00:06:30.316 00:14:29 -- accel/accel.sh@20 -- # read -r var val 00:06:30.316 00:14:29 -- accel/accel.sh@21 -- # val= 00:06:30.316 00:14:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:30.316 00:14:29 -- accel/accel.sh@20 -- # IFS=: 00:06:30.316 00:14:29 -- accel/accel.sh@20 -- # read -r var val 00:06:30.316 00:14:29 -- accel/accel.sh@21 -- # val=0x1 00:06:30.316 00:14:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:30.316 00:14:29 -- accel/accel.sh@20 -- # IFS=: 00:06:30.316 00:14:29 -- accel/accel.sh@20 -- # read -r var val 00:06:30.316 00:14:29 -- accel/accel.sh@21 -- # val= 00:06:30.316 00:14:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:30.316 00:14:29 -- accel/accel.sh@20 -- # IFS=: 00:06:30.316 00:14:29 -- accel/accel.sh@20 -- # read -r var val 00:06:30.316 00:14:29 -- accel/accel.sh@21 -- # val= 00:06:30.316 00:14:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:30.316 00:14:29 -- accel/accel.sh@20 -- # IFS=: 00:06:30.316 00:14:29 -- accel/accel.sh@20 -- # read -r var val 00:06:30.316 00:14:29 -- accel/accel.sh@21 -- # val=dif_verify 00:06:30.316 00:14:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:30.316 00:14:29 -- accel/accel.sh@24 -- # accel_opc=dif_verify 00:06:30.316 00:14:29 -- accel/accel.sh@20 -- # IFS=: 00:06:30.316 00:14:29 -- accel/accel.sh@20 -- # read -r var val 00:06:30.316 00:14:29 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:30.316 00:14:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:30.316 00:14:29 -- accel/accel.sh@20 -- # IFS=: 00:06:30.316 00:14:29 -- accel/accel.sh@20 -- # read -r var val 00:06:30.316 00:14:29 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:30.316 00:14:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:30.316 00:14:29 -- accel/accel.sh@20 -- # IFS=: 00:06:30.316 00:14:29 -- accel/accel.sh@20 -- # read -r var val 00:06:30.316 00:14:29 -- accel/accel.sh@21 -- # val='512 bytes' 00:06:30.316 00:14:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:30.316 00:14:29 -- accel/accel.sh@20 -- # IFS=: 00:06:30.316 00:14:29 -- accel/accel.sh@20 -- # read -r var val 00:06:30.316 00:14:29 -- accel/accel.sh@21 -- # val='8 bytes' 00:06:30.316 00:14:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:30.316 00:14:29 -- accel/accel.sh@20 -- # IFS=: 00:06:30.316 00:14:29 -- accel/accel.sh@20 -- # read -r var val 00:06:30.316 00:14:29 -- accel/accel.sh@21 -- # val= 00:06:30.316 00:14:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:30.316 00:14:29 -- accel/accel.sh@20 -- # IFS=: 00:06:30.316 00:14:29 -- accel/accel.sh@20 -- # read -r var val 00:06:30.316 00:14:29 -- accel/accel.sh@21 -- # val=software 00:06:30.316 00:14:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:30.316 00:14:29 -- accel/accel.sh@23 -- # accel_module=software 00:06:30.316 00:14:29 -- accel/accel.sh@20 -- # IFS=: 00:06:30.317 00:14:29 -- accel/accel.sh@20 -- # read -r var val 00:06:30.317 00:14:29 -- accel/accel.sh@21 -- # val=32 00:06:30.317 00:14:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:30.317 00:14:29 -- accel/accel.sh@20 -- # IFS=: 00:06:30.317 00:14:29 -- accel/accel.sh@20 -- # read -r var val 00:06:30.317 00:14:29 -- accel/accel.sh@21 -- # val=32 00:06:30.317 00:14:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:30.317 00:14:29 -- accel/accel.sh@20 -- # IFS=: 00:06:30.317 00:14:29 -- accel/accel.sh@20 -- # read -r var val 00:06:30.317 00:14:29 -- accel/accel.sh@21 -- # val=1 00:06:30.317 00:14:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:30.317 00:14:29 -- accel/accel.sh@20 -- # IFS=: 00:06:30.317 00:14:29 -- accel/accel.sh@20 -- # read -r var val 00:06:30.317 00:14:29 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:30.317 00:14:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:30.317 00:14:29 -- accel/accel.sh@20 -- # IFS=: 00:06:30.317 00:14:29 -- accel/accel.sh@20 -- # read -r var val 00:06:30.317 00:14:29 -- accel/accel.sh@21 -- # val=No 00:06:30.317 00:14:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:30.317 00:14:29 -- accel/accel.sh@20 -- # IFS=: 00:06:30.317 00:14:29 -- accel/accel.sh@20 -- # read -r var val 00:06:30.317 00:14:29 -- accel/accel.sh@21 -- # val= 00:06:30.317 00:14:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:30.317 00:14:29 -- accel/accel.sh@20 -- # IFS=: 00:06:30.317 00:14:29 -- accel/accel.sh@20 -- # read -r var val 00:06:30.317 00:14:29 -- accel/accel.sh@21 -- # val= 00:06:30.317 00:14:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:30.317 00:14:29 -- accel/accel.sh@20 -- # IFS=: 00:06:30.317 00:14:29 -- accel/accel.sh@20 -- # read -r var val 00:06:31.695 00:14:30 -- accel/accel.sh@21 -- # val= 00:06:31.695 00:14:30 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.695 00:14:30 -- accel/accel.sh@20 -- # IFS=: 00:06:31.695 00:14:30 -- accel/accel.sh@20 -- # read -r var val 00:06:31.695 00:14:30 -- accel/accel.sh@21 -- # val= 00:06:31.695 00:14:30 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.695 00:14:30 -- accel/accel.sh@20 -- # IFS=: 00:06:31.695 00:14:30 -- accel/accel.sh@20 -- # read -r var val 00:06:31.695 00:14:30 -- accel/accel.sh@21 -- # val= 00:06:31.695 00:14:30 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.695 00:14:30 -- accel/accel.sh@20 -- # IFS=: 00:06:31.695 00:14:30 -- accel/accel.sh@20 -- # read -r var val 00:06:31.695 00:14:30 -- accel/accel.sh@21 -- # val= 00:06:31.695 00:14:30 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.695 00:14:30 -- accel/accel.sh@20 -- # IFS=: 00:06:31.695 00:14:30 -- accel/accel.sh@20 -- # read -r var val 00:06:31.695 00:14:30 -- accel/accel.sh@21 -- # val= 00:06:31.695 00:14:30 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.695 00:14:30 -- accel/accel.sh@20 -- # IFS=: 00:06:31.695 00:14:30 -- accel/accel.sh@20 -- # read -r var val 00:06:31.695 00:14:30 -- accel/accel.sh@21 -- # val= 00:06:31.695 00:14:30 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.695 00:14:30 -- accel/accel.sh@20 -- # IFS=: 00:06:31.695 00:14:30 -- accel/accel.sh@20 -- # read -r var val 00:06:31.695 00:14:30 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:31.695 00:14:30 -- accel/accel.sh@28 -- # [[ -n dif_verify ]] 00:06:31.695 00:14:30 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:31.695 00:06:31.695 real 0m2.661s 00:06:31.695 user 0m2.415s 00:06:31.695 sys 0m0.255s 00:06:31.695 00:14:30 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:31.695 00:14:30 -- common/autotest_common.sh@10 -- # set +x 00:06:31.695 ************************************ 00:06:31.695 END TEST accel_dif_verify 00:06:31.695 ************************************ 00:06:31.695 00:14:30 -- accel/accel.sh@104 -- # run_test accel_dif_generate accel_test -t 1 -w dif_generate 00:06:31.695 00:14:30 -- common/autotest_common.sh@1077 -- # '[' 6 -le 1 ']' 00:06:31.695 00:14:30 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:31.695 00:14:30 -- common/autotest_common.sh@10 -- # set +x 00:06:31.695 ************************************ 00:06:31.695 START TEST accel_dif_generate 00:06:31.695 ************************************ 00:06:31.695 00:14:30 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w dif_generate 00:06:31.695 00:14:30 -- accel/accel.sh@16 -- # local accel_opc 00:06:31.695 00:14:30 -- accel/accel.sh@17 -- # local accel_module 00:06:31.695 00:14:30 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dif_generate 00:06:31.695 00:14:30 -- accel/accel.sh@12 -- # build_accel_config 00:06:31.695 00:14:30 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate 00:06:31.695 00:14:30 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:31.695 00:14:30 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:31.695 00:14:30 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:31.695 00:14:30 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:31.695 00:14:30 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:31.695 00:14:30 -- accel/accel.sh@41 -- # local IFS=, 00:06:31.695 00:14:30 -- accel/accel.sh@42 -- # jq -r . 00:06:31.695 [2024-07-15 00:14:30.504252] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:31.695 [2024-07-15 00:14:30.504341] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid319979 ] 00:06:31.695 EAL: No free 2048 kB hugepages reported on node 1 00:06:31.695 [2024-07-15 00:14:30.572736] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:31.695 [2024-07-15 00:14:30.640348] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:33.072 00:14:31 -- accel/accel.sh@18 -- # out=' 00:06:33.072 SPDK Configuration: 00:06:33.072 Core mask: 0x1 00:06:33.072 00:06:33.072 Accel Perf Configuration: 00:06:33.072 Workload Type: dif_generate 00:06:33.072 Vector size: 4096 bytes 00:06:33.072 Transfer size: 4096 bytes 00:06:33.072 Block size: 512 bytes 00:06:33.072 Metadata size: 8 bytes 00:06:33.072 Vector count 1 00:06:33.072 Module: software 00:06:33.072 Queue depth: 32 00:06:33.072 Allocate depth: 32 00:06:33.072 # threads/core: 1 00:06:33.072 Run time: 1 seconds 00:06:33.072 Verify: No 00:06:33.072 00:06:33.072 Running for 1 seconds... 00:06:33.072 00:06:33.072 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:33.072 ------------------------------------------------------------------------------------ 00:06:33.072 0,0 288928/s 1146 MiB/s 0 0 00:06:33.072 ==================================================================================== 00:06:33.072 Total 288928/s 1128 MiB/s 0 0' 00:06:33.072 00:14:31 -- accel/accel.sh@20 -- # IFS=: 00:06:33.072 00:14:31 -- accel/accel.sh@20 -- # read -r var val 00:06:33.072 00:14:31 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate 00:06:33.072 00:14:31 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate 00:06:33.072 00:14:31 -- accel/accel.sh@12 -- # build_accel_config 00:06:33.072 00:14:31 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:33.072 00:14:31 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:33.072 00:14:31 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:33.072 00:14:31 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:33.072 00:14:31 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:33.072 00:14:31 -- accel/accel.sh@41 -- # local IFS=, 00:06:33.072 00:14:31 -- accel/accel.sh@42 -- # jq -r . 00:06:33.072 [2024-07-15 00:14:31.826990] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:33.072 [2024-07-15 00:14:31.827078] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid320245 ] 00:06:33.072 EAL: No free 2048 kB hugepages reported on node 1 00:06:33.072 [2024-07-15 00:14:31.896505] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:33.072 [2024-07-15 00:14:31.963417] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:33.072 00:14:32 -- accel/accel.sh@21 -- # val= 00:06:33.072 00:14:32 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.072 00:14:32 -- accel/accel.sh@20 -- # IFS=: 00:06:33.072 00:14:32 -- accel/accel.sh@20 -- # read -r var val 00:06:33.072 00:14:32 -- accel/accel.sh@21 -- # val= 00:06:33.072 00:14:32 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.072 00:14:32 -- accel/accel.sh@20 -- # IFS=: 00:06:33.072 00:14:32 -- accel/accel.sh@20 -- # read -r var val 00:06:33.072 00:14:32 -- accel/accel.sh@21 -- # val=0x1 00:06:33.072 00:14:32 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.072 00:14:32 -- accel/accel.sh@20 -- # IFS=: 00:06:33.072 00:14:32 -- accel/accel.sh@20 -- # read -r var val 00:06:33.072 00:14:32 -- accel/accel.sh@21 -- # val= 00:06:33.072 00:14:32 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.072 00:14:32 -- accel/accel.sh@20 -- # IFS=: 00:06:33.073 00:14:32 -- accel/accel.sh@20 -- # read -r var val 00:06:33.073 00:14:32 -- accel/accel.sh@21 -- # val= 00:06:33.073 00:14:32 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.073 00:14:32 -- accel/accel.sh@20 -- # IFS=: 00:06:33.073 00:14:32 -- accel/accel.sh@20 -- # read -r var val 00:06:33.073 00:14:32 -- accel/accel.sh@21 -- # val=dif_generate 00:06:33.073 00:14:32 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.073 00:14:32 -- accel/accel.sh@24 -- # accel_opc=dif_generate 00:06:33.073 00:14:32 -- accel/accel.sh@20 -- # IFS=: 00:06:33.073 00:14:32 -- accel/accel.sh@20 -- # read -r var val 00:06:33.073 00:14:32 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:33.073 00:14:32 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.073 00:14:32 -- accel/accel.sh@20 -- # IFS=: 00:06:33.073 00:14:32 -- accel/accel.sh@20 -- # read -r var val 00:06:33.073 00:14:32 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:33.073 00:14:32 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.073 00:14:32 -- accel/accel.sh@20 -- # IFS=: 00:06:33.073 00:14:32 -- accel/accel.sh@20 -- # read -r var val 00:06:33.073 00:14:32 -- accel/accel.sh@21 -- # val='512 bytes' 00:06:33.073 00:14:32 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.073 00:14:32 -- accel/accel.sh@20 -- # IFS=: 00:06:33.073 00:14:32 -- accel/accel.sh@20 -- # read -r var val 00:06:33.073 00:14:32 -- accel/accel.sh@21 -- # val='8 bytes' 00:06:33.073 00:14:32 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.073 00:14:32 -- accel/accel.sh@20 -- # IFS=: 00:06:33.073 00:14:32 -- accel/accel.sh@20 -- # read -r var val 00:06:33.073 00:14:32 -- accel/accel.sh@21 -- # val= 00:06:33.073 00:14:32 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.073 00:14:32 -- accel/accel.sh@20 -- # IFS=: 00:06:33.073 00:14:32 -- accel/accel.sh@20 -- # read -r var val 00:06:33.073 00:14:32 -- accel/accel.sh@21 -- # val=software 00:06:33.073 00:14:32 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.073 00:14:32 -- accel/accel.sh@23 -- # accel_module=software 00:06:33.073 00:14:32 -- accel/accel.sh@20 -- # IFS=: 00:06:33.073 00:14:32 -- accel/accel.sh@20 -- # read -r var val 00:06:33.073 00:14:32 -- accel/accel.sh@21 -- # val=32 00:06:33.073 00:14:32 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.073 00:14:32 -- accel/accel.sh@20 -- # IFS=: 00:06:33.073 00:14:32 -- accel/accel.sh@20 -- # read -r var val 00:06:33.073 00:14:32 -- accel/accel.sh@21 -- # val=32 00:06:33.073 00:14:32 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.073 00:14:32 -- accel/accel.sh@20 -- # IFS=: 00:06:33.073 00:14:32 -- accel/accel.sh@20 -- # read -r var val 00:06:33.073 00:14:32 -- accel/accel.sh@21 -- # val=1 00:06:33.073 00:14:32 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.073 00:14:32 -- accel/accel.sh@20 -- # IFS=: 00:06:33.073 00:14:32 -- accel/accel.sh@20 -- # read -r var val 00:06:33.073 00:14:32 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:33.073 00:14:32 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.073 00:14:32 -- accel/accel.sh@20 -- # IFS=: 00:06:33.073 00:14:32 -- accel/accel.sh@20 -- # read -r var val 00:06:33.073 00:14:32 -- accel/accel.sh@21 -- # val=No 00:06:33.073 00:14:32 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.073 00:14:32 -- accel/accel.sh@20 -- # IFS=: 00:06:33.073 00:14:32 -- accel/accel.sh@20 -- # read -r var val 00:06:33.073 00:14:32 -- accel/accel.sh@21 -- # val= 00:06:33.073 00:14:32 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.073 00:14:32 -- accel/accel.sh@20 -- # IFS=: 00:06:33.073 00:14:32 -- accel/accel.sh@20 -- # read -r var val 00:06:33.073 00:14:32 -- accel/accel.sh@21 -- # val= 00:06:33.073 00:14:32 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.073 00:14:32 -- accel/accel.sh@20 -- # IFS=: 00:06:33.073 00:14:32 -- accel/accel.sh@20 -- # read -r var val 00:06:34.452 00:14:33 -- accel/accel.sh@21 -- # val= 00:06:34.452 00:14:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:34.452 00:14:33 -- accel/accel.sh@20 -- # IFS=: 00:06:34.452 00:14:33 -- accel/accel.sh@20 -- # read -r var val 00:06:34.452 00:14:33 -- accel/accel.sh@21 -- # val= 00:06:34.452 00:14:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:34.452 00:14:33 -- accel/accel.sh@20 -- # IFS=: 00:06:34.452 00:14:33 -- accel/accel.sh@20 -- # read -r var val 00:06:34.452 00:14:33 -- accel/accel.sh@21 -- # val= 00:06:34.452 00:14:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:34.452 00:14:33 -- accel/accel.sh@20 -- # IFS=: 00:06:34.452 00:14:33 -- accel/accel.sh@20 -- # read -r var val 00:06:34.452 00:14:33 -- accel/accel.sh@21 -- # val= 00:06:34.452 00:14:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:34.452 00:14:33 -- accel/accel.sh@20 -- # IFS=: 00:06:34.452 00:14:33 -- accel/accel.sh@20 -- # read -r var val 00:06:34.452 00:14:33 -- accel/accel.sh@21 -- # val= 00:06:34.452 00:14:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:34.452 00:14:33 -- accel/accel.sh@20 -- # IFS=: 00:06:34.452 00:14:33 -- accel/accel.sh@20 -- # read -r var val 00:06:34.452 00:14:33 -- accel/accel.sh@21 -- # val= 00:06:34.452 00:14:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:34.452 00:14:33 -- accel/accel.sh@20 -- # IFS=: 00:06:34.452 00:14:33 -- accel/accel.sh@20 -- # read -r var val 00:06:34.452 00:14:33 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:34.452 00:14:33 -- accel/accel.sh@28 -- # [[ -n dif_generate ]] 00:06:34.452 00:14:33 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:34.452 00:06:34.452 real 0m2.652s 00:06:34.452 user 0m2.415s 00:06:34.452 sys 0m0.247s 00:06:34.452 00:14:33 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:34.452 00:14:33 -- common/autotest_common.sh@10 -- # set +x 00:06:34.452 ************************************ 00:06:34.452 END TEST accel_dif_generate 00:06:34.452 ************************************ 00:06:34.452 00:14:33 -- accel/accel.sh@105 -- # run_test accel_dif_generate_copy accel_test -t 1 -w dif_generate_copy 00:06:34.452 00:14:33 -- common/autotest_common.sh@1077 -- # '[' 6 -le 1 ']' 00:06:34.452 00:14:33 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:34.452 00:14:33 -- common/autotest_common.sh@10 -- # set +x 00:06:34.453 ************************************ 00:06:34.453 START TEST accel_dif_generate_copy 00:06:34.453 ************************************ 00:06:34.453 00:14:33 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w dif_generate_copy 00:06:34.453 00:14:33 -- accel/accel.sh@16 -- # local accel_opc 00:06:34.453 00:14:33 -- accel/accel.sh@17 -- # local accel_module 00:06:34.453 00:14:33 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dif_generate_copy 00:06:34.453 00:14:33 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate_copy 00:06:34.453 00:14:33 -- accel/accel.sh@12 -- # build_accel_config 00:06:34.453 00:14:33 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:34.453 00:14:33 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:34.453 00:14:33 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:34.453 00:14:33 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:34.453 00:14:33 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:34.453 00:14:33 -- accel/accel.sh@41 -- # local IFS=, 00:06:34.453 00:14:33 -- accel/accel.sh@42 -- # jq -r . 00:06:34.453 [2024-07-15 00:14:33.206513] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:34.453 [2024-07-15 00:14:33.206600] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid320534 ] 00:06:34.453 EAL: No free 2048 kB hugepages reported on node 1 00:06:34.453 [2024-07-15 00:14:33.276142] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:34.453 [2024-07-15 00:14:33.343540] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:35.831 00:14:34 -- accel/accel.sh@18 -- # out=' 00:06:35.831 SPDK Configuration: 00:06:35.831 Core mask: 0x1 00:06:35.831 00:06:35.831 Accel Perf Configuration: 00:06:35.831 Workload Type: dif_generate_copy 00:06:35.831 Vector size: 4096 bytes 00:06:35.831 Transfer size: 4096 bytes 00:06:35.831 Vector count 1 00:06:35.831 Module: software 00:06:35.831 Queue depth: 32 00:06:35.831 Allocate depth: 32 00:06:35.831 # threads/core: 1 00:06:35.831 Run time: 1 seconds 00:06:35.831 Verify: No 00:06:35.831 00:06:35.831 Running for 1 seconds... 00:06:35.831 00:06:35.831 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:35.831 ------------------------------------------------------------------------------------ 00:06:35.831 0,0 223744/s 887 MiB/s 0 0 00:06:35.831 ==================================================================================== 00:06:35.831 Total 223744/s 874 MiB/s 0 0' 00:06:35.831 00:14:34 -- accel/accel.sh@20 -- # IFS=: 00:06:35.831 00:14:34 -- accel/accel.sh@20 -- # read -r var val 00:06:35.831 00:14:34 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate_copy 00:06:35.832 00:14:34 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate_copy 00:06:35.832 00:14:34 -- accel/accel.sh@12 -- # build_accel_config 00:06:35.832 00:14:34 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:35.832 00:14:34 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:35.832 00:14:34 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:35.832 00:14:34 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:35.832 00:14:34 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:35.832 00:14:34 -- accel/accel.sh@41 -- # local IFS=, 00:06:35.832 00:14:34 -- accel/accel.sh@42 -- # jq -r . 00:06:35.832 [2024-07-15 00:14:34.532118] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:35.832 [2024-07-15 00:14:34.532208] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid320802 ] 00:06:35.832 EAL: No free 2048 kB hugepages reported on node 1 00:06:35.832 [2024-07-15 00:14:34.603493] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:35.832 [2024-07-15 00:14:34.668828] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:35.832 00:14:34 -- accel/accel.sh@21 -- # val= 00:06:35.832 00:14:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:35.832 00:14:34 -- accel/accel.sh@20 -- # IFS=: 00:06:35.832 00:14:34 -- accel/accel.sh@20 -- # read -r var val 00:06:35.832 00:14:34 -- accel/accel.sh@21 -- # val= 00:06:35.832 00:14:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:35.832 00:14:34 -- accel/accel.sh@20 -- # IFS=: 00:06:35.832 00:14:34 -- accel/accel.sh@20 -- # read -r var val 00:06:35.832 00:14:34 -- accel/accel.sh@21 -- # val=0x1 00:06:35.832 00:14:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:35.832 00:14:34 -- accel/accel.sh@20 -- # IFS=: 00:06:35.832 00:14:34 -- accel/accel.sh@20 -- # read -r var val 00:06:35.832 00:14:34 -- accel/accel.sh@21 -- # val= 00:06:35.832 00:14:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:35.832 00:14:34 -- accel/accel.sh@20 -- # IFS=: 00:06:35.832 00:14:34 -- accel/accel.sh@20 -- # read -r var val 00:06:35.832 00:14:34 -- accel/accel.sh@21 -- # val= 00:06:35.832 00:14:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:35.832 00:14:34 -- accel/accel.sh@20 -- # IFS=: 00:06:35.832 00:14:34 -- accel/accel.sh@20 -- # read -r var val 00:06:35.832 00:14:34 -- accel/accel.sh@21 -- # val=dif_generate_copy 00:06:35.832 00:14:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:35.832 00:14:34 -- accel/accel.sh@24 -- # accel_opc=dif_generate_copy 00:06:35.832 00:14:34 -- accel/accel.sh@20 -- # IFS=: 00:06:35.832 00:14:34 -- accel/accel.sh@20 -- # read -r var val 00:06:35.832 00:14:34 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:35.832 00:14:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:35.832 00:14:34 -- accel/accel.sh@20 -- # IFS=: 00:06:35.832 00:14:34 -- accel/accel.sh@20 -- # read -r var val 00:06:35.832 00:14:34 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:35.832 00:14:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:35.832 00:14:34 -- accel/accel.sh@20 -- # IFS=: 00:06:35.832 00:14:34 -- accel/accel.sh@20 -- # read -r var val 00:06:35.832 00:14:34 -- accel/accel.sh@21 -- # val= 00:06:35.832 00:14:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:35.832 00:14:34 -- accel/accel.sh@20 -- # IFS=: 00:06:35.832 00:14:34 -- accel/accel.sh@20 -- # read -r var val 00:06:35.832 00:14:34 -- accel/accel.sh@21 -- # val=software 00:06:35.832 00:14:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:35.832 00:14:34 -- accel/accel.sh@23 -- # accel_module=software 00:06:35.832 00:14:34 -- accel/accel.sh@20 -- # IFS=: 00:06:35.832 00:14:34 -- accel/accel.sh@20 -- # read -r var val 00:06:35.832 00:14:34 -- accel/accel.sh@21 -- # val=32 00:06:35.832 00:14:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:35.832 00:14:34 -- accel/accel.sh@20 -- # IFS=: 00:06:35.832 00:14:34 -- accel/accel.sh@20 -- # read -r var val 00:06:35.832 00:14:34 -- accel/accel.sh@21 -- # val=32 00:06:35.832 00:14:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:35.832 00:14:34 -- accel/accel.sh@20 -- # IFS=: 00:06:35.832 00:14:34 -- accel/accel.sh@20 -- # read -r var val 00:06:35.832 00:14:34 -- accel/accel.sh@21 -- # val=1 00:06:35.832 00:14:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:35.832 00:14:34 -- accel/accel.sh@20 -- # IFS=: 00:06:35.832 00:14:34 -- accel/accel.sh@20 -- # read -r var val 00:06:35.832 00:14:34 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:35.832 00:14:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:35.832 00:14:34 -- accel/accel.sh@20 -- # IFS=: 00:06:35.832 00:14:34 -- accel/accel.sh@20 -- # read -r var val 00:06:35.832 00:14:34 -- accel/accel.sh@21 -- # val=No 00:06:35.832 00:14:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:35.832 00:14:34 -- accel/accel.sh@20 -- # IFS=: 00:06:35.832 00:14:34 -- accel/accel.sh@20 -- # read -r var val 00:06:35.832 00:14:34 -- accel/accel.sh@21 -- # val= 00:06:35.832 00:14:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:35.832 00:14:34 -- accel/accel.sh@20 -- # IFS=: 00:06:35.832 00:14:34 -- accel/accel.sh@20 -- # read -r var val 00:06:35.832 00:14:34 -- accel/accel.sh@21 -- # val= 00:06:35.832 00:14:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:35.832 00:14:34 -- accel/accel.sh@20 -- # IFS=: 00:06:35.832 00:14:34 -- accel/accel.sh@20 -- # read -r var val 00:06:37.209 00:14:35 -- accel/accel.sh@21 -- # val= 00:06:37.209 00:14:35 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.209 00:14:35 -- accel/accel.sh@20 -- # IFS=: 00:06:37.209 00:14:35 -- accel/accel.sh@20 -- # read -r var val 00:06:37.209 00:14:35 -- accel/accel.sh@21 -- # val= 00:06:37.209 00:14:35 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.209 00:14:35 -- accel/accel.sh@20 -- # IFS=: 00:06:37.209 00:14:35 -- accel/accel.sh@20 -- # read -r var val 00:06:37.209 00:14:35 -- accel/accel.sh@21 -- # val= 00:06:37.209 00:14:35 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.209 00:14:35 -- accel/accel.sh@20 -- # IFS=: 00:06:37.209 00:14:35 -- accel/accel.sh@20 -- # read -r var val 00:06:37.209 00:14:35 -- accel/accel.sh@21 -- # val= 00:06:37.209 00:14:35 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.209 00:14:35 -- accel/accel.sh@20 -- # IFS=: 00:06:37.209 00:14:35 -- accel/accel.sh@20 -- # read -r var val 00:06:37.209 00:14:35 -- accel/accel.sh@21 -- # val= 00:06:37.209 00:14:35 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.209 00:14:35 -- accel/accel.sh@20 -- # IFS=: 00:06:37.209 00:14:35 -- accel/accel.sh@20 -- # read -r var val 00:06:37.209 00:14:35 -- accel/accel.sh@21 -- # val= 00:06:37.209 00:14:35 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.209 00:14:35 -- accel/accel.sh@20 -- # IFS=: 00:06:37.209 00:14:35 -- accel/accel.sh@20 -- # read -r var val 00:06:37.209 00:14:35 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:37.209 00:14:35 -- accel/accel.sh@28 -- # [[ -n dif_generate_copy ]] 00:06:37.209 00:14:35 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:37.209 00:06:37.209 real 0m2.655s 00:06:37.209 user 0m2.407s 00:06:37.209 sys 0m0.256s 00:06:37.209 00:14:35 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:37.209 00:14:35 -- common/autotest_common.sh@10 -- # set +x 00:06:37.209 ************************************ 00:06:37.209 END TEST accel_dif_generate_copy 00:06:37.209 ************************************ 00:06:37.209 00:14:35 -- accel/accel.sh@107 -- # [[ y == y ]] 00:06:37.209 00:14:35 -- accel/accel.sh@108 -- # run_test accel_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:37.209 00:14:35 -- common/autotest_common.sh@1077 -- # '[' 8 -le 1 ']' 00:06:37.209 00:14:35 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:37.209 00:14:35 -- common/autotest_common.sh@10 -- # set +x 00:06:37.209 ************************************ 00:06:37.209 START TEST accel_comp 00:06:37.209 ************************************ 00:06:37.209 00:14:35 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:37.209 00:14:35 -- accel/accel.sh@16 -- # local accel_opc 00:06:37.209 00:14:35 -- accel/accel.sh@17 -- # local accel_module 00:06:37.210 00:14:35 -- accel/accel.sh@18 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:37.210 00:14:35 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:37.210 00:14:35 -- accel/accel.sh@12 -- # build_accel_config 00:06:37.210 00:14:35 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:37.210 00:14:35 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:37.210 00:14:35 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:37.210 00:14:35 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:37.210 00:14:35 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:37.210 00:14:35 -- accel/accel.sh@41 -- # local IFS=, 00:06:37.210 00:14:35 -- accel/accel.sh@42 -- # jq -r . 00:06:37.210 [2024-07-15 00:14:35.909739] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:37.210 [2024-07-15 00:14:35.909829] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid321088 ] 00:06:37.210 EAL: No free 2048 kB hugepages reported on node 1 00:06:37.210 [2024-07-15 00:14:35.979744] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:37.210 [2024-07-15 00:14:36.047580] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:38.586 00:14:37 -- accel/accel.sh@18 -- # out='Preparing input file... 00:06:38.586 00:06:38.586 SPDK Configuration: 00:06:38.586 Core mask: 0x1 00:06:38.586 00:06:38.586 Accel Perf Configuration: 00:06:38.586 Workload Type: compress 00:06:38.586 Transfer size: 4096 bytes 00:06:38.586 Vector count 1 00:06:38.586 Module: software 00:06:38.586 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:38.586 Queue depth: 32 00:06:38.586 Allocate depth: 32 00:06:38.586 # threads/core: 1 00:06:38.586 Run time: 1 seconds 00:06:38.586 Verify: No 00:06:38.586 00:06:38.586 Running for 1 seconds... 00:06:38.586 00:06:38.586 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:38.586 ------------------------------------------------------------------------------------ 00:06:38.586 0,0 69248/s 288 MiB/s 0 0 00:06:38.586 ==================================================================================== 00:06:38.586 Total 69248/s 270 MiB/s 0 0' 00:06:38.586 00:14:37 -- accel/accel.sh@20 -- # IFS=: 00:06:38.586 00:14:37 -- accel/accel.sh@20 -- # read -r var val 00:06:38.586 00:14:37 -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:38.586 00:14:37 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:38.586 00:14:37 -- accel/accel.sh@12 -- # build_accel_config 00:06:38.586 00:14:37 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:38.586 00:14:37 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:38.586 00:14:37 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:38.586 00:14:37 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:38.586 00:14:37 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:38.586 00:14:37 -- accel/accel.sh@41 -- # local IFS=, 00:06:38.586 00:14:37 -- accel/accel.sh@42 -- # jq -r . 00:06:38.586 [2024-07-15 00:14:37.238226] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:38.586 [2024-07-15 00:14:37.238309] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid321320 ] 00:06:38.586 EAL: No free 2048 kB hugepages reported on node 1 00:06:38.586 [2024-07-15 00:14:37.307144] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:38.586 [2024-07-15 00:14:37.372780] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:38.586 00:14:37 -- accel/accel.sh@21 -- # val= 00:06:38.586 00:14:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.586 00:14:37 -- accel/accel.sh@20 -- # IFS=: 00:06:38.586 00:14:37 -- accel/accel.sh@20 -- # read -r var val 00:06:38.586 00:14:37 -- accel/accel.sh@21 -- # val= 00:06:38.586 00:14:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.586 00:14:37 -- accel/accel.sh@20 -- # IFS=: 00:06:38.586 00:14:37 -- accel/accel.sh@20 -- # read -r var val 00:06:38.586 00:14:37 -- accel/accel.sh@21 -- # val= 00:06:38.586 00:14:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.586 00:14:37 -- accel/accel.sh@20 -- # IFS=: 00:06:38.586 00:14:37 -- accel/accel.sh@20 -- # read -r var val 00:06:38.586 00:14:37 -- accel/accel.sh@21 -- # val=0x1 00:06:38.586 00:14:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.586 00:14:37 -- accel/accel.sh@20 -- # IFS=: 00:06:38.587 00:14:37 -- accel/accel.sh@20 -- # read -r var val 00:06:38.587 00:14:37 -- accel/accel.sh@21 -- # val= 00:06:38.587 00:14:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.587 00:14:37 -- accel/accel.sh@20 -- # IFS=: 00:06:38.587 00:14:37 -- accel/accel.sh@20 -- # read -r var val 00:06:38.587 00:14:37 -- accel/accel.sh@21 -- # val= 00:06:38.587 00:14:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.587 00:14:37 -- accel/accel.sh@20 -- # IFS=: 00:06:38.587 00:14:37 -- accel/accel.sh@20 -- # read -r var val 00:06:38.587 00:14:37 -- accel/accel.sh@21 -- # val=compress 00:06:38.587 00:14:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.587 00:14:37 -- accel/accel.sh@24 -- # accel_opc=compress 00:06:38.587 00:14:37 -- accel/accel.sh@20 -- # IFS=: 00:06:38.587 00:14:37 -- accel/accel.sh@20 -- # read -r var val 00:06:38.587 00:14:37 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:38.587 00:14:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.587 00:14:37 -- accel/accel.sh@20 -- # IFS=: 00:06:38.587 00:14:37 -- accel/accel.sh@20 -- # read -r var val 00:06:38.587 00:14:37 -- accel/accel.sh@21 -- # val= 00:06:38.587 00:14:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.587 00:14:37 -- accel/accel.sh@20 -- # IFS=: 00:06:38.587 00:14:37 -- accel/accel.sh@20 -- # read -r var val 00:06:38.587 00:14:37 -- accel/accel.sh@21 -- # val=software 00:06:38.587 00:14:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.587 00:14:37 -- accel/accel.sh@23 -- # accel_module=software 00:06:38.587 00:14:37 -- accel/accel.sh@20 -- # IFS=: 00:06:38.587 00:14:37 -- accel/accel.sh@20 -- # read -r var val 00:06:38.587 00:14:37 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:38.587 00:14:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.587 00:14:37 -- accel/accel.sh@20 -- # IFS=: 00:06:38.587 00:14:37 -- accel/accel.sh@20 -- # read -r var val 00:06:38.587 00:14:37 -- accel/accel.sh@21 -- # val=32 00:06:38.587 00:14:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.587 00:14:37 -- accel/accel.sh@20 -- # IFS=: 00:06:38.587 00:14:37 -- accel/accel.sh@20 -- # read -r var val 00:06:38.587 00:14:37 -- accel/accel.sh@21 -- # val=32 00:06:38.587 00:14:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.587 00:14:37 -- accel/accel.sh@20 -- # IFS=: 00:06:38.587 00:14:37 -- accel/accel.sh@20 -- # read -r var val 00:06:38.587 00:14:37 -- accel/accel.sh@21 -- # val=1 00:06:38.587 00:14:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.587 00:14:37 -- accel/accel.sh@20 -- # IFS=: 00:06:38.587 00:14:37 -- accel/accel.sh@20 -- # read -r var val 00:06:38.587 00:14:37 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:38.587 00:14:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.587 00:14:37 -- accel/accel.sh@20 -- # IFS=: 00:06:38.587 00:14:37 -- accel/accel.sh@20 -- # read -r var val 00:06:38.587 00:14:37 -- accel/accel.sh@21 -- # val=No 00:06:38.587 00:14:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.587 00:14:37 -- accel/accel.sh@20 -- # IFS=: 00:06:38.587 00:14:37 -- accel/accel.sh@20 -- # read -r var val 00:06:38.587 00:14:37 -- accel/accel.sh@21 -- # val= 00:06:38.587 00:14:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.587 00:14:37 -- accel/accel.sh@20 -- # IFS=: 00:06:38.587 00:14:37 -- accel/accel.sh@20 -- # read -r var val 00:06:38.587 00:14:37 -- accel/accel.sh@21 -- # val= 00:06:38.587 00:14:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.587 00:14:37 -- accel/accel.sh@20 -- # IFS=: 00:06:38.587 00:14:37 -- accel/accel.sh@20 -- # read -r var val 00:06:39.523 00:14:38 -- accel/accel.sh@21 -- # val= 00:06:39.523 00:14:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.523 00:14:38 -- accel/accel.sh@20 -- # IFS=: 00:06:39.523 00:14:38 -- accel/accel.sh@20 -- # read -r var val 00:06:39.523 00:14:38 -- accel/accel.sh@21 -- # val= 00:06:39.523 00:14:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.523 00:14:38 -- accel/accel.sh@20 -- # IFS=: 00:06:39.523 00:14:38 -- accel/accel.sh@20 -- # read -r var val 00:06:39.523 00:14:38 -- accel/accel.sh@21 -- # val= 00:06:39.523 00:14:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.523 00:14:38 -- accel/accel.sh@20 -- # IFS=: 00:06:39.523 00:14:38 -- accel/accel.sh@20 -- # read -r var val 00:06:39.523 00:14:38 -- accel/accel.sh@21 -- # val= 00:06:39.523 00:14:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.523 00:14:38 -- accel/accel.sh@20 -- # IFS=: 00:06:39.523 00:14:38 -- accel/accel.sh@20 -- # read -r var val 00:06:39.523 00:14:38 -- accel/accel.sh@21 -- # val= 00:06:39.523 00:14:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.523 00:14:38 -- accel/accel.sh@20 -- # IFS=: 00:06:39.523 00:14:38 -- accel/accel.sh@20 -- # read -r var val 00:06:39.523 00:14:38 -- accel/accel.sh@21 -- # val= 00:06:39.523 00:14:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.523 00:14:38 -- accel/accel.sh@20 -- # IFS=: 00:06:39.523 00:14:38 -- accel/accel.sh@20 -- # read -r var val 00:06:39.523 00:14:38 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:39.523 00:14:38 -- accel/accel.sh@28 -- # [[ -n compress ]] 00:06:39.523 00:14:38 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:39.523 00:06:39.523 real 0m2.659s 00:06:39.523 user 0m2.409s 00:06:39.523 sys 0m0.258s 00:06:39.523 00:14:38 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:39.523 00:14:38 -- common/autotest_common.sh@10 -- # set +x 00:06:39.523 ************************************ 00:06:39.523 END TEST accel_comp 00:06:39.523 ************************************ 00:06:39.782 00:14:38 -- accel/accel.sh@109 -- # run_test accel_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:39.782 00:14:38 -- common/autotest_common.sh@1077 -- # '[' 9 -le 1 ']' 00:06:39.782 00:14:38 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:39.782 00:14:38 -- common/autotest_common.sh@10 -- # set +x 00:06:39.782 ************************************ 00:06:39.782 START TEST accel_decomp 00:06:39.782 ************************************ 00:06:39.782 00:14:38 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:39.782 00:14:38 -- accel/accel.sh@16 -- # local accel_opc 00:06:39.782 00:14:38 -- accel/accel.sh@17 -- # local accel_module 00:06:39.782 00:14:38 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:39.782 00:14:38 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:39.782 00:14:38 -- accel/accel.sh@12 -- # build_accel_config 00:06:39.782 00:14:38 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:39.782 00:14:38 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:39.782 00:14:38 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:39.782 00:14:38 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:39.782 00:14:38 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:39.782 00:14:38 -- accel/accel.sh@41 -- # local IFS=, 00:06:39.782 00:14:38 -- accel/accel.sh@42 -- # jq -r . 00:06:39.782 [2024-07-15 00:14:38.617721] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:39.782 [2024-07-15 00:14:38.617813] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid321523 ] 00:06:39.782 EAL: No free 2048 kB hugepages reported on node 1 00:06:39.782 [2024-07-15 00:14:38.687968] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:39.782 [2024-07-15 00:14:38.755253] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:41.159 00:14:39 -- accel/accel.sh@18 -- # out='Preparing input file... 00:06:41.159 00:06:41.159 SPDK Configuration: 00:06:41.159 Core mask: 0x1 00:06:41.159 00:06:41.159 Accel Perf Configuration: 00:06:41.159 Workload Type: decompress 00:06:41.159 Transfer size: 4096 bytes 00:06:41.159 Vector count 1 00:06:41.159 Module: software 00:06:41.159 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:41.159 Queue depth: 32 00:06:41.159 Allocate depth: 32 00:06:41.159 # threads/core: 1 00:06:41.159 Run time: 1 seconds 00:06:41.159 Verify: Yes 00:06:41.159 00:06:41.159 Running for 1 seconds... 00:06:41.159 00:06:41.159 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:41.159 ------------------------------------------------------------------------------------ 00:06:41.159 0,0 95552/s 176 MiB/s 0 0 00:06:41.159 ==================================================================================== 00:06:41.159 Total 95552/s 373 MiB/s 0 0' 00:06:41.159 00:14:39 -- accel/accel.sh@20 -- # IFS=: 00:06:41.159 00:14:39 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:41.159 00:14:39 -- accel/accel.sh@20 -- # read -r var val 00:06:41.159 00:14:39 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:41.159 00:14:39 -- accel/accel.sh@12 -- # build_accel_config 00:06:41.159 00:14:39 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:41.159 00:14:39 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:41.159 00:14:39 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:41.159 00:14:39 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:41.159 00:14:39 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:41.159 00:14:39 -- accel/accel.sh@41 -- # local IFS=, 00:06:41.159 00:14:39 -- accel/accel.sh@42 -- # jq -r . 00:06:41.159 [2024-07-15 00:14:39.932611] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:41.159 [2024-07-15 00:14:39.932663] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid321681 ] 00:06:41.159 EAL: No free 2048 kB hugepages reported on node 1 00:06:41.159 [2024-07-15 00:14:39.997627] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:41.159 [2024-07-15 00:14:40.086113] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:41.159 00:14:40 -- accel/accel.sh@21 -- # val= 00:06:41.159 00:14:40 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.159 00:14:40 -- accel/accel.sh@20 -- # IFS=: 00:06:41.159 00:14:40 -- accel/accel.sh@20 -- # read -r var val 00:06:41.159 00:14:40 -- accel/accel.sh@21 -- # val= 00:06:41.159 00:14:40 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.159 00:14:40 -- accel/accel.sh@20 -- # IFS=: 00:06:41.159 00:14:40 -- accel/accel.sh@20 -- # read -r var val 00:06:41.159 00:14:40 -- accel/accel.sh@21 -- # val= 00:06:41.159 00:14:40 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.159 00:14:40 -- accel/accel.sh@20 -- # IFS=: 00:06:41.159 00:14:40 -- accel/accel.sh@20 -- # read -r var val 00:06:41.159 00:14:40 -- accel/accel.sh@21 -- # val=0x1 00:06:41.159 00:14:40 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.159 00:14:40 -- accel/accel.sh@20 -- # IFS=: 00:06:41.159 00:14:40 -- accel/accel.sh@20 -- # read -r var val 00:06:41.159 00:14:40 -- accel/accel.sh@21 -- # val= 00:06:41.159 00:14:40 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.159 00:14:40 -- accel/accel.sh@20 -- # IFS=: 00:06:41.159 00:14:40 -- accel/accel.sh@20 -- # read -r var val 00:06:41.159 00:14:40 -- accel/accel.sh@21 -- # val= 00:06:41.159 00:14:40 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.159 00:14:40 -- accel/accel.sh@20 -- # IFS=: 00:06:41.159 00:14:40 -- accel/accel.sh@20 -- # read -r var val 00:06:41.159 00:14:40 -- accel/accel.sh@21 -- # val=decompress 00:06:41.159 00:14:40 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.159 00:14:40 -- accel/accel.sh@24 -- # accel_opc=decompress 00:06:41.159 00:14:40 -- accel/accel.sh@20 -- # IFS=: 00:06:41.159 00:14:40 -- accel/accel.sh@20 -- # read -r var val 00:06:41.159 00:14:40 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:41.159 00:14:40 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.159 00:14:40 -- accel/accel.sh@20 -- # IFS=: 00:06:41.160 00:14:40 -- accel/accel.sh@20 -- # read -r var val 00:06:41.160 00:14:40 -- accel/accel.sh@21 -- # val= 00:06:41.160 00:14:40 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.160 00:14:40 -- accel/accel.sh@20 -- # IFS=: 00:06:41.160 00:14:40 -- accel/accel.sh@20 -- # read -r var val 00:06:41.160 00:14:40 -- accel/accel.sh@21 -- # val=software 00:06:41.160 00:14:40 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.160 00:14:40 -- accel/accel.sh@23 -- # accel_module=software 00:06:41.160 00:14:40 -- accel/accel.sh@20 -- # IFS=: 00:06:41.160 00:14:40 -- accel/accel.sh@20 -- # read -r var val 00:06:41.160 00:14:40 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:41.160 00:14:40 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.160 00:14:40 -- accel/accel.sh@20 -- # IFS=: 00:06:41.160 00:14:40 -- accel/accel.sh@20 -- # read -r var val 00:06:41.160 00:14:40 -- accel/accel.sh@21 -- # val=32 00:06:41.160 00:14:40 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.160 00:14:40 -- accel/accel.sh@20 -- # IFS=: 00:06:41.160 00:14:40 -- accel/accel.sh@20 -- # read -r var val 00:06:41.160 00:14:40 -- accel/accel.sh@21 -- # val=32 00:06:41.160 00:14:40 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.160 00:14:40 -- accel/accel.sh@20 -- # IFS=: 00:06:41.160 00:14:40 -- accel/accel.sh@20 -- # read -r var val 00:06:41.160 00:14:40 -- accel/accel.sh@21 -- # val=1 00:06:41.160 00:14:40 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.160 00:14:40 -- accel/accel.sh@20 -- # IFS=: 00:06:41.160 00:14:40 -- accel/accel.sh@20 -- # read -r var val 00:06:41.160 00:14:40 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:41.160 00:14:40 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.160 00:14:40 -- accel/accel.sh@20 -- # IFS=: 00:06:41.160 00:14:40 -- accel/accel.sh@20 -- # read -r var val 00:06:41.160 00:14:40 -- accel/accel.sh@21 -- # val=Yes 00:06:41.160 00:14:40 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.160 00:14:40 -- accel/accel.sh@20 -- # IFS=: 00:06:41.160 00:14:40 -- accel/accel.sh@20 -- # read -r var val 00:06:41.160 00:14:40 -- accel/accel.sh@21 -- # val= 00:06:41.160 00:14:40 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.160 00:14:40 -- accel/accel.sh@20 -- # IFS=: 00:06:41.160 00:14:40 -- accel/accel.sh@20 -- # read -r var val 00:06:41.160 00:14:40 -- accel/accel.sh@21 -- # val= 00:06:41.160 00:14:40 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.160 00:14:40 -- accel/accel.sh@20 -- # IFS=: 00:06:41.160 00:14:40 -- accel/accel.sh@20 -- # read -r var val 00:06:42.538 00:14:41 -- accel/accel.sh@21 -- # val= 00:06:42.538 00:14:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.538 00:14:41 -- accel/accel.sh@20 -- # IFS=: 00:06:42.538 00:14:41 -- accel/accel.sh@20 -- # read -r var val 00:06:42.538 00:14:41 -- accel/accel.sh@21 -- # val= 00:06:42.538 00:14:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.538 00:14:41 -- accel/accel.sh@20 -- # IFS=: 00:06:42.538 00:14:41 -- accel/accel.sh@20 -- # read -r var val 00:06:42.538 00:14:41 -- accel/accel.sh@21 -- # val= 00:06:42.538 00:14:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.538 00:14:41 -- accel/accel.sh@20 -- # IFS=: 00:06:42.538 00:14:41 -- accel/accel.sh@20 -- # read -r var val 00:06:42.538 00:14:41 -- accel/accel.sh@21 -- # val= 00:06:42.538 00:14:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.538 00:14:41 -- accel/accel.sh@20 -- # IFS=: 00:06:42.538 00:14:41 -- accel/accel.sh@20 -- # read -r var val 00:06:42.538 00:14:41 -- accel/accel.sh@21 -- # val= 00:06:42.538 00:14:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.538 00:14:41 -- accel/accel.sh@20 -- # IFS=: 00:06:42.538 00:14:41 -- accel/accel.sh@20 -- # read -r var val 00:06:42.538 00:14:41 -- accel/accel.sh@21 -- # val= 00:06:42.538 00:14:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.538 00:14:41 -- accel/accel.sh@20 -- # IFS=: 00:06:42.538 00:14:41 -- accel/accel.sh@20 -- # read -r var val 00:06:42.538 00:14:41 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:42.538 00:14:41 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:06:42.538 00:14:41 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:42.538 00:06:42.538 real 0m2.660s 00:06:42.538 user 0m2.413s 00:06:42.538 sys 0m0.247s 00:06:42.538 00:14:41 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:42.538 00:14:41 -- common/autotest_common.sh@10 -- # set +x 00:06:42.538 ************************************ 00:06:42.538 END TEST accel_decomp 00:06:42.538 ************************************ 00:06:42.538 00:14:41 -- accel/accel.sh@110 -- # run_test accel_decmop_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:06:42.538 00:14:41 -- common/autotest_common.sh@1077 -- # '[' 11 -le 1 ']' 00:06:42.538 00:14:41 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:42.538 00:14:41 -- common/autotest_common.sh@10 -- # set +x 00:06:42.538 ************************************ 00:06:42.538 START TEST accel_decmop_full 00:06:42.538 ************************************ 00:06:42.538 00:14:41 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:06:42.538 00:14:41 -- accel/accel.sh@16 -- # local accel_opc 00:06:42.538 00:14:41 -- accel/accel.sh@17 -- # local accel_module 00:06:42.538 00:14:41 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:06:42.538 00:14:41 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:06:42.538 00:14:41 -- accel/accel.sh@12 -- # build_accel_config 00:06:42.538 00:14:41 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:42.538 00:14:41 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:42.538 00:14:41 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:42.538 00:14:41 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:42.538 00:14:41 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:42.538 00:14:41 -- accel/accel.sh@41 -- # local IFS=, 00:06:42.538 00:14:41 -- accel/accel.sh@42 -- # jq -r . 00:06:42.538 [2024-07-15 00:14:41.310702] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:42.538 [2024-07-15 00:14:41.310777] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid321954 ] 00:06:42.538 EAL: No free 2048 kB hugepages reported on node 1 00:06:42.538 [2024-07-15 00:14:41.379551] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:42.538 [2024-07-15 00:14:41.448626] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:43.915 00:14:42 -- accel/accel.sh@18 -- # out='Preparing input file... 00:06:43.916 00:06:43.916 SPDK Configuration: 00:06:43.916 Core mask: 0x1 00:06:43.916 00:06:43.916 Accel Perf Configuration: 00:06:43.916 Workload Type: decompress 00:06:43.916 Transfer size: 111250 bytes 00:06:43.916 Vector count 1 00:06:43.916 Module: software 00:06:43.916 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:43.916 Queue depth: 32 00:06:43.916 Allocate depth: 32 00:06:43.916 # threads/core: 1 00:06:43.916 Run time: 1 seconds 00:06:43.916 Verify: Yes 00:06:43.916 00:06:43.916 Running for 1 seconds... 00:06:43.916 00:06:43.916 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:43.916 ------------------------------------------------------------------------------------ 00:06:43.916 0,0 5824/s 240 MiB/s 0 0 00:06:43.916 ==================================================================================== 00:06:43.916 Total 5824/s 617 MiB/s 0 0' 00:06:43.916 00:14:42 -- accel/accel.sh@20 -- # IFS=: 00:06:43.916 00:14:42 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:06:43.916 00:14:42 -- accel/accel.sh@20 -- # read -r var val 00:06:43.916 00:14:42 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:06:43.916 00:14:42 -- accel/accel.sh@12 -- # build_accel_config 00:06:43.916 00:14:42 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:43.916 00:14:42 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:43.916 00:14:42 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:43.916 00:14:42 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:43.916 00:14:42 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:43.916 00:14:42 -- accel/accel.sh@41 -- # local IFS=, 00:06:43.916 00:14:42 -- accel/accel.sh@42 -- # jq -r . 00:06:43.916 [2024-07-15 00:14:42.630013] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:43.916 [2024-07-15 00:14:42.630065] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid322226 ] 00:06:43.916 EAL: No free 2048 kB hugepages reported on node 1 00:06:43.916 [2024-07-15 00:14:42.693238] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:43.916 [2024-07-15 00:14:42.758714] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:43.916 00:14:42 -- accel/accel.sh@21 -- # val= 00:06:43.916 00:14:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.916 00:14:42 -- accel/accel.sh@20 -- # IFS=: 00:06:43.916 00:14:42 -- accel/accel.sh@20 -- # read -r var val 00:06:43.916 00:14:42 -- accel/accel.sh@21 -- # val= 00:06:43.916 00:14:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.916 00:14:42 -- accel/accel.sh@20 -- # IFS=: 00:06:43.916 00:14:42 -- accel/accel.sh@20 -- # read -r var val 00:06:43.916 00:14:42 -- accel/accel.sh@21 -- # val= 00:06:43.916 00:14:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.916 00:14:42 -- accel/accel.sh@20 -- # IFS=: 00:06:43.916 00:14:42 -- accel/accel.sh@20 -- # read -r var val 00:06:43.916 00:14:42 -- accel/accel.sh@21 -- # val=0x1 00:06:43.916 00:14:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.916 00:14:42 -- accel/accel.sh@20 -- # IFS=: 00:06:43.916 00:14:42 -- accel/accel.sh@20 -- # read -r var val 00:06:43.916 00:14:42 -- accel/accel.sh@21 -- # val= 00:06:43.916 00:14:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.916 00:14:42 -- accel/accel.sh@20 -- # IFS=: 00:06:43.916 00:14:42 -- accel/accel.sh@20 -- # read -r var val 00:06:43.916 00:14:42 -- accel/accel.sh@21 -- # val= 00:06:43.916 00:14:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.916 00:14:42 -- accel/accel.sh@20 -- # IFS=: 00:06:43.916 00:14:42 -- accel/accel.sh@20 -- # read -r var val 00:06:43.916 00:14:42 -- accel/accel.sh@21 -- # val=decompress 00:06:43.916 00:14:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.916 00:14:42 -- accel/accel.sh@24 -- # accel_opc=decompress 00:06:43.916 00:14:42 -- accel/accel.sh@20 -- # IFS=: 00:06:43.916 00:14:42 -- accel/accel.sh@20 -- # read -r var val 00:06:43.916 00:14:42 -- accel/accel.sh@21 -- # val='111250 bytes' 00:06:43.916 00:14:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.916 00:14:42 -- accel/accel.sh@20 -- # IFS=: 00:06:43.916 00:14:42 -- accel/accel.sh@20 -- # read -r var val 00:06:43.916 00:14:42 -- accel/accel.sh@21 -- # val= 00:06:43.916 00:14:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.916 00:14:42 -- accel/accel.sh@20 -- # IFS=: 00:06:43.916 00:14:42 -- accel/accel.sh@20 -- # read -r var val 00:06:43.916 00:14:42 -- accel/accel.sh@21 -- # val=software 00:06:43.916 00:14:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.916 00:14:42 -- accel/accel.sh@23 -- # accel_module=software 00:06:43.916 00:14:42 -- accel/accel.sh@20 -- # IFS=: 00:06:43.916 00:14:42 -- accel/accel.sh@20 -- # read -r var val 00:06:43.916 00:14:42 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:43.916 00:14:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.916 00:14:42 -- accel/accel.sh@20 -- # IFS=: 00:06:43.916 00:14:42 -- accel/accel.sh@20 -- # read -r var val 00:06:43.916 00:14:42 -- accel/accel.sh@21 -- # val=32 00:06:43.916 00:14:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.916 00:14:42 -- accel/accel.sh@20 -- # IFS=: 00:06:43.916 00:14:42 -- accel/accel.sh@20 -- # read -r var val 00:06:43.916 00:14:42 -- accel/accel.sh@21 -- # val=32 00:06:43.916 00:14:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.916 00:14:42 -- accel/accel.sh@20 -- # IFS=: 00:06:43.916 00:14:42 -- accel/accel.sh@20 -- # read -r var val 00:06:43.916 00:14:42 -- accel/accel.sh@21 -- # val=1 00:06:43.916 00:14:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.916 00:14:42 -- accel/accel.sh@20 -- # IFS=: 00:06:43.916 00:14:42 -- accel/accel.sh@20 -- # read -r var val 00:06:43.916 00:14:42 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:43.916 00:14:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.916 00:14:42 -- accel/accel.sh@20 -- # IFS=: 00:06:43.916 00:14:42 -- accel/accel.sh@20 -- # read -r var val 00:06:43.916 00:14:42 -- accel/accel.sh@21 -- # val=Yes 00:06:43.916 00:14:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.916 00:14:42 -- accel/accel.sh@20 -- # IFS=: 00:06:43.916 00:14:42 -- accel/accel.sh@20 -- # read -r var val 00:06:43.916 00:14:42 -- accel/accel.sh@21 -- # val= 00:06:43.916 00:14:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.916 00:14:42 -- accel/accel.sh@20 -- # IFS=: 00:06:43.916 00:14:42 -- accel/accel.sh@20 -- # read -r var val 00:06:43.916 00:14:42 -- accel/accel.sh@21 -- # val= 00:06:43.916 00:14:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.916 00:14:42 -- accel/accel.sh@20 -- # IFS=: 00:06:43.916 00:14:42 -- accel/accel.sh@20 -- # read -r var val 00:06:45.292 00:14:43 -- accel/accel.sh@21 -- # val= 00:06:45.292 00:14:43 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.292 00:14:43 -- accel/accel.sh@20 -- # IFS=: 00:06:45.292 00:14:43 -- accel/accel.sh@20 -- # read -r var val 00:06:45.292 00:14:43 -- accel/accel.sh@21 -- # val= 00:06:45.292 00:14:43 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.292 00:14:43 -- accel/accel.sh@20 -- # IFS=: 00:06:45.292 00:14:43 -- accel/accel.sh@20 -- # read -r var val 00:06:45.292 00:14:43 -- accel/accel.sh@21 -- # val= 00:06:45.292 00:14:43 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.292 00:14:43 -- accel/accel.sh@20 -- # IFS=: 00:06:45.292 00:14:43 -- accel/accel.sh@20 -- # read -r var val 00:06:45.292 00:14:43 -- accel/accel.sh@21 -- # val= 00:06:45.292 00:14:43 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.292 00:14:43 -- accel/accel.sh@20 -- # IFS=: 00:06:45.292 00:14:43 -- accel/accel.sh@20 -- # read -r var val 00:06:45.292 00:14:43 -- accel/accel.sh@21 -- # val= 00:06:45.292 00:14:43 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.292 00:14:43 -- accel/accel.sh@20 -- # IFS=: 00:06:45.292 00:14:43 -- accel/accel.sh@20 -- # read -r var val 00:06:45.292 00:14:43 -- accel/accel.sh@21 -- # val= 00:06:45.292 00:14:43 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.292 00:14:43 -- accel/accel.sh@20 -- # IFS=: 00:06:45.292 00:14:43 -- accel/accel.sh@20 -- # read -r var val 00:06:45.293 00:14:43 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:45.293 00:14:43 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:06:45.293 00:14:43 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:45.293 00:06:45.293 real 0m2.647s 00:06:45.293 user 0m2.405s 00:06:45.293 sys 0m0.239s 00:06:45.293 00:14:43 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:45.293 00:14:43 -- common/autotest_common.sh@10 -- # set +x 00:06:45.293 ************************************ 00:06:45.293 END TEST accel_decmop_full 00:06:45.293 ************************************ 00:06:45.293 00:14:43 -- accel/accel.sh@111 -- # run_test accel_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:06:45.293 00:14:43 -- common/autotest_common.sh@1077 -- # '[' 11 -le 1 ']' 00:06:45.293 00:14:43 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:45.293 00:14:43 -- common/autotest_common.sh@10 -- # set +x 00:06:45.293 ************************************ 00:06:45.293 START TEST accel_decomp_mcore 00:06:45.293 ************************************ 00:06:45.293 00:14:43 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:06:45.293 00:14:43 -- accel/accel.sh@16 -- # local accel_opc 00:06:45.293 00:14:43 -- accel/accel.sh@17 -- # local accel_module 00:06:45.293 00:14:43 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:06:45.293 00:14:43 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:06:45.293 00:14:43 -- accel/accel.sh@12 -- # build_accel_config 00:06:45.293 00:14:43 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:45.293 00:14:43 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:45.293 00:14:43 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:45.293 00:14:43 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:45.293 00:14:43 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:45.293 00:14:43 -- accel/accel.sh@41 -- # local IFS=, 00:06:45.293 00:14:43 -- accel/accel.sh@42 -- # jq -r . 00:06:45.293 [2024-07-15 00:14:43.995153] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:45.293 [2024-07-15 00:14:43.995240] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid322507 ] 00:06:45.293 EAL: No free 2048 kB hugepages reported on node 1 00:06:45.293 [2024-07-15 00:14:44.065336] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:06:45.293 [2024-07-15 00:14:44.135339] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:45.293 [2024-07-15 00:14:44.135434] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:45.293 [2024-07-15 00:14:44.135495] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:06:45.293 [2024-07-15 00:14:44.135498] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:46.672 00:14:45 -- accel/accel.sh@18 -- # out='Preparing input file... 00:06:46.672 00:06:46.672 SPDK Configuration: 00:06:46.672 Core mask: 0xf 00:06:46.672 00:06:46.672 Accel Perf Configuration: 00:06:46.672 Workload Type: decompress 00:06:46.672 Transfer size: 4096 bytes 00:06:46.672 Vector count 1 00:06:46.672 Module: software 00:06:46.672 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:46.672 Queue depth: 32 00:06:46.672 Allocate depth: 32 00:06:46.672 # threads/core: 1 00:06:46.672 Run time: 1 seconds 00:06:46.672 Verify: Yes 00:06:46.672 00:06:46.672 Running for 1 seconds... 00:06:46.672 00:06:46.672 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:46.672 ------------------------------------------------------------------------------------ 00:06:46.672 0,0 78432/s 144 MiB/s 0 0 00:06:46.672 3,0 79200/s 145 MiB/s 0 0 00:06:46.672 2,0 78816/s 145 MiB/s 0 0 00:06:46.672 1,0 78752/s 145 MiB/s 0 0 00:06:46.672 ==================================================================================== 00:06:46.672 Total 315200/s 1231 MiB/s 0 0' 00:06:46.672 00:14:45 -- accel/accel.sh@20 -- # IFS=: 00:06:46.672 00:14:45 -- accel/accel.sh@20 -- # read -r var val 00:06:46.672 00:14:45 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:06:46.672 00:14:45 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:06:46.672 00:14:45 -- accel/accel.sh@12 -- # build_accel_config 00:06:46.672 00:14:45 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:46.672 00:14:45 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:46.672 00:14:45 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:46.672 00:14:45 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:46.672 00:14:45 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:46.672 00:14:45 -- accel/accel.sh@41 -- # local IFS=, 00:06:46.672 00:14:45 -- accel/accel.sh@42 -- # jq -r . 00:06:46.672 [2024-07-15 00:14:45.331559] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:46.672 [2024-07-15 00:14:45.331650] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid322785 ] 00:06:46.672 EAL: No free 2048 kB hugepages reported on node 1 00:06:46.672 [2024-07-15 00:14:45.401467] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:06:46.672 [2024-07-15 00:14:45.469726] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:46.672 [2024-07-15 00:14:45.469823] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:46.672 [2024-07-15 00:14:45.469912] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:06:46.672 [2024-07-15 00:14:45.469914] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:46.672 00:14:45 -- accel/accel.sh@21 -- # val= 00:06:46.672 00:14:45 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.672 00:14:45 -- accel/accel.sh@20 -- # IFS=: 00:06:46.672 00:14:45 -- accel/accel.sh@20 -- # read -r var val 00:06:46.672 00:14:45 -- accel/accel.sh@21 -- # val= 00:06:46.672 00:14:45 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.672 00:14:45 -- accel/accel.sh@20 -- # IFS=: 00:06:46.672 00:14:45 -- accel/accel.sh@20 -- # read -r var val 00:06:46.672 00:14:45 -- accel/accel.sh@21 -- # val= 00:06:46.672 00:14:45 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.672 00:14:45 -- accel/accel.sh@20 -- # IFS=: 00:06:46.672 00:14:45 -- accel/accel.sh@20 -- # read -r var val 00:06:46.672 00:14:45 -- accel/accel.sh@21 -- # val=0xf 00:06:46.672 00:14:45 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.672 00:14:45 -- accel/accel.sh@20 -- # IFS=: 00:06:46.672 00:14:45 -- accel/accel.sh@20 -- # read -r var val 00:06:46.672 00:14:45 -- accel/accel.sh@21 -- # val= 00:06:46.672 00:14:45 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.672 00:14:45 -- accel/accel.sh@20 -- # IFS=: 00:06:46.672 00:14:45 -- accel/accel.sh@20 -- # read -r var val 00:06:46.672 00:14:45 -- accel/accel.sh@21 -- # val= 00:06:46.672 00:14:45 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.672 00:14:45 -- accel/accel.sh@20 -- # IFS=: 00:06:46.672 00:14:45 -- accel/accel.sh@20 -- # read -r var val 00:06:46.672 00:14:45 -- accel/accel.sh@21 -- # val=decompress 00:06:46.672 00:14:45 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.672 00:14:45 -- accel/accel.sh@24 -- # accel_opc=decompress 00:06:46.672 00:14:45 -- accel/accel.sh@20 -- # IFS=: 00:06:46.672 00:14:45 -- accel/accel.sh@20 -- # read -r var val 00:06:46.672 00:14:45 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:46.672 00:14:45 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.672 00:14:45 -- accel/accel.sh@20 -- # IFS=: 00:06:46.672 00:14:45 -- accel/accel.sh@20 -- # read -r var val 00:06:46.672 00:14:45 -- accel/accel.sh@21 -- # val= 00:06:46.672 00:14:45 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.672 00:14:45 -- accel/accel.sh@20 -- # IFS=: 00:06:46.672 00:14:45 -- accel/accel.sh@20 -- # read -r var val 00:06:46.672 00:14:45 -- accel/accel.sh@21 -- # val=software 00:06:46.672 00:14:45 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.672 00:14:45 -- accel/accel.sh@23 -- # accel_module=software 00:06:46.672 00:14:45 -- accel/accel.sh@20 -- # IFS=: 00:06:46.672 00:14:45 -- accel/accel.sh@20 -- # read -r var val 00:06:46.672 00:14:45 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:46.672 00:14:45 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.672 00:14:45 -- accel/accel.sh@20 -- # IFS=: 00:06:46.672 00:14:45 -- accel/accel.sh@20 -- # read -r var val 00:06:46.672 00:14:45 -- accel/accel.sh@21 -- # val=32 00:06:46.672 00:14:45 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.672 00:14:45 -- accel/accel.sh@20 -- # IFS=: 00:06:46.672 00:14:45 -- accel/accel.sh@20 -- # read -r var val 00:06:46.672 00:14:45 -- accel/accel.sh@21 -- # val=32 00:06:46.672 00:14:45 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.672 00:14:45 -- accel/accel.sh@20 -- # IFS=: 00:06:46.672 00:14:45 -- accel/accel.sh@20 -- # read -r var val 00:06:46.672 00:14:45 -- accel/accel.sh@21 -- # val=1 00:06:46.672 00:14:45 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.672 00:14:45 -- accel/accel.sh@20 -- # IFS=: 00:06:46.672 00:14:45 -- accel/accel.sh@20 -- # read -r var val 00:06:46.672 00:14:45 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:46.672 00:14:45 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.672 00:14:45 -- accel/accel.sh@20 -- # IFS=: 00:06:46.672 00:14:45 -- accel/accel.sh@20 -- # read -r var val 00:06:46.672 00:14:45 -- accel/accel.sh@21 -- # val=Yes 00:06:46.672 00:14:45 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.672 00:14:45 -- accel/accel.sh@20 -- # IFS=: 00:06:46.672 00:14:45 -- accel/accel.sh@20 -- # read -r var val 00:06:46.672 00:14:45 -- accel/accel.sh@21 -- # val= 00:06:46.672 00:14:45 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.672 00:14:45 -- accel/accel.sh@20 -- # IFS=: 00:06:46.672 00:14:45 -- accel/accel.sh@20 -- # read -r var val 00:06:46.672 00:14:45 -- accel/accel.sh@21 -- # val= 00:06:46.672 00:14:45 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.672 00:14:45 -- accel/accel.sh@20 -- # IFS=: 00:06:46.672 00:14:45 -- accel/accel.sh@20 -- # read -r var val 00:06:47.611 00:14:46 -- accel/accel.sh@21 -- # val= 00:06:47.611 00:14:46 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.611 00:14:46 -- accel/accel.sh@20 -- # IFS=: 00:06:47.611 00:14:46 -- accel/accel.sh@20 -- # read -r var val 00:06:47.611 00:14:46 -- accel/accel.sh@21 -- # val= 00:06:47.611 00:14:46 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.611 00:14:46 -- accel/accel.sh@20 -- # IFS=: 00:06:47.611 00:14:46 -- accel/accel.sh@20 -- # read -r var val 00:06:47.611 00:14:46 -- accel/accel.sh@21 -- # val= 00:06:47.611 00:14:46 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.611 00:14:46 -- accel/accel.sh@20 -- # IFS=: 00:06:47.611 00:14:46 -- accel/accel.sh@20 -- # read -r var val 00:06:47.611 00:14:46 -- accel/accel.sh@21 -- # val= 00:06:47.611 00:14:46 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.611 00:14:46 -- accel/accel.sh@20 -- # IFS=: 00:06:47.611 00:14:46 -- accel/accel.sh@20 -- # read -r var val 00:06:47.611 00:14:46 -- accel/accel.sh@21 -- # val= 00:06:47.611 00:14:46 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.611 00:14:46 -- accel/accel.sh@20 -- # IFS=: 00:06:47.611 00:14:46 -- accel/accel.sh@20 -- # read -r var val 00:06:47.611 00:14:46 -- accel/accel.sh@21 -- # val= 00:06:47.611 00:14:46 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.611 00:14:46 -- accel/accel.sh@20 -- # IFS=: 00:06:47.611 00:14:46 -- accel/accel.sh@20 -- # read -r var val 00:06:47.611 00:14:46 -- accel/accel.sh@21 -- # val= 00:06:47.611 00:14:46 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.611 00:14:46 -- accel/accel.sh@20 -- # IFS=: 00:06:47.611 00:14:46 -- accel/accel.sh@20 -- # read -r var val 00:06:47.611 00:14:46 -- accel/accel.sh@21 -- # val= 00:06:47.611 00:14:46 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.611 00:14:46 -- accel/accel.sh@20 -- # IFS=: 00:06:47.611 00:14:46 -- accel/accel.sh@20 -- # read -r var val 00:06:47.611 00:14:46 -- accel/accel.sh@21 -- # val= 00:06:47.611 00:14:46 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.611 00:14:46 -- accel/accel.sh@20 -- # IFS=: 00:06:47.611 00:14:46 -- accel/accel.sh@20 -- # read -r var val 00:06:47.611 00:14:46 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:47.611 00:14:46 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:06:47.611 00:14:46 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:47.611 00:06:47.611 real 0m2.681s 00:06:47.611 user 0m9.064s 00:06:47.611 sys 0m0.278s 00:06:47.611 00:14:46 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:47.611 00:14:46 -- common/autotest_common.sh@10 -- # set +x 00:06:47.611 ************************************ 00:06:47.611 END TEST accel_decomp_mcore 00:06:47.611 ************************************ 00:06:47.871 00:14:46 -- accel/accel.sh@112 -- # run_test accel_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:06:47.871 00:14:46 -- common/autotest_common.sh@1077 -- # '[' 13 -le 1 ']' 00:06:47.871 00:14:46 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:47.871 00:14:46 -- common/autotest_common.sh@10 -- # set +x 00:06:47.871 ************************************ 00:06:47.871 START TEST accel_decomp_full_mcore 00:06:47.871 ************************************ 00:06:47.871 00:14:46 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:06:47.871 00:14:46 -- accel/accel.sh@16 -- # local accel_opc 00:06:47.871 00:14:46 -- accel/accel.sh@17 -- # local accel_module 00:06:47.871 00:14:46 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:06:47.871 00:14:46 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:06:47.871 00:14:46 -- accel/accel.sh@12 -- # build_accel_config 00:06:47.871 00:14:46 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:47.871 00:14:46 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:47.871 00:14:46 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:47.871 00:14:46 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:47.871 00:14:46 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:47.871 00:14:46 -- accel/accel.sh@41 -- # local IFS=, 00:06:47.871 00:14:46 -- accel/accel.sh@42 -- # jq -r . 00:06:47.871 [2024-07-15 00:14:46.726642] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:47.871 [2024-07-15 00:14:46.726737] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid323069 ] 00:06:47.871 EAL: No free 2048 kB hugepages reported on node 1 00:06:47.871 [2024-07-15 00:14:46.798297] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:06:47.871 [2024-07-15 00:14:46.866063] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:47.871 [2024-07-15 00:14:46.866150] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:47.871 [2024-07-15 00:14:46.866238] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:06:47.871 [2024-07-15 00:14:46.866240] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:49.251 00:14:48 -- accel/accel.sh@18 -- # out='Preparing input file... 00:06:49.251 00:06:49.251 SPDK Configuration: 00:06:49.251 Core mask: 0xf 00:06:49.251 00:06:49.251 Accel Perf Configuration: 00:06:49.251 Workload Type: decompress 00:06:49.251 Transfer size: 111250 bytes 00:06:49.251 Vector count 1 00:06:49.251 Module: software 00:06:49.251 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:49.251 Queue depth: 32 00:06:49.251 Allocate depth: 32 00:06:49.251 # threads/core: 1 00:06:49.251 Run time: 1 seconds 00:06:49.251 Verify: Yes 00:06:49.251 00:06:49.251 Running for 1 seconds... 00:06:49.251 00:06:49.251 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:49.251 ------------------------------------------------------------------------------------ 00:06:49.251 0,0 5760/s 237 MiB/s 0 0 00:06:49.251 3,0 5792/s 239 MiB/s 0 0 00:06:49.251 2,0 5792/s 239 MiB/s 0 0 00:06:49.251 1,0 5792/s 239 MiB/s 0 0 00:06:49.251 ==================================================================================== 00:06:49.251 Total 23136/s 2454 MiB/s 0 0' 00:06:49.251 00:14:48 -- accel/accel.sh@20 -- # IFS=: 00:06:49.251 00:14:48 -- accel/accel.sh@20 -- # read -r var val 00:06:49.251 00:14:48 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:06:49.251 00:14:48 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:06:49.251 00:14:48 -- accel/accel.sh@12 -- # build_accel_config 00:06:49.251 00:14:48 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:49.251 00:14:48 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:49.251 00:14:48 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:49.251 00:14:48 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:49.251 00:14:48 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:49.251 00:14:48 -- accel/accel.sh@41 -- # local IFS=, 00:06:49.251 00:14:48 -- accel/accel.sh@42 -- # jq -r . 00:06:49.251 [2024-07-15 00:14:48.074077] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:49.251 [2024-07-15 00:14:48.074175] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid323304 ] 00:06:49.251 EAL: No free 2048 kB hugepages reported on node 1 00:06:49.251 [2024-07-15 00:14:48.143307] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:06:49.251 [2024-07-15 00:14:48.212099] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:49.251 [2024-07-15 00:14:48.212196] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:49.251 [2024-07-15 00:14:48.212279] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:06:49.251 [2024-07-15 00:14:48.212280] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:49.251 00:14:48 -- accel/accel.sh@21 -- # val= 00:06:49.251 00:14:48 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.251 00:14:48 -- accel/accel.sh@20 -- # IFS=: 00:06:49.251 00:14:48 -- accel/accel.sh@20 -- # read -r var val 00:06:49.251 00:14:48 -- accel/accel.sh@21 -- # val= 00:06:49.251 00:14:48 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.251 00:14:48 -- accel/accel.sh@20 -- # IFS=: 00:06:49.251 00:14:48 -- accel/accel.sh@20 -- # read -r var val 00:06:49.251 00:14:48 -- accel/accel.sh@21 -- # val= 00:06:49.251 00:14:48 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.251 00:14:48 -- accel/accel.sh@20 -- # IFS=: 00:06:49.251 00:14:48 -- accel/accel.sh@20 -- # read -r var val 00:06:49.251 00:14:48 -- accel/accel.sh@21 -- # val=0xf 00:06:49.251 00:14:48 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.251 00:14:48 -- accel/accel.sh@20 -- # IFS=: 00:06:49.251 00:14:48 -- accel/accel.sh@20 -- # read -r var val 00:06:49.251 00:14:48 -- accel/accel.sh@21 -- # val= 00:06:49.251 00:14:48 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.251 00:14:48 -- accel/accel.sh@20 -- # IFS=: 00:06:49.251 00:14:48 -- accel/accel.sh@20 -- # read -r var val 00:06:49.251 00:14:48 -- accel/accel.sh@21 -- # val= 00:06:49.251 00:14:48 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.251 00:14:48 -- accel/accel.sh@20 -- # IFS=: 00:06:49.251 00:14:48 -- accel/accel.sh@20 -- # read -r var val 00:06:49.251 00:14:48 -- accel/accel.sh@21 -- # val=decompress 00:06:49.251 00:14:48 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.251 00:14:48 -- accel/accel.sh@24 -- # accel_opc=decompress 00:06:49.251 00:14:48 -- accel/accel.sh@20 -- # IFS=: 00:06:49.251 00:14:48 -- accel/accel.sh@20 -- # read -r var val 00:06:49.251 00:14:48 -- accel/accel.sh@21 -- # val='111250 bytes' 00:06:49.251 00:14:48 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.251 00:14:48 -- accel/accel.sh@20 -- # IFS=: 00:06:49.251 00:14:48 -- accel/accel.sh@20 -- # read -r var val 00:06:49.251 00:14:48 -- accel/accel.sh@21 -- # val= 00:06:49.251 00:14:48 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.251 00:14:48 -- accel/accel.sh@20 -- # IFS=: 00:06:49.251 00:14:48 -- accel/accel.sh@20 -- # read -r var val 00:06:49.251 00:14:48 -- accel/accel.sh@21 -- # val=software 00:06:49.251 00:14:48 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.251 00:14:48 -- accel/accel.sh@23 -- # accel_module=software 00:06:49.251 00:14:48 -- accel/accel.sh@20 -- # IFS=: 00:06:49.251 00:14:48 -- accel/accel.sh@20 -- # read -r var val 00:06:49.251 00:14:48 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:49.251 00:14:48 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.251 00:14:48 -- accel/accel.sh@20 -- # IFS=: 00:06:49.251 00:14:48 -- accel/accel.sh@20 -- # read -r var val 00:06:49.251 00:14:48 -- accel/accel.sh@21 -- # val=32 00:06:49.251 00:14:48 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.251 00:14:48 -- accel/accel.sh@20 -- # IFS=: 00:06:49.251 00:14:48 -- accel/accel.sh@20 -- # read -r var val 00:06:49.251 00:14:48 -- accel/accel.sh@21 -- # val=32 00:06:49.251 00:14:48 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.251 00:14:48 -- accel/accel.sh@20 -- # IFS=: 00:06:49.251 00:14:48 -- accel/accel.sh@20 -- # read -r var val 00:06:49.251 00:14:48 -- accel/accel.sh@21 -- # val=1 00:06:49.251 00:14:48 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.251 00:14:48 -- accel/accel.sh@20 -- # IFS=: 00:06:49.251 00:14:48 -- accel/accel.sh@20 -- # read -r var val 00:06:49.251 00:14:48 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:49.251 00:14:48 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.251 00:14:48 -- accel/accel.sh@20 -- # IFS=: 00:06:49.251 00:14:48 -- accel/accel.sh@20 -- # read -r var val 00:06:49.251 00:14:48 -- accel/accel.sh@21 -- # val=Yes 00:06:49.251 00:14:48 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.251 00:14:48 -- accel/accel.sh@20 -- # IFS=: 00:06:49.251 00:14:48 -- accel/accel.sh@20 -- # read -r var val 00:06:49.251 00:14:48 -- accel/accel.sh@21 -- # val= 00:06:49.251 00:14:48 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.251 00:14:48 -- accel/accel.sh@20 -- # IFS=: 00:06:49.251 00:14:48 -- accel/accel.sh@20 -- # read -r var val 00:06:49.251 00:14:48 -- accel/accel.sh@21 -- # val= 00:06:49.251 00:14:48 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.251 00:14:48 -- accel/accel.sh@20 -- # IFS=: 00:06:49.251 00:14:48 -- accel/accel.sh@20 -- # read -r var val 00:06:50.631 00:14:49 -- accel/accel.sh@21 -- # val= 00:06:50.631 00:14:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.631 00:14:49 -- accel/accel.sh@20 -- # IFS=: 00:06:50.631 00:14:49 -- accel/accel.sh@20 -- # read -r var val 00:06:50.631 00:14:49 -- accel/accel.sh@21 -- # val= 00:06:50.631 00:14:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.631 00:14:49 -- accel/accel.sh@20 -- # IFS=: 00:06:50.631 00:14:49 -- accel/accel.sh@20 -- # read -r var val 00:06:50.631 00:14:49 -- accel/accel.sh@21 -- # val= 00:06:50.631 00:14:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.631 00:14:49 -- accel/accel.sh@20 -- # IFS=: 00:06:50.631 00:14:49 -- accel/accel.sh@20 -- # read -r var val 00:06:50.631 00:14:49 -- accel/accel.sh@21 -- # val= 00:06:50.631 00:14:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.631 00:14:49 -- accel/accel.sh@20 -- # IFS=: 00:06:50.631 00:14:49 -- accel/accel.sh@20 -- # read -r var val 00:06:50.631 00:14:49 -- accel/accel.sh@21 -- # val= 00:06:50.631 00:14:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.631 00:14:49 -- accel/accel.sh@20 -- # IFS=: 00:06:50.631 00:14:49 -- accel/accel.sh@20 -- # read -r var val 00:06:50.631 00:14:49 -- accel/accel.sh@21 -- # val= 00:06:50.631 00:14:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.631 00:14:49 -- accel/accel.sh@20 -- # IFS=: 00:06:50.631 00:14:49 -- accel/accel.sh@20 -- # read -r var val 00:06:50.631 00:14:49 -- accel/accel.sh@21 -- # val= 00:06:50.631 00:14:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.631 00:14:49 -- accel/accel.sh@20 -- # IFS=: 00:06:50.631 00:14:49 -- accel/accel.sh@20 -- # read -r var val 00:06:50.631 00:14:49 -- accel/accel.sh@21 -- # val= 00:06:50.631 00:14:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.631 00:14:49 -- accel/accel.sh@20 -- # IFS=: 00:06:50.631 00:14:49 -- accel/accel.sh@20 -- # read -r var val 00:06:50.631 00:14:49 -- accel/accel.sh@21 -- # val= 00:06:50.631 00:14:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.631 00:14:49 -- accel/accel.sh@20 -- # IFS=: 00:06:50.631 00:14:49 -- accel/accel.sh@20 -- # read -r var val 00:06:50.631 00:14:49 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:50.631 00:14:49 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:06:50.631 00:14:49 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:50.631 00:06:50.631 real 0m2.703s 00:06:50.631 user 0m9.128s 00:06:50.631 sys 0m0.291s 00:06:50.631 00:14:49 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:50.631 00:14:49 -- common/autotest_common.sh@10 -- # set +x 00:06:50.631 ************************************ 00:06:50.631 END TEST accel_decomp_full_mcore 00:06:50.631 ************************************ 00:06:50.632 00:14:49 -- accel/accel.sh@113 -- # run_test accel_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:06:50.632 00:14:49 -- common/autotest_common.sh@1077 -- # '[' 11 -le 1 ']' 00:06:50.632 00:14:49 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:50.632 00:14:49 -- common/autotest_common.sh@10 -- # set +x 00:06:50.632 ************************************ 00:06:50.632 START TEST accel_decomp_mthread 00:06:50.632 ************************************ 00:06:50.632 00:14:49 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:06:50.632 00:14:49 -- accel/accel.sh@16 -- # local accel_opc 00:06:50.632 00:14:49 -- accel/accel.sh@17 -- # local accel_module 00:06:50.632 00:14:49 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:06:50.632 00:14:49 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:06:50.632 00:14:49 -- accel/accel.sh@12 -- # build_accel_config 00:06:50.632 00:14:49 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:50.632 00:14:49 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:50.632 00:14:49 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:50.632 00:14:49 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:50.632 00:14:49 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:50.632 00:14:49 -- accel/accel.sh@41 -- # local IFS=, 00:06:50.632 00:14:49 -- accel/accel.sh@42 -- # jq -r . 00:06:50.632 [2024-07-15 00:14:49.477149] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:50.632 [2024-07-15 00:14:49.477239] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid323524 ] 00:06:50.632 EAL: No free 2048 kB hugepages reported on node 1 00:06:50.632 [2024-07-15 00:14:49.547067] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:50.632 [2024-07-15 00:14:49.615266] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:52.018 00:14:50 -- accel/accel.sh@18 -- # out='Preparing input file... 00:06:52.018 00:06:52.018 SPDK Configuration: 00:06:52.018 Core mask: 0x1 00:06:52.018 00:06:52.018 Accel Perf Configuration: 00:06:52.018 Workload Type: decompress 00:06:52.018 Transfer size: 4096 bytes 00:06:52.018 Vector count 1 00:06:52.018 Module: software 00:06:52.018 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:52.018 Queue depth: 32 00:06:52.018 Allocate depth: 32 00:06:52.018 # threads/core: 2 00:06:52.018 Run time: 1 seconds 00:06:52.018 Verify: Yes 00:06:52.018 00:06:52.018 Running for 1 seconds... 00:06:52.018 00:06:52.018 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:52.018 ------------------------------------------------------------------------------------ 00:06:52.018 0,1 45952/s 84 MiB/s 0 0 00:06:52.018 0,0 45856/s 84 MiB/s 0 0 00:06:52.018 ==================================================================================== 00:06:52.018 Total 91808/s 358 MiB/s 0 0' 00:06:52.018 00:14:50 -- accel/accel.sh@20 -- # IFS=: 00:06:52.018 00:14:50 -- accel/accel.sh@20 -- # read -r var val 00:06:52.018 00:14:50 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:06:52.018 00:14:50 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:06:52.018 00:14:50 -- accel/accel.sh@12 -- # build_accel_config 00:06:52.018 00:14:50 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:52.018 00:14:50 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:52.018 00:14:50 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:52.018 00:14:50 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:52.018 00:14:50 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:52.018 00:14:50 -- accel/accel.sh@41 -- # local IFS=, 00:06:52.018 00:14:50 -- accel/accel.sh@42 -- # jq -r . 00:06:52.018 [2024-07-15 00:14:50.810269] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:52.018 [2024-07-15 00:14:50.810361] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid323687 ] 00:06:52.018 EAL: No free 2048 kB hugepages reported on node 1 00:06:52.018 [2024-07-15 00:14:50.881143] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:52.018 [2024-07-15 00:14:50.947773] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:52.018 00:14:50 -- accel/accel.sh@21 -- # val= 00:06:52.018 00:14:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.018 00:14:50 -- accel/accel.sh@20 -- # IFS=: 00:06:52.018 00:14:50 -- accel/accel.sh@20 -- # read -r var val 00:06:52.018 00:14:50 -- accel/accel.sh@21 -- # val= 00:06:52.018 00:14:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.018 00:14:50 -- accel/accel.sh@20 -- # IFS=: 00:06:52.018 00:14:50 -- accel/accel.sh@20 -- # read -r var val 00:06:52.018 00:14:50 -- accel/accel.sh@21 -- # val= 00:06:52.018 00:14:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.018 00:14:50 -- accel/accel.sh@20 -- # IFS=: 00:06:52.018 00:14:50 -- accel/accel.sh@20 -- # read -r var val 00:06:52.018 00:14:50 -- accel/accel.sh@21 -- # val=0x1 00:06:52.018 00:14:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.018 00:14:50 -- accel/accel.sh@20 -- # IFS=: 00:06:52.018 00:14:50 -- accel/accel.sh@20 -- # read -r var val 00:06:52.018 00:14:50 -- accel/accel.sh@21 -- # val= 00:06:52.018 00:14:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.018 00:14:50 -- accel/accel.sh@20 -- # IFS=: 00:06:52.018 00:14:50 -- accel/accel.sh@20 -- # read -r var val 00:06:52.018 00:14:50 -- accel/accel.sh@21 -- # val= 00:06:52.018 00:14:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.018 00:14:50 -- accel/accel.sh@20 -- # IFS=: 00:06:52.018 00:14:50 -- accel/accel.sh@20 -- # read -r var val 00:06:52.018 00:14:50 -- accel/accel.sh@21 -- # val=decompress 00:06:52.018 00:14:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.019 00:14:50 -- accel/accel.sh@24 -- # accel_opc=decompress 00:06:52.019 00:14:50 -- accel/accel.sh@20 -- # IFS=: 00:06:52.019 00:14:50 -- accel/accel.sh@20 -- # read -r var val 00:06:52.019 00:14:50 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:52.019 00:14:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.019 00:14:50 -- accel/accel.sh@20 -- # IFS=: 00:06:52.019 00:14:50 -- accel/accel.sh@20 -- # read -r var val 00:06:52.019 00:14:50 -- accel/accel.sh@21 -- # val= 00:06:52.019 00:14:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.019 00:14:50 -- accel/accel.sh@20 -- # IFS=: 00:06:52.019 00:14:50 -- accel/accel.sh@20 -- # read -r var val 00:06:52.019 00:14:50 -- accel/accel.sh@21 -- # val=software 00:06:52.019 00:14:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.019 00:14:50 -- accel/accel.sh@23 -- # accel_module=software 00:06:52.019 00:14:50 -- accel/accel.sh@20 -- # IFS=: 00:06:52.019 00:14:51 -- accel/accel.sh@20 -- # read -r var val 00:06:52.019 00:14:51 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:52.019 00:14:51 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.019 00:14:51 -- accel/accel.sh@20 -- # IFS=: 00:06:52.019 00:14:51 -- accel/accel.sh@20 -- # read -r var val 00:06:52.019 00:14:51 -- accel/accel.sh@21 -- # val=32 00:06:52.019 00:14:51 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.019 00:14:51 -- accel/accel.sh@20 -- # IFS=: 00:06:52.019 00:14:51 -- accel/accel.sh@20 -- # read -r var val 00:06:52.019 00:14:51 -- accel/accel.sh@21 -- # val=32 00:06:52.019 00:14:51 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.019 00:14:51 -- accel/accel.sh@20 -- # IFS=: 00:06:52.019 00:14:51 -- accel/accel.sh@20 -- # read -r var val 00:06:52.019 00:14:51 -- accel/accel.sh@21 -- # val=2 00:06:52.019 00:14:51 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.019 00:14:51 -- accel/accel.sh@20 -- # IFS=: 00:06:52.019 00:14:51 -- accel/accel.sh@20 -- # read -r var val 00:06:52.019 00:14:51 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:52.019 00:14:51 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.019 00:14:51 -- accel/accel.sh@20 -- # IFS=: 00:06:52.019 00:14:51 -- accel/accel.sh@20 -- # read -r var val 00:06:52.019 00:14:51 -- accel/accel.sh@21 -- # val=Yes 00:06:52.019 00:14:51 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.019 00:14:51 -- accel/accel.sh@20 -- # IFS=: 00:06:52.019 00:14:51 -- accel/accel.sh@20 -- # read -r var val 00:06:52.019 00:14:51 -- accel/accel.sh@21 -- # val= 00:06:52.019 00:14:51 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.019 00:14:51 -- accel/accel.sh@20 -- # IFS=: 00:06:52.019 00:14:51 -- accel/accel.sh@20 -- # read -r var val 00:06:52.019 00:14:51 -- accel/accel.sh@21 -- # val= 00:06:52.019 00:14:51 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.019 00:14:51 -- accel/accel.sh@20 -- # IFS=: 00:06:52.019 00:14:51 -- accel/accel.sh@20 -- # read -r var val 00:06:53.477 00:14:52 -- accel/accel.sh@21 -- # val= 00:06:53.477 00:14:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.477 00:14:52 -- accel/accel.sh@20 -- # IFS=: 00:06:53.477 00:14:52 -- accel/accel.sh@20 -- # read -r var val 00:06:53.477 00:14:52 -- accel/accel.sh@21 -- # val= 00:06:53.477 00:14:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.477 00:14:52 -- accel/accel.sh@20 -- # IFS=: 00:06:53.477 00:14:52 -- accel/accel.sh@20 -- # read -r var val 00:06:53.477 00:14:52 -- accel/accel.sh@21 -- # val= 00:06:53.477 00:14:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.477 00:14:52 -- accel/accel.sh@20 -- # IFS=: 00:06:53.477 00:14:52 -- accel/accel.sh@20 -- # read -r var val 00:06:53.477 00:14:52 -- accel/accel.sh@21 -- # val= 00:06:53.477 00:14:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.477 00:14:52 -- accel/accel.sh@20 -- # IFS=: 00:06:53.477 00:14:52 -- accel/accel.sh@20 -- # read -r var val 00:06:53.477 00:14:52 -- accel/accel.sh@21 -- # val= 00:06:53.477 00:14:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.477 00:14:52 -- accel/accel.sh@20 -- # IFS=: 00:06:53.477 00:14:52 -- accel/accel.sh@20 -- # read -r var val 00:06:53.477 00:14:52 -- accel/accel.sh@21 -- # val= 00:06:53.477 00:14:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.477 00:14:52 -- accel/accel.sh@20 -- # IFS=: 00:06:53.477 00:14:52 -- accel/accel.sh@20 -- # read -r var val 00:06:53.477 00:14:52 -- accel/accel.sh@21 -- # val= 00:06:53.477 00:14:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.477 00:14:52 -- accel/accel.sh@20 -- # IFS=: 00:06:53.477 00:14:52 -- accel/accel.sh@20 -- # read -r var val 00:06:53.477 00:14:52 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:53.477 00:14:52 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:06:53.477 00:14:52 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:53.477 00:06:53.477 real 0m2.672s 00:06:53.477 user 0m2.414s 00:06:53.477 sys 0m0.268s 00:06:53.477 00:14:52 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:53.477 00:14:52 -- common/autotest_common.sh@10 -- # set +x 00:06:53.477 ************************************ 00:06:53.477 END TEST accel_decomp_mthread 00:06:53.477 ************************************ 00:06:53.477 00:14:52 -- accel/accel.sh@114 -- # run_test accel_deomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:06:53.477 00:14:52 -- common/autotest_common.sh@1077 -- # '[' 13 -le 1 ']' 00:06:53.477 00:14:52 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:53.477 00:14:52 -- common/autotest_common.sh@10 -- # set +x 00:06:53.477 ************************************ 00:06:53.477 START TEST accel_deomp_full_mthread 00:06:53.477 ************************************ 00:06:53.477 00:14:52 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:06:53.477 00:14:52 -- accel/accel.sh@16 -- # local accel_opc 00:06:53.477 00:14:52 -- accel/accel.sh@17 -- # local accel_module 00:06:53.477 00:14:52 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:06:53.477 00:14:52 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:06:53.477 00:14:52 -- accel/accel.sh@12 -- # build_accel_config 00:06:53.477 00:14:52 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:53.477 00:14:52 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:53.477 00:14:52 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:53.478 00:14:52 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:53.478 00:14:52 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:53.478 00:14:52 -- accel/accel.sh@41 -- # local IFS=, 00:06:53.478 00:14:52 -- accel/accel.sh@42 -- # jq -r . 00:06:53.478 [2024-07-15 00:14:52.199247] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:53.478 [2024-07-15 00:14:52.199339] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid323940 ] 00:06:53.478 EAL: No free 2048 kB hugepages reported on node 1 00:06:53.478 [2024-07-15 00:14:52.270726] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:53.478 [2024-07-15 00:14:52.338281] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:54.854 00:14:53 -- accel/accel.sh@18 -- # out='Preparing input file... 00:06:54.854 00:06:54.854 SPDK Configuration: 00:06:54.854 Core mask: 0x1 00:06:54.854 00:06:54.854 Accel Perf Configuration: 00:06:54.854 Workload Type: decompress 00:06:54.854 Transfer size: 111250 bytes 00:06:54.854 Vector count 1 00:06:54.854 Module: software 00:06:54.854 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:54.854 Queue depth: 32 00:06:54.854 Allocate depth: 32 00:06:54.854 # threads/core: 2 00:06:54.854 Run time: 1 seconds 00:06:54.854 Verify: Yes 00:06:54.854 00:06:54.854 Running for 1 seconds... 00:06:54.854 00:06:54.854 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:54.854 ------------------------------------------------------------------------------------ 00:06:54.854 0,1 2976/s 122 MiB/s 0 0 00:06:54.854 0,0 2912/s 120 MiB/s 0 0 00:06:54.854 ==================================================================================== 00:06:54.854 Total 5888/s 624 MiB/s 0 0' 00:06:54.854 00:14:53 -- accel/accel.sh@20 -- # IFS=: 00:06:54.854 00:14:53 -- accel/accel.sh@20 -- # read -r var val 00:06:54.854 00:14:53 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:06:54.854 00:14:53 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:06:54.854 00:14:53 -- accel/accel.sh@12 -- # build_accel_config 00:06:54.854 00:14:53 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:54.854 00:14:53 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:54.854 00:14:53 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:54.854 00:14:53 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:54.854 00:14:53 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:54.854 00:14:53 -- accel/accel.sh@41 -- # local IFS=, 00:06:54.854 00:14:53 -- accel/accel.sh@42 -- # jq -r . 00:06:54.854 [2024-07-15 00:14:53.551580] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:54.854 [2024-07-15 00:14:53.551670] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid324214 ] 00:06:54.854 EAL: No free 2048 kB hugepages reported on node 1 00:06:54.854 [2024-07-15 00:14:53.621669] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:54.854 [2024-07-15 00:14:53.687554] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:54.854 00:14:53 -- accel/accel.sh@21 -- # val= 00:06:54.854 00:14:53 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.854 00:14:53 -- accel/accel.sh@20 -- # IFS=: 00:06:54.854 00:14:53 -- accel/accel.sh@20 -- # read -r var val 00:06:54.854 00:14:53 -- accel/accel.sh@21 -- # val= 00:06:54.854 00:14:53 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.854 00:14:53 -- accel/accel.sh@20 -- # IFS=: 00:06:54.854 00:14:53 -- accel/accel.sh@20 -- # read -r var val 00:06:54.854 00:14:53 -- accel/accel.sh@21 -- # val= 00:06:54.854 00:14:53 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.854 00:14:53 -- accel/accel.sh@20 -- # IFS=: 00:06:54.854 00:14:53 -- accel/accel.sh@20 -- # read -r var val 00:06:54.854 00:14:53 -- accel/accel.sh@21 -- # val=0x1 00:06:54.854 00:14:53 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.854 00:14:53 -- accel/accel.sh@20 -- # IFS=: 00:06:54.854 00:14:53 -- accel/accel.sh@20 -- # read -r var val 00:06:54.854 00:14:53 -- accel/accel.sh@21 -- # val= 00:06:54.854 00:14:53 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.854 00:14:53 -- accel/accel.sh@20 -- # IFS=: 00:06:54.854 00:14:53 -- accel/accel.sh@20 -- # read -r var val 00:06:54.854 00:14:53 -- accel/accel.sh@21 -- # val= 00:06:54.854 00:14:53 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.854 00:14:53 -- accel/accel.sh@20 -- # IFS=: 00:06:54.854 00:14:53 -- accel/accel.sh@20 -- # read -r var val 00:06:54.854 00:14:53 -- accel/accel.sh@21 -- # val=decompress 00:06:54.854 00:14:53 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.854 00:14:53 -- accel/accel.sh@24 -- # accel_opc=decompress 00:06:54.854 00:14:53 -- accel/accel.sh@20 -- # IFS=: 00:06:54.854 00:14:53 -- accel/accel.sh@20 -- # read -r var val 00:06:54.854 00:14:53 -- accel/accel.sh@21 -- # val='111250 bytes' 00:06:54.854 00:14:53 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.854 00:14:53 -- accel/accel.sh@20 -- # IFS=: 00:06:54.854 00:14:53 -- accel/accel.sh@20 -- # read -r var val 00:06:54.854 00:14:53 -- accel/accel.sh@21 -- # val= 00:06:54.854 00:14:53 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.854 00:14:53 -- accel/accel.sh@20 -- # IFS=: 00:06:54.854 00:14:53 -- accel/accel.sh@20 -- # read -r var val 00:06:54.854 00:14:53 -- accel/accel.sh@21 -- # val=software 00:06:54.854 00:14:53 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.854 00:14:53 -- accel/accel.sh@23 -- # accel_module=software 00:06:54.854 00:14:53 -- accel/accel.sh@20 -- # IFS=: 00:06:54.854 00:14:53 -- accel/accel.sh@20 -- # read -r var val 00:06:54.854 00:14:53 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:54.854 00:14:53 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.854 00:14:53 -- accel/accel.sh@20 -- # IFS=: 00:06:54.854 00:14:53 -- accel/accel.sh@20 -- # read -r var val 00:06:54.854 00:14:53 -- accel/accel.sh@21 -- # val=32 00:06:54.854 00:14:53 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.854 00:14:53 -- accel/accel.sh@20 -- # IFS=: 00:06:54.854 00:14:53 -- accel/accel.sh@20 -- # read -r var val 00:06:54.854 00:14:53 -- accel/accel.sh@21 -- # val=32 00:06:54.854 00:14:53 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.854 00:14:53 -- accel/accel.sh@20 -- # IFS=: 00:06:54.854 00:14:53 -- accel/accel.sh@20 -- # read -r var val 00:06:54.854 00:14:53 -- accel/accel.sh@21 -- # val=2 00:06:54.854 00:14:53 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.854 00:14:53 -- accel/accel.sh@20 -- # IFS=: 00:06:54.854 00:14:53 -- accel/accel.sh@20 -- # read -r var val 00:06:54.854 00:14:53 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:54.854 00:14:53 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.854 00:14:53 -- accel/accel.sh@20 -- # IFS=: 00:06:54.854 00:14:53 -- accel/accel.sh@20 -- # read -r var val 00:06:54.854 00:14:53 -- accel/accel.sh@21 -- # val=Yes 00:06:54.854 00:14:53 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.854 00:14:53 -- accel/accel.sh@20 -- # IFS=: 00:06:54.854 00:14:53 -- accel/accel.sh@20 -- # read -r var val 00:06:54.854 00:14:53 -- accel/accel.sh@21 -- # val= 00:06:54.854 00:14:53 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.854 00:14:53 -- accel/accel.sh@20 -- # IFS=: 00:06:54.854 00:14:53 -- accel/accel.sh@20 -- # read -r var val 00:06:54.854 00:14:53 -- accel/accel.sh@21 -- # val= 00:06:54.854 00:14:53 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.854 00:14:53 -- accel/accel.sh@20 -- # IFS=: 00:06:54.854 00:14:53 -- accel/accel.sh@20 -- # read -r var val 00:06:56.232 00:14:54 -- accel/accel.sh@21 -- # val= 00:06:56.232 00:14:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.232 00:14:54 -- accel/accel.sh@20 -- # IFS=: 00:06:56.232 00:14:54 -- accel/accel.sh@20 -- # read -r var val 00:06:56.232 00:14:54 -- accel/accel.sh@21 -- # val= 00:06:56.232 00:14:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.232 00:14:54 -- accel/accel.sh@20 -- # IFS=: 00:06:56.232 00:14:54 -- accel/accel.sh@20 -- # read -r var val 00:06:56.232 00:14:54 -- accel/accel.sh@21 -- # val= 00:06:56.232 00:14:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.232 00:14:54 -- accel/accel.sh@20 -- # IFS=: 00:06:56.232 00:14:54 -- accel/accel.sh@20 -- # read -r var val 00:06:56.232 00:14:54 -- accel/accel.sh@21 -- # val= 00:06:56.232 00:14:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.232 00:14:54 -- accel/accel.sh@20 -- # IFS=: 00:06:56.232 00:14:54 -- accel/accel.sh@20 -- # read -r var val 00:06:56.232 00:14:54 -- accel/accel.sh@21 -- # val= 00:06:56.232 00:14:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.232 00:14:54 -- accel/accel.sh@20 -- # IFS=: 00:06:56.232 00:14:54 -- accel/accel.sh@20 -- # read -r var val 00:06:56.232 00:14:54 -- accel/accel.sh@21 -- # val= 00:06:56.232 00:14:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.232 00:14:54 -- accel/accel.sh@20 -- # IFS=: 00:06:56.232 00:14:54 -- accel/accel.sh@20 -- # read -r var val 00:06:56.232 00:14:54 -- accel/accel.sh@21 -- # val= 00:06:56.232 00:14:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.232 00:14:54 -- accel/accel.sh@20 -- # IFS=: 00:06:56.232 00:14:54 -- accel/accel.sh@20 -- # read -r var val 00:06:56.232 00:14:54 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:56.232 00:14:54 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:06:56.232 00:14:54 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:56.232 00:06:56.232 real 0m2.705s 00:06:56.232 user 0m2.451s 00:06:56.232 sys 0m0.261s 00:06:56.232 00:14:54 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:56.232 00:14:54 -- common/autotest_common.sh@10 -- # set +x 00:06:56.232 ************************************ 00:06:56.232 END TEST accel_deomp_full_mthread 00:06:56.232 ************************************ 00:06:56.232 00:14:54 -- accel/accel.sh@116 -- # [[ n == y ]] 00:06:56.232 00:14:54 -- accel/accel.sh@129 -- # run_test accel_dif_functional_tests /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:06:56.232 00:14:54 -- accel/accel.sh@129 -- # build_accel_config 00:06:56.232 00:14:54 -- common/autotest_common.sh@1077 -- # '[' 4 -le 1 ']' 00:06:56.232 00:14:54 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:56.232 00:14:54 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:56.232 00:14:54 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:56.232 00:14:54 -- common/autotest_common.sh@10 -- # set +x 00:06:56.232 00:14:54 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:56.232 00:14:54 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:56.232 00:14:54 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:56.232 00:14:54 -- accel/accel.sh@41 -- # local IFS=, 00:06:56.232 00:14:54 -- accel/accel.sh@42 -- # jq -r . 00:06:56.232 ************************************ 00:06:56.232 START TEST accel_dif_functional_tests 00:06:56.232 ************************************ 00:06:56.232 00:14:54 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:06:56.232 [2024-07-15 00:14:54.956617] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:56.232 [2024-07-15 00:14:54.956704] [ DPDK EAL parameters: DIF --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid324496 ] 00:06:56.232 EAL: No free 2048 kB hugepages reported on node 1 00:06:56.232 [2024-07-15 00:14:55.026425] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:06:56.232 [2024-07-15 00:14:55.095636] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:56.232 [2024-07-15 00:14:55.095731] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:56.232 [2024-07-15 00:14:55.095732] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:56.232 00:06:56.232 00:06:56.232 CUnit - A unit testing framework for C - Version 2.1-3 00:06:56.232 http://cunit.sourceforge.net/ 00:06:56.232 00:06:56.232 00:06:56.232 Suite: accel_dif 00:06:56.232 Test: verify: DIF generated, GUARD check ...passed 00:06:56.232 Test: verify: DIF generated, APPTAG check ...passed 00:06:56.232 Test: verify: DIF generated, REFTAG check ...passed 00:06:56.232 Test: verify: DIF not generated, GUARD check ...[2024-07-15 00:14:55.163124] dif.c: 779:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:06:56.232 [2024-07-15 00:14:55.163172] dif.c: 779:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:06:56.232 passed 00:06:56.232 Test: verify: DIF not generated, APPTAG check ...[2024-07-15 00:14:55.163222] dif.c: 794:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:06:56.232 [2024-07-15 00:14:55.163241] dif.c: 794:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:06:56.232 passed 00:06:56.233 Test: verify: DIF not generated, REFTAG check ...[2024-07-15 00:14:55.163262] dif.c: 815:_dif_verify: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:06:56.233 [2024-07-15 00:14:55.163280] dif.c: 815:_dif_verify: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:06:56.233 passed 00:06:56.233 Test: verify: APPTAG correct, APPTAG check ...passed 00:06:56.233 Test: verify: APPTAG incorrect, APPTAG check ...[2024-07-15 00:14:55.163325] dif.c: 794:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=30, Expected=28, Actual=14 00:06:56.233 passed 00:06:56.233 Test: verify: APPTAG incorrect, no APPTAG check ...passed 00:06:56.233 Test: verify: REFTAG incorrect, REFTAG ignore ...passed 00:06:56.233 Test: verify: REFTAG_INIT correct, REFTAG check ...passed 00:06:56.233 Test: verify: REFTAG_INIT incorrect, REFTAG check ...[2024-07-15 00:14:55.163424] dif.c: 815:_dif_verify: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=10 00:06:56.233 passed 00:06:56.233 Test: generate copy: DIF generated, GUARD check ...passed 00:06:56.233 Test: generate copy: DIF generated, APTTAG check ...passed 00:06:56.233 Test: generate copy: DIF generated, REFTAG check ...passed 00:06:56.233 Test: generate copy: DIF generated, no GUARD check flag set ...passed 00:06:56.233 Test: generate copy: DIF generated, no APPTAG check flag set ...passed 00:06:56.233 Test: generate copy: DIF generated, no REFTAG check flag set ...passed 00:06:56.233 Test: generate copy: iovecs-len validate ...[2024-07-15 00:14:55.163607] dif.c:1167:spdk_dif_generate_copy: *ERROR*: Size of bounce_iovs arrays are not valid or misaligned with block_size. 00:06:56.233 passed 00:06:56.233 Test: generate copy: buffer alignment validate ...passed 00:06:56.233 00:06:56.233 Run Summary: Type Total Ran Passed Failed Inactive 00:06:56.233 suites 1 1 n/a 0 0 00:06:56.233 tests 20 20 20 0 0 00:06:56.233 asserts 204 204 204 0 n/a 00:06:56.233 00:06:56.233 Elapsed time = 0.002 seconds 00:06:56.492 00:06:56.492 real 0m0.390s 00:06:56.492 user 0m0.580s 00:06:56.492 sys 0m0.163s 00:06:56.492 00:14:55 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:56.492 00:14:55 -- common/autotest_common.sh@10 -- # set +x 00:06:56.492 ************************************ 00:06:56.492 END TEST accel_dif_functional_tests 00:06:56.492 ************************************ 00:06:56.492 00:06:56.492 real 0m56.889s 00:06:56.492 user 1m4.502s 00:06:56.492 sys 0m7.073s 00:06:56.492 00:14:55 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:56.492 00:14:55 -- common/autotest_common.sh@10 -- # set +x 00:06:56.492 ************************************ 00:06:56.492 END TEST accel 00:06:56.492 ************************************ 00:06:56.492 00:14:55 -- spdk/autotest.sh@190 -- # run_test accel_rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel_rpc.sh 00:06:56.492 00:14:55 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:56.492 00:14:55 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:56.492 00:14:55 -- common/autotest_common.sh@10 -- # set +x 00:06:56.492 ************************************ 00:06:56.492 START TEST accel_rpc 00:06:56.492 ************************************ 00:06:56.492 00:14:55 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel_rpc.sh 00:06:56.492 * Looking for test storage... 00:06:56.492 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel 00:06:56.492 00:14:55 -- accel/accel_rpc.sh@11 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:06:56.492 00:14:55 -- accel/accel_rpc.sh@13 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --wait-for-rpc 00:06:56.492 00:14:55 -- accel/accel_rpc.sh@14 -- # spdk_tgt_pid=324659 00:06:56.492 00:14:55 -- accel/accel_rpc.sh@15 -- # waitforlisten 324659 00:06:56.492 00:14:55 -- common/autotest_common.sh@819 -- # '[' -z 324659 ']' 00:06:56.492 00:14:55 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:56.492 00:14:55 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:56.492 00:14:55 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:56.492 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:56.492 00:14:55 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:56.492 00:14:55 -- common/autotest_common.sh@10 -- # set +x 00:06:56.492 [2024-07-15 00:14:55.517664] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:56.492 [2024-07-15 00:14:55.517719] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid324659 ] 00:06:56.750 EAL: No free 2048 kB hugepages reported on node 1 00:06:56.750 [2024-07-15 00:14:55.583037] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:56.750 [2024-07-15 00:14:55.654623] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:56.750 [2024-07-15 00:14:55.654732] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:56.750 00:14:55 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:56.750 00:14:55 -- common/autotest_common.sh@852 -- # return 0 00:06:56.750 00:14:55 -- accel/accel_rpc.sh@45 -- # [[ y == y ]] 00:06:56.750 00:14:55 -- accel/accel_rpc.sh@45 -- # [[ 0 -gt 0 ]] 00:06:56.750 00:14:55 -- accel/accel_rpc.sh@49 -- # [[ y == y ]] 00:06:56.750 00:14:55 -- accel/accel_rpc.sh@49 -- # [[ 0 -gt 0 ]] 00:06:56.750 00:14:55 -- accel/accel_rpc.sh@53 -- # run_test accel_assign_opcode accel_assign_opcode_test_suite 00:06:56.750 00:14:55 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:56.750 00:14:55 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:56.750 00:14:55 -- common/autotest_common.sh@10 -- # set +x 00:06:56.750 ************************************ 00:06:56.750 START TEST accel_assign_opcode 00:06:56.750 ************************************ 00:06:56.750 00:14:55 -- common/autotest_common.sh@1104 -- # accel_assign_opcode_test_suite 00:06:56.750 00:14:55 -- accel/accel_rpc.sh@38 -- # rpc_cmd accel_assign_opc -o copy -m incorrect 00:06:56.750 00:14:55 -- common/autotest_common.sh@551 -- # xtrace_disable 00:06:56.750 00:14:55 -- common/autotest_common.sh@10 -- # set +x 00:06:56.750 [2024-07-15 00:14:55.715200] accel_rpc.c: 168:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module incorrect 00:06:56.750 00:14:55 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:06:56.750 00:14:55 -- accel/accel_rpc.sh@40 -- # rpc_cmd accel_assign_opc -o copy -m software 00:06:56.750 00:14:55 -- common/autotest_common.sh@551 -- # xtrace_disable 00:06:56.750 00:14:55 -- common/autotest_common.sh@10 -- # set +x 00:06:56.750 [2024-07-15 00:14:55.723211] accel_rpc.c: 168:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module software 00:06:56.750 00:14:55 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:06:56.750 00:14:55 -- accel/accel_rpc.sh@41 -- # rpc_cmd framework_start_init 00:06:56.750 00:14:55 -- common/autotest_common.sh@551 -- # xtrace_disable 00:06:56.750 00:14:55 -- common/autotest_common.sh@10 -- # set +x 00:06:57.009 00:14:55 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:06:57.009 00:14:55 -- accel/accel_rpc.sh@42 -- # rpc_cmd accel_get_opc_assignments 00:06:57.009 00:14:55 -- common/autotest_common.sh@551 -- # xtrace_disable 00:06:57.009 00:14:55 -- accel/accel_rpc.sh@42 -- # jq -r .copy 00:06:57.009 00:14:55 -- common/autotest_common.sh@10 -- # set +x 00:06:57.009 00:14:55 -- accel/accel_rpc.sh@42 -- # grep software 00:06:57.009 00:14:55 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:06:57.009 software 00:06:57.009 00:06:57.009 real 0m0.240s 00:06:57.009 user 0m0.054s 00:06:57.009 sys 0m0.006s 00:06:57.009 00:14:55 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:57.009 00:14:55 -- common/autotest_common.sh@10 -- # set +x 00:06:57.009 ************************************ 00:06:57.009 END TEST accel_assign_opcode 00:06:57.009 ************************************ 00:06:57.009 00:14:55 -- accel/accel_rpc.sh@55 -- # killprocess 324659 00:06:57.009 00:14:55 -- common/autotest_common.sh@926 -- # '[' -z 324659 ']' 00:06:57.009 00:14:55 -- common/autotest_common.sh@930 -- # kill -0 324659 00:06:57.009 00:14:55 -- common/autotest_common.sh@931 -- # uname 00:06:57.009 00:14:55 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:06:57.009 00:14:55 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 324659 00:06:57.009 00:14:56 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:06:57.009 00:14:56 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:06:57.009 00:14:56 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 324659' 00:06:57.009 killing process with pid 324659 00:06:57.009 00:14:56 -- common/autotest_common.sh@945 -- # kill 324659 00:06:57.009 00:14:56 -- common/autotest_common.sh@950 -- # wait 324659 00:06:57.578 00:06:57.578 real 0m0.934s 00:06:57.578 user 0m0.876s 00:06:57.578 sys 0m0.403s 00:06:57.578 00:14:56 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:57.578 00:14:56 -- common/autotest_common.sh@10 -- # set +x 00:06:57.578 ************************************ 00:06:57.578 END TEST accel_rpc 00:06:57.578 ************************************ 00:06:57.578 00:14:56 -- spdk/autotest.sh@191 -- # run_test app_cmdline /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/cmdline.sh 00:06:57.578 00:14:56 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:57.578 00:14:56 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:57.578 00:14:56 -- common/autotest_common.sh@10 -- # set +x 00:06:57.578 ************************************ 00:06:57.578 START TEST app_cmdline 00:06:57.578 ************************************ 00:06:57.578 00:14:56 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/cmdline.sh 00:06:57.578 * Looking for test storage... 00:06:57.578 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:06:57.578 00:14:56 -- app/cmdline.sh@14 -- # trap 'killprocess $spdk_tgt_pid' EXIT 00:06:57.578 00:14:56 -- app/cmdline.sh@17 -- # spdk_tgt_pid=324899 00:06:57.578 00:14:56 -- app/cmdline.sh@18 -- # waitforlisten 324899 00:06:57.578 00:14:56 -- app/cmdline.sh@16 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --rpcs-allowed spdk_get_version,rpc_get_methods 00:06:57.578 00:14:56 -- common/autotest_common.sh@819 -- # '[' -z 324899 ']' 00:06:57.578 00:14:56 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:57.578 00:14:56 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:57.578 00:14:56 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:57.578 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:57.578 00:14:56 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:57.578 00:14:56 -- common/autotest_common.sh@10 -- # set +x 00:06:57.578 [2024-07-15 00:14:56.523898] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:57.578 [2024-07-15 00:14:56.523971] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid324899 ] 00:06:57.578 EAL: No free 2048 kB hugepages reported on node 1 00:06:57.578 [2024-07-15 00:14:56.591890] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:57.838 [2024-07-15 00:14:56.667749] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:57.838 [2024-07-15 00:14:56.667853] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:58.405 00:14:57 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:58.405 00:14:57 -- common/autotest_common.sh@852 -- # return 0 00:06:58.405 00:14:57 -- app/cmdline.sh@20 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py spdk_get_version 00:06:58.664 { 00:06:58.664 "version": "SPDK v24.01.1-pre git sha1 4b94202c6", 00:06:58.664 "fields": { 00:06:58.664 "major": 24, 00:06:58.664 "minor": 1, 00:06:58.664 "patch": 1, 00:06:58.664 "suffix": "-pre", 00:06:58.664 "commit": "4b94202c6" 00:06:58.664 } 00:06:58.664 } 00:06:58.664 00:14:57 -- app/cmdline.sh@22 -- # expected_methods=() 00:06:58.664 00:14:57 -- app/cmdline.sh@23 -- # expected_methods+=("rpc_get_methods") 00:06:58.664 00:14:57 -- app/cmdline.sh@24 -- # expected_methods+=("spdk_get_version") 00:06:58.664 00:14:57 -- app/cmdline.sh@26 -- # methods=($(rpc_cmd rpc_get_methods | jq -r ".[]" | sort)) 00:06:58.664 00:14:57 -- app/cmdline.sh@26 -- # sort 00:06:58.664 00:14:57 -- app/cmdline.sh@26 -- # rpc_cmd rpc_get_methods 00:06:58.664 00:14:57 -- app/cmdline.sh@26 -- # jq -r '.[]' 00:06:58.664 00:14:57 -- common/autotest_common.sh@551 -- # xtrace_disable 00:06:58.664 00:14:57 -- common/autotest_common.sh@10 -- # set +x 00:06:58.665 00:14:57 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:06:58.665 00:14:57 -- app/cmdline.sh@27 -- # (( 2 == 2 )) 00:06:58.665 00:14:57 -- app/cmdline.sh@28 -- # [[ rpc_get_methods spdk_get_version == \r\p\c\_\g\e\t\_\m\e\t\h\o\d\s\ \s\p\d\k\_\g\e\t\_\v\e\r\s\i\o\n ]] 00:06:58.665 00:14:57 -- app/cmdline.sh@30 -- # NOT /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:06:58.665 00:14:57 -- common/autotest_common.sh@640 -- # local es=0 00:06:58.665 00:14:57 -- common/autotest_common.sh@642 -- # valid_exec_arg /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:06:58.665 00:14:57 -- common/autotest_common.sh@628 -- # local arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:06:58.665 00:14:57 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:58.665 00:14:57 -- common/autotest_common.sh@632 -- # type -t /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:06:58.665 00:14:57 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:58.665 00:14:57 -- common/autotest_common.sh@634 -- # type -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:06:58.665 00:14:57 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:58.665 00:14:57 -- common/autotest_common.sh@634 -- # arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:06:58.665 00:14:57 -- common/autotest_common.sh@634 -- # [[ -x /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py ]] 00:06:58.665 00:14:57 -- common/autotest_common.sh@643 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:06:58.665 request: 00:06:58.665 { 00:06:58.665 "method": "env_dpdk_get_mem_stats", 00:06:58.665 "req_id": 1 00:06:58.665 } 00:06:58.665 Got JSON-RPC error response 00:06:58.665 response: 00:06:58.665 { 00:06:58.665 "code": -32601, 00:06:58.665 "message": "Method not found" 00:06:58.665 } 00:06:58.665 00:14:57 -- common/autotest_common.sh@643 -- # es=1 00:06:58.665 00:14:57 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:06:58.665 00:14:57 -- common/autotest_common.sh@662 -- # [[ -n '' ]] 00:06:58.665 00:14:57 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:06:58.665 00:14:57 -- app/cmdline.sh@1 -- # killprocess 324899 00:06:58.665 00:14:57 -- common/autotest_common.sh@926 -- # '[' -z 324899 ']' 00:06:58.665 00:14:57 -- common/autotest_common.sh@930 -- # kill -0 324899 00:06:58.665 00:14:57 -- common/autotest_common.sh@931 -- # uname 00:06:58.924 00:14:57 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:06:58.924 00:14:57 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 324899 00:06:58.924 00:14:57 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:06:58.924 00:14:57 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:06:58.924 00:14:57 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 324899' 00:06:58.924 killing process with pid 324899 00:06:58.924 00:14:57 -- common/autotest_common.sh@945 -- # kill 324899 00:06:58.924 00:14:57 -- common/autotest_common.sh@950 -- # wait 324899 00:06:59.184 00:06:59.184 real 0m1.665s 00:06:59.184 user 0m1.919s 00:06:59.184 sys 0m0.484s 00:06:59.184 00:14:58 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:59.184 00:14:58 -- common/autotest_common.sh@10 -- # set +x 00:06:59.184 ************************************ 00:06:59.184 END TEST app_cmdline 00:06:59.184 ************************************ 00:06:59.184 00:14:58 -- spdk/autotest.sh@192 -- # run_test version /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/version.sh 00:06:59.184 00:14:58 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:59.184 00:14:58 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:59.184 00:14:58 -- common/autotest_common.sh@10 -- # set +x 00:06:59.184 ************************************ 00:06:59.184 START TEST version 00:06:59.184 ************************************ 00:06:59.184 00:14:58 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/version.sh 00:06:59.184 * Looking for test storage... 00:06:59.184 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:06:59.184 00:14:58 -- app/version.sh@17 -- # get_header_version major 00:06:59.184 00:14:58 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MAJOR[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:06:59.184 00:14:58 -- app/version.sh@14 -- # cut -f2 00:06:59.184 00:14:58 -- app/version.sh@14 -- # tr -d '"' 00:06:59.184 00:14:58 -- app/version.sh@17 -- # major=24 00:06:59.184 00:14:58 -- app/version.sh@18 -- # get_header_version minor 00:06:59.184 00:14:58 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MINOR[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:06:59.184 00:14:58 -- app/version.sh@14 -- # cut -f2 00:06:59.184 00:14:58 -- app/version.sh@14 -- # tr -d '"' 00:06:59.184 00:14:58 -- app/version.sh@18 -- # minor=1 00:06:59.184 00:14:58 -- app/version.sh@19 -- # get_header_version patch 00:06:59.184 00:14:58 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_PATCH[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:06:59.184 00:14:58 -- app/version.sh@14 -- # cut -f2 00:06:59.184 00:14:58 -- app/version.sh@14 -- # tr -d '"' 00:06:59.184 00:14:58 -- app/version.sh@19 -- # patch=1 00:06:59.184 00:14:58 -- app/version.sh@20 -- # get_header_version suffix 00:06:59.184 00:14:58 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_SUFFIX[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:06:59.184 00:14:58 -- app/version.sh@14 -- # tr -d '"' 00:06:59.184 00:14:58 -- app/version.sh@14 -- # cut -f2 00:06:59.444 00:14:58 -- app/version.sh@20 -- # suffix=-pre 00:06:59.444 00:14:58 -- app/version.sh@22 -- # version=24.1 00:06:59.444 00:14:58 -- app/version.sh@25 -- # (( patch != 0 )) 00:06:59.444 00:14:58 -- app/version.sh@25 -- # version=24.1.1 00:06:59.444 00:14:58 -- app/version.sh@28 -- # version=24.1.1rc0 00:06:59.444 00:14:58 -- app/version.sh@30 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:06:59.444 00:14:58 -- app/version.sh@30 -- # python3 -c 'import spdk; print(spdk.__version__)' 00:06:59.444 00:14:58 -- app/version.sh@30 -- # py_version=24.1.1rc0 00:06:59.444 00:14:58 -- app/version.sh@31 -- # [[ 24.1.1rc0 == \2\4\.\1\.\1\r\c\0 ]] 00:06:59.444 00:06:59.444 real 0m0.174s 00:06:59.444 user 0m0.089s 00:06:59.444 sys 0m0.126s 00:06:59.444 00:14:58 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:59.444 00:14:58 -- common/autotest_common.sh@10 -- # set +x 00:06:59.444 ************************************ 00:06:59.444 END TEST version 00:06:59.444 ************************************ 00:06:59.444 00:14:58 -- spdk/autotest.sh@194 -- # '[' 0 -eq 1 ']' 00:06:59.444 00:14:58 -- spdk/autotest.sh@204 -- # uname -s 00:06:59.444 00:14:58 -- spdk/autotest.sh@204 -- # [[ Linux == Linux ]] 00:06:59.444 00:14:58 -- spdk/autotest.sh@205 -- # [[ 0 -eq 1 ]] 00:06:59.444 00:14:58 -- spdk/autotest.sh@205 -- # [[ 0 -eq 1 ]] 00:06:59.444 00:14:58 -- spdk/autotest.sh@217 -- # '[' 0 -eq 1 ']' 00:06:59.444 00:14:58 -- spdk/autotest.sh@264 -- # '[' 0 -eq 1 ']' 00:06:59.444 00:14:58 -- spdk/autotest.sh@268 -- # timing_exit lib 00:06:59.444 00:14:58 -- common/autotest_common.sh@718 -- # xtrace_disable 00:06:59.444 00:14:58 -- common/autotest_common.sh@10 -- # set +x 00:06:59.444 00:14:58 -- spdk/autotest.sh@270 -- # '[' 0 -eq 1 ']' 00:06:59.444 00:14:58 -- spdk/autotest.sh@278 -- # '[' 0 -eq 1 ']' 00:06:59.444 00:14:58 -- spdk/autotest.sh@287 -- # '[' 0 -eq 1 ']' 00:06:59.444 00:14:58 -- spdk/autotest.sh@311 -- # '[' 0 -eq 1 ']' 00:06:59.444 00:14:58 -- spdk/autotest.sh@315 -- # '[' 0 -eq 1 ']' 00:06:59.444 00:14:58 -- spdk/autotest.sh@319 -- # '[' 0 -eq 1 ']' 00:06:59.444 00:14:58 -- spdk/autotest.sh@324 -- # '[' 0 -eq 1 ']' 00:06:59.444 00:14:58 -- spdk/autotest.sh@333 -- # '[' 0 -eq 1 ']' 00:06:59.444 00:14:58 -- spdk/autotest.sh@338 -- # '[' 0 -eq 1 ']' 00:06:59.444 00:14:58 -- spdk/autotest.sh@342 -- # '[' 0 -eq 1 ']' 00:06:59.444 00:14:58 -- spdk/autotest.sh@346 -- # '[' 0 -eq 1 ']' 00:06:59.444 00:14:58 -- spdk/autotest.sh@350 -- # '[' 0 -eq 1 ']' 00:06:59.444 00:14:58 -- spdk/autotest.sh@355 -- # '[' 0 -eq 1 ']' 00:06:59.444 00:14:58 -- spdk/autotest.sh@359 -- # '[' 0 -eq 1 ']' 00:06:59.444 00:14:58 -- spdk/autotest.sh@366 -- # [[ 0 -eq 1 ]] 00:06:59.444 00:14:58 -- spdk/autotest.sh@370 -- # [[ 0 -eq 1 ]] 00:06:59.444 00:14:58 -- spdk/autotest.sh@374 -- # [[ 1 -eq 1 ]] 00:06:59.444 00:14:58 -- spdk/autotest.sh@375 -- # run_test llvm_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm.sh 00:06:59.444 00:14:58 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:59.444 00:14:58 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:59.444 00:14:58 -- common/autotest_common.sh@10 -- # set +x 00:06:59.444 ************************************ 00:06:59.444 START TEST llvm_fuzz 00:06:59.444 ************************************ 00:06:59.444 00:14:58 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm.sh 00:06:59.444 * Looking for test storage... 00:06:59.444 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz 00:06:59.444 00:14:58 -- fuzz/llvm.sh@11 -- # fuzzers=($(get_fuzzer_targets)) 00:06:59.444 00:14:58 -- fuzz/llvm.sh@11 -- # get_fuzzer_targets 00:06:59.444 00:14:58 -- common/autotest_common.sh@538 -- # fuzzers=() 00:06:59.444 00:14:58 -- common/autotest_common.sh@538 -- # local fuzzers 00:06:59.444 00:14:58 -- common/autotest_common.sh@540 -- # [[ -n '' ]] 00:06:59.444 00:14:58 -- common/autotest_common.sh@543 -- # fuzzers=("$rootdir/test/fuzz/llvm/"*) 00:06:59.444 00:14:58 -- common/autotest_common.sh@544 -- # fuzzers=("${fuzzers[@]##*/}") 00:06:59.444 00:14:58 -- common/autotest_common.sh@547 -- # echo 'common.sh llvm-gcov.sh nvmf vfio' 00:06:59.444 00:14:58 -- fuzz/llvm.sh@13 -- # llvm_out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm 00:06:59.444 00:14:58 -- fuzz/llvm.sh@15 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/coverage 00:06:59.706 00:14:58 -- fuzz/llvm.sh@56 -- # [[ 1 -eq 0 ]] 00:06:59.706 00:14:58 -- fuzz/llvm.sh@60 -- # for fuzzer in "${fuzzers[@]}" 00:06:59.706 00:14:58 -- fuzz/llvm.sh@61 -- # case "$fuzzer" in 00:06:59.706 00:14:58 -- fuzz/llvm.sh@60 -- # for fuzzer in "${fuzzers[@]}" 00:06:59.706 00:14:58 -- fuzz/llvm.sh@61 -- # case "$fuzzer" in 00:06:59.706 00:14:58 -- fuzz/llvm.sh@60 -- # for fuzzer in "${fuzzers[@]}" 00:06:59.706 00:14:58 -- fuzz/llvm.sh@61 -- # case "$fuzzer" in 00:06:59.706 00:14:58 -- fuzz/llvm.sh@62 -- # run_test nvmf_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/run.sh 00:06:59.706 00:14:58 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:59.706 00:14:58 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:59.706 00:14:58 -- common/autotest_common.sh@10 -- # set +x 00:06:59.706 ************************************ 00:06:59.706 START TEST nvmf_fuzz 00:06:59.706 ************************************ 00:06:59.706 00:14:58 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/run.sh 00:06:59.706 * Looking for test storage... 00:06:59.706 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:06:59.706 00:14:58 -- nvmf/run.sh@52 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/common.sh 00:06:59.706 00:14:58 -- setup/common.sh@6 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh 00:06:59.706 00:14:58 -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:06:59.706 00:14:58 -- common/autotest_common.sh@34 -- # set -e 00:06:59.706 00:14:58 -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:06:59.706 00:14:58 -- common/autotest_common.sh@36 -- # shopt -s extglob 00:06:59.706 00:14:58 -- common/autotest_common.sh@38 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh ]] 00:06:59.706 00:14:58 -- common/autotest_common.sh@39 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh 00:06:59.706 00:14:58 -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:06:59.706 00:14:58 -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:06:59.706 00:14:58 -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=n 00:06:59.706 00:14:58 -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:06:59.706 00:14:58 -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:06:59.706 00:14:58 -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:06:59.706 00:14:58 -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:06:59.706 00:14:58 -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:06:59.706 00:14:58 -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:06:59.706 00:14:58 -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:06:59.706 00:14:58 -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:06:59.706 00:14:58 -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:06:59.706 00:14:58 -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:06:59.706 00:14:58 -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:06:59.706 00:14:58 -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:06:59.706 00:14:58 -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:06:59.706 00:14:58 -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:06:59.706 00:14:58 -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:06:59.706 00:14:58 -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:06:59.706 00:14:58 -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:06:59.706 00:14:58 -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:06:59.706 00:14:58 -- common/build_config.sh@22 -- # CONFIG_CET=n 00:06:59.706 00:14:58 -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=n 00:06:59.706 00:14:58 -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:06:59.706 00:14:58 -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:06:59.706 00:14:58 -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:06:59.706 00:14:58 -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:06:59.706 00:14:58 -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:06:59.706 00:14:58 -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:06:59.706 00:14:58 -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:06:59.706 00:14:58 -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:06:59.706 00:14:58 -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:06:59.706 00:14:58 -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:06:59.706 00:14:58 -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB=/usr/lib64/clang/16/lib/libclang_rt.fuzzer_no_main-x86_64.a 00:06:59.706 00:14:58 -- common/build_config.sh@35 -- # CONFIG_FUZZER=y 00:06:59.706 00:14:58 -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build 00:06:59.706 00:14:58 -- common/build_config.sh@37 -- # CONFIG_CRYPTO=n 00:06:59.706 00:14:58 -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:06:59.706 00:14:58 -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:06:59.706 00:14:58 -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:06:59.706 00:14:58 -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR= 00:06:59.706 00:14:58 -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:06:59.706 00:14:58 -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:06:59.706 00:14:58 -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:06:59.706 00:14:58 -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:06:59.706 00:14:58 -- common/build_config.sh@46 -- # CONFIG_COVERAGE=y 00:06:59.706 00:14:58 -- common/build_config.sh@47 -- # CONFIG_RDMA=y 00:06:59.706 00:14:58 -- common/build_config.sh@48 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:06:59.706 00:14:58 -- common/build_config.sh@49 -- # CONFIG_URING_PATH= 00:06:59.706 00:14:58 -- common/build_config.sh@50 -- # CONFIG_XNVME=n 00:06:59.706 00:14:58 -- common/build_config.sh@51 -- # CONFIG_VFIO_USER=y 00:06:59.706 00:14:58 -- common/build_config.sh@52 -- # CONFIG_ARCH=native 00:06:59.706 00:14:58 -- common/build_config.sh@53 -- # CONFIG_URING_ZNS=n 00:06:59.706 00:14:58 -- common/build_config.sh@54 -- # CONFIG_WERROR=y 00:06:59.706 00:14:58 -- common/build_config.sh@55 -- # CONFIG_HAVE_LIBBSD=n 00:06:59.706 00:14:58 -- common/build_config.sh@56 -- # CONFIG_UBSAN=y 00:06:59.706 00:14:58 -- common/build_config.sh@57 -- # CONFIG_IPSEC_MB_DIR= 00:06:59.706 00:14:58 -- common/build_config.sh@58 -- # CONFIG_GOLANG=n 00:06:59.706 00:14:58 -- common/build_config.sh@59 -- # CONFIG_ISAL=y 00:06:59.706 00:14:58 -- common/build_config.sh@60 -- # CONFIG_IDXD_KERNEL=y 00:06:59.706 00:14:58 -- common/build_config.sh@61 -- # CONFIG_DPDK_LIB_DIR= 00:06:59.706 00:14:58 -- common/build_config.sh@62 -- # CONFIG_RDMA_PROV=verbs 00:06:59.706 00:14:58 -- common/build_config.sh@63 -- # CONFIG_APPS=y 00:06:59.706 00:14:58 -- common/build_config.sh@64 -- # CONFIG_SHARED=n 00:06:59.706 00:14:58 -- common/build_config.sh@65 -- # CONFIG_FC_PATH= 00:06:59.706 00:14:58 -- common/build_config.sh@66 -- # CONFIG_DPDK_PKG_CONFIG=n 00:06:59.706 00:14:58 -- common/build_config.sh@67 -- # CONFIG_FC=n 00:06:59.706 00:14:58 -- common/build_config.sh@68 -- # CONFIG_AVAHI=n 00:06:59.706 00:14:58 -- common/build_config.sh@69 -- # CONFIG_FIO_PLUGIN=y 00:06:59.706 00:14:58 -- common/build_config.sh@70 -- # CONFIG_RAID5F=n 00:06:59.706 00:14:58 -- common/build_config.sh@71 -- # CONFIG_EXAMPLES=y 00:06:59.706 00:14:58 -- common/build_config.sh@72 -- # CONFIG_TESTS=y 00:06:59.706 00:14:58 -- common/build_config.sh@73 -- # CONFIG_CRYPTO_MLX5=n 00:06:59.706 00:14:58 -- common/build_config.sh@74 -- # CONFIG_MAX_LCORES= 00:06:59.706 00:14:58 -- common/build_config.sh@75 -- # CONFIG_IPSEC_MB=n 00:06:59.706 00:14:58 -- common/build_config.sh@76 -- # CONFIG_DEBUG=y 00:06:59.706 00:14:58 -- common/build_config.sh@77 -- # CONFIG_DPDK_COMPRESSDEV=n 00:06:59.706 00:14:58 -- common/build_config.sh@78 -- # CONFIG_CROSS_PREFIX= 00:06:59.706 00:14:58 -- common/build_config.sh@79 -- # CONFIG_URING=n 00:06:59.706 00:14:58 -- common/autotest_common.sh@48 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:06:59.706 00:14:58 -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:06:59.706 00:14:58 -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:06:59.706 00:14:58 -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:06:59.706 00:14:58 -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:06:59.706 00:14:58 -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:06:59.706 00:14:58 -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:06:59.706 00:14:58 -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:06:59.706 00:14:58 -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:06:59.706 00:14:58 -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:06:59.706 00:14:58 -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:06:59.706 00:14:58 -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:06:59.706 00:14:58 -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:06:59.706 00:14:58 -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:06:59.706 00:14:58 -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/config.h ]] 00:06:59.706 00:14:58 -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:06:59.706 #define SPDK_CONFIG_H 00:06:59.706 #define SPDK_CONFIG_APPS 1 00:06:59.707 #define SPDK_CONFIG_ARCH native 00:06:59.707 #undef SPDK_CONFIG_ASAN 00:06:59.707 #undef SPDK_CONFIG_AVAHI 00:06:59.707 #undef SPDK_CONFIG_CET 00:06:59.707 #define SPDK_CONFIG_COVERAGE 1 00:06:59.707 #define SPDK_CONFIG_CROSS_PREFIX 00:06:59.707 #undef SPDK_CONFIG_CRYPTO 00:06:59.707 #undef SPDK_CONFIG_CRYPTO_MLX5 00:06:59.707 #undef SPDK_CONFIG_CUSTOMOCF 00:06:59.707 #undef SPDK_CONFIG_DAOS 00:06:59.707 #define SPDK_CONFIG_DAOS_DIR 00:06:59.707 #define SPDK_CONFIG_DEBUG 1 00:06:59.707 #undef SPDK_CONFIG_DPDK_COMPRESSDEV 00:06:59.707 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build 00:06:59.707 #define SPDK_CONFIG_DPDK_INC_DIR 00:06:59.707 #define SPDK_CONFIG_DPDK_LIB_DIR 00:06:59.707 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:06:59.707 #define SPDK_CONFIG_ENV /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:06:59.707 #define SPDK_CONFIG_EXAMPLES 1 00:06:59.707 #undef SPDK_CONFIG_FC 00:06:59.707 #define SPDK_CONFIG_FC_PATH 00:06:59.707 #define SPDK_CONFIG_FIO_PLUGIN 1 00:06:59.707 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:06:59.707 #undef SPDK_CONFIG_FUSE 00:06:59.707 #define SPDK_CONFIG_FUZZER 1 00:06:59.707 #define SPDK_CONFIG_FUZZER_LIB /usr/lib64/clang/16/lib/libclang_rt.fuzzer_no_main-x86_64.a 00:06:59.707 #undef SPDK_CONFIG_GOLANG 00:06:59.707 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:06:59.707 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:06:59.707 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:06:59.707 #undef SPDK_CONFIG_HAVE_LIBBSD 00:06:59.707 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:06:59.707 #define SPDK_CONFIG_IDXD 1 00:06:59.707 #define SPDK_CONFIG_IDXD_KERNEL 1 00:06:59.707 #undef SPDK_CONFIG_IPSEC_MB 00:06:59.707 #define SPDK_CONFIG_IPSEC_MB_DIR 00:06:59.707 #define SPDK_CONFIG_ISAL 1 00:06:59.707 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:06:59.707 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:06:59.707 #define SPDK_CONFIG_LIBDIR 00:06:59.707 #undef SPDK_CONFIG_LTO 00:06:59.707 #define SPDK_CONFIG_MAX_LCORES 00:06:59.707 #define SPDK_CONFIG_NVME_CUSE 1 00:06:59.707 #undef SPDK_CONFIG_OCF 00:06:59.707 #define SPDK_CONFIG_OCF_PATH 00:06:59.707 #define SPDK_CONFIG_OPENSSL_PATH 00:06:59.707 #undef SPDK_CONFIG_PGO_CAPTURE 00:06:59.707 #undef SPDK_CONFIG_PGO_USE 00:06:59.707 #define SPDK_CONFIG_PREFIX /usr/local 00:06:59.707 #undef SPDK_CONFIG_RAID5F 00:06:59.707 #undef SPDK_CONFIG_RBD 00:06:59.707 #define SPDK_CONFIG_RDMA 1 00:06:59.707 #define SPDK_CONFIG_RDMA_PROV verbs 00:06:59.707 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:06:59.707 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:06:59.707 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:06:59.707 #undef SPDK_CONFIG_SHARED 00:06:59.707 #undef SPDK_CONFIG_SMA 00:06:59.707 #define SPDK_CONFIG_TESTS 1 00:06:59.707 #undef SPDK_CONFIG_TSAN 00:06:59.707 #define SPDK_CONFIG_UBLK 1 00:06:59.707 #define SPDK_CONFIG_UBSAN 1 00:06:59.707 #undef SPDK_CONFIG_UNIT_TESTS 00:06:59.707 #undef SPDK_CONFIG_URING 00:06:59.707 #define SPDK_CONFIG_URING_PATH 00:06:59.707 #undef SPDK_CONFIG_URING_ZNS 00:06:59.707 #undef SPDK_CONFIG_USDT 00:06:59.707 #undef SPDK_CONFIG_VBDEV_COMPRESS 00:06:59.707 #undef SPDK_CONFIG_VBDEV_COMPRESS_MLX5 00:06:59.707 #define SPDK_CONFIG_VFIO_USER 1 00:06:59.707 #define SPDK_CONFIG_VFIO_USER_DIR 00:06:59.707 #define SPDK_CONFIG_VHOST 1 00:06:59.707 #define SPDK_CONFIG_VIRTIO 1 00:06:59.707 #undef SPDK_CONFIG_VTUNE 00:06:59.707 #define SPDK_CONFIG_VTUNE_DIR 00:06:59.707 #define SPDK_CONFIG_WERROR 1 00:06:59.707 #define SPDK_CONFIG_WPDK_DIR 00:06:59.707 #undef SPDK_CONFIG_XNVME 00:06:59.707 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:06:59.707 00:14:58 -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:06:59.707 00:14:58 -- common/autotest_common.sh@49 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:06:59.707 00:14:58 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:06:59.707 00:14:58 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:06:59.707 00:14:58 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:06:59.707 00:14:58 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:59.707 00:14:58 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:59.707 00:14:58 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:59.707 00:14:58 -- paths/export.sh@5 -- # export PATH 00:06:59.707 00:14:58 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:59.707 00:14:58 -- common/autotest_common.sh@50 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:06:59.707 00:14:58 -- pm/common@6 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:06:59.707 00:14:58 -- pm/common@6 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:06:59.707 00:14:58 -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:06:59.707 00:14:58 -- pm/common@7 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/../../../ 00:06:59.707 00:14:58 -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:06:59.707 00:14:58 -- pm/common@16 -- # TEST_TAG=N/A 00:06:59.707 00:14:58 -- pm/common@17 -- # TEST_TAG_FILE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.run_test_name 00:06:59.707 00:14:58 -- common/autotest_common.sh@52 -- # : 1 00:06:59.707 00:14:58 -- common/autotest_common.sh@53 -- # export RUN_NIGHTLY 00:06:59.707 00:14:58 -- common/autotest_common.sh@56 -- # : 0 00:06:59.707 00:14:58 -- common/autotest_common.sh@57 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:06:59.707 00:14:58 -- common/autotest_common.sh@58 -- # : 0 00:06:59.707 00:14:58 -- common/autotest_common.sh@59 -- # export SPDK_RUN_VALGRIND 00:06:59.707 00:14:58 -- common/autotest_common.sh@60 -- # : 1 00:06:59.707 00:14:58 -- common/autotest_common.sh@61 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:06:59.707 00:14:58 -- common/autotest_common.sh@62 -- # : 0 00:06:59.707 00:14:58 -- common/autotest_common.sh@63 -- # export SPDK_TEST_UNITTEST 00:06:59.707 00:14:58 -- common/autotest_common.sh@64 -- # : 00:06:59.707 00:14:58 -- common/autotest_common.sh@65 -- # export SPDK_TEST_AUTOBUILD 00:06:59.707 00:14:58 -- common/autotest_common.sh@66 -- # : 0 00:06:59.707 00:14:58 -- common/autotest_common.sh@67 -- # export SPDK_TEST_RELEASE_BUILD 00:06:59.707 00:14:58 -- common/autotest_common.sh@68 -- # : 0 00:06:59.707 00:14:58 -- common/autotest_common.sh@69 -- # export SPDK_TEST_ISAL 00:06:59.707 00:14:58 -- common/autotest_common.sh@70 -- # : 0 00:06:59.707 00:14:58 -- common/autotest_common.sh@71 -- # export SPDK_TEST_ISCSI 00:06:59.707 00:14:58 -- common/autotest_common.sh@72 -- # : 0 00:06:59.707 00:14:58 -- common/autotest_common.sh@73 -- # export SPDK_TEST_ISCSI_INITIATOR 00:06:59.707 00:14:58 -- common/autotest_common.sh@74 -- # : 0 00:06:59.707 00:14:58 -- common/autotest_common.sh@75 -- # export SPDK_TEST_NVME 00:06:59.707 00:14:58 -- common/autotest_common.sh@76 -- # : 0 00:06:59.707 00:14:58 -- common/autotest_common.sh@77 -- # export SPDK_TEST_NVME_PMR 00:06:59.707 00:14:58 -- common/autotest_common.sh@78 -- # : 0 00:06:59.707 00:14:58 -- common/autotest_common.sh@79 -- # export SPDK_TEST_NVME_BP 00:06:59.707 00:14:58 -- common/autotest_common.sh@80 -- # : 0 00:06:59.707 00:14:58 -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME_CLI 00:06:59.707 00:14:58 -- common/autotest_common.sh@82 -- # : 0 00:06:59.707 00:14:58 -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_CUSE 00:06:59.707 00:14:58 -- common/autotest_common.sh@84 -- # : 0 00:06:59.707 00:14:58 -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_FDP 00:06:59.707 00:14:58 -- common/autotest_common.sh@86 -- # : 0 00:06:59.707 00:14:58 -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVMF 00:06:59.707 00:14:58 -- common/autotest_common.sh@88 -- # : 0 00:06:59.707 00:14:58 -- common/autotest_common.sh@89 -- # export SPDK_TEST_VFIOUSER 00:06:59.707 00:14:58 -- common/autotest_common.sh@90 -- # : 0 00:06:59.707 00:14:58 -- common/autotest_common.sh@91 -- # export SPDK_TEST_VFIOUSER_QEMU 00:06:59.707 00:14:58 -- common/autotest_common.sh@92 -- # : 1 00:06:59.707 00:14:58 -- common/autotest_common.sh@93 -- # export SPDK_TEST_FUZZER 00:06:59.707 00:14:58 -- common/autotest_common.sh@94 -- # : 1 00:06:59.707 00:14:58 -- common/autotest_common.sh@95 -- # export SPDK_TEST_FUZZER_SHORT 00:06:59.707 00:14:58 -- common/autotest_common.sh@96 -- # : rdma 00:06:59.707 00:14:58 -- common/autotest_common.sh@97 -- # export SPDK_TEST_NVMF_TRANSPORT 00:06:59.707 00:14:58 -- common/autotest_common.sh@98 -- # : 0 00:06:59.707 00:14:58 -- common/autotest_common.sh@99 -- # export SPDK_TEST_RBD 00:06:59.707 00:14:58 -- common/autotest_common.sh@100 -- # : 0 00:06:59.707 00:14:58 -- common/autotest_common.sh@101 -- # export SPDK_TEST_VHOST 00:06:59.707 00:14:58 -- common/autotest_common.sh@102 -- # : 0 00:06:59.707 00:14:58 -- common/autotest_common.sh@103 -- # export SPDK_TEST_BLOCKDEV 00:06:59.707 00:14:58 -- common/autotest_common.sh@104 -- # : 0 00:06:59.707 00:14:58 -- common/autotest_common.sh@105 -- # export SPDK_TEST_IOAT 00:06:59.707 00:14:58 -- common/autotest_common.sh@106 -- # : 0 00:06:59.707 00:14:58 -- common/autotest_common.sh@107 -- # export SPDK_TEST_BLOBFS 00:06:59.707 00:14:58 -- common/autotest_common.sh@108 -- # : 0 00:06:59.707 00:14:58 -- common/autotest_common.sh@109 -- # export SPDK_TEST_VHOST_INIT 00:06:59.707 00:14:58 -- common/autotest_common.sh@110 -- # : 0 00:06:59.707 00:14:58 -- common/autotest_common.sh@111 -- # export SPDK_TEST_LVOL 00:06:59.707 00:14:58 -- common/autotest_common.sh@112 -- # : 0 00:06:59.707 00:14:58 -- common/autotest_common.sh@113 -- # export SPDK_TEST_VBDEV_COMPRESS 00:06:59.707 00:14:58 -- common/autotest_common.sh@114 -- # : 0 00:06:59.707 00:14:58 -- common/autotest_common.sh@115 -- # export SPDK_RUN_ASAN 00:06:59.707 00:14:58 -- common/autotest_common.sh@116 -- # : 1 00:06:59.707 00:14:58 -- common/autotest_common.sh@117 -- # export SPDK_RUN_UBSAN 00:06:59.708 00:14:58 -- common/autotest_common.sh@118 -- # : 00:06:59.708 00:14:58 -- common/autotest_common.sh@119 -- # export SPDK_RUN_EXTERNAL_DPDK 00:06:59.708 00:14:58 -- common/autotest_common.sh@120 -- # : 0 00:06:59.708 00:14:58 -- common/autotest_common.sh@121 -- # export SPDK_RUN_NON_ROOT 00:06:59.708 00:14:58 -- common/autotest_common.sh@122 -- # : 0 00:06:59.708 00:14:58 -- common/autotest_common.sh@123 -- # export SPDK_TEST_CRYPTO 00:06:59.708 00:14:58 -- common/autotest_common.sh@124 -- # : 0 00:06:59.708 00:14:58 -- common/autotest_common.sh@125 -- # export SPDK_TEST_FTL 00:06:59.708 00:14:58 -- common/autotest_common.sh@126 -- # : 0 00:06:59.708 00:14:58 -- common/autotest_common.sh@127 -- # export SPDK_TEST_OCF 00:06:59.708 00:14:58 -- common/autotest_common.sh@128 -- # : 0 00:06:59.708 00:14:58 -- common/autotest_common.sh@129 -- # export SPDK_TEST_VMD 00:06:59.708 00:14:58 -- common/autotest_common.sh@130 -- # : 0 00:06:59.708 00:14:58 -- common/autotest_common.sh@131 -- # export SPDK_TEST_OPAL 00:06:59.708 00:14:58 -- common/autotest_common.sh@132 -- # : 00:06:59.708 00:14:58 -- common/autotest_common.sh@133 -- # export SPDK_TEST_NATIVE_DPDK 00:06:59.708 00:14:58 -- common/autotest_common.sh@134 -- # : true 00:06:59.708 00:14:58 -- common/autotest_common.sh@135 -- # export SPDK_AUTOTEST_X 00:06:59.708 00:14:58 -- common/autotest_common.sh@136 -- # : 0 00:06:59.708 00:14:58 -- common/autotest_common.sh@137 -- # export SPDK_TEST_RAID5 00:06:59.708 00:14:58 -- common/autotest_common.sh@138 -- # : 0 00:06:59.708 00:14:58 -- common/autotest_common.sh@139 -- # export SPDK_TEST_URING 00:06:59.708 00:14:58 -- common/autotest_common.sh@140 -- # : 0 00:06:59.708 00:14:58 -- common/autotest_common.sh@141 -- # export SPDK_TEST_USDT 00:06:59.708 00:14:58 -- common/autotest_common.sh@142 -- # : 0 00:06:59.708 00:14:58 -- common/autotest_common.sh@143 -- # export SPDK_TEST_USE_IGB_UIO 00:06:59.708 00:14:58 -- common/autotest_common.sh@144 -- # : 0 00:06:59.708 00:14:58 -- common/autotest_common.sh@145 -- # export SPDK_TEST_SCHEDULER 00:06:59.708 00:14:58 -- common/autotest_common.sh@146 -- # : 0 00:06:59.708 00:14:58 -- common/autotest_common.sh@147 -- # export SPDK_TEST_SCANBUILD 00:06:59.708 00:14:58 -- common/autotest_common.sh@148 -- # : 00:06:59.708 00:14:58 -- common/autotest_common.sh@149 -- # export SPDK_TEST_NVMF_NICS 00:06:59.708 00:14:58 -- common/autotest_common.sh@150 -- # : 0 00:06:59.708 00:14:58 -- common/autotest_common.sh@151 -- # export SPDK_TEST_SMA 00:06:59.708 00:14:58 -- common/autotest_common.sh@152 -- # : 0 00:06:59.708 00:14:58 -- common/autotest_common.sh@153 -- # export SPDK_TEST_DAOS 00:06:59.708 00:14:58 -- common/autotest_common.sh@154 -- # : 0 00:06:59.708 00:14:58 -- common/autotest_common.sh@155 -- # export SPDK_TEST_XNVME 00:06:59.708 00:14:58 -- common/autotest_common.sh@156 -- # : 0 00:06:59.708 00:14:58 -- common/autotest_common.sh@157 -- # export SPDK_TEST_ACCEL_DSA 00:06:59.708 00:14:58 -- common/autotest_common.sh@158 -- # : 0 00:06:59.708 00:14:58 -- common/autotest_common.sh@159 -- # export SPDK_TEST_ACCEL_IAA 00:06:59.708 00:14:58 -- common/autotest_common.sh@160 -- # : 0 00:06:59.708 00:14:58 -- common/autotest_common.sh@161 -- # export SPDK_TEST_ACCEL_IOAT 00:06:59.708 00:14:58 -- common/autotest_common.sh@163 -- # : 00:06:59.708 00:14:58 -- common/autotest_common.sh@164 -- # export SPDK_TEST_FUZZER_TARGET 00:06:59.708 00:14:58 -- common/autotest_common.sh@165 -- # : 0 00:06:59.708 00:14:58 -- common/autotest_common.sh@166 -- # export SPDK_TEST_NVMF_MDNS 00:06:59.708 00:14:58 -- common/autotest_common.sh@167 -- # : 0 00:06:59.708 00:14:58 -- common/autotest_common.sh@168 -- # export SPDK_JSONRPC_GO_CLIENT 00:06:59.708 00:14:58 -- common/autotest_common.sh@171 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:06:59.708 00:14:58 -- common/autotest_common.sh@171 -- # SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:06:59.708 00:14:58 -- common/autotest_common.sh@172 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib 00:06:59.708 00:14:58 -- common/autotest_common.sh@172 -- # DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib 00:06:59.708 00:14:58 -- common/autotest_common.sh@173 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:06:59.708 00:14:58 -- common/autotest_common.sh@173 -- # VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:06:59.708 00:14:58 -- common/autotest_common.sh@174 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:06:59.708 00:14:58 -- common/autotest_common.sh@174 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:06:59.708 00:14:58 -- common/autotest_common.sh@177 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:06:59.708 00:14:58 -- common/autotest_common.sh@177 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:06:59.708 00:14:58 -- common/autotest_common.sh@181 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:06:59.708 00:14:58 -- common/autotest_common.sh@181 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:06:59.708 00:14:58 -- common/autotest_common.sh@185 -- # export PYTHONDONTWRITEBYTECODE=1 00:06:59.708 00:14:58 -- common/autotest_common.sh@185 -- # PYTHONDONTWRITEBYTECODE=1 00:06:59.708 00:14:58 -- common/autotest_common.sh@189 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:06:59.708 00:14:58 -- common/autotest_common.sh@189 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:06:59.708 00:14:58 -- common/autotest_common.sh@190 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:06:59.708 00:14:58 -- common/autotest_common.sh@190 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:06:59.708 00:14:58 -- common/autotest_common.sh@194 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:06:59.708 00:14:58 -- common/autotest_common.sh@195 -- # rm -rf /var/tmp/asan_suppression_file 00:06:59.708 00:14:58 -- common/autotest_common.sh@196 -- # cat 00:06:59.708 00:14:58 -- common/autotest_common.sh@222 -- # echo leak:libfuse3.so 00:06:59.708 00:14:58 -- common/autotest_common.sh@224 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:06:59.708 00:14:58 -- common/autotest_common.sh@224 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:06:59.708 00:14:58 -- common/autotest_common.sh@226 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:06:59.708 00:14:58 -- common/autotest_common.sh@226 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:06:59.708 00:14:58 -- common/autotest_common.sh@228 -- # '[' -z /var/spdk/dependencies ']' 00:06:59.708 00:14:58 -- common/autotest_common.sh@231 -- # export DEPENDENCY_DIR 00:06:59.708 00:14:58 -- common/autotest_common.sh@235 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:06:59.708 00:14:58 -- common/autotest_common.sh@235 -- # SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:06:59.708 00:14:58 -- common/autotest_common.sh@236 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:06:59.708 00:14:58 -- common/autotest_common.sh@236 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:06:59.708 00:14:58 -- common/autotest_common.sh@239 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:06:59.708 00:14:58 -- common/autotest_common.sh@239 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:06:59.708 00:14:58 -- common/autotest_common.sh@240 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:06:59.708 00:14:58 -- common/autotest_common.sh@240 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:06:59.708 00:14:58 -- common/autotest_common.sh@242 -- # export AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:06:59.708 00:14:58 -- common/autotest_common.sh@242 -- # AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:06:59.708 00:14:58 -- common/autotest_common.sh@245 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:06:59.708 00:14:58 -- common/autotest_common.sh@245 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:06:59.708 00:14:58 -- common/autotest_common.sh@248 -- # '[' 0 -eq 0 ']' 00:06:59.708 00:14:58 -- common/autotest_common.sh@249 -- # export valgrind= 00:06:59.708 00:14:58 -- common/autotest_common.sh@249 -- # valgrind= 00:06:59.708 00:14:58 -- common/autotest_common.sh@255 -- # uname -s 00:06:59.708 00:14:58 -- common/autotest_common.sh@255 -- # '[' Linux = Linux ']' 00:06:59.708 00:14:58 -- common/autotest_common.sh@256 -- # HUGEMEM=4096 00:06:59.708 00:14:58 -- common/autotest_common.sh@257 -- # export CLEAR_HUGE=yes 00:06:59.708 00:14:58 -- common/autotest_common.sh@257 -- # CLEAR_HUGE=yes 00:06:59.708 00:14:58 -- common/autotest_common.sh@258 -- # [[ 0 -eq 1 ]] 00:06:59.708 00:14:58 -- common/autotest_common.sh@258 -- # [[ 0 -eq 1 ]] 00:06:59.708 00:14:58 -- common/autotest_common.sh@265 -- # MAKE=make 00:06:59.708 00:14:58 -- common/autotest_common.sh@266 -- # MAKEFLAGS=-j112 00:06:59.708 00:14:58 -- common/autotest_common.sh@282 -- # export HUGEMEM=4096 00:06:59.708 00:14:58 -- common/autotest_common.sh@282 -- # HUGEMEM=4096 00:06:59.708 00:14:58 -- common/autotest_common.sh@284 -- # '[' -z /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output ']' 00:06:59.708 00:14:58 -- common/autotest_common.sh@289 -- # NO_HUGE=() 00:06:59.708 00:14:58 -- common/autotest_common.sh@290 -- # TEST_MODE= 00:06:59.708 00:14:58 -- common/autotest_common.sh@309 -- # [[ -z 325408 ]] 00:06:59.708 00:14:58 -- common/autotest_common.sh@309 -- # kill -0 325408 00:06:59.708 00:14:58 -- common/autotest_common.sh@1665 -- # set_test_storage 2147483648 00:06:59.708 00:14:58 -- common/autotest_common.sh@319 -- # [[ -v testdir ]] 00:06:59.708 00:14:58 -- common/autotest_common.sh@321 -- # local requested_size=2147483648 00:06:59.708 00:14:58 -- common/autotest_common.sh@322 -- # local mount target_dir 00:06:59.708 00:14:58 -- common/autotest_common.sh@324 -- # local -A mounts fss sizes avails uses 00:06:59.708 00:14:58 -- common/autotest_common.sh@325 -- # local source fs size avail mount use 00:06:59.708 00:14:58 -- common/autotest_common.sh@327 -- # local storage_fallback storage_candidates 00:06:59.708 00:14:58 -- common/autotest_common.sh@329 -- # mktemp -udt spdk.XXXXXX 00:06:59.708 00:14:58 -- common/autotest_common.sh@329 -- # storage_fallback=/tmp/spdk.TM8Bzy 00:06:59.708 00:14:58 -- common/autotest_common.sh@334 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:06:59.708 00:14:58 -- common/autotest_common.sh@336 -- # [[ -n '' ]] 00:06:59.709 00:14:58 -- common/autotest_common.sh@341 -- # [[ -n '' ]] 00:06:59.709 00:14:58 -- common/autotest_common.sh@346 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf /tmp/spdk.TM8Bzy/tests/nvmf /tmp/spdk.TM8Bzy 00:06:59.709 00:14:58 -- common/autotest_common.sh@349 -- # requested_size=2214592512 00:06:59.709 00:14:58 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:06:59.709 00:14:58 -- common/autotest_common.sh@318 -- # df -T 00:06:59.709 00:14:58 -- common/autotest_common.sh@318 -- # grep -v Filesystem 00:06:59.709 00:14:58 -- common/autotest_common.sh@352 -- # mounts["$mount"]=spdk_devtmpfs 00:06:59.709 00:14:58 -- common/autotest_common.sh@352 -- # fss["$mount"]=devtmpfs 00:06:59.709 00:14:58 -- common/autotest_common.sh@353 -- # avails["$mount"]=67108864 00:06:59.709 00:14:58 -- common/autotest_common.sh@353 -- # sizes["$mount"]=67108864 00:06:59.709 00:14:58 -- common/autotest_common.sh@354 -- # uses["$mount"]=0 00:06:59.709 00:14:58 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:06:59.709 00:14:58 -- common/autotest_common.sh@352 -- # mounts["$mount"]=/dev/pmem0 00:06:59.709 00:14:58 -- common/autotest_common.sh@352 -- # fss["$mount"]=ext2 00:06:59.709 00:14:58 -- common/autotest_common.sh@353 -- # avails["$mount"]=954408960 00:06:59.709 00:14:58 -- common/autotest_common.sh@353 -- # sizes["$mount"]=5284429824 00:06:59.709 00:14:58 -- common/autotest_common.sh@354 -- # uses["$mount"]=4330020864 00:06:59.709 00:14:58 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:06:59.709 00:14:58 -- common/autotest_common.sh@352 -- # mounts["$mount"]=spdk_root 00:06:59.709 00:14:58 -- common/autotest_common.sh@352 -- # fss["$mount"]=overlay 00:06:59.709 00:14:58 -- common/autotest_common.sh@353 -- # avails["$mount"]=54355607552 00:06:59.709 00:14:58 -- common/autotest_common.sh@353 -- # sizes["$mount"]=61742317568 00:06:59.709 00:14:58 -- common/autotest_common.sh@354 -- # uses["$mount"]=7386710016 00:06:59.709 00:14:58 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:06:59.709 00:14:58 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:06:59.709 00:14:58 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:06:59.709 00:14:58 -- common/autotest_common.sh@353 -- # avails["$mount"]=30868566016 00:06:59.709 00:14:58 -- common/autotest_common.sh@353 -- # sizes["$mount"]=30871158784 00:06:59.709 00:14:58 -- common/autotest_common.sh@354 -- # uses["$mount"]=2592768 00:06:59.709 00:14:58 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:06:59.709 00:14:58 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:06:59.709 00:14:58 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:06:59.709 00:14:58 -- common/autotest_common.sh@353 -- # avails["$mount"]=12342484992 00:06:59.709 00:14:58 -- common/autotest_common.sh@353 -- # sizes["$mount"]=12348465152 00:06:59.709 00:14:58 -- common/autotest_common.sh@354 -- # uses["$mount"]=5980160 00:06:59.709 00:14:58 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:06:59.709 00:14:58 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:06:59.709 00:14:58 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:06:59.709 00:14:58 -- common/autotest_common.sh@353 -- # avails["$mount"]=30870425600 00:06:59.709 00:14:58 -- common/autotest_common.sh@353 -- # sizes["$mount"]=30871158784 00:06:59.709 00:14:58 -- common/autotest_common.sh@354 -- # uses["$mount"]=733184 00:06:59.709 00:14:58 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:06:59.709 00:14:58 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:06:59.709 00:14:58 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:06:59.709 00:14:58 -- common/autotest_common.sh@353 -- # avails["$mount"]=6174224384 00:06:59.709 00:14:58 -- common/autotest_common.sh@353 -- # sizes["$mount"]=6174228480 00:06:59.709 00:14:58 -- common/autotest_common.sh@354 -- # uses["$mount"]=4096 00:06:59.709 00:14:58 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:06:59.709 00:14:58 -- common/autotest_common.sh@357 -- # printf '* Looking for test storage...\n' 00:06:59.709 * Looking for test storage... 00:06:59.709 00:14:58 -- common/autotest_common.sh@359 -- # local target_space new_size 00:06:59.709 00:14:58 -- common/autotest_common.sh@360 -- # for target_dir in "${storage_candidates[@]}" 00:06:59.709 00:14:58 -- common/autotest_common.sh@363 -- # df /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:06:59.709 00:14:58 -- common/autotest_common.sh@363 -- # awk '$1 !~ /Filesystem/{print $6}' 00:06:59.969 00:14:58 -- common/autotest_common.sh@363 -- # mount=/ 00:06:59.969 00:14:58 -- common/autotest_common.sh@365 -- # target_space=54355607552 00:06:59.969 00:14:58 -- common/autotest_common.sh@366 -- # (( target_space == 0 || target_space < requested_size )) 00:06:59.969 00:14:58 -- common/autotest_common.sh@369 -- # (( target_space >= requested_size )) 00:06:59.969 00:14:58 -- common/autotest_common.sh@371 -- # [[ overlay == tmpfs ]] 00:06:59.969 00:14:58 -- common/autotest_common.sh@371 -- # [[ overlay == ramfs ]] 00:06:59.969 00:14:58 -- common/autotest_common.sh@371 -- # [[ / == / ]] 00:06:59.969 00:14:58 -- common/autotest_common.sh@372 -- # new_size=9601302528 00:06:59.969 00:14:58 -- common/autotest_common.sh@373 -- # (( new_size * 100 / sizes[/] > 95 )) 00:06:59.969 00:14:58 -- common/autotest_common.sh@378 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:06:59.969 00:14:58 -- common/autotest_common.sh@378 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:06:59.969 00:14:58 -- common/autotest_common.sh@379 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:06:59.969 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:06:59.969 00:14:58 -- common/autotest_common.sh@380 -- # return 0 00:06:59.969 00:14:58 -- common/autotest_common.sh@1667 -- # set -o errtrace 00:06:59.969 00:14:58 -- common/autotest_common.sh@1668 -- # shopt -s extdebug 00:06:59.969 00:14:58 -- common/autotest_common.sh@1669 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:06:59.969 00:14:58 -- common/autotest_common.sh@1671 -- # PS4=' \t -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:06:59.969 00:14:58 -- common/autotest_common.sh@1672 -- # true 00:06:59.969 00:14:58 -- common/autotest_common.sh@1674 -- # xtrace_fd 00:06:59.969 00:14:58 -- common/autotest_common.sh@25 -- # [[ -n 14 ]] 00:06:59.969 00:14:58 -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/14 ]] 00:06:59.969 00:14:58 -- common/autotest_common.sh@27 -- # exec 00:06:59.969 00:14:58 -- common/autotest_common.sh@29 -- # exec 00:06:59.969 00:14:58 -- common/autotest_common.sh@31 -- # xtrace_restore 00:06:59.969 00:14:58 -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:06:59.969 00:14:58 -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:06:59.969 00:14:58 -- common/autotest_common.sh@18 -- # set -x 00:06:59.969 00:14:58 -- nvmf/run.sh@53 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/../common.sh 00:06:59.969 00:14:58 -- ../common.sh@8 -- # pids=() 00:06:59.969 00:14:58 -- nvmf/run.sh@55 -- # fuzzfile=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c 00:06:59.969 00:14:58 -- nvmf/run.sh@56 -- # grep -c '\.fn =' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c 00:06:59.969 00:14:58 -- nvmf/run.sh@56 -- # fuzz_num=25 00:06:59.969 00:14:58 -- nvmf/run.sh@57 -- # (( fuzz_num != 0 )) 00:06:59.969 00:14:58 -- nvmf/run.sh@59 -- # trap 'cleanup /tmp/llvm_fuzz*; exit 1' SIGINT SIGTERM EXIT 00:06:59.969 00:14:58 -- nvmf/run.sh@61 -- # mem_size=512 00:06:59.969 00:14:58 -- nvmf/run.sh@62 -- # [[ 1 -eq 1 ]] 00:06:59.969 00:14:58 -- nvmf/run.sh@63 -- # start_llvm_fuzz_short 25 1 00:06:59.969 00:14:58 -- ../common.sh@69 -- # local fuzz_num=25 00:06:59.969 00:14:58 -- ../common.sh@70 -- # local time=1 00:06:59.969 00:14:58 -- ../common.sh@72 -- # (( i = 0 )) 00:06:59.969 00:14:58 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:06:59.969 00:14:58 -- ../common.sh@73 -- # start_llvm_fuzz 0 1 0x1 00:06:59.969 00:14:58 -- nvmf/run.sh@23 -- # local fuzzer_type=0 00:06:59.969 00:14:58 -- nvmf/run.sh@24 -- # local timen=1 00:06:59.969 00:14:58 -- nvmf/run.sh@25 -- # local core=0x1 00:06:59.969 00:14:58 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:06:59.969 00:14:58 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_0.conf 00:06:59.969 00:14:58 -- nvmf/run.sh@29 -- # printf %02d 0 00:06:59.969 00:14:58 -- nvmf/run.sh@29 -- # port=4400 00:06:59.969 00:14:58 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:06:59.969 00:14:58 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4400' 00:06:59.969 00:14:58 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4400"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:06:59.969 00:14:58 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4400' -c /tmp/fuzz_json_0.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 -Z 0 -r /var/tmp/spdk0.sock 00:06:59.969 [2024-07-15 00:14:58.824719] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:59.969 [2024-07-15 00:14:58.824794] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid325574 ] 00:06:59.969 EAL: No free 2048 kB hugepages reported on node 1 00:07:00.228 [2024-07-15 00:14:59.083006] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:00.228 [2024-07-15 00:14:59.171395] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:00.228 [2024-07-15 00:14:59.171555] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:00.228 [2024-07-15 00:14:59.229346] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:00.228 [2024-07-15 00:14:59.245631] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4400 *** 00:07:00.228 INFO: Running with entropic power schedule (0xFF, 100). 00:07:00.228 INFO: Seed: 1082281004 00:07:00.228 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x280a94c, 0x285dea9), 00:07:00.228 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x285deb0,0x2d93480), 00:07:00.228 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:07:00.228 INFO: A corpus is not provided, starting from an empty corpus 00:07:00.228 #2 INITED exec/s: 0 rss: 61Mb 00:07:00.228 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:00.228 This may also happen if the target rejected all inputs we tried so far 00:07:00.487 [2024-07-15 00:14:59.294835] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (3a) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:00.487 [2024-07-15 00:14:59.294867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.747 NEW_FUNC[1/667]: 0x480d10 in fuzz_admin_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:47 00:07:00.747 NEW_FUNC[2/667]: 0x4bd260 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:00.747 #7 NEW cov: 11436 ft: 11478 corp: 2/104b lim: 320 exec/s: 0 rss: 67Mb L: 103/103 MS: 5 CopyPart-ShuffleBytes-ChangeByte-CrossOver-InsertRepeatedBytes- 00:07:00.747 [2024-07-15 00:14:59.605615] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (3a) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:00.747 [2024-07-15 00:14:59.605651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.747 NEW_FUNC[1/3]: 0x1957ca0 in event_queue_run_batch /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:528 00:07:00.747 NEW_FUNC[2/3]: 0x19593f0 in reactor_post_process_lw_thread /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:864 00:07:00.747 #8 NEW cov: 11590 ft: 11982 corp: 3/207b lim: 320 exec/s: 0 rss: 67Mb L: 103/103 MS: 1 CopyPart- 00:07:00.747 [2024-07-15 00:14:59.655673] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (3a) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:00.747 [2024-07-15 00:14:59.655700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.747 #9 NEW cov: 11596 ft: 12251 corp: 4/297b lim: 320 exec/s: 0 rss: 67Mb L: 90/103 MS: 1 EraseBytes- 00:07:00.747 [2024-07-15 00:14:59.695725] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (3a) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:00.747 [2024-07-15 00:14:59.695750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.747 #10 NEW cov: 11681 ft: 12509 corp: 5/387b lim: 320 exec/s: 0 rss: 68Mb L: 90/103 MS: 1 ShuffleBytes- 00:07:00.747 [2024-07-15 00:14:59.735839] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (3a) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:00.747 [2024-07-15 00:14:59.735865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.747 #11 NEW cov: 11681 ft: 12675 corp: 6/490b lim: 320 exec/s: 0 rss: 68Mb L: 103/103 MS: 1 ChangeBit- 00:07:00.747 [2024-07-15 00:14:59.775999] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (3a) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:00.747 [2024-07-15 00:14:59.776023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.747 #12 NEW cov: 11681 ft: 12783 corp: 7/593b lim: 320 exec/s: 0 rss: 68Mb L: 103/103 MS: 1 CMP- DE: "\000*\233\257k\356\025b"- 00:07:01.006 [2024-07-15 00:14:59.816098] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (3a) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:01.006 [2024-07-15 00:14:59.816123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.006 #13 NEW cov: 11681 ft: 12860 corp: 8/659b lim: 320 exec/s: 0 rss: 68Mb L: 66/103 MS: 1 EraseBytes- 00:07:01.006 [2024-07-15 00:14:59.846196] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (3a) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:01.006 [2024-07-15 00:14:59.846220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.006 #14 NEW cov: 11681 ft: 12878 corp: 9/725b lim: 320 exec/s: 0 rss: 68Mb L: 66/103 MS: 1 ShuffleBytes- 00:07:01.006 [2024-07-15 00:14:59.886310] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:ffff3aff cdw10:ffffffff cdw11:ffffffff 00:07:01.006 [2024-07-15 00:14:59.886335] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.006 #18 NEW cov: 11683 ft: 12958 corp: 10/838b lim: 320 exec/s: 0 rss: 68Mb L: 113/113 MS: 4 CopyPart-PersAutoDict-CrossOver-CrossOver- DE: "\000*\233\257k\356\025b"- 00:07:01.006 [2024-07-15 00:14:59.926598] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (3a) qid:0 cid:4 nsid:ffff3aff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:01.006 [2024-07-15 00:14:59.926624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.006 [2024-07-15 00:14:59.926702] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:01.006 [2024-07-15 00:14:59.926717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:01.006 [2024-07-15 00:14:59.926778] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:6 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x6baf9b2a00ffffff 00:07:01.006 [2024-07-15 00:14:59.926791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:01.006 NEW_FUNC[1/1]: 0x12e0670 in nvmf_tcp_req_set_cpl /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/tcp.c:2014 00:07:01.006 #19 NEW cov: 11714 ft: 13268 corp: 11/1044b lim: 320 exec/s: 0 rss: 68Mb L: 206/206 MS: 1 CrossOver- 00:07:01.006 [2024-07-15 00:14:59.976575] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (3a) qid:0 cid:4 nsid:ffffffff cdw10:af9b2a00 cdw11:6215ee6b SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:01.006 [2024-07-15 00:14:59.976600] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.006 #20 NEW cov: 11714 ft: 13310 corp: 12/1147b lim: 320 exec/s: 0 rss: 68Mb L: 103/206 MS: 1 PersAutoDict- DE: "\000*\233\257k\356\025b"- 00:07:01.006 [2024-07-15 00:15:00.006814] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (3a) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffff3affffff 00:07:01.006 [2024-07-15 00:15:00.006840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.006 [2024-07-15 00:15:00.006900] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:01.006 [2024-07-15 00:15:00.006915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:01.006 #21 NEW cov: 11714 ft: 13532 corp: 13/1292b lim: 320 exec/s: 0 rss: 68Mb L: 145/206 MS: 1 CrossOver- 00:07:01.006 [2024-07-15 00:15:00.046832] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (3a) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffff62 00:07:01.006 [2024-07-15 00:15:00.046860] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.265 #22 NEW cov: 11714 ft: 13551 corp: 14/1395b lim: 320 exec/s: 0 rss: 68Mb L: 103/206 MS: 1 PersAutoDict- DE: "\000*\233\257k\356\025b"- 00:07:01.265 [2024-07-15 00:15:00.086919] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:ffff3aff cdw10:ffffffff cdw11:ffffffff 00:07:01.265 [2024-07-15 00:15:00.086947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.265 #23 NEW cov: 11714 ft: 13561 corp: 15/1508b lim: 320 exec/s: 0 rss: 69Mb L: 113/206 MS: 1 ChangeByte- 00:07:01.265 [2024-07-15 00:15:00.127224] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (3a) qid:0 cid:4 nsid:ffff3aff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:01.265 [2024-07-15 00:15:00.127250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.265 [2024-07-15 00:15:00.127325] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:01.265 [2024-07-15 00:15:00.127339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:01.265 [2024-07-15 00:15:00.127397] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:6 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x6baf9b2a00ffffff 00:07:01.265 [2024-07-15 00:15:00.127410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:01.265 #24 NEW cov: 11714 ft: 13615 corp: 16/1714b lim: 320 exec/s: 0 rss: 69Mb L: 206/206 MS: 1 ChangeByte- 00:07:01.265 [2024-07-15 00:15:00.167130] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (3a) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:01.265 [2024-07-15 00:15:00.167156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.265 NEW_FUNC[1/1]: 0x195e300 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:01.265 #25 NEW cov: 11737 ft: 13655 corp: 17/1780b lim: 320 exec/s: 0 rss: 69Mb L: 66/206 MS: 1 ChangeBit- 00:07:01.265 [2024-07-15 00:15:00.207232] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (3a) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:01.265 [2024-07-15 00:15:00.207257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.265 #26 NEW cov: 11737 ft: 13707 corp: 18/1870b lim: 320 exec/s: 0 rss: 69Mb L: 90/206 MS: 1 CopyPart- 00:07:01.265 [2024-07-15 00:15:00.247414] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (3a) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:01.265 [2024-07-15 00:15:00.247439] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.265 #27 NEW cov: 11737 ft: 13729 corp: 19/1937b lim: 320 exec/s: 0 rss: 69Mb L: 67/206 MS: 1 InsertByte- 00:07:01.266 [2024-07-15 00:15:00.287537] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (3a) qid:0 cid:4 nsid:ffffffff cdw10:af9b2a00 cdw11:decfedb7 SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:01.266 [2024-07-15 00:15:00.287562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.266 #28 NEW cov: 11737 ft: 13746 corp: 20/2040b lim: 320 exec/s: 28 rss: 69Mb L: 103/206 MS: 1 CMP- DE: "\000*\233\257\267\355\317\336"- 00:07:01.266 [2024-07-15 00:15:00.317603] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (3a) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:01.266 [2024-07-15 00:15:00.317628] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.525 #29 NEW cov: 11737 ft: 13840 corp: 21/2106b lim: 320 exec/s: 29 rss: 69Mb L: 66/206 MS: 1 PersAutoDict- DE: "\000*\233\257k\356\025b"- 00:07:01.525 [2024-07-15 00:15:00.347828] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (3a) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:0000ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:01.525 [2024-07-15 00:15:00.347857] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.525 [2024-07-15 00:15:00.347909] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff 00:07:01.525 [2024-07-15 00:15:00.347922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:01.525 #30 NEW cov: 11737 ft: 13868 corp: 22/2261b lim: 320 exec/s: 30 rss: 69Mb L: 155/206 MS: 1 InsertRepeatedBytes- 00:07:01.525 [2024-07-15 00:15:00.377992] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (3a) qid:0 cid:4 nsid:ffff3aff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:01.525 [2024-07-15 00:15:00.378017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.525 [2024-07-15 00:15:00.378080] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:01.525 [2024-07-15 00:15:00.378094] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:01.525 [2024-07-15 00:15:00.378156] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:6 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xaf9b2a00ffffffff 00:07:01.525 [2024-07-15 00:15:00.378170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:01.525 #31 NEW cov: 11737 ft: 13920 corp: 23/2468b lim: 320 exec/s: 31 rss: 69Mb L: 207/207 MS: 1 InsertByte- 00:07:01.525 [2024-07-15 00:15:00.418082] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (3a) qid:0 cid:4 nsid:ffff3aff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:01.525 [2024-07-15 00:15:00.418107] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.525 [2024-07-15 00:15:00.418179] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:01.525 [2024-07-15 00:15:00.418193] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:01.525 [2024-07-15 00:15:00.418254] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:6 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xaf9b2a00ffffffff 00:07:01.525 [2024-07-15 00:15:00.418269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:01.525 #32 NEW cov: 11737 ft: 13937 corp: 24/2675b lim: 320 exec/s: 32 rss: 70Mb L: 207/207 MS: 1 ChangeBinInt- 00:07:01.525 [2024-07-15 00:15:00.458121] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (3a) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:01.525 [2024-07-15 00:15:00.458145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.525 [2024-07-15 00:15:00.458203] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:01.525 [2024-07-15 00:15:00.458217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:01.525 #33 NEW cov: 11737 ft: 13959 corp: 25/2807b lim: 320 exec/s: 33 rss: 70Mb L: 132/207 MS: 1 CrossOver- 00:07:01.525 [2024-07-15 00:15:00.498082] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (3a) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffff2affffffffff 00:07:01.525 [2024-07-15 00:15:00.498111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.525 #34 NEW cov: 11737 ft: 13968 corp: 26/2873b lim: 320 exec/s: 34 rss: 70Mb L: 66/207 MS: 1 ChangeByte- 00:07:01.525 [2024-07-15 00:15:00.538174] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (3a) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:01.525 [2024-07-15 00:15:00.538199] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.525 #35 NEW cov: 11737 ft: 13983 corp: 27/2960b lim: 320 exec/s: 35 rss: 70Mb L: 87/207 MS: 1 EraseBytes- 00:07:01.525 [2024-07-15 00:15:00.568286] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (3a) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:01.525 [2024-07-15 00:15:00.568310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.784 #36 NEW cov: 11737 ft: 13990 corp: 28/3050b lim: 320 exec/s: 36 rss: 70Mb L: 90/207 MS: 1 CopyPart- 00:07:01.784 [2024-07-15 00:15:00.598472] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (3a) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:01.784 [2024-07-15 00:15:00.598497] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.784 [2024-07-15 00:15:00.598545] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:01.784 [2024-07-15 00:15:00.598558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:01.784 #42 NEW cov: 11737 ft: 13996 corp: 29/3182b lim: 320 exec/s: 42 rss: 70Mb L: 132/207 MS: 1 ChangeBit- 00:07:01.784 [2024-07-15 00:15:00.638489] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (3a) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xff2affffffffffff 00:07:01.784 [2024-07-15 00:15:00.638513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.784 #43 NEW cov: 11737 ft: 14023 corp: 30/3249b lim: 320 exec/s: 43 rss: 70Mb L: 67/207 MS: 1 InsertByte- 00:07:01.784 [2024-07-15 00:15:00.678622] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (3a) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:01.784 [2024-07-15 00:15:00.678647] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.784 #44 NEW cov: 11737 ft: 14065 corp: 31/3336b lim: 320 exec/s: 44 rss: 70Mb L: 87/207 MS: 1 ChangeBit- 00:07:01.784 [2024-07-15 00:15:00.718724] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (3a) qid:0 cid:4 nsid:ffffffff cdw10:af9b2a00 cdw11:6215ee6b SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:01.784 [2024-07-15 00:15:00.718748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.784 #45 NEW cov: 11737 ft: 14103 corp: 32/3439b lim: 320 exec/s: 45 rss: 70Mb L: 103/207 MS: 1 CopyPart- 00:07:01.784 [2024-07-15 00:15:00.758839] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (3a) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:01.784 [2024-07-15 00:15:00.758862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.784 #46 NEW cov: 11737 ft: 14112 corp: 33/3529b lim: 320 exec/s: 46 rss: 70Mb L: 90/207 MS: 1 ChangeBit- 00:07:01.784 [2024-07-15 00:15:00.788925] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (3a) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:01.784 [2024-07-15 00:15:00.788955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.784 #47 NEW cov: 11737 ft: 14125 corp: 34/3616b lim: 320 exec/s: 47 rss: 70Mb L: 87/207 MS: 1 ShuffleBytes- 00:07:01.784 [2024-07-15 00:15:00.829054] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (3a) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:01.784 [2024-07-15 00:15:00.829078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.043 #48 NEW cov: 11737 ft: 14144 corp: 35/3706b lim: 320 exec/s: 48 rss: 70Mb L: 90/207 MS: 1 ChangeByte- 00:07:02.043 [2024-07-15 00:15:00.869169] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:ffff3aff cdw10:ffffffff cdw11:ffffffff 00:07:02.043 [2024-07-15 00:15:00.869193] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.043 #49 NEW cov: 11737 ft: 14164 corp: 36/3819b lim: 320 exec/s: 49 rss: 70Mb L: 113/207 MS: 1 ChangeByte- 00:07:02.043 [2024-07-15 00:15:00.909313] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (3a) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:02.044 [2024-07-15 00:15:00.909338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.044 #50 NEW cov: 11737 ft: 14166 corp: 37/3906b lim: 320 exec/s: 50 rss: 70Mb L: 87/207 MS: 1 CrossOver- 00:07:02.044 [2024-07-15 00:15:00.939356] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (3a) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xff2affffffffffff 00:07:02.044 [2024-07-15 00:15:00.939380] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.044 #51 NEW cov: 11737 ft: 14182 corp: 38/3973b lim: 320 exec/s: 51 rss: 70Mb L: 67/207 MS: 1 ShuffleBytes- 00:07:02.044 [2024-07-15 00:15:00.979795] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (3a) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:02.044 [2024-07-15 00:15:00.979819] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.044 [2024-07-15 00:15:00.979877] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:02.044 [2024-07-15 00:15:00.979890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:02.044 [2024-07-15 00:15:00.979948] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:6 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:02.044 [2024-07-15 00:15:00.979961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:02.044 [2024-07-15 00:15:00.980017] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:7 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:02.044 [2024-07-15 00:15:00.980030] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:02.044 #57 NEW cov: 11737 ft: 14403 corp: 39/4230b lim: 320 exec/s: 57 rss: 70Mb L: 257/257 MS: 1 CopyPart- 00:07:02.044 [2024-07-15 00:15:01.019641] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (3a) qid:0 cid:4 nsid:ffffffff cdw10:fffffffe cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:02.044 [2024-07-15 00:15:01.019666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.044 #58 NEW cov: 11737 ft: 14413 corp: 40/4320b lim: 320 exec/s: 58 rss: 70Mb L: 90/257 MS: 1 ChangeBit- 00:07:02.044 [2024-07-15 00:15:01.049719] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (3a) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:02.044 [2024-07-15 00:15:01.049744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.044 #59 NEW cov: 11737 ft: 14436 corp: 41/4410b lim: 320 exec/s: 59 rss: 70Mb L: 90/257 MS: 1 ChangeByte- 00:07:02.044 [2024-07-15 00:15:01.079743] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:ffff3aff cdw10:ffffffff cdw11:ffffffff 00:07:02.044 [2024-07-15 00:15:01.079768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.303 #60 NEW cov: 11737 ft: 14445 corp: 42/4523b lim: 320 exec/s: 60 rss: 70Mb L: 113/257 MS: 1 CMP- DE: "\377\377\377\033"- 00:07:02.303 [2024-07-15 00:15:01.119925] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (3a) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffff2affffffffff 00:07:02.303 [2024-07-15 00:15:01.119949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.303 #61 NEW cov: 11737 ft: 14458 corp: 43/4610b lim: 320 exec/s: 61 rss: 70Mb L: 87/257 MS: 1 ChangeByte- 00:07:02.303 [2024-07-15 00:15:01.160228] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (3a) qid:0 cid:4 nsid:ffff3aff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:02.303 [2024-07-15 00:15:01.160253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.303 [2024-07-15 00:15:01.160329] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:02.303 [2024-07-15 00:15:01.160343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:02.303 [2024-07-15 00:15:01.160404] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:6 nsid:9b2a00ff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:02.303 [2024-07-15 00:15:01.160417] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:02.303 #62 NEW cov: 11737 ft: 14461 corp: 44/4817b lim: 320 exec/s: 62 rss: 70Mb L: 207/257 MS: 1 CrossOver- 00:07:02.303 [2024-07-15 00:15:01.200148] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (3a) qid:0 cid:4 nsid:ffffffff cdw10:af9b2a00 cdw11:6215ee6b SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:02.303 [2024-07-15 00:15:01.200172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.303 #63 NEW cov: 11737 ft: 14471 corp: 45/4920b lim: 320 exec/s: 63 rss: 71Mb L: 103/257 MS: 1 ChangeBinInt- 00:07:02.303 [2024-07-15 00:15:01.240255] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (3a) qid:0 cid:4 nsid:ffffffff cdw10:af9b2a00 cdw11:6215ee6b SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:02.303 [2024-07-15 00:15:01.240279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.303 #64 NEW cov: 11737 ft: 14479 corp: 46/5023b lim: 320 exec/s: 64 rss: 71Mb L: 103/257 MS: 1 ShuffleBytes- 00:07:02.303 [2024-07-15 00:15:01.270427] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (3a) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:02.303 [2024-07-15 00:15:01.270455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.303 [2024-07-15 00:15:01.270521] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:02.303 [2024-07-15 00:15:01.270534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:02.303 #65 NEW cov: 11737 ft: 14494 corp: 47/5155b lim: 320 exec/s: 32 rss: 71Mb L: 132/257 MS: 1 ChangeBit- 00:07:02.303 #65 DONE cov: 11737 ft: 14494 corp: 47/5155b lim: 320 exec/s: 32 rss: 71Mb 00:07:02.303 ###### Recommended dictionary. ###### 00:07:02.303 "\000*\233\257k\356\025b" # Uses: 4 00:07:02.303 "\000*\233\257\267\355\317\336" # Uses: 0 00:07:02.303 "\377\377\377\033" # Uses: 0 00:07:02.303 ###### End of recommended dictionary. ###### 00:07:02.303 Done 65 runs in 2 second(s) 00:07:02.562 00:15:01 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_0.conf 00:07:02.562 00:15:01 -- ../common.sh@72 -- # (( i++ )) 00:07:02.562 00:15:01 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:02.562 00:15:01 -- ../common.sh@73 -- # start_llvm_fuzz 1 1 0x1 00:07:02.562 00:15:01 -- nvmf/run.sh@23 -- # local fuzzer_type=1 00:07:02.562 00:15:01 -- nvmf/run.sh@24 -- # local timen=1 00:07:02.562 00:15:01 -- nvmf/run.sh@25 -- # local core=0x1 00:07:02.562 00:15:01 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:02.562 00:15:01 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_1.conf 00:07:02.562 00:15:01 -- nvmf/run.sh@29 -- # printf %02d 1 00:07:02.562 00:15:01 -- nvmf/run.sh@29 -- # port=4401 00:07:02.562 00:15:01 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:02.562 00:15:01 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4401' 00:07:02.562 00:15:01 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4401"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:02.562 00:15:01 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4401' -c /tmp/fuzz_json_1.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 -Z 1 -r /var/tmp/spdk1.sock 00:07:02.562 [2024-07-15 00:15:01.455046] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:02.562 [2024-07-15 00:15:01.455120] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid326012 ] 00:07:02.562 EAL: No free 2048 kB hugepages reported on node 1 00:07:02.821 [2024-07-15 00:15:01.706628] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:02.821 [2024-07-15 00:15:01.783270] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:02.821 [2024-07-15 00:15:01.783406] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:02.821 [2024-07-15 00:15:01.842234] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:02.821 [2024-07-15 00:15:01.858527] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4401 *** 00:07:02.821 INFO: Running with entropic power schedule (0xFF, 100). 00:07:02.821 INFO: Seed: 3697301066 00:07:03.079 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x280a94c, 0x285dea9), 00:07:03.079 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x285deb0,0x2d93480), 00:07:03.079 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:03.079 INFO: A corpus is not provided, starting from an empty corpus 00:07:03.079 #2 INITED exec/s: 0 rss: 61Mb 00:07:03.079 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:03.079 This may also happen if the target rejected all inputs we tried so far 00:07:03.079 [2024-07-15 00:15:01.925040] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10244) > buf size (4096) 00:07:03.079 [2024-07-15 00:15:01.926211] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.079 [2024-07-15 00:15:01.926256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:03.080 [2024-07-15 00:15:01.926331] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.080 [2024-07-15 00:15:01.926347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:03.080 [2024-07-15 00:15:01.926418] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.080 [2024-07-15 00:15:01.926434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:03.080 [2024-07-15 00:15:01.926510] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.080 [2024-07-15 00:15:01.926525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:03.339 NEW_FUNC[1/670]: 0x481610 in fuzz_admin_get_log_page_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:67 00:07:03.339 NEW_FUNC[2/670]: 0x4bd260 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:03.339 #3 NEW cov: 11584 ft: 11585 corp: 2/30b lim: 30 exec/s: 0 rss: 67Mb L: 29/29 MS: 1 InsertRepeatedBytes- 00:07:03.339 [2024-07-15 00:15:02.265386] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10244) > buf size (4096) 00:07:03.339 [2024-07-15 00:15:02.265842] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (29700) > buf size (4096) 00:07:03.339 [2024-07-15 00:15:02.266217] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.339 [2024-07-15 00:15:02.266258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:03.339 [2024-07-15 00:15:02.266373] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.339 [2024-07-15 00:15:02.266390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:03.339 [2024-07-15 00:15:02.266513] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.339 [2024-07-15 00:15:02.266529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:03.339 [2024-07-15 00:15:02.266661] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:1d000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.339 [2024-07-15 00:15:02.266678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:03.339 NEW_FUNC[1/1]: 0x16c4af0 in nvme_qpair_is_admin_queue /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/./nvme_internal.h:1090 00:07:03.339 #9 NEW cov: 11700 ft: 12275 corp: 3/59b lim: 30 exec/s: 0 rss: 67Mb L: 29/29 MS: 1 ChangeBinInt- 00:07:03.339 [2024-07-15 00:15:02.315409] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (262148) > buf size (4096) 00:07:03.339 [2024-07-15 00:15:02.316158] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00008100 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.339 [2024-07-15 00:15:02.316188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:03.339 [2024-07-15 00:15:02.316297] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.339 [2024-07-15 00:15:02.316316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:03.339 [2024-07-15 00:15:02.316454] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.339 [2024-07-15 00:15:02.316472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:03.339 [2024-07-15 00:15:02.316601] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.339 [2024-07-15 00:15:02.316618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:03.339 #10 NEW cov: 11706 ft: 12522 corp: 4/88b lim: 30 exec/s: 0 rss: 67Mb L: 29/29 MS: 1 CrossOver- 00:07:03.339 [2024-07-15 00:15:02.355512] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (152580) > buf size (4096) 00:07:03.339 [2024-07-15 00:15:02.356140] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:95000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.339 [2024-07-15 00:15:02.356168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:03.339 [2024-07-15 00:15:02.356286] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.339 [2024-07-15 00:15:02.356305] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:03.339 [2024-07-15 00:15:02.356423] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.339 [2024-07-15 00:15:02.356455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:03.339 #14 NEW cov: 11791 ft: 13221 corp: 5/109b lim: 30 exec/s: 0 rss: 67Mb L: 21/29 MS: 4 ChangeBit-ChangeBit-ChangeByte-CrossOver- 00:07:03.599 [2024-07-15 00:15:02.395616] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10244) > buf size (4096) 00:07:03.599 [2024-07-15 00:15:02.396258] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.599 [2024-07-15 00:15:02.396287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:03.599 [2024-07-15 00:15:02.396410] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.599 [2024-07-15 00:15:02.396439] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:03.599 [2024-07-15 00:15:02.396562] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.599 [2024-07-15 00:15:02.396580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:03.599 #15 NEW cov: 11791 ft: 13435 corp: 6/131b lim: 30 exec/s: 0 rss: 67Mb L: 22/29 MS: 1 InsertRepeatedBytes- 00:07:03.599 [2024-07-15 00:15:02.435880] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (262148) > buf size (4096) 00:07:03.599 [2024-07-15 00:15:02.436035] ctrlr.c:2547:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: offset (15616) > len (4) 00:07:03.599 [2024-07-15 00:15:02.436775] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00008100 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.599 [2024-07-15 00:15:02.436804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:03.599 [2024-07-15 00:15:02.436927] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.599 [2024-07-15 00:15:02.436944] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:03.599 [2024-07-15 00:15:02.437074] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.599 [2024-07-15 00:15:02.437094] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:03.599 [2024-07-15 00:15:02.437223] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.599 [2024-07-15 00:15:02.437240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:03.599 [2024-07-15 00:15:02.437356] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.599 [2024-07-15 00:15:02.437376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:03.599 #16 NEW cov: 11797 ft: 13600 corp: 7/161b lim: 30 exec/s: 0 rss: 68Mb L: 30/30 MS: 1 InsertByte- 00:07:03.599 [2024-07-15 00:15:02.485847] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100005151 00:07:03.599 [2024-07-15 00:15:02.486025] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100005151 00:07:03.599 [2024-07-15 00:15:02.486335] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:1f518151 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.599 [2024-07-15 00:15:02.486363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:03.599 [2024-07-15 00:15:02.486467] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:51518151 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.599 [2024-07-15 00:15:02.486485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:03.599 #18 NEW cov: 11803 ft: 13933 corp: 8/178b lim: 30 exec/s: 0 rss: 68Mb L: 17/30 MS: 2 InsertByte-InsertRepeatedBytes- 00:07:03.599 [2024-07-15 00:15:02.526186] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (262148) > buf size (4096) 00:07:03.599 [2024-07-15 00:15:02.526771] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x25 00:07:03.599 [2024-07-15 00:15:02.527121] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00008100 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.599 [2024-07-15 00:15:02.527150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:03.599 [2024-07-15 00:15:02.527238] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.599 [2024-07-15 00:15:02.527255] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:03.599 [2024-07-15 00:15:02.527384] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.599 [2024-07-15 00:15:02.527401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:03.599 [2024-07-15 00:15:02.527545] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.599 [2024-07-15 00:15:02.527563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:03.599 [2024-07-15 00:15:02.527684] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.599 [2024-07-15 00:15:02.527701] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:03.599 #19 NEW cov: 11803 ft: 13966 corp: 9/208b lim: 30 exec/s: 0 rss: 68Mb L: 30/30 MS: 1 InsertByte- 00:07:03.599 [2024-07-15 00:15:02.566217] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (262148) > buf size (4096) 00:07:03.599 [2024-07-15 00:15:02.566833] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00008100 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.599 [2024-07-15 00:15:02.566862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:03.599 [2024-07-15 00:15:02.566983] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.599 [2024-07-15 00:15:02.567000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:03.599 [2024-07-15 00:15:02.567125] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.600 [2024-07-15 00:15:02.567143] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:03.600 #20 NEW cov: 11803 ft: 13995 corp: 10/227b lim: 30 exec/s: 0 rss: 68Mb L: 19/30 MS: 1 EraseBytes- 00:07:03.600 [2024-07-15 00:15:02.616311] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (14340) > buf size (4096) 00:07:03.600 [2024-07-15 00:15:02.616771] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (29700) > buf size (4096) 00:07:03.600 [2024-07-15 00:15:02.617110] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0e000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.600 [2024-07-15 00:15:02.617136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:03.600 [2024-07-15 00:15:02.617251] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.600 [2024-07-15 00:15:02.617269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:03.600 [2024-07-15 00:15:02.617377] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.600 [2024-07-15 00:15:02.617395] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:03.600 [2024-07-15 00:15:02.617517] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:1d000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.600 [2024-07-15 00:15:02.617536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:03.600 #21 NEW cov: 11803 ft: 14061 corp: 11/256b lim: 30 exec/s: 0 rss: 68Mb L: 29/30 MS: 1 ChangeBit- 00:07:03.860 [2024-07-15 00:15:02.656129] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (262148) > buf size (4096) 00:07:03.860 [2024-07-15 00:15:02.656594] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (262148) > buf size (4096) 00:07:03.860 [2024-07-15 00:15:02.656758] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x25 00:07:03.860 [2024-07-15 00:15:02.657114] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00008100 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.860 [2024-07-15 00:15:02.657140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:03.860 [2024-07-15 00:15:02.657278] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.860 [2024-07-15 00:15:02.657294] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:03.860 [2024-07-15 00:15:02.657404] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.860 [2024-07-15 00:15:02.657421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:03.860 [2024-07-15 00:15:02.657548] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00008100 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.860 [2024-07-15 00:15:02.657565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:03.860 [2024-07-15 00:15:02.657685] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:01000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.860 [2024-07-15 00:15:02.657703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:03.860 #22 NEW cov: 11803 ft: 14087 corp: 12/286b lim: 30 exec/s: 0 rss: 68Mb L: 30/30 MS: 1 ChangeBinInt- 00:07:03.860 [2024-07-15 00:15:02.706168] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (262148) > buf size (4096) 00:07:03.860 [2024-07-15 00:15:02.706820] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00008100 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.860 [2024-07-15 00:15:02.706850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:03.860 [2024-07-15 00:15:02.706975] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.860 [2024-07-15 00:15:02.706990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:03.860 [2024-07-15 00:15:02.707111] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.860 [2024-07-15 00:15:02.707128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:03.860 #23 NEW cov: 11803 ft: 14139 corp: 13/305b lim: 30 exec/s: 0 rss: 68Mb L: 19/30 MS: 1 ChangeBit- 00:07:03.860 [2024-07-15 00:15:02.746853] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (262148) > buf size (4096) 00:07:03.860 [2024-07-15 00:15:02.747448] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x25 00:07:03.860 [2024-07-15 00:15:02.747783] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00008100 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.860 [2024-07-15 00:15:02.747810] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:03.860 [2024-07-15 00:15:02.747931] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.860 [2024-07-15 00:15:02.747948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:03.860 [2024-07-15 00:15:02.748069] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.860 [2024-07-15 00:15:02.748084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:03.860 [2024-07-15 00:15:02.748221] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.860 [2024-07-15 00:15:02.748241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:03.860 [2024-07-15 00:15:02.748357] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.860 [2024-07-15 00:15:02.748376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:03.860 #24 NEW cov: 11803 ft: 14211 corp: 14/335b lim: 30 exec/s: 0 rss: 69Mb L: 30/30 MS: 1 ChangeBinInt- 00:07:03.860 [2024-07-15 00:15:02.786977] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (262148) > buf size (4096) 00:07:03.860 [2024-07-15 00:15:02.787915] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00008100 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.860 [2024-07-15 00:15:02.787943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:03.860 [2024-07-15 00:15:02.788066] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.860 [2024-07-15 00:15:02.788083] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:03.860 [2024-07-15 00:15:02.788200] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.860 [2024-07-15 00:15:02.788219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:03.860 [2024-07-15 00:15:02.788342] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.860 [2024-07-15 00:15:02.788358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:03.860 [2024-07-15 00:15:02.788479] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.860 [2024-07-15 00:15:02.788495] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:03.860 NEW_FUNC[1/1]: 0x195e300 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:03.860 #25 NEW cov: 11826 ft: 14286 corp: 15/365b lim: 30 exec/s: 0 rss: 69Mb L: 30/30 MS: 1 CopyPart- 00:07:03.860 [2024-07-15 00:15:02.836870] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (14340) > buf size (4096) 00:07:03.860 [2024-07-15 00:15:02.837532] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0e000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.860 [2024-07-15 00:15:02.837561] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:03.860 [2024-07-15 00:15:02.837695] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.860 [2024-07-15 00:15:02.837712] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:03.860 [2024-07-15 00:15:02.837831] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.860 [2024-07-15 00:15:02.837849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:03.860 #26 NEW cov: 11826 ft: 14313 corp: 16/385b lim: 30 exec/s: 0 rss: 69Mb L: 20/30 MS: 1 EraseBytes- 00:07:03.860 [2024-07-15 00:15:02.887206] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (262148) > buf size (4096) 00:07:03.860 [2024-07-15 00:15:02.888125] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00008100 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.860 [2024-07-15 00:15:02.888154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:03.860 [2024-07-15 00:15:02.888288] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.860 [2024-07-15 00:15:02.888306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:03.860 [2024-07-15 00:15:02.888423] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.860 [2024-07-15 00:15:02.888445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:03.860 [2024-07-15 00:15:02.888569] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.860 [2024-07-15 00:15:02.888586] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:03.860 [2024-07-15 00:15:02.888700] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.860 [2024-07-15 00:15:02.888716] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:04.120 #27 NEW cov: 11826 ft: 14336 corp: 17/415b lim: 30 exec/s: 27 rss: 69Mb L: 30/30 MS: 1 ShuffleBytes- 00:07:04.120 [2024-07-15 00:15:02.937408] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (262148) > buf size (4096) 00:07:04.120 [2024-07-15 00:15:02.938307] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00008100 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.120 [2024-07-15 00:15:02.938335] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:04.120 [2024-07-15 00:15:02.938447] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.120 [2024-07-15 00:15:02.938465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:04.120 [2024-07-15 00:15:02.938591] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.120 [2024-07-15 00:15:02.938610] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:04.120 [2024-07-15 00:15:02.938724] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.120 [2024-07-15 00:15:02.938740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:04.120 [2024-07-15 00:15:02.938856] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.120 [2024-07-15 00:15:02.938876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:04.120 #28 NEW cov: 11826 ft: 14374 corp: 18/445b lim: 30 exec/s: 28 rss: 69Mb L: 30/30 MS: 1 CopyPart- 00:07:04.120 [2024-07-15 00:15:02.977259] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10244) > buf size (4096) 00:07:04.120 [2024-07-15 00:15:02.978009] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.120 [2024-07-15 00:15:02.978041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:04.120 [2024-07-15 00:15:02.978152] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.120 [2024-07-15 00:15:02.978170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:04.120 [2024-07-15 00:15:02.978287] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.120 [2024-07-15 00:15:02.978306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:04.120 [2024-07-15 00:15:02.978418] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.120 [2024-07-15 00:15:02.978437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:04.120 #29 NEW cov: 11826 ft: 14391 corp: 19/474b lim: 30 exec/s: 29 rss: 69Mb L: 29/30 MS: 1 ChangeBinInt- 00:07:04.120 [2024-07-15 00:15:03.017460] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (262148) > buf size (4096) 00:07:04.120 [2024-07-15 00:15:03.018241] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00008100 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.120 [2024-07-15 00:15:03.018269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:04.120 [2024-07-15 00:15:03.018388] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.120 [2024-07-15 00:15:03.018406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:04.120 [2024-07-15 00:15:03.018523] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.120 [2024-07-15 00:15:03.018541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:04.120 [2024-07-15 00:15:03.018673] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.120 [2024-07-15 00:15:03.018691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:04.120 #30 NEW cov: 11826 ft: 14422 corp: 20/501b lim: 30 exec/s: 30 rss: 69Mb L: 27/30 MS: 1 EraseBytes- 00:07:04.120 [2024-07-15 00:15:03.057755] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (14340) > buf size (4096) 00:07:04.120 [2024-07-15 00:15:03.058190] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (29700) > buf size (4096) 00:07:04.120 [2024-07-15 00:15:03.058556] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0e000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.120 [2024-07-15 00:15:03.058585] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:04.120 [2024-07-15 00:15:03.058701] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.120 [2024-07-15 00:15:03.058717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:04.120 [2024-07-15 00:15:03.058825] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.120 [2024-07-15 00:15:03.058846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:04.120 [2024-07-15 00:15:03.058964] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:1d000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.120 [2024-07-15 00:15:03.058983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:04.120 #31 NEW cov: 11826 ft: 14425 corp: 21/530b lim: 30 exec/s: 31 rss: 69Mb L: 29/30 MS: 1 ChangeBinInt- 00:07:04.120 [2024-07-15 00:15:03.097875] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (262148) > buf size (4096) 00:07:04.120 [2024-07-15 00:15:03.098028] ctrlr.c:2547:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: offset (14592) > len (4) 00:07:04.120 [2024-07-15 00:15:03.098788] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00008100 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.120 [2024-07-15 00:15:03.098818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:04.120 [2024-07-15 00:15:03.098930] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.120 [2024-07-15 00:15:03.098950] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:04.120 [2024-07-15 00:15:03.099070] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.120 [2024-07-15 00:15:03.099090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:04.120 [2024-07-15 00:15:03.099208] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.120 [2024-07-15 00:15:03.099226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:04.121 [2024-07-15 00:15:03.099349] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.121 [2024-07-15 00:15:03.099368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:04.121 #32 NEW cov: 11826 ft: 14470 corp: 22/560b lim: 30 exec/s: 32 rss: 69Mb L: 30/30 MS: 1 ChangeBit- 00:07:04.121 [2024-07-15 00:15:03.137927] ctrlr.c:2547:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: offset (6912) > len (4) 00:07:04.121 [2024-07-15 00:15:03.138683] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.121 [2024-07-15 00:15:03.138712] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:04.121 [2024-07-15 00:15:03.138823] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.121 [2024-07-15 00:15:03.138840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:04.121 [2024-07-15 00:15:03.138960] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.121 [2024-07-15 00:15:03.138976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:04.121 [2024-07-15 00:15:03.139096] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.121 [2024-07-15 00:15:03.139115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:04.121 #33 NEW cov: 11826 ft: 14512 corp: 23/587b lim: 30 exec/s: 33 rss: 69Mb L: 27/30 MS: 1 ChangeBinInt- 00:07:04.380 [2024-07-15 00:15:03.188081] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (262148) > buf size (4096) 00:07:04.380 [2024-07-15 00:15:03.188541] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:04.380 [2024-07-15 00:15:03.188865] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00008100 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.380 [2024-07-15 00:15:03.188894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:04.380 [2024-07-15 00:15:03.189008] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.380 [2024-07-15 00:15:03.189026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:04.380 [2024-07-15 00:15:03.189148] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.380 [2024-07-15 00:15:03.189169] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:04.380 [2024-07-15 00:15:03.189279] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.380 [2024-07-15 00:15:03.189296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:04.380 #34 NEW cov: 11826 ft: 14524 corp: 24/615b lim: 30 exec/s: 34 rss: 69Mb L: 28/30 MS: 1 InsertRepeatedBytes- 00:07:04.380 [2024-07-15 00:15:03.228335] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (262148) > buf size (4096) 00:07:04.380 [2024-07-15 00:15:03.228657] ctrlr.c:2547:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: offset (26624) > len (4) 00:07:04.380 [2024-07-15 00:15:03.228969] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x25 00:07:04.380 [2024-07-15 00:15:03.229334] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00008100 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.380 [2024-07-15 00:15:03.229362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:04.380 [2024-07-15 00:15:03.229475] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.380 [2024-07-15 00:15:03.229493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:04.380 [2024-07-15 00:15:03.229613] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.380 [2024-07-15 00:15:03.229630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:04.380 [2024-07-15 00:15:03.229752] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.380 [2024-07-15 00:15:03.229768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:04.380 [2024-07-15 00:15:03.229894] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.380 [2024-07-15 00:15:03.229913] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:04.380 #35 NEW cov: 11826 ft: 14554 corp: 25/645b lim: 30 exec/s: 35 rss: 69Mb L: 30/30 MS: 1 ChangeByte- 00:07:04.380 [2024-07-15 00:15:03.267899] ctrlr.c:2547:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: offset (6912) > len (4) 00:07:04.380 [2024-07-15 00:15:03.268660] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.380 [2024-07-15 00:15:03.268687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:04.380 [2024-07-15 00:15:03.268803] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.380 [2024-07-15 00:15:03.268820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:04.380 [2024-07-15 00:15:03.268937] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.380 [2024-07-15 00:15:03.268952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:04.380 [2024-07-15 00:15:03.269062] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.380 [2024-07-15 00:15:03.269080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:04.380 #36 NEW cov: 11826 ft: 14576 corp: 26/673b lim: 30 exec/s: 36 rss: 69Mb L: 28/30 MS: 1 CopyPart- 00:07:04.380 [2024-07-15 00:15:03.318052] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (786436) > buf size (4096) 00:07:04.380 [2024-07-15 00:15:03.318482] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:04.380 [2024-07-15 00:15:03.318851] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00008300 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.380 [2024-07-15 00:15:03.318878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:04.380 [2024-07-15 00:15:03.318995] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.380 [2024-07-15 00:15:03.319012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:04.380 [2024-07-15 00:15:03.319135] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.380 [2024-07-15 00:15:03.319154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:04.380 [2024-07-15 00:15:03.319273] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.380 [2024-07-15 00:15:03.319291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:04.380 #37 NEW cov: 11826 ft: 14643 corp: 27/701b lim: 30 exec/s: 37 rss: 69Mb L: 28/30 MS: 1 CMP- DE: "\003\000\000\000"- 00:07:04.380 [2024-07-15 00:15:03.368608] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10244) > buf size (4096) 00:07:04.380 [2024-07-15 00:15:03.368920] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (4100) > buf size (4096) 00:07:04.380 [2024-07-15 00:15:03.369399] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.380 [2024-07-15 00:15:03.369428] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:04.380 [2024-07-15 00:15:03.369551] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.380 [2024-07-15 00:15:03.369570] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:04.380 [2024-07-15 00:15:03.369687] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:04000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.380 [2024-07-15 00:15:03.369704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:04.380 [2024-07-15 00:15:03.369820] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.380 [2024-07-15 00:15:03.369837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:04.380 #38 NEW cov: 11826 ft: 14646 corp: 28/725b lim: 30 exec/s: 38 rss: 69Mb L: 24/30 MS: 1 CrossOver- 00:07:04.380 [2024-07-15 00:15:03.408790] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10244) > buf size (4096) 00:07:04.380 [2024-07-15 00:15:03.409221] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (29700) > buf size (4096) 00:07:04.380 [2024-07-15 00:15:03.409560] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.380 [2024-07-15 00:15:03.409590] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:04.380 [2024-07-15 00:15:03.409713] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.380 [2024-07-15 00:15:03.409732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:04.380 [2024-07-15 00:15:03.409857] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.380 [2024-07-15 00:15:03.409875] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:04.380 [2024-07-15 00:15:03.409996] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:1d000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.380 [2024-07-15 00:15:03.410014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:04.380 #39 NEW cov: 11826 ft: 14654 corp: 29/754b lim: 30 exec/s: 39 rss: 69Mb L: 29/30 MS: 1 ChangeBit- 00:07:04.638 [2024-07-15 00:15:03.448767] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10244) > buf size (4096) 00:07:04.638 [2024-07-15 00:15:03.448940] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (262148) > buf size (4096) 00:07:04.638 [2024-07-15 00:15:03.449433] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.638 [2024-07-15 00:15:03.449468] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:04.638 [2024-07-15 00:15:03.449592] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00008100 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.638 [2024-07-15 00:15:03.449612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:04.638 [2024-07-15 00:15:03.449729] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.638 [2024-07-15 00:15:03.449748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:04.638 #40 NEW cov: 11826 ft: 14687 corp: 30/774b lim: 30 exec/s: 40 rss: 70Mb L: 20/30 MS: 1 EraseBytes- 00:07:04.638 [2024-07-15 00:15:03.489072] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000007e 00:07:04.638 [2024-07-15 00:15:03.489653] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x25 00:07:04.638 [2024-07-15 00:15:03.490008] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00008100 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.638 [2024-07-15 00:15:03.490035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:04.638 [2024-07-15 00:15:03.490151] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.639 [2024-07-15 00:15:03.490168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:04.639 [2024-07-15 00:15:03.490293] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.639 [2024-07-15 00:15:03.490311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:04.639 [2024-07-15 00:15:03.490435] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.639 [2024-07-15 00:15:03.490456] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:04.639 [2024-07-15 00:15:03.490572] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.639 [2024-07-15 00:15:03.490589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:04.639 #41 NEW cov: 11826 ft: 14703 corp: 31/804b lim: 30 exec/s: 41 rss: 70Mb L: 30/30 MS: 1 ChangeByte- 00:07:04.639 [2024-07-15 00:15:03.528969] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100005115 00:07:04.639 [2024-07-15 00:15:03.529129] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100005151 00:07:04.639 [2024-07-15 00:15:03.529283] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000510a 00:07:04.639 [2024-07-15 00:15:03.529627] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:1f518151 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.639 [2024-07-15 00:15:03.529654] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:04.639 [2024-07-15 00:15:03.529771] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:51518151 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.639 [2024-07-15 00:15:03.529787] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:04.639 [2024-07-15 00:15:03.529906] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:51518151 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.639 [2024-07-15 00:15:03.529923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:04.639 #42 NEW cov: 11826 ft: 14734 corp: 32/822b lim: 30 exec/s: 42 rss: 70Mb L: 18/30 MS: 1 InsertByte- 00:07:04.639 [2024-07-15 00:15:03.569210] ctrlr.c:2547:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: offset (6912) > len (4) 00:07:04.639 [2024-07-15 00:15:03.570008] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.639 [2024-07-15 00:15:03.570037] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:04.639 [2024-07-15 00:15:03.570170] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.639 [2024-07-15 00:15:03.570191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:04.639 [2024-07-15 00:15:03.570314] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.639 [2024-07-15 00:15:03.570334] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:04.639 [2024-07-15 00:15:03.570457] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.639 [2024-07-15 00:15:03.570475] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:04.639 [2024-07-15 00:15:03.609314] ctrlr.c:2547:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: offset (6912) > len (4) 00:07:04.639 [2024-07-15 00:15:03.610093] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.639 [2024-07-15 00:15:03.610121] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:04.639 [2024-07-15 00:15:03.610238] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:003d0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.639 [2024-07-15 00:15:03.610256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:04.639 [2024-07-15 00:15:03.610366] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.639 [2024-07-15 00:15:03.610383] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:04.639 [2024-07-15 00:15:03.610506] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.639 [2024-07-15 00:15:03.610523] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:04.639 #44 NEW cov: 11826 ft: 14743 corp: 33/850b lim: 30 exec/s: 44 rss: 70Mb L: 28/30 MS: 2 ChangeBit-ChangeByte- 00:07:04.639 [2024-07-15 00:15:03.649340] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (294216) > buf size (4096) 00:07:04.639 [2024-07-15 00:15:03.649504] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x51 00:07:04.639 [2024-07-15 00:15:03.649659] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (345416) > buf size (4096) 00:07:04.639 [2024-07-15 00:15:03.650013] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:1f518151 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.639 [2024-07-15 00:15:03.650039] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:04.639 [2024-07-15 00:15:03.650158] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:005d0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.639 [2024-07-15 00:15:03.650176] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:04.639 [2024-07-15 00:15:03.650288] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:51518151 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.639 [2024-07-15 00:15:03.650306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:04.639 #45 NEW cov: 11826 ft: 14764 corp: 34/868b lim: 30 exec/s: 45 rss: 70Mb L: 18/30 MS: 1 CrossOver- 00:07:04.639 [2024-07-15 00:15:03.689347] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000007e 00:07:04.639 [2024-07-15 00:15:03.689855] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00008100 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.639 [2024-07-15 00:15:03.689887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:04.639 [2024-07-15 00:15:03.689997] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.639 [2024-07-15 00:15:03.690013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:04.898 #46 NEW cov: 11826 ft: 14790 corp: 35/880b lim: 30 exec/s: 46 rss: 70Mb L: 12/30 MS: 1 CrossOver- 00:07:04.898 [2024-07-15 00:15:03.739829] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (262148) > buf size (4096) 00:07:04.898 [2024-07-15 00:15:03.740146] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x1e 00:07:04.898 [2024-07-15 00:15:03.740786] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00008100 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.898 [2024-07-15 00:15:03.740814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:04.899 [2024-07-15 00:15:03.740935] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.899 [2024-07-15 00:15:03.740952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:04.899 [2024-07-15 00:15:03.741069] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.899 [2024-07-15 00:15:03.741088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:04.899 [2024-07-15 00:15:03.741204] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.899 [2024-07-15 00:15:03.741220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:04.899 [2024-07-15 00:15:03.741339] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.899 [2024-07-15 00:15:03.741357] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:04.899 #47 NEW cov: 11826 ft: 14812 corp: 36/910b lim: 30 exec/s: 47 rss: 70Mb L: 30/30 MS: 1 ChangeBinInt- 00:07:04.899 [2024-07-15 00:15:03.779944] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (262148) > buf size (4096) 00:07:04.899 [2024-07-15 00:15:03.780118] ctrlr.c:2547:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: offset (14592) > len (4) 00:07:04.899 [2024-07-15 00:15:03.780894] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00008100 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.899 [2024-07-15 00:15:03.780923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:04.899 [2024-07-15 00:15:03.781035] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.899 [2024-07-15 00:15:03.781052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:04.899 [2024-07-15 00:15:03.781173] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.899 [2024-07-15 00:15:03.781190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:04.899 [2024-07-15 00:15:03.781315] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.899 [2024-07-15 00:15:03.781330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:04.899 [2024-07-15 00:15:03.781453] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.899 [2024-07-15 00:15:03.781470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:04.899 #48 NEW cov: 11826 ft: 14825 corp: 37/940b lim: 30 exec/s: 48 rss: 70Mb L: 30/30 MS: 1 CopyPart- 00:07:04.899 [2024-07-15 00:15:03.829822] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (294216) > buf size (4096) 00:07:04.899 [2024-07-15 00:15:03.829996] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x2 00:07:04.899 [2024-07-15 00:15:03.830311] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:1f518151 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.899 [2024-07-15 00:15:03.830339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:04.899 [2024-07-15 00:15:03.830458] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.899 [2024-07-15 00:15:03.830476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:04.899 #49 NEW cov: 11826 ft: 14857 corp: 38/957b lim: 30 exec/s: 49 rss: 70Mb L: 17/30 MS: 1 CMP- DE: "\001\000\000\000\000\000\000\002"- 00:07:04.899 [2024-07-15 00:15:03.870171] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (262148) > buf size (4096) 00:07:04.899 [2024-07-15 00:15:03.870344] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:04.899 [2024-07-15 00:15:03.870502] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x1e 00:07:04.899 [2024-07-15 00:15:03.871123] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00008100 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.899 [2024-07-15 00:15:03.871150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:04.899 [2024-07-15 00:15:03.871270] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.899 [2024-07-15 00:15:03.871289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:04.899 [2024-07-15 00:15:03.871416] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.899 [2024-07-15 00:15:03.871436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:04.899 [2024-07-15 00:15:03.871522] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.899 [2024-07-15 00:15:03.871538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:04.899 [2024-07-15 00:15:03.871659] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.899 [2024-07-15 00:15:03.871677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:04.899 #50 NEW cov: 11826 ft: 14869 corp: 39/987b lim: 30 exec/s: 50 rss: 70Mb L: 30/30 MS: 1 ChangeBinInt- 00:07:04.899 [2024-07-15 00:15:03.920334] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (262148) > buf size (4096) 00:07:04.899 [2024-07-15 00:15:03.921298] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00008100 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.899 [2024-07-15 00:15:03.921327] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:04.899 [2024-07-15 00:15:03.921449] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.899 [2024-07-15 00:15:03.921468] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:04.899 [2024-07-15 00:15:03.921596] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.899 [2024-07-15 00:15:03.921616] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:04.899 [2024-07-15 00:15:03.921739] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.899 [2024-07-15 00:15:03.921758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:04.899 [2024-07-15 00:15:03.921870] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.899 [2024-07-15 00:15:03.921891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:04.899 #51 NEW cov: 11826 ft: 14879 corp: 40/1017b lim: 30 exec/s: 25 rss: 70Mb L: 30/30 MS: 1 ChangeBit- 00:07:04.899 #51 DONE cov: 11826 ft: 14879 corp: 40/1017b lim: 30 exec/s: 25 rss: 70Mb 00:07:04.899 ###### Recommended dictionary. ###### 00:07:04.899 "\003\000\000\000" # Uses: 0 00:07:04.899 "\001\000\000\000\000\000\000\002" # Uses: 0 00:07:04.899 ###### End of recommended dictionary. ###### 00:07:04.899 Done 51 runs in 2 second(s) 00:07:05.158 00:15:04 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_1.conf 00:07:05.158 00:15:04 -- ../common.sh@72 -- # (( i++ )) 00:07:05.158 00:15:04 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:05.158 00:15:04 -- ../common.sh@73 -- # start_llvm_fuzz 2 1 0x1 00:07:05.158 00:15:04 -- nvmf/run.sh@23 -- # local fuzzer_type=2 00:07:05.158 00:15:04 -- nvmf/run.sh@24 -- # local timen=1 00:07:05.158 00:15:04 -- nvmf/run.sh@25 -- # local core=0x1 00:07:05.158 00:15:04 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:07:05.158 00:15:04 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_2.conf 00:07:05.158 00:15:04 -- nvmf/run.sh@29 -- # printf %02d 2 00:07:05.158 00:15:04 -- nvmf/run.sh@29 -- # port=4402 00:07:05.158 00:15:04 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:07:05.158 00:15:04 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4402' 00:07:05.158 00:15:04 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4402"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:05.158 00:15:04 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4402' -c /tmp/fuzz_json_2.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 -Z 2 -r /var/tmp/spdk2.sock 00:07:05.158 [2024-07-15 00:15:04.103967] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:05.158 [2024-07-15 00:15:04.104045] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid326685 ] 00:07:05.158 EAL: No free 2048 kB hugepages reported on node 1 00:07:05.417 [2024-07-15 00:15:04.283730] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:05.417 [2024-07-15 00:15:04.346074] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:05.417 [2024-07-15 00:15:04.346222] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:05.417 [2024-07-15 00:15:04.404519] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:05.417 [2024-07-15 00:15:04.420807] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4402 *** 00:07:05.417 INFO: Running with entropic power schedule (0xFF, 100). 00:07:05.417 INFO: Seed: 1964317527 00:07:05.417 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x280a94c, 0x285dea9), 00:07:05.417 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x285deb0,0x2d93480), 00:07:05.417 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:07:05.417 INFO: A corpus is not provided, starting from an empty corpus 00:07:05.417 #2 INITED exec/s: 0 rss: 60Mb 00:07:05.417 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:05.417 This may also happen if the target rejected all inputs we tried so far 00:07:05.417 [2024-07-15 00:15:04.465465] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0aff0032 cdw11:b200299b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:05.417 [2024-07-15 00:15:04.465503] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:05.935 NEW_FUNC[1/667]: 0x484030 in fuzz_admin_identify_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:95 00:07:05.935 NEW_FUNC[2/667]: 0x4bd260 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:05.935 #19 NEW cov: 11483 ft: 11484 corp: 2/11b lim: 35 exec/s: 0 rss: 67Mb L: 10/10 MS: 2 InsertByte-CMP- DE: "\377)\233\262%f.\200"- 00:07:05.935 [2024-07-15 00:15:04.796249] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:05.935 [2024-07-15 00:15:04.796293] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:05.935 NEW_FUNC[1/3]: 0x1537870 in nvme_ctrlr_process_init /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_ctrlr.c:3790 00:07:05.935 NEW_FUNC[2/3]: 0x17046e0 in spdk_nvme_probe_poll_async /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme.c:1507 00:07:05.935 #20 NEW cov: 11624 ft: 11986 corp: 3/24b lim: 35 exec/s: 0 rss: 68Mb L: 13/13 MS: 1 InsertRepeatedBytes- 00:07:05.935 [2024-07-15 00:15:04.856233] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0aff0032 cdw11:b200299b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:05.935 [2024-07-15 00:15:04.856266] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:05.935 #21 NEW cov: 11630 ft: 12278 corp: 4/34b lim: 35 exec/s: 0 rss: 68Mb L: 10/13 MS: 1 ChangeByte- 00:07:05.935 [2024-07-15 00:15:04.916456] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0aff0072 cdw11:b200299b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:05.935 [2024-07-15 00:15:04.916488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:05.935 #22 NEW cov: 11715 ft: 12559 corp: 5/44b lim: 35 exec/s: 0 rss: 68Mb L: 10/13 MS: 1 ChangeBit- 00:07:05.936 [2024-07-15 00:15:04.986633] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0aff0072 cdw11:9b002529 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:05.936 [2024-07-15 00:15:04.986665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.195 #23 NEW cov: 11715 ft: 12750 corp: 6/54b lim: 35 exec/s: 0 rss: 68Mb L: 10/13 MS: 1 ShuffleBytes- 00:07:06.195 [2024-07-15 00:15:05.046794] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0aff0072 cdw11:b200299b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:06.195 [2024-07-15 00:15:05.046823] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.195 #24 NEW cov: 11715 ft: 12815 corp: 7/64b lim: 35 exec/s: 0 rss: 68Mb L: 10/13 MS: 1 ShuffleBytes- 00:07:06.195 [2024-07-15 00:15:05.096936] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ff290032 cdw11:25009bb2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:06.195 [2024-07-15 00:15:05.096966] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.195 [2024-07-15 00:15:05.097013] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:800a002e cdw11:9b00ff29 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:06.195 [2024-07-15 00:15:05.097028] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:06.195 #25 NEW cov: 11715 ft: 13221 corp: 8/82b lim: 35 exec/s: 0 rss: 68Mb L: 18/18 MS: 1 PersAutoDict- DE: "\377)\233\262%f.\200"- 00:07:06.195 [2024-07-15 00:15:05.157034] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0aff0072 cdw11:b200299b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:06.195 [2024-07-15 00:15:05.157062] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.195 #26 NEW cov: 11715 ft: 13271 corp: 9/93b lim: 35 exec/s: 0 rss: 68Mb L: 11/18 MS: 1 InsertByte- 00:07:06.195 [2024-07-15 00:15:05.217201] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0aff0072 cdw11:0a002572 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:06.195 [2024-07-15 00:15:05.217230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.455 #27 NEW cov: 11715 ft: 13338 corp: 10/102b lim: 35 exec/s: 0 rss: 69Mb L: 9/18 MS: 1 CrossOver- 00:07:06.455 [2024-07-15 00:15:05.277539] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:2fff0032 cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:06.455 [2024-07-15 00:15:05.277568] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.455 [2024-07-15 00:15:05.277616] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:06.455 [2024-07-15 00:15:05.277633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:06.455 [2024-07-15 00:15:05.277662] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:06.455 [2024-07-15 00:15:05.277679] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:06.455 [2024-07-15 00:15:05.277708] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:06.455 [2024-07-15 00:15:05.277723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:06.455 #30 NEW cov: 11715 ft: 13941 corp: 11/130b lim: 35 exec/s: 0 rss: 69Mb L: 28/28 MS: 3 ChangeByte-InsertByte-InsertRepeatedBytes- 00:07:06.455 [2024-07-15 00:15:05.337540] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0aff0072 cdw11:b200299b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:06.455 [2024-07-15 00:15:05.337571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.455 NEW_FUNC[1/1]: 0x195e300 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:06.455 #31 NEW cov: 11738 ft: 14001 corp: 12/141b lim: 35 exec/s: 0 rss: 69Mb L: 11/28 MS: 1 CrossOver- 00:07:06.455 [2024-07-15 00:15:05.407716] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:06.455 [2024-07-15 00:15:05.407869] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:0000005b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:06.455 [2024-07-15 00:15:05.407897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.455 [2024-07-15 00:15:05.407929] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:06.455 [2024-07-15 00:15:05.407945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:06.455 #32 NEW cov: 11747 ft: 14102 corp: 13/155b lim: 35 exec/s: 32 rss: 69Mb L: 14/28 MS: 1 InsertByte- 00:07:06.455 [2024-07-15 00:15:05.477896] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:06.455 [2024-07-15 00:15:05.478032] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0aff0032 cdw11:b200299b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:06.455 [2024-07-15 00:15:05.478056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.455 [2024-07-15 00:15:05.478087] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:2e800066 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:06.455 [2024-07-15 00:15:05.478103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:06.456 [2024-07-15 00:15:05.478131] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:06.456 [2024-07-15 00:15:05.478148] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:06.715 #33 NEW cov: 11747 ft: 14287 corp: 14/176b lim: 35 exec/s: 33 rss: 69Mb L: 21/28 MS: 1 InsertRepeatedBytes- 00:07:06.715 [2024-07-15 00:15:05.528006] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0ab20072 cdw11:ff00259b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:06.715 [2024-07-15 00:15:05.528036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.715 #39 NEW cov: 11747 ft: 14375 corp: 15/186b lim: 35 exec/s: 39 rss: 69Mb L: 10/28 MS: 1 ShuffleBytes- 00:07:06.715 [2024-07-15 00:15:05.578315] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:2fff0032 cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:06.715 [2024-07-15 00:15:05.578345] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.715 [2024-07-15 00:15:05.578378] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:06.715 [2024-07-15 00:15:05.578394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:06.715 [2024-07-15 00:15:05.578422] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ff94 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:06.715 [2024-07-15 00:15:05.578438] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:06.715 [2024-07-15 00:15:05.578473] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:06.715 [2024-07-15 00:15:05.578488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:06.715 #40 NEW cov: 11747 ft: 14426 corp: 16/214b lim: 35 exec/s: 40 rss: 69Mb L: 28/28 MS: 1 ChangeByte- 00:07:06.715 [2024-07-15 00:15:05.648379] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:7ab20072 cdw11:ff00259b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:06.715 [2024-07-15 00:15:05.648415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.715 #41 NEW cov: 11747 ft: 14506 corp: 17/224b lim: 35 exec/s: 41 rss: 69Mb L: 10/28 MS: 1 ChangeByte- 00:07:06.715 [2024-07-15 00:15:05.708526] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a250072 cdw11:ff00fe7a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:06.715 [2024-07-15 00:15:05.708556] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.715 #42 NEW cov: 11747 ft: 14533 corp: 18/231b lim: 35 exec/s: 42 rss: 69Mb L: 7/28 MS: 1 EraseBytes- 00:07:06.715 [2024-07-15 00:15:05.768730] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0aff0072 cdw11:b200299b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:06.715 [2024-07-15 00:15:05.768760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.715 [2024-07-15 00:15:05.768792] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:7a6600fe cdw11:290080ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:06.715 [2024-07-15 00:15:05.768808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:06.975 #43 NEW cov: 11747 ft: 14636 corp: 19/250b lim: 35 exec/s: 43 rss: 69Mb L: 19/28 MS: 1 PersAutoDict- DE: "\377)\233\262%f.\200"- 00:07:06.975 [2024-07-15 00:15:05.818850] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ff290032 cdw11:25009bb2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:06.975 [2024-07-15 00:15:05.818879] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.975 [2024-07-15 00:15:05.818927] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:800a002e cdw11:9b00ff29 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:06.975 [2024-07-15 00:15:05.818942] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:06.975 [2024-07-15 00:15:05.818971] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:66ff0025 cdw11:b200299b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:06.975 [2024-07-15 00:15:05.818987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:06.975 #44 NEW cov: 11747 ft: 14673 corp: 20/277b lim: 35 exec/s: 44 rss: 69Mb L: 27/28 MS: 1 CopyPart- 00:07:06.975 [2024-07-15 00:15:05.878946] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:08000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:06.975 [2024-07-15 00:15:05.878975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.975 #45 NEW cov: 11747 ft: 14683 corp: 21/290b lim: 35 exec/s: 45 rss: 69Mb L: 13/28 MS: 1 ChangeBit- 00:07:06.975 [2024-07-15 00:15:05.929085] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0aff0072 cdw11:0a002572 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:06.975 [2024-07-15 00:15:05.929114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.975 #46 NEW cov: 11747 ft: 14694 corp: 22/299b lim: 35 exec/s: 46 rss: 70Mb L: 9/28 MS: 1 ChangeBinInt- 00:07:06.975 [2024-07-15 00:15:05.989226] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:320a0027 cdw11:9b00ff29 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:06.975 [2024-07-15 00:15:05.989255] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.975 #47 NEW cov: 11747 ft: 14710 corp: 23/310b lim: 35 exec/s: 47 rss: 70Mb L: 11/28 MS: 1 InsertByte- 00:07:07.234 [2024-07-15 00:15:06.039380] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0aff0072 cdw11:290025ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:07.234 [2024-07-15 00:15:06.039415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.234 [2024-07-15 00:15:06.039470] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:256600b2 cdw11:72002e80 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:07.234 [2024-07-15 00:15:06.039486] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:07.234 #48 NEW cov: 11747 ft: 14715 corp: 24/327b lim: 35 exec/s: 48 rss: 70Mb L: 17/28 MS: 1 PersAutoDict- DE: "\377)\233\262%f.\200"- 00:07:07.234 [2024-07-15 00:15:06.089458] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:320a0027 cdw11:9b00ff29 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:07.235 [2024-07-15 00:15:06.089489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.235 #49 NEW cov: 11747 ft: 14761 corp: 25/338b lim: 35 exec/s: 49 rss: 70Mb L: 11/28 MS: 1 ChangeBit- 00:07:07.235 [2024-07-15 00:15:06.149672] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:07.235 [2024-07-15 00:15:06.149887] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:2fff0032 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:07.235 [2024-07-15 00:15:06.149912] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.235 [2024-07-15 00:15:06.149944] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:ff001cff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:07.235 [2024-07-15 00:15:06.149961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:07.235 [2024-07-15 00:15:06.149990] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:07.235 [2024-07-15 00:15:06.150005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:07.235 [2024-07-15 00:15:06.150033] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:07.235 [2024-07-15 00:15:06.150049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:07.235 #50 NEW cov: 11747 ft: 14771 corp: 26/366b lim: 35 exec/s: 50 rss: 70Mb L: 28/28 MS: 1 ChangeBinInt- 00:07:07.235 [2024-07-15 00:15:06.199800] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0aff0072 cdw11:4d002962 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:07.235 [2024-07-15 00:15:06.199830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.235 #51 NEW cov: 11747 ft: 14793 corp: 27/377b lim: 35 exec/s: 51 rss: 70Mb L: 11/28 MS: 1 ChangeBinInt- 00:07:07.235 [2024-07-15 00:15:06.250099] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a250072 cdw11:ff00fe7a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:07.235 [2024-07-15 00:15:06.250129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.235 [2024-07-15 00:15:06.250177] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:07.235 [2024-07-15 00:15:06.250193] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:07.235 [2024-07-15 00:15:06.250223] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:07.235 [2024-07-15 00:15:06.250242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:07.235 [2024-07-15 00:15:06.250271] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:07.235 [2024-07-15 00:15:06.250287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:07.494 #52 NEW cov: 11747 ft: 14849 corp: 28/411b lim: 35 exec/s: 52 rss: 70Mb L: 34/34 MS: 1 InsertRepeatedBytes- 00:07:07.494 [2024-07-15 00:15:06.320122] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:07.494 [2024-07-15 00:15:06.320243] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0aff0032 cdw11:b200299b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:07.494 [2024-07-15 00:15:06.320266] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.494 [2024-07-15 00:15:06.320297] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:2e800066 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:07.494 [2024-07-15 00:15:06.320312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:07.494 [2024-07-15 00:15:06.320340] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:07.494 [2024-07-15 00:15:06.320356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:07.494 #53 NEW cov: 11747 ft: 14867 corp: 29/432b lim: 35 exec/s: 53 rss: 70Mb L: 21/34 MS: 1 ShuffleBytes- 00:07:07.494 [2024-07-15 00:15:06.380387] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:2fff0032 cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:07.494 [2024-07-15 00:15:06.380416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.494 [2024-07-15 00:15:06.380470] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:69690069 cdw11:ff006969 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:07.494 [2024-07-15 00:15:06.380486] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:07.494 [2024-07-15 00:15:06.380515] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:07.494 [2024-07-15 00:15:06.380530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:07.494 [2024-07-15 00:15:06.380559] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:07.494 [2024-07-15 00:15:06.380574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:07.494 #54 NEW cov: 11747 ft: 14869 corp: 30/465b lim: 35 exec/s: 54 rss: 70Mb L: 33/34 MS: 1 InsertRepeatedBytes- 00:07:07.494 [2024-07-15 00:15:06.430491] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ff290032 cdw11:25009bb2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:07.494 [2024-07-15 00:15:06.430520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.494 [2024-07-15 00:15:06.430567] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:800a002e cdw11:9b00ff29 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:07.494 [2024-07-15 00:15:06.430583] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:07.494 [2024-07-15 00:15:06.430611] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:fe7a0025 cdw11:9b00ff25 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:07.494 [2024-07-15 00:15:06.430630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:07.494 [2024-07-15 00:15:06.430658] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:66ff0025 cdw11:b200299b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:07.494 [2024-07-15 00:15:06.430673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:07.494 #55 NEW cov: 11747 ft: 14880 corp: 31/499b lim: 35 exec/s: 27 rss: 70Mb L: 34/34 MS: 1 CrossOver- 00:07:07.494 #55 DONE cov: 11747 ft: 14880 corp: 31/499b lim: 35 exec/s: 27 rss: 70Mb 00:07:07.494 ###### Recommended dictionary. ###### 00:07:07.494 "\377)\233\262%f.\200" # Uses: 3 00:07:07.494 ###### End of recommended dictionary. ###### 00:07:07.494 Done 55 runs in 2 second(s) 00:07:07.754 00:15:06 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_2.conf 00:07:07.754 00:15:06 -- ../common.sh@72 -- # (( i++ )) 00:07:07.754 00:15:06 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:07.754 00:15:06 -- ../common.sh@73 -- # start_llvm_fuzz 3 1 0x1 00:07:07.754 00:15:06 -- nvmf/run.sh@23 -- # local fuzzer_type=3 00:07:07.754 00:15:06 -- nvmf/run.sh@24 -- # local timen=1 00:07:07.754 00:15:06 -- nvmf/run.sh@25 -- # local core=0x1 00:07:07.754 00:15:06 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:07:07.754 00:15:06 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_3.conf 00:07:07.754 00:15:06 -- nvmf/run.sh@29 -- # printf %02d 3 00:07:07.754 00:15:06 -- nvmf/run.sh@29 -- # port=4403 00:07:07.754 00:15:06 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:07:07.754 00:15:06 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4403' 00:07:07.754 00:15:06 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4403"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:07.754 00:15:06 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4403' -c /tmp/fuzz_json_3.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 -Z 3 -r /var/tmp/spdk3.sock 00:07:07.754 [2024-07-15 00:15:06.647396] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:07.754 [2024-07-15 00:15:06.647499] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid327544 ] 00:07:07.754 EAL: No free 2048 kB hugepages reported on node 1 00:07:08.012 [2024-07-15 00:15:06.825320] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:08.013 [2024-07-15 00:15:06.886533] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:08.013 [2024-07-15 00:15:06.886678] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:08.013 [2024-07-15 00:15:06.944555] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:08.013 [2024-07-15 00:15:06.960847] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4403 *** 00:07:08.013 INFO: Running with entropic power schedule (0xFF, 100). 00:07:08.013 INFO: Seed: 209369260 00:07:08.013 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x280a94c, 0x285dea9), 00:07:08.013 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x285deb0,0x2d93480), 00:07:08.013 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:07:08.013 INFO: A corpus is not provided, starting from an empty corpus 00:07:08.013 #2 INITED exec/s: 0 rss: 60Mb 00:07:08.013 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:08.013 This may also happen if the target rejected all inputs we tried so far 00:07:08.272 NEW_FUNC[1/659]: 0x485d00 in fuzz_admin_abort_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:114 00:07:08.272 NEW_FUNC[2/659]: 0x4bd260 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:08.272 #4 NEW cov: 11423 ft: 11424 corp: 2/12b lim: 20 exec/s: 0 rss: 67Mb L: 11/11 MS: 2 CopyPart-InsertRepeatedBytes- 00:07:08.272 [2024-07-15 00:15:07.317422] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:08.272 [2024-07-15 00:15:07.317466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:08.532 NEW_FUNC[1/20]: 0x113e4b0 in nvmf_qpair_abort_request /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:3224 00:07:08.532 NEW_FUNC[2/20]: 0x113f030 in nvmf_qpair_abort_aer /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:3166 00:07:08.532 #11 NEW cov: 11875 ft: 12733 corp: 3/31b lim: 20 exec/s: 0 rss: 67Mb L: 19/19 MS: 2 InsertByte-InsertRepeatedBytes- 00:07:08.532 #12 NEW cov: 11881 ft: 12957 corp: 4/42b lim: 20 exec/s: 0 rss: 67Mb L: 11/19 MS: 1 ShuffleBytes- 00:07:08.532 #13 NEW cov: 11966 ft: 13266 corp: 5/53b lim: 20 exec/s: 0 rss: 67Mb L: 11/19 MS: 1 ChangeBinInt- 00:07:08.532 #14 NEW cov: 11970 ft: 13448 corp: 6/68b lim: 20 exec/s: 0 rss: 67Mb L: 15/19 MS: 1 InsertRepeatedBytes- 00:07:08.532 #15 NEW cov: 11970 ft: 13591 corp: 7/79b lim: 20 exec/s: 0 rss: 67Mb L: 11/19 MS: 1 ChangeByte- 00:07:08.532 #16 NEW cov: 11970 ft: 13688 corp: 8/91b lim: 20 exec/s: 0 rss: 67Mb L: 12/19 MS: 1 InsertByte- 00:07:08.532 #17 NEW cov: 11970 ft: 13702 corp: 9/103b lim: 20 exec/s: 0 rss: 67Mb L: 12/19 MS: 1 InsertByte- 00:07:08.792 #18 NEW cov: 11970 ft: 13739 corp: 10/120b lim: 20 exec/s: 0 rss: 67Mb L: 17/19 MS: 1 CrossOver- 00:07:08.792 #20 NEW cov: 11970 ft: 13787 corp: 11/136b lim: 20 exec/s: 0 rss: 67Mb L: 16/19 MS: 2 CopyPart-InsertRepeatedBytes- 00:07:08.792 [2024-07-15 00:15:07.678342] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:08.792 [2024-07-15 00:15:07.678371] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:08.792 #21 NEW cov: 11970 ft: 13831 corp: 12/155b lim: 20 exec/s: 0 rss: 67Mb L: 19/19 MS: 1 ChangeByte- 00:07:08.792 #22 NEW cov: 11970 ft: 13858 corp: 13/175b lim: 20 exec/s: 0 rss: 68Mb L: 20/20 MS: 1 CrossOver- 00:07:08.792 #23 NEW cov: 11970 ft: 13866 corp: 14/195b lim: 20 exec/s: 0 rss: 68Mb L: 20/20 MS: 1 CMP- DE: "\005\000"- 00:07:08.792 #24 NEW cov: 11970 ft: 13879 corp: 15/207b lim: 20 exec/s: 0 rss: 68Mb L: 12/20 MS: 1 CrossOver- 00:07:09.051 #25 NEW cov: 11970 ft: 13900 corp: 16/224b lim: 20 exec/s: 0 rss: 68Mb L: 17/20 MS: 1 EraseBytes- 00:07:09.051 NEW_FUNC[1/1]: 0x195e300 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:09.052 #26 NEW cov: 11993 ft: 14022 corp: 17/239b lim: 20 exec/s: 0 rss: 68Mb L: 15/20 MS: 1 CrossOver- 00:07:09.052 #27 NEW cov: 11993 ft: 14035 corp: 18/251b lim: 20 exec/s: 0 rss: 68Mb L: 12/20 MS: 1 InsertByte- 00:07:09.052 #28 NEW cov: 11993 ft: 14055 corp: 19/264b lim: 20 exec/s: 28 rss: 68Mb L: 13/20 MS: 1 InsertByte- 00:07:09.052 #29 NEW cov: 11993 ft: 14061 corp: 20/275b lim: 20 exec/s: 29 rss: 68Mb L: 11/20 MS: 1 ChangeByte- 00:07:09.052 #30 NEW cov: 11993 ft: 14114 corp: 21/292b lim: 20 exec/s: 30 rss: 68Mb L: 17/20 MS: 1 ChangeByte- 00:07:09.311 #31 NEW cov: 11993 ft: 14117 corp: 22/307b lim: 20 exec/s: 31 rss: 68Mb L: 15/20 MS: 1 ChangeByte- 00:07:09.311 #32 NEW cov: 11993 ft: 14151 corp: 23/320b lim: 20 exec/s: 32 rss: 68Mb L: 13/20 MS: 1 InsertByte- 00:07:09.311 #33 NEW cov: 11993 ft: 14158 corp: 24/338b lim: 20 exec/s: 33 rss: 69Mb L: 18/20 MS: 1 PersAutoDict- DE: "\005\000"- 00:07:09.311 #34 NEW cov: 11993 ft: 14171 corp: 25/355b lim: 20 exec/s: 34 rss: 69Mb L: 17/20 MS: 1 ChangeByte- 00:07:09.311 #35 NEW cov: 11993 ft: 14212 corp: 26/363b lim: 20 exec/s: 35 rss: 69Mb L: 8/20 MS: 1 EraseBytes- 00:07:09.311 #36 NEW cov: 11993 ft: 14216 corp: 27/374b lim: 20 exec/s: 36 rss: 69Mb L: 11/20 MS: 1 ShuffleBytes- 00:07:09.311 #41 NEW cov: 11993 ft: 14233 corp: 28/383b lim: 20 exec/s: 41 rss: 69Mb L: 9/20 MS: 5 CrossOver-CopyPart-EraseBytes-ShuffleBytes-InsertRepeatedBytes- 00:07:09.570 #42 NEW cov: 11993 ft: 14275 corp: 29/395b lim: 20 exec/s: 42 rss: 69Mb L: 12/20 MS: 1 CopyPart- 00:07:09.570 #43 NEW cov: 11993 ft: 14284 corp: 30/414b lim: 20 exec/s: 43 rss: 69Mb L: 19/20 MS: 1 CrossOver- 00:07:09.570 #44 NEW cov: 11993 ft: 14288 corp: 31/434b lim: 20 exec/s: 44 rss: 69Mb L: 20/20 MS: 1 CopyPart- 00:07:09.570 #45 NEW cov: 11993 ft: 14533 corp: 32/440b lim: 20 exec/s: 45 rss: 69Mb L: 6/20 MS: 1 EraseBytes- 00:07:09.570 #46 NEW cov: 11993 ft: 14608 corp: 33/449b lim: 20 exec/s: 46 rss: 69Mb L: 9/20 MS: 1 ChangeByte- 00:07:09.570 #47 NEW cov: 11993 ft: 14634 corp: 34/469b lim: 20 exec/s: 47 rss: 69Mb L: 20/20 MS: 1 CrossOver- 00:07:09.570 #48 NEW cov: 11993 ft: 14647 corp: 35/486b lim: 20 exec/s: 48 rss: 69Mb L: 17/20 MS: 1 PersAutoDict- DE: "\005\000"- 00:07:09.829 #49 NEW cov: 11993 ft: 14678 corp: 36/502b lim: 20 exec/s: 49 rss: 69Mb L: 16/20 MS: 1 InsertByte- 00:07:09.829 #50 NEW cov: 11993 ft: 14687 corp: 37/515b lim: 20 exec/s: 50 rss: 70Mb L: 13/20 MS: 1 ChangeBit- 00:07:09.829 #51 NEW cov: 11993 ft: 14719 corp: 38/521b lim: 20 exec/s: 51 rss: 70Mb L: 6/20 MS: 1 ChangeByte- 00:07:09.829 #52 NEW cov: 11993 ft: 14727 corp: 39/538b lim: 20 exec/s: 52 rss: 70Mb L: 17/20 MS: 1 ChangeBit- 00:07:09.829 #53 NEW cov: 11993 ft: 14751 corp: 40/546b lim: 20 exec/s: 53 rss: 70Mb L: 8/20 MS: 1 EraseBytes- 00:07:09.829 [2024-07-15 00:15:08.831617] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:09.829 [2024-07-15 00:15:08.831646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:09.829 #54 NEW cov: 11993 ft: 14920 corp: 41/565b lim: 20 exec/s: 54 rss: 70Mb L: 19/20 MS: 1 ChangeByte- 00:07:10.088 #55 NEW cov: 11993 ft: 14953 corp: 42/582b lim: 20 exec/s: 55 rss: 70Mb L: 17/20 MS: 1 ChangeBinInt- 00:07:10.088 #56 NEW cov: 11993 ft: 14960 corp: 43/595b lim: 20 exec/s: 56 rss: 70Mb L: 13/20 MS: 1 ChangeBinInt- 00:07:10.088 #57 NEW cov: 11993 ft: 14971 corp: 44/612b lim: 20 exec/s: 57 rss: 70Mb L: 17/20 MS: 1 ChangeBit- 00:07:10.088 #58 NEW cov: 11993 ft: 15025 corp: 45/630b lim: 20 exec/s: 29 rss: 70Mb L: 18/20 MS: 1 InsertByte- 00:07:10.088 #58 DONE cov: 11993 ft: 15025 corp: 45/630b lim: 20 exec/s: 29 rss: 70Mb 00:07:10.088 ###### Recommended dictionary. ###### 00:07:10.088 "\005\000" # Uses: 2 00:07:10.088 ###### End of recommended dictionary. ###### 00:07:10.088 Done 58 runs in 2 second(s) 00:07:10.088 00:15:09 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_3.conf 00:07:10.088 00:15:09 -- ../common.sh@72 -- # (( i++ )) 00:07:10.088 00:15:09 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:10.088 00:15:09 -- ../common.sh@73 -- # start_llvm_fuzz 4 1 0x1 00:07:10.088 00:15:09 -- nvmf/run.sh@23 -- # local fuzzer_type=4 00:07:10.088 00:15:09 -- nvmf/run.sh@24 -- # local timen=1 00:07:10.088 00:15:09 -- nvmf/run.sh@25 -- # local core=0x1 00:07:10.088 00:15:09 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:07:10.088 00:15:09 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_4.conf 00:07:10.088 00:15:09 -- nvmf/run.sh@29 -- # printf %02d 4 00:07:10.088 00:15:09 -- nvmf/run.sh@29 -- # port=4404 00:07:10.088 00:15:09 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:07:10.088 00:15:09 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4404' 00:07:10.088 00:15:09 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4404"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:10.347 00:15:09 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4404' -c /tmp/fuzz_json_4.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 -Z 4 -r /var/tmp/spdk4.sock 00:07:10.347 [2024-07-15 00:15:09.175820] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:10.347 [2024-07-15 00:15:09.175887] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid327856 ] 00:07:10.347 EAL: No free 2048 kB hugepages reported on node 1 00:07:10.347 [2024-07-15 00:15:09.359095] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:10.606 [2024-07-15 00:15:09.421761] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:10.606 [2024-07-15 00:15:09.421908] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:10.606 [2024-07-15 00:15:09.479869] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:10.606 [2024-07-15 00:15:09.496168] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4404 *** 00:07:10.606 INFO: Running with entropic power schedule (0xFF, 100). 00:07:10.606 INFO: Seed: 2744366170 00:07:10.606 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x280a94c, 0x285dea9), 00:07:10.606 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x285deb0,0x2d93480), 00:07:10.606 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:07:10.606 INFO: A corpus is not provided, starting from an empty corpus 00:07:10.606 #2 INITED exec/s: 0 rss: 60Mb 00:07:10.606 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:10.606 This may also happen if the target rejected all inputs we tried so far 00:07:10.606 [2024-07-15 00:15:09.540799] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000a6e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.606 [2024-07-15 00:15:09.540835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.865 NEW_FUNC[1/671]: 0x486df0 in fuzz_admin_create_io_completion_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:126 00:07:10.865 NEW_FUNC[2/671]: 0x4bd260 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:10.865 #3 NEW cov: 11528 ft: 11529 corp: 2/10b lim: 35 exec/s: 0 rss: 67Mb L: 9/9 MS: 1 CMP- DE: "n\000\000\000\000\000\000\000"- 00:07:10.865 [2024-07-15 00:15:09.871582] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00004a6e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.865 [2024-07-15 00:15:09.871620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.865 #5 NEW cov: 11645 ft: 12066 corp: 3/19b lim: 35 exec/s: 0 rss: 67Mb L: 9/9 MS: 2 ChangeBit-PersAutoDict- DE: "n\000\000\000\000\000\000\000"- 00:07:10.865 [2024-07-15 00:15:09.921635] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00004a72 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.865 [2024-07-15 00:15:09.921666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.124 #6 NEW cov: 11651 ft: 12321 corp: 4/28b lim: 35 exec/s: 0 rss: 67Mb L: 9/9 MS: 1 ChangeByte- 00:07:11.124 [2024-07-15 00:15:09.981808] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0000096e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.124 [2024-07-15 00:15:09.981839] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.124 #7 NEW cov: 11736 ft: 12548 corp: 5/37b lim: 35 exec/s: 0 rss: 67Mb L: 9/9 MS: 1 ChangeBinInt- 00:07:11.124 [2024-07-15 00:15:10.052015] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0000096e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.125 [2024-07-15 00:15:10.052050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.125 #8 NEW cov: 11736 ft: 12633 corp: 6/45b lim: 35 exec/s: 0 rss: 67Mb L: 8/9 MS: 1 EraseBytes- 00:07:11.125 [2024-07-15 00:15:10.122407] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000a6e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.125 [2024-07-15 00:15:10.122450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.125 #9 NEW cov: 11736 ft: 12682 corp: 7/54b lim: 35 exec/s: 0 rss: 68Mb L: 9/9 MS: 1 ChangeBit- 00:07:11.125 [2024-07-15 00:15:10.172529] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000a6e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.125 [2024-07-15 00:15:10.172559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.125 [2024-07-15 00:15:10.172607] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00006e00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.125 [2024-07-15 00:15:10.172624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:11.383 #10 NEW cov: 11736 ft: 13464 corp: 8/71b lim: 35 exec/s: 0 rss: 68Mb L: 17/17 MS: 1 PersAutoDict- DE: "n\000\000\000\000\000\000\000"- 00:07:11.383 [2024-07-15 00:15:10.232664] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000a6e cdw11:04000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.383 [2024-07-15 00:15:10.232694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.383 #11 NEW cov: 11736 ft: 13522 corp: 9/80b lim: 35 exec/s: 0 rss: 68Mb L: 9/17 MS: 1 ChangeBinInt- 00:07:11.383 [2024-07-15 00:15:10.282756] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000a6e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.383 [2024-07-15 00:15:10.282785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.383 [2024-07-15 00:15:10.282833] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00006e08 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.383 [2024-07-15 00:15:10.282848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:11.384 #12 NEW cov: 11736 ft: 13667 corp: 10/97b lim: 35 exec/s: 0 rss: 68Mb L: 17/17 MS: 1 ChangeBit- 00:07:11.384 [2024-07-15 00:15:10.352962] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:006e0a6e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.384 [2024-07-15 00:15:10.352992] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.384 [2024-07-15 00:15:10.353040] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.384 [2024-07-15 00:15:10.353057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:11.384 #13 NEW cov: 11736 ft: 13743 corp: 11/114b lim: 35 exec/s: 0 rss: 68Mb L: 17/17 MS: 1 PersAutoDict- DE: "n\000\000\000\000\000\000\000"- 00:07:11.384 [2024-07-15 00:15:10.403082] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:6e000a0a cdw11:6e000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.384 [2024-07-15 00:15:10.403111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.384 [2024-07-15 00:15:10.403160] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.384 [2024-07-15 00:15:10.403176] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:11.642 NEW_FUNC[1/1]: 0x195e300 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:11.642 #15 NEW cov: 11753 ft: 13793 corp: 12/132b lim: 35 exec/s: 0 rss: 68Mb L: 18/18 MS: 2 ShuffleBytes-CrossOver- 00:07:11.642 [2024-07-15 00:15:10.463232] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:006e0a6e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.642 [2024-07-15 00:15:10.463264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.642 [2024-07-15 00:15:10.463312] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.642 [2024-07-15 00:15:10.463328] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:11.642 #16 NEW cov: 11753 ft: 13834 corp: 13/149b lim: 35 exec/s: 0 rss: 68Mb L: 17/18 MS: 1 ShuffleBytes- 00:07:11.642 [2024-07-15 00:15:10.533432] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000a6e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.642 [2024-07-15 00:15:10.533465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.642 [2024-07-15 00:15:10.533513] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00006e08 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.642 [2024-07-15 00:15:10.533529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:11.642 #17 NEW cov: 11753 ft: 13894 corp: 14/166b lim: 35 exec/s: 17 rss: 68Mb L: 17/18 MS: 1 CrossOver- 00:07:11.642 [2024-07-15 00:15:10.603680] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000a6e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.642 [2024-07-15 00:15:10.603709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.642 [2024-07-15 00:15:10.603757] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00006e08 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.642 [2024-07-15 00:15:10.603773] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:11.642 [2024-07-15 00:15:10.603801] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:0000006e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.642 [2024-07-15 00:15:10.603816] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:11.642 #18 NEW cov: 11753 ft: 14148 corp: 15/191b lim: 35 exec/s: 18 rss: 68Mb L: 25/25 MS: 1 PersAutoDict- DE: "n\000\000\000\000\000\000\000"- 00:07:11.642 [2024-07-15 00:15:10.673809] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000a6e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.642 [2024-07-15 00:15:10.673841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.642 [2024-07-15 00:15:10.673873] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.642 [2024-07-15 00:15:10.673889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:11.917 #19 NEW cov: 11753 ft: 14187 corp: 16/207b lim: 35 exec/s: 19 rss: 68Mb L: 16/25 MS: 1 InsertRepeatedBytes- 00:07:11.917 [2024-07-15 00:15:10.743919] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00080a6e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.917 [2024-07-15 00:15:10.743950] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.917 #20 NEW cov: 11753 ft: 14261 corp: 17/216b lim: 35 exec/s: 20 rss: 68Mb L: 9/25 MS: 1 ShuffleBytes- 00:07:11.917 [2024-07-15 00:15:10.804063] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0000096e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.917 [2024-07-15 00:15:10.804093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.917 #21 NEW cov: 11753 ft: 14311 corp: 18/224b lim: 35 exec/s: 21 rss: 68Mb L: 8/25 MS: 1 CrossOver- 00:07:11.917 [2024-07-15 00:15:10.864359] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000a6e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.917 [2024-07-15 00:15:10.864389] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.917 [2024-07-15 00:15:10.864422] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:0a6e0000 cdw11:006e0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.917 [2024-07-15 00:15:10.864437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:11.917 [2024-07-15 00:15:10.864474] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.917 [2024-07-15 00:15:10.864490] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:11.917 #22 NEW cov: 11753 ft: 14327 corp: 19/250b lim: 35 exec/s: 22 rss: 69Mb L: 26/26 MS: 1 PersAutoDict- DE: "n\000\000\000\000\000\000\000"- 00:07:11.917 [2024-07-15 00:15:10.934539] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000a6e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.918 [2024-07-15 00:15:10.934575] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.918 [2024-07-15 00:15:10.934638] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00006e00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.918 [2024-07-15 00:15:10.934656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:11.918 #23 NEW cov: 11753 ft: 14402 corp: 20/267b lim: 35 exec/s: 23 rss: 69Mb L: 17/26 MS: 1 ChangeBit- 00:07:12.176 [2024-07-15 00:15:10.975329] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000a6e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:12.176 [2024-07-15 00:15:10.975356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.176 [2024-07-15 00:15:10.975412] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00006e00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:12.176 [2024-07-15 00:15:10.975426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.176 #24 NEW cov: 11753 ft: 14486 corp: 21/284b lim: 35 exec/s: 24 rss: 69Mb L: 17/26 MS: 1 ChangeBit- 00:07:12.176 [2024-07-15 00:15:11.015253] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00080a6e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:12.176 [2024-07-15 00:15:11.015279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.176 #25 NEW cov: 11753 ft: 14571 corp: 22/293b lim: 35 exec/s: 25 rss: 69Mb L: 9/26 MS: 1 ChangeBit- 00:07:12.176 [2024-07-15 00:15:11.055345] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00006e00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:12.176 [2024-07-15 00:15:11.055369] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.176 #26 NEW cov: 11753 ft: 14601 corp: 23/302b lim: 35 exec/s: 26 rss: 69Mb L: 9/26 MS: 1 PersAutoDict- DE: "n\000\000\000\000\000\000\000"- 00:07:12.177 [2024-07-15 00:15:11.095777] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:6e000a0a cdw11:6e000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:12.177 [2024-07-15 00:15:11.095802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.177 [2024-07-15 00:15:11.095862] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:12.177 [2024-07-15 00:15:11.095875] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.177 [2024-07-15 00:15:11.095930] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:82828282 cdw11:82820001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:12.177 [2024-07-15 00:15:11.095943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:12.177 #27 NEW cov: 11753 ft: 14656 corp: 24/328b lim: 35 exec/s: 27 rss: 69Mb L: 26/26 MS: 1 InsertRepeatedBytes- 00:07:12.177 [2024-07-15 00:15:11.135628] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000a82 cdw11:00000002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:12.177 [2024-07-15 00:15:11.135654] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.177 #30 NEW cov: 11753 ft: 14668 corp: 25/336b lim: 35 exec/s: 30 rss: 69Mb L: 8/26 MS: 3 CopyPart-ChangeByte-CrossOver- 00:07:12.177 [2024-07-15 00:15:11.175875] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000a6e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:12.177 [2024-07-15 00:15:11.175900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.177 [2024-07-15 00:15:11.175968] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:0a6e0000 cdw11:006e0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:12.177 [2024-07-15 00:15:11.175982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.177 #31 NEW cov: 11753 ft: 14689 corp: 26/354b lim: 35 exec/s: 31 rss: 69Mb L: 18/26 MS: 1 EraseBytes- 00:07:12.177 [2024-07-15 00:15:11.215834] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0000ca6e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:12.177 [2024-07-15 00:15:11.215858] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.436 #32 NEW cov: 11753 ft: 14719 corp: 27/363b lim: 35 exec/s: 32 rss: 69Mb L: 9/26 MS: 1 ChangeBit- 00:07:12.436 [2024-07-15 00:15:11.255951] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00044a72 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:12.436 [2024-07-15 00:15:11.255975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.436 #33 NEW cov: 11753 ft: 14740 corp: 28/372b lim: 35 exec/s: 33 rss: 69Mb L: 9/26 MS: 1 ChangeBit- 00:07:12.436 [2024-07-15 00:15:11.296357] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000a6e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:12.436 [2024-07-15 00:15:11.296381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.436 [2024-07-15 00:15:11.296439] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:0a6e0000 cdw11:806e0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:12.436 [2024-07-15 00:15:11.296456] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.436 [2024-07-15 00:15:11.296508] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:12.436 [2024-07-15 00:15:11.296522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:12.436 #34 NEW cov: 11753 ft: 14752 corp: 29/398b lim: 35 exec/s: 34 rss: 69Mb L: 26/26 MS: 1 ChangeBit- 00:07:12.436 [2024-07-15 00:15:11.336157] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00004a72 cdw11:09000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:12.436 [2024-07-15 00:15:11.336181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.436 #35 NEW cov: 11753 ft: 14772 corp: 30/407b lim: 35 exec/s: 35 rss: 69Mb L: 9/26 MS: 1 ChangeBinInt- 00:07:12.436 [2024-07-15 00:15:11.366463] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:003f0a6e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:12.436 [2024-07-15 00:15:11.366488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.436 [2024-07-15 00:15:11.366557] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:0000206e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:12.436 [2024-07-15 00:15:11.366570] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.436 #36 NEW cov: 11753 ft: 14786 corp: 31/425b lim: 35 exec/s: 36 rss: 69Mb L: 18/26 MS: 1 InsertByte- 00:07:12.436 [2024-07-15 00:15:11.406534] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00004a00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:12.436 [2024-07-15 00:15:11.406558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.436 [2024-07-15 00:15:11.406611] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:6e006e00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:12.436 [2024-07-15 00:15:11.406625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.436 #38 NEW cov: 11753 ft: 14800 corp: 32/441b lim: 35 exec/s: 38 rss: 69Mb L: 16/26 MS: 2 ChangeBit-CrossOver- 00:07:12.436 [2024-07-15 00:15:11.446476] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00004a72 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:12.436 [2024-07-15 00:15:11.446500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.436 #39 NEW cov: 11760 ft: 14890 corp: 33/451b lim: 35 exec/s: 39 rss: 69Mb L: 10/26 MS: 1 InsertByte- 00:07:12.436 [2024-07-15 00:15:11.476735] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:006e0a6e cdw11:00000002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:12.436 [2024-07-15 00:15:11.476759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.436 [2024-07-15 00:15:11.476815] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:12.436 [2024-07-15 00:15:11.476828] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.694 #40 NEW cov: 11760 ft: 14921 corp: 34/468b lim: 35 exec/s: 40 rss: 69Mb L: 17/26 MS: 1 PersAutoDict- DE: "n\000\000\000\000\000\000\000"- 00:07:12.694 [2024-07-15 00:15:11.516849] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00044a72 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:12.694 [2024-07-15 00:15:11.516873] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.694 [2024-07-15 00:15:11.516928] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00040072 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:12.694 [2024-07-15 00:15:11.516941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.694 #41 NEW cov: 11760 ft: 14934 corp: 35/484b lim: 35 exec/s: 20 rss: 69Mb L: 16/26 MS: 1 CopyPart- 00:07:12.694 #41 DONE cov: 11760 ft: 14934 corp: 35/484b lim: 35 exec/s: 20 rss: 69Mb 00:07:12.694 ###### Recommended dictionary. ###### 00:07:12.694 "n\000\000\000\000\000\000\000" # Uses: 7 00:07:12.694 ###### End of recommended dictionary. ###### 00:07:12.694 Done 41 runs in 2 second(s) 00:07:12.694 00:15:11 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_4.conf 00:07:12.694 00:15:11 -- ../common.sh@72 -- # (( i++ )) 00:07:12.694 00:15:11 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:12.694 00:15:11 -- ../common.sh@73 -- # start_llvm_fuzz 5 1 0x1 00:07:12.694 00:15:11 -- nvmf/run.sh@23 -- # local fuzzer_type=5 00:07:12.694 00:15:11 -- nvmf/run.sh@24 -- # local timen=1 00:07:12.694 00:15:11 -- nvmf/run.sh@25 -- # local core=0x1 00:07:12.694 00:15:11 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:07:12.694 00:15:11 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_5.conf 00:07:12.694 00:15:11 -- nvmf/run.sh@29 -- # printf %02d 5 00:07:12.694 00:15:11 -- nvmf/run.sh@29 -- # port=4405 00:07:12.694 00:15:11 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:07:12.694 00:15:11 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4405' 00:07:12.694 00:15:11 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4405"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:12.694 00:15:11 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4405' -c /tmp/fuzz_json_5.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 -Z 5 -r /var/tmp/spdk5.sock 00:07:12.694 [2024-07-15 00:15:11.705938] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:12.694 [2024-07-15 00:15:11.706007] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid328378 ] 00:07:12.694 EAL: No free 2048 kB hugepages reported on node 1 00:07:12.953 [2024-07-15 00:15:11.883016] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:12.953 [2024-07-15 00:15:11.944496] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:12.953 [2024-07-15 00:15:11.944642] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:12.953 [2024-07-15 00:15:12.002411] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:13.210 [2024-07-15 00:15:12.018712] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4405 *** 00:07:13.210 INFO: Running with entropic power schedule (0xFF, 100). 00:07:13.211 INFO: Seed: 971402165 00:07:13.211 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x280a94c, 0x285dea9), 00:07:13.211 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x285deb0,0x2d93480), 00:07:13.211 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:07:13.211 INFO: A corpus is not provided, starting from an empty corpus 00:07:13.211 #2 INITED exec/s: 0 rss: 60Mb 00:07:13.211 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:13.211 This may also happen if the target rejected all inputs we tried so far 00:07:13.211 [2024-07-15 00:15:12.064007] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:37373737 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:13.211 [2024-07-15 00:15:12.064036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.211 [2024-07-15 00:15:12.064090] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:37373737 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:13.211 [2024-07-15 00:15:12.064104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.470 NEW_FUNC[1/671]: 0x488f80 in fuzz_admin_create_io_submission_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:142 00:07:13.470 NEW_FUNC[2/671]: 0x4bd260 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:13.470 #9 NEW cov: 11543 ft: 11544 corp: 2/27b lim: 45 exec/s: 0 rss: 67Mb L: 26/26 MS: 2 ChangeByte-InsertRepeatedBytes- 00:07:13.470 [2024-07-15 00:15:12.384977] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:37373737 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:13.470 [2024-07-15 00:15:12.385009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.470 [2024-07-15 00:15:12.385065] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:37373737 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:13.470 [2024-07-15 00:15:12.385078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.470 [2024-07-15 00:15:12.385131] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:37373737 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:13.470 [2024-07-15 00:15:12.385144] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:13.470 #10 NEW cov: 11656 ft: 12290 corp: 3/55b lim: 45 exec/s: 0 rss: 67Mb L: 28/28 MS: 1 CopyPart- 00:07:13.470 [2024-07-15 00:15:12.434725] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:37373737 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:13.470 [2024-07-15 00:15:12.434753] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.470 #11 NEW cov: 11662 ft: 13313 corp: 4/71b lim: 45 exec/s: 0 rss: 67Mb L: 16/28 MS: 1 EraseBytes- 00:07:13.470 [2024-07-15 00:15:12.474970] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:37373737 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:13.470 [2024-07-15 00:15:12.474995] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.470 [2024-07-15 00:15:12.475050] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:37373737 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:13.470 [2024-07-15 00:15:12.475064] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.470 #12 NEW cov: 11747 ft: 13508 corp: 5/96b lim: 45 exec/s: 0 rss: 67Mb L: 25/28 MS: 1 CopyPart- 00:07:13.470 [2024-07-15 00:15:12.515449] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:37373737 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:13.470 [2024-07-15 00:15:12.515475] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.470 [2024-07-15 00:15:12.515530] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:37373737 cdw11:baba0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:13.470 [2024-07-15 00:15:12.515543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.470 [2024-07-15 00:15:12.515597] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:babababa cdw11:baba0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:13.470 [2024-07-15 00:15:12.515611] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:13.470 [2024-07-15 00:15:12.515661] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:37373737 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:13.470 [2024-07-15 00:15:12.515674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:13.730 #13 NEW cov: 11747 ft: 13862 corp: 6/135b lim: 45 exec/s: 0 rss: 67Mb L: 39/39 MS: 1 InsertRepeatedBytes- 00:07:13.730 [2024-07-15 00:15:12.555242] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:37373737 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:13.730 [2024-07-15 00:15:12.555267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.730 [2024-07-15 00:15:12.555322] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:37373737 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:13.730 [2024-07-15 00:15:12.555336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.730 #14 NEW cov: 11747 ft: 13927 corp: 7/160b lim: 45 exec/s: 0 rss: 68Mb L: 25/39 MS: 1 ChangeASCIIInt- 00:07:13.730 [2024-07-15 00:15:12.595683] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:37373737 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:13.730 [2024-07-15 00:15:12.595709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.730 [2024-07-15 00:15:12.595746] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:37373737 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:13.730 [2024-07-15 00:15:12.595759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.730 [2024-07-15 00:15:12.595814] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:37373737 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:13.730 [2024-07-15 00:15:12.595828] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:13.730 [2024-07-15 00:15:12.595879] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:37373737 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:13.730 [2024-07-15 00:15:12.595892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:13.730 #15 NEW cov: 11747 ft: 14056 corp: 8/196b lim: 45 exec/s: 0 rss: 68Mb L: 36/39 MS: 1 CopyPart- 00:07:13.730 [2024-07-15 00:15:12.635297] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:13.730 [2024-07-15 00:15:12.635322] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.730 #16 NEW cov: 11747 ft: 14101 corp: 9/212b lim: 45 exec/s: 0 rss: 68Mb L: 16/39 MS: 1 InsertRepeatedBytes- 00:07:13.730 [2024-07-15 00:15:12.675846] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:37373737 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:13.730 [2024-07-15 00:15:12.675871] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.730 [2024-07-15 00:15:12.675923] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:37373737 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:13.730 [2024-07-15 00:15:12.675937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.731 [2024-07-15 00:15:12.675990] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:37373737 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:13.731 [2024-07-15 00:15:12.676003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:13.731 [2024-07-15 00:15:12.676042] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:37373737 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:13.731 [2024-07-15 00:15:12.676058] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:13.731 #17 NEW cov: 11747 ft: 14136 corp: 10/248b lim: 45 exec/s: 0 rss: 68Mb L: 36/39 MS: 1 CopyPart- 00:07:13.731 [2024-07-15 00:15:12.716023] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:37373737 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:13.731 [2024-07-15 00:15:12.716048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.731 [2024-07-15 00:15:12.716103] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:37373737 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:13.731 [2024-07-15 00:15:12.716117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.731 [2024-07-15 00:15:12.716152] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:37373737 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:13.731 [2024-07-15 00:15:12.716181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:13.731 [2024-07-15 00:15:12.716232] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:37373737 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:13.731 [2024-07-15 00:15:12.716245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:13.731 #18 NEW cov: 11747 ft: 14157 corp: 11/284b lim: 45 exec/s: 0 rss: 68Mb L: 36/39 MS: 1 CrossOver- 00:07:13.731 [2024-07-15 00:15:12.756086] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:37373737 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:13.731 [2024-07-15 00:15:12.756111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.731 [2024-07-15 00:15:12.756181] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:37373737 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:13.731 [2024-07-15 00:15:12.756195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.731 [2024-07-15 00:15:12.756247] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:37373737 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:13.731 [2024-07-15 00:15:12.756260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:13.731 [2024-07-15 00:15:12.756291] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ff37ffff cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:13.731 [2024-07-15 00:15:12.756304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:13.731 #19 NEW cov: 11747 ft: 14178 corp: 12/323b lim: 45 exec/s: 0 rss: 68Mb L: 39/39 MS: 1 InsertRepeatedBytes- 00:07:13.990 [2024-07-15 00:15:12.796246] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:37373737 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:13.990 [2024-07-15 00:15:12.796272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.990 [2024-07-15 00:15:12.796325] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:37373737 cdw11:baba0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:13.990 [2024-07-15 00:15:12.796339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.990 [2024-07-15 00:15:12.796391] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:babababa cdw11:baba0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:13.990 [2024-07-15 00:15:12.796408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:13.990 [2024-07-15 00:15:12.796459] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:37ba3737 cdw11:ba370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:13.990 [2024-07-15 00:15:12.796472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:13.990 #20 NEW cov: 11747 ft: 14197 corp: 13/362b lim: 45 exec/s: 0 rss: 68Mb L: 39/39 MS: 1 ShuffleBytes- 00:07:13.990 [2024-07-15 00:15:12.836033] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:37373737 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:13.990 [2024-07-15 00:15:12.836057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.990 [2024-07-15 00:15:12.836109] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:37373737 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:13.990 [2024-07-15 00:15:12.836122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.990 #21 NEW cov: 11747 ft: 14216 corp: 14/388b lim: 45 exec/s: 0 rss: 68Mb L: 26/39 MS: 1 ChangeASCIIInt- 00:07:13.990 [2024-07-15 00:15:12.876473] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:37373737 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:13.990 [2024-07-15 00:15:12.876497] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.990 [2024-07-15 00:15:12.876551] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:37373737 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:13.990 [2024-07-15 00:15:12.876565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.990 [2024-07-15 00:15:12.876634] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:37373737 cdw11:36370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:13.990 [2024-07-15 00:15:12.876647] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:13.990 [2024-07-15 00:15:12.876700] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:37373737 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:13.990 [2024-07-15 00:15:12.876714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:13.990 #22 NEW cov: 11747 ft: 14262 corp: 15/424b lim: 45 exec/s: 0 rss: 68Mb L: 36/39 MS: 1 ChangeBit- 00:07:13.991 [2024-07-15 00:15:12.916399] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:37373737 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:13.991 [2024-07-15 00:15:12.916423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.991 [2024-07-15 00:15:12.916483] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:37373737 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:13.991 [2024-07-15 00:15:12.916498] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.991 [2024-07-15 00:15:12.916550] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:37373637 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:13.991 [2024-07-15 00:15:12.916563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:13.991 #23 NEW cov: 11747 ft: 14281 corp: 16/456b lim: 45 exec/s: 0 rss: 68Mb L: 32/39 MS: 1 EraseBytes- 00:07:13.991 [2024-07-15 00:15:12.956418] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:37373737 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:13.991 [2024-07-15 00:15:12.956449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.991 [2024-07-15 00:15:12.956520] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:37373737 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:13.991 [2024-07-15 00:15:12.956534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.991 NEW_FUNC[1/1]: 0x195e300 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:13.991 #24 NEW cov: 11770 ft: 14415 corp: 17/481b lim: 45 exec/s: 0 rss: 68Mb L: 25/39 MS: 1 ShuffleBytes- 00:07:13.991 [2024-07-15 00:15:12.996498] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:37373737 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:13.991 [2024-07-15 00:15:12.996523] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.991 [2024-07-15 00:15:12.996576] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:37373737 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:13.991 [2024-07-15 00:15:12.996589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.991 #25 NEW cov: 11770 ft: 14434 corp: 18/505b lim: 45 exec/s: 0 rss: 68Mb L: 24/39 MS: 1 EraseBytes- 00:07:13.991 [2024-07-15 00:15:13.036597] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:37373737 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:13.991 [2024-07-15 00:15:13.036621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.991 [2024-07-15 00:15:13.036674] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:37373737 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:13.991 [2024-07-15 00:15:13.036687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.293 #26 NEW cov: 11770 ft: 14440 corp: 19/531b lim: 45 exec/s: 26 rss: 68Mb L: 26/39 MS: 1 ChangeASCIIInt- 00:07:14.293 [2024-07-15 00:15:13.077015] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:37373737 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.293 [2024-07-15 00:15:13.077040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.293 [2024-07-15 00:15:13.077111] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:37373737 cdw11:37170001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.293 [2024-07-15 00:15:13.077125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.293 [2024-07-15 00:15:13.077177] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:37373737 cdw11:36370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.293 [2024-07-15 00:15:13.077190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:14.293 [2024-07-15 00:15:13.077241] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:37373737 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.293 [2024-07-15 00:15:13.077254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:14.293 #27 NEW cov: 11770 ft: 14462 corp: 20/567b lim: 45 exec/s: 27 rss: 68Mb L: 36/39 MS: 1 ChangeBit- 00:07:14.293 [2024-07-15 00:15:13.117126] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:37373737 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.293 [2024-07-15 00:15:13.117151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.293 [2024-07-15 00:15:13.117207] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:37373737 cdw11:baba0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.293 [2024-07-15 00:15:13.117220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.293 [2024-07-15 00:15:13.117273] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:3737ba37 cdw11:37370005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.293 [2024-07-15 00:15:13.117286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:14.293 [2024-07-15 00:15:13.117337] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:37373737 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.293 [2024-07-15 00:15:13.117350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:14.293 #28 NEW cov: 11770 ft: 14497 corp: 21/606b lim: 45 exec/s: 28 rss: 68Mb L: 39/39 MS: 1 CopyPart- 00:07:14.293 [2024-07-15 00:15:13.157093] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:37373737 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.293 [2024-07-15 00:15:13.157117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.293 [2024-07-15 00:15:13.157188] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:3737373f cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.293 [2024-07-15 00:15:13.157202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.293 [2024-07-15 00:15:13.157256] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:37373637 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.293 [2024-07-15 00:15:13.157270] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:14.293 #29 NEW cov: 11770 ft: 14502 corp: 22/638b lim: 45 exec/s: 29 rss: 68Mb L: 32/39 MS: 1 ChangeBit- 00:07:14.293 [2024-07-15 00:15:13.197357] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:37373737 cdw11:37370000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.293 [2024-07-15 00:15:13.197381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.293 [2024-07-15 00:15:13.197434] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:37370037 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.293 [2024-07-15 00:15:13.197455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.293 [2024-07-15 00:15:13.197509] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:37373737 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.293 [2024-07-15 00:15:13.197523] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:14.293 [2024-07-15 00:15:13.197576] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:37373737 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.293 [2024-07-15 00:15:13.197588] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:14.293 #30 NEW cov: 11770 ft: 14511 corp: 23/678b lim: 45 exec/s: 30 rss: 68Mb L: 40/40 MS: 1 CMP- DE: "\033\000\000\000"- 00:07:14.293 [2024-07-15 00:15:13.237491] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:37373737 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.293 [2024-07-15 00:15:13.237515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.293 [2024-07-15 00:15:13.237571] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:37373737 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.293 [2024-07-15 00:15:13.237585] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.293 [2024-07-15 00:15:13.237639] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:37373737 cdw11:37370000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.293 [2024-07-15 00:15:13.237653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:14.293 [2024-07-15 00:15:13.237722] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.293 [2024-07-15 00:15:13.237736] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:14.293 #31 NEW cov: 11770 ft: 14545 corp: 24/714b lim: 45 exec/s: 31 rss: 68Mb L: 36/40 MS: 1 InsertRepeatedBytes- 00:07:14.293 [2024-07-15 00:15:13.277601] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:37373737 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.293 [2024-07-15 00:15:13.277625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.293 [2024-07-15 00:15:13.277695] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:37373737 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.293 [2024-07-15 00:15:13.277708] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.293 [2024-07-15 00:15:13.277760] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:37373737 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.293 [2024-07-15 00:15:13.277774] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:14.293 [2024-07-15 00:15:13.277825] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:37373737 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.293 [2024-07-15 00:15:13.277838] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:14.293 #32 NEW cov: 11770 ft: 14620 corp: 25/754b lim: 45 exec/s: 32 rss: 68Mb L: 40/40 MS: 1 CrossOver- 00:07:14.614 [2024-07-15 00:15:13.317778] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:37373737 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.614 [2024-07-15 00:15:13.317804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.614 [2024-07-15 00:15:13.317857] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:37373737 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.614 [2024-07-15 00:15:13.317871] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.614 [2024-07-15 00:15:13.317924] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:37373737 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.614 [2024-07-15 00:15:13.317937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:14.614 [2024-07-15 00:15:13.317989] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ff37ffff cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.614 [2024-07-15 00:15:13.318003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:14.614 #33 NEW cov: 11770 ft: 14678 corp: 26/793b lim: 45 exec/s: 33 rss: 68Mb L: 39/40 MS: 1 ChangeBit- 00:07:14.614 [2024-07-15 00:15:13.357602] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:37373737 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.614 [2024-07-15 00:15:13.357627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.614 [2024-07-15 00:15:13.357680] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:37373737 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.614 [2024-07-15 00:15:13.357693] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.614 #34 NEW cov: 11770 ft: 14688 corp: 27/817b lim: 45 exec/s: 34 rss: 69Mb L: 24/40 MS: 1 ChangeBit- 00:07:14.614 [2024-07-15 00:15:13.397832] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:37373737 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.614 [2024-07-15 00:15:13.397856] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.614 [2024-07-15 00:15:13.397912] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:3737373f cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.615 [2024-07-15 00:15:13.397926] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.615 [2024-07-15 00:15:13.397979] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:37373637 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.615 [2024-07-15 00:15:13.397992] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:14.615 #35 NEW cov: 11770 ft: 14696 corp: 28/849b lim: 45 exec/s: 35 rss: 69Mb L: 32/40 MS: 1 ChangeASCIIInt- 00:07:14.615 [2024-07-15 00:15:13.438064] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:37373737 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.615 [2024-07-15 00:15:13.438089] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.615 [2024-07-15 00:15:13.438144] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:37373737 cdw11:37370004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.615 [2024-07-15 00:15:13.438158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.615 [2024-07-15 00:15:13.438210] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:89378989 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.615 [2024-07-15 00:15:13.438224] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:14.615 [2024-07-15 00:15:13.438277] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:37373736 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.615 [2024-07-15 00:15:13.438291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:14.615 #36 NEW cov: 11770 ft: 14706 corp: 29/891b lim: 45 exec/s: 36 rss: 69Mb L: 42/42 MS: 1 InsertRepeatedBytes- 00:07:14.615 [2024-07-15 00:15:13.478164] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:373737eb cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.615 [2024-07-15 00:15:13.478188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.615 [2024-07-15 00:15:13.478255] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:37373737 cdw11:37370000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.615 [2024-07-15 00:15:13.478269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.615 [2024-07-15 00:15:13.478330] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:37373737 cdw11:37360001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.615 [2024-07-15 00:15:13.478343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:14.615 [2024-07-15 00:15:13.478395] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:37373737 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.615 [2024-07-15 00:15:13.478408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:14.615 #37 NEW cov: 11770 ft: 14720 corp: 30/928b lim: 45 exec/s: 37 rss: 69Mb L: 37/42 MS: 1 InsertByte- 00:07:14.615 [2024-07-15 00:15:13.518176] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:37373737 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.615 [2024-07-15 00:15:13.518201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.615 [2024-07-15 00:15:13.518257] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:37373737 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.615 [2024-07-15 00:15:13.518271] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.615 [2024-07-15 00:15:13.518325] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:37373637 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.615 [2024-07-15 00:15:13.518339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:14.615 #38 NEW cov: 11770 ft: 14736 corp: 31/960b lim: 45 exec/s: 38 rss: 69Mb L: 32/42 MS: 1 ChangeASCIIInt- 00:07:14.615 [2024-07-15 00:15:13.558402] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:37373737 cdw11:37000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.615 [2024-07-15 00:15:13.558427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.615 [2024-07-15 00:15:13.558486] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:04000000 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.615 [2024-07-15 00:15:13.558500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.615 [2024-07-15 00:15:13.558565] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:37373737 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.615 [2024-07-15 00:15:13.558579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:14.615 [2024-07-15 00:15:13.558630] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:37373737 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.615 [2024-07-15 00:15:13.558643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:14.615 #39 NEW cov: 11770 ft: 14777 corp: 32/996b lim: 45 exec/s: 39 rss: 69Mb L: 36/42 MS: 1 CMP- DE: "\000\000\000\000\000\000\004\000"- 00:07:14.615 [2024-07-15 00:15:13.598536] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:37373737 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.615 [2024-07-15 00:15:13.598560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.615 [2024-07-15 00:15:13.598613] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:37373737 cdw11:baba0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.615 [2024-07-15 00:15:13.598629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.615 [2024-07-15 00:15:13.598682] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:babababa cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.615 [2024-07-15 00:15:13.598695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:14.615 [2024-07-15 00:15:13.598746] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:37373737 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.615 [2024-07-15 00:15:13.598760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:14.615 #40 NEW cov: 11770 ft: 14785 corp: 33/1035b lim: 45 exec/s: 40 rss: 69Mb L: 39/42 MS: 1 CopyPart- 00:07:14.615 [2024-07-15 00:15:13.638665] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:37373737 cdw11:37000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.615 [2024-07-15 00:15:13.638689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.615 [2024-07-15 00:15:13.638744] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:04000000 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.615 [2024-07-15 00:15:13.638758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.615 [2024-07-15 00:15:13.638809] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:42374242 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.615 [2024-07-15 00:15:13.638822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:14.615 [2024-07-15 00:15:13.638872] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:37373737 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.615 [2024-07-15 00:15:13.638884] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:14.615 #41 NEW cov: 11770 ft: 14830 corp: 34/1076b lim: 45 exec/s: 41 rss: 69Mb L: 41/42 MS: 1 InsertRepeatedBytes- 00:07:14.873 [2024-07-15 00:15:13.678499] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:37373737 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.873 [2024-07-15 00:15:13.678525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.873 [2024-07-15 00:15:13.678596] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:37373737 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.873 [2024-07-15 00:15:13.678610] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.873 #42 NEW cov: 11770 ft: 14843 corp: 35/1102b lim: 45 exec/s: 42 rss: 69Mb L: 26/42 MS: 1 ChangeBit- 00:07:14.873 [2024-07-15 00:15:13.708908] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:37373737 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.873 [2024-07-15 00:15:13.708932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.873 [2024-07-15 00:15:13.709002] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:37373737 cdw11:baba0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.873 [2024-07-15 00:15:13.709017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.873 [2024-07-15 00:15:13.709068] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:04000005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.873 [2024-07-15 00:15:13.709085] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:14.873 [2024-07-15 00:15:13.709135] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:37ba3737 cdw11:ba370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.873 [2024-07-15 00:15:13.709148] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:14.873 #43 NEW cov: 11770 ft: 14855 corp: 36/1141b lim: 45 exec/s: 43 rss: 69Mb L: 39/42 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\004\000"- 00:07:14.873 [2024-07-15 00:15:13.748676] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:37373737 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.873 [2024-07-15 00:15:13.748701] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.873 [2024-07-15 00:15:13.748754] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:37373737 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.873 [2024-07-15 00:15:13.748768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.873 #44 NEW cov: 11770 ft: 14863 corp: 37/1167b lim: 45 exec/s: 44 rss: 69Mb L: 26/42 MS: 1 EraseBytes- 00:07:14.873 [2024-07-15 00:15:13.788624] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:37373737 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.873 [2024-07-15 00:15:13.788649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.873 #45 NEW cov: 11770 ft: 14895 corp: 38/1183b lim: 45 exec/s: 45 rss: 69Mb L: 16/42 MS: 1 CopyPart- 00:07:14.873 [2024-07-15 00:15:13.829008] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:37373737 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.874 [2024-07-15 00:15:13.829034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.874 [2024-07-15 00:15:13.829091] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:37373737 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.874 [2024-07-15 00:15:13.829105] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.874 [2024-07-15 00:15:13.829174] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:37373737 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.874 [2024-07-15 00:15:13.829188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:14.874 #46 NEW cov: 11770 ft: 14901 corp: 39/1215b lim: 45 exec/s: 46 rss: 69Mb L: 32/42 MS: 1 CrossOver- 00:07:14.874 [2024-07-15 00:15:13.868914] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:37353737 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.874 [2024-07-15 00:15:13.868939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.874 #47 NEW cov: 11770 ft: 14951 corp: 40/1231b lim: 45 exec/s: 47 rss: 69Mb L: 16/42 MS: 1 ChangeBit- 00:07:14.874 [2024-07-15 00:15:13.909463] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:37373737 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.874 [2024-07-15 00:15:13.909487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.874 [2024-07-15 00:15:13.909557] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:37373737 cdw11:baba0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.874 [2024-07-15 00:15:13.909572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.874 [2024-07-15 00:15:13.909629] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:babababa cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.874 [2024-07-15 00:15:13.909653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:14.874 [2024-07-15 00:15:13.909707] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:37373737 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.874 [2024-07-15 00:15:13.909720] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:15.133 #48 NEW cov: 11770 ft: 14976 corp: 41/1273b lim: 45 exec/s: 48 rss: 69Mb L: 42/42 MS: 1 InsertRepeatedBytes- 00:07:15.133 [2024-07-15 00:15:13.949460] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:3f373737 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.133 [2024-07-15 00:15:13.949485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.133 [2024-07-15 00:15:13.949541] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:37373737 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.133 [2024-07-15 00:15:13.949554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.133 #49 NEW cov: 11770 ft: 15020 corp: 42/1298b lim: 45 exec/s: 49 rss: 69Mb L: 25/42 MS: 1 ChangeBit- 00:07:15.133 [2024-07-15 00:15:13.989665] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:37373737 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.133 [2024-07-15 00:15:13.989690] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.133 [2024-07-15 00:15:13.989744] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:37373737 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.133 [2024-07-15 00:15:13.989757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.133 [2024-07-15 00:15:13.989809] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:37373737 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.133 [2024-07-15 00:15:13.989822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:15.133 [2024-07-15 00:15:13.989875] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ff37ffff cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.133 [2024-07-15 00:15:13.989888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:15.133 #55 NEW cov: 11770 ft: 15026 corp: 43/1337b lim: 45 exec/s: 55 rss: 69Mb L: 39/42 MS: 1 ChangeByte- 00:07:15.133 [2024-07-15 00:15:14.029803] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:37373737 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.133 [2024-07-15 00:15:14.029827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.133 [2024-07-15 00:15:14.029880] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:37003737 cdw11:00000001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.133 [2024-07-15 00:15:14.029894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.133 [2024-07-15 00:15:14.029948] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:babababa cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.133 [2024-07-15 00:15:14.029961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:15.133 [2024-07-15 00:15:14.030016] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:37373737 cdw11:37370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.133 [2024-07-15 00:15:14.030029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:15.133 #56 NEW cov: 11770 ft: 15035 corp: 44/1376b lim: 45 exec/s: 28 rss: 69Mb L: 39/42 MS: 1 ChangeBinInt- 00:07:15.133 #56 DONE cov: 11770 ft: 15035 corp: 44/1376b lim: 45 exec/s: 28 rss: 69Mb 00:07:15.133 ###### Recommended dictionary. ###### 00:07:15.133 "\033\000\000\000" # Uses: 0 00:07:15.133 "\000\000\000\000\000\000\004\000" # Uses: 1 00:07:15.133 ###### End of recommended dictionary. ###### 00:07:15.133 Done 56 runs in 2 second(s) 00:07:15.133 00:15:14 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_5.conf 00:07:15.133 00:15:14 -- ../common.sh@72 -- # (( i++ )) 00:07:15.133 00:15:14 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:15.133 00:15:14 -- ../common.sh@73 -- # start_llvm_fuzz 6 1 0x1 00:07:15.133 00:15:14 -- nvmf/run.sh@23 -- # local fuzzer_type=6 00:07:15.133 00:15:14 -- nvmf/run.sh@24 -- # local timen=1 00:07:15.133 00:15:14 -- nvmf/run.sh@25 -- # local core=0x1 00:07:15.133 00:15:14 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:07:15.133 00:15:14 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_6.conf 00:07:15.133 00:15:14 -- nvmf/run.sh@29 -- # printf %02d 6 00:07:15.133 00:15:14 -- nvmf/run.sh@29 -- # port=4406 00:07:15.133 00:15:14 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:07:15.133 00:15:14 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4406' 00:07:15.133 00:15:14 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4406"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:15.133 00:15:14 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4406' -c /tmp/fuzz_json_6.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 -Z 6 -r /var/tmp/spdk6.sock 00:07:15.392 [2024-07-15 00:15:14.212880] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:15.392 [2024-07-15 00:15:14.212947] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid328870 ] 00:07:15.392 EAL: No free 2048 kB hugepages reported on node 1 00:07:15.392 [2024-07-15 00:15:14.394753] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:15.651 [2024-07-15 00:15:14.456933] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:15.651 [2024-07-15 00:15:14.457063] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:15.651 [2024-07-15 00:15:14.514999] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:15.651 [2024-07-15 00:15:14.531296] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4406 *** 00:07:15.651 INFO: Running with entropic power schedule (0xFF, 100). 00:07:15.651 INFO: Seed: 3484400966 00:07:15.651 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x280a94c, 0x285dea9), 00:07:15.651 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x285deb0,0x2d93480), 00:07:15.651 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:07:15.651 INFO: A corpus is not provided, starting from an empty corpus 00:07:15.651 #2 INITED exec/s: 0 rss: 60Mb 00:07:15.651 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:15.651 This may also happen if the target rejected all inputs we tried so far 00:07:15.651 [2024-07-15 00:15:14.576405] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a60 cdw11:00000000 00:07:15.651 [2024-07-15 00:15:14.576433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.910 NEW_FUNC[1/669]: 0x48b790 in fuzz_admin_delete_io_completion_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:161 00:07:15.910 NEW_FUNC[2/669]: 0x4bd260 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:15.910 #6 NEW cov: 11460 ft: 11457 corp: 2/3b lim: 10 exec/s: 0 rss: 67Mb L: 2/2 MS: 4 ShuffleBytes-ChangeByte-ChangeByte-CrossOver- 00:07:15.910 [2024-07-15 00:15:14.887210] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a2d cdw11:00000000 00:07:15.910 [2024-07-15 00:15:14.887247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.910 #7 NEW cov: 11573 ft: 11937 corp: 3/5b lim: 10 exec/s: 0 rss: 67Mb L: 2/2 MS: 1 InsertByte- 00:07:15.910 [2024-07-15 00:15:14.927215] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000e60 cdw11:00000000 00:07:15.910 [2024-07-15 00:15:14.927239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.910 #13 NEW cov: 11579 ft: 12260 corp: 4/7b lim: 10 exec/s: 0 rss: 67Mb L: 2/2 MS: 1 ChangeBit- 00:07:16.169 [2024-07-15 00:15:14.967291] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0e cdw11:00000000 00:07:16.169 [2024-07-15 00:15:14.967315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:16.169 #14 NEW cov: 11664 ft: 12567 corp: 5/10b lim: 10 exec/s: 0 rss: 67Mb L: 3/3 MS: 1 CrossOver- 00:07:16.169 [2024-07-15 00:15:15.007672] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 00:07:16.169 [2024-07-15 00:15:15.007697] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:16.169 [2024-07-15 00:15:15.007749] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:16.169 [2024-07-15 00:15:15.007762] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:16.169 [2024-07-15 00:15:15.007813] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000002d cdw11:00000000 00:07:16.169 [2024-07-15 00:15:15.007826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:16.169 #15 NEW cov: 11664 ft: 12871 corp: 6/16b lim: 10 exec/s: 0 rss: 67Mb L: 6/6 MS: 1 InsertRepeatedBytes- 00:07:16.169 [2024-07-15 00:15:15.047764] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a10 cdw11:00000000 00:07:16.169 [2024-07-15 00:15:15.047788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:16.169 [2024-07-15 00:15:15.047839] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:16.169 [2024-07-15 00:15:15.047852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:16.169 [2024-07-15 00:15:15.047903] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000002d cdw11:00000000 00:07:16.169 [2024-07-15 00:15:15.047916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:16.169 #16 NEW cov: 11664 ft: 12945 corp: 7/22b lim: 10 exec/s: 0 rss: 67Mb L: 6/6 MS: 1 ChangeBit- 00:07:16.169 [2024-07-15 00:15:15.087703] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0e cdw11:00000000 00:07:16.169 [2024-07-15 00:15:15.087726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:16.169 #17 NEW cov: 11664 ft: 12983 corp: 8/25b lim: 10 exec/s: 0 rss: 68Mb L: 3/6 MS: 1 ChangeBinInt- 00:07:16.169 [2024-07-15 00:15:15.127780] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a2f cdw11:00000000 00:07:16.169 [2024-07-15 00:15:15.127803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:16.169 #18 NEW cov: 11664 ft: 13066 corp: 9/27b lim: 10 exec/s: 0 rss: 68Mb L: 2/6 MS: 1 ChangeBit- 00:07:16.169 [2024-07-15 00:15:15.168240] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0e cdw11:00000000 00:07:16.169 [2024-07-15 00:15:15.168263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:16.169 [2024-07-15 00:15:15.168328] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00005900 cdw11:00000000 00:07:16.169 [2024-07-15 00:15:15.168341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:16.169 [2024-07-15 00:15:15.168391] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:16.169 [2024-07-15 00:15:15.168404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:16.169 [2024-07-15 00:15:15.168452] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:16.169 [2024-07-15 00:15:15.168466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:16.169 #19 NEW cov: 11664 ft: 13305 corp: 10/36b lim: 10 exec/s: 0 rss: 68Mb L: 9/9 MS: 1 InsertRepeatedBytes- 00:07:16.169 [2024-07-15 00:15:15.208121] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:16.169 [2024-07-15 00:15:15.208145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:16.169 [2024-07-15 00:15:15.208195] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00002d2d cdw11:00000000 00:07:16.169 [2024-07-15 00:15:15.208209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:16.429 #20 NEW cov: 11664 ft: 13493 corp: 11/40b lim: 10 exec/s: 0 rss: 68Mb L: 4/9 MS: 1 CopyPart- 00:07:16.429 [2024-07-15 00:15:15.248341] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a40 cdw11:00000000 00:07:16.429 [2024-07-15 00:15:15.248366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:16.429 [2024-07-15 00:15:15.248414] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:16.429 [2024-07-15 00:15:15.248428] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:16.429 [2024-07-15 00:15:15.248478] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000002d cdw11:00000000 00:07:16.429 [2024-07-15 00:15:15.248491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:16.429 #21 NEW cov: 11664 ft: 13516 corp: 12/46b lim: 10 exec/s: 0 rss: 68Mb L: 6/9 MS: 1 ChangeBit- 00:07:16.429 [2024-07-15 00:15:15.288229] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0e cdw11:00000000 00:07:16.429 [2024-07-15 00:15:15.288253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:16.429 #22 NEW cov: 11664 ft: 13529 corp: 13/48b lim: 10 exec/s: 0 rss: 68Mb L: 2/9 MS: 1 CrossOver- 00:07:16.429 [2024-07-15 00:15:15.328437] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 00:07:16.429 [2024-07-15 00:15:15.328469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:16.429 [2024-07-15 00:15:15.328518] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000d32d cdw11:00000000 00:07:16.429 [2024-07-15 00:15:15.328532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:16.429 #23 NEW cov: 11664 ft: 13571 corp: 14/52b lim: 10 exec/s: 0 rss: 68Mb L: 4/9 MS: 1 ChangeBinInt- 00:07:16.429 [2024-07-15 00:15:15.368473] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0e cdw11:00000000 00:07:16.429 [2024-07-15 00:15:15.368499] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:16.429 #24 NEW cov: 11664 ft: 13592 corp: 15/55b lim: 10 exec/s: 0 rss: 68Mb L: 3/9 MS: 1 ChangeByte- 00:07:16.429 [2024-07-15 00:15:15.408864] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0e cdw11:00000000 00:07:16.429 [2024-07-15 00:15:15.408889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:16.429 [2024-07-15 00:15:15.408956] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00005926 cdw11:00000000 00:07:16.429 [2024-07-15 00:15:15.408969] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:16.429 [2024-07-15 00:15:15.409020] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00002626 cdw11:00000000 00:07:16.429 [2024-07-15 00:15:15.409034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:16.429 [2024-07-15 00:15:15.409083] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00002626 cdw11:00000000 00:07:16.429 [2024-07-15 00:15:15.409096] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:16.429 #25 NEW cov: 11664 ft: 13627 corp: 16/64b lim: 10 exec/s: 0 rss: 68Mb L: 9/9 MS: 1 InsertRepeatedBytes- 00:07:16.429 [2024-07-15 00:15:15.448774] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000f200 cdw11:00000000 00:07:16.429 [2024-07-15 00:15:15.448799] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:16.429 [2024-07-15 00:15:15.448864] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000d32d cdw11:00000000 00:07:16.429 [2024-07-15 00:15:15.448878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:16.429 NEW_FUNC[1/1]: 0x195e300 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:16.429 #26 NEW cov: 11687 ft: 13671 corp: 17/68b lim: 10 exec/s: 0 rss: 68Mb L: 4/9 MS: 1 CMP- DE: "\362\000"- 00:07:16.689 [2024-07-15 00:15:15.489131] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000af2 cdw11:00000000 00:07:16.689 [2024-07-15 00:15:15.489157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:16.689 [2024-07-15 00:15:15.489223] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:16.689 [2024-07-15 00:15:15.489236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:16.689 [2024-07-15 00:15:15.489287] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:16.689 [2024-07-15 00:15:15.489300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:16.689 [2024-07-15 00:15:15.489350] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:16.689 [2024-07-15 00:15:15.489367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:16.689 #27 NEW cov: 11687 ft: 13695 corp: 18/77b lim: 10 exec/s: 0 rss: 68Mb L: 9/9 MS: 1 PersAutoDict- DE: "\362\000"- 00:07:16.689 [2024-07-15 00:15:15.528915] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0e cdw11:00000000 00:07:16.689 [2024-07-15 00:15:15.528939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:16.689 #28 NEW cov: 11687 ft: 13710 corp: 19/80b lim: 10 exec/s: 0 rss: 68Mb L: 3/9 MS: 1 ShuffleBytes- 00:07:16.689 [2024-07-15 00:15:15.558996] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a60 cdw11:00000000 00:07:16.689 [2024-07-15 00:15:15.559020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:16.689 #29 NEW cov: 11687 ft: 13781 corp: 20/82b lim: 10 exec/s: 29 rss: 68Mb L: 2/9 MS: 1 CrossOver- 00:07:16.689 [2024-07-15 00:15:15.599187] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000310a cdw11:00000000 00:07:16.689 [2024-07-15 00:15:15.599211] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:16.689 [2024-07-15 00:15:15.599262] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000a2d cdw11:00000000 00:07:16.689 [2024-07-15 00:15:15.599275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:16.689 #30 NEW cov: 11687 ft: 13821 corp: 21/87b lim: 10 exec/s: 30 rss: 69Mb L: 5/9 MS: 1 InsertByte- 00:07:16.689 [2024-07-15 00:15:15.639199] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a59 cdw11:00000000 00:07:16.689 [2024-07-15 00:15:15.639223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:16.689 #31 NEW cov: 11687 ft: 13836 corp: 22/89b lim: 10 exec/s: 31 rss: 69Mb L: 2/9 MS: 1 EraseBytes- 00:07:16.689 [2024-07-15 00:15:15.669676] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0e cdw11:00000000 00:07:16.689 [2024-07-15 00:15:15.669700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:16.689 [2024-07-15 00:15:15.669750] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000220a cdw11:00000000 00:07:16.689 [2024-07-15 00:15:15.669762] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:16.689 [2024-07-15 00:15:15.669812] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000e59 cdw11:00000000 00:07:16.689 [2024-07-15 00:15:15.669825] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:16.689 [2024-07-15 00:15:15.669871] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:16.689 [2024-07-15 00:15:15.669884] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:16.689 #32 NEW cov: 11687 ft: 13843 corp: 23/98b lim: 10 exec/s: 32 rss: 69Mb L: 9/9 MS: 1 CrossOver- 00:07:16.689 [2024-07-15 00:15:15.709446] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000f200 cdw11:00000000 00:07:16.689 [2024-07-15 00:15:15.709471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:16.689 #33 NEW cov: 11687 ft: 13886 corp: 24/101b lim: 10 exec/s: 33 rss: 69Mb L: 3/9 MS: 1 PersAutoDict- DE: "\362\000"- 00:07:16.689 [2024-07-15 00:15:15.739845] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000aff cdw11:00000000 00:07:16.689 [2024-07-15 00:15:15.739869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:16.689 [2024-07-15 00:15:15.739937] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:16.689 [2024-07-15 00:15:15.739950] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:16.689 [2024-07-15 00:15:15.739999] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:16.689 [2024-07-15 00:15:15.740013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:16.689 [2024-07-15 00:15:15.740062] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:16.689 [2024-07-15 00:15:15.740075] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:16.949 #35 NEW cov: 11687 ft: 13898 corp: 25/109b lim: 10 exec/s: 35 rss: 69Mb L: 8/9 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:07:16.949 [2024-07-15 00:15:15.779832] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000310a cdw11:00000000 00:07:16.949 [2024-07-15 00:15:15.779856] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:16.949 [2024-07-15 00:15:15.779908] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000600a cdw11:00000000 00:07:16.949 [2024-07-15 00:15:15.779921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:16.949 [2024-07-15 00:15:15.779972] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00002d2d cdw11:00000000 00:07:16.949 [2024-07-15 00:15:15.779985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:16.949 #36 NEW cov: 11687 ft: 13911 corp: 26/115b lim: 10 exec/s: 36 rss: 69Mb L: 6/9 MS: 1 InsertByte- 00:07:16.949 [2024-07-15 00:15:15.819869] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000310a cdw11:00000000 00:07:16.949 [2024-07-15 00:15:15.819894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:16.949 [2024-07-15 00:15:15.819943] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000a27 cdw11:00000000 00:07:16.949 [2024-07-15 00:15:15.819957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:16.949 #37 NEW cov: 11687 ft: 13925 corp: 27/120b lim: 10 exec/s: 37 rss: 69Mb L: 5/9 MS: 1 ChangeByte- 00:07:16.949 [2024-07-15 00:15:15.860129] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a10 cdw11:00000000 00:07:16.949 [2024-07-15 00:15:15.860154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:16.949 [2024-07-15 00:15:15.860203] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00004300 cdw11:00000000 00:07:16.949 [2024-07-15 00:15:15.860215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:16.949 [2024-07-15 00:15:15.860278] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:16.949 [2024-07-15 00:15:15.860291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:16.949 #38 NEW cov: 11687 ft: 13953 corp: 28/127b lim: 10 exec/s: 38 rss: 69Mb L: 7/9 MS: 1 InsertByte- 00:07:16.949 [2024-07-15 00:15:15.899962] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:16.949 [2024-07-15 00:15:15.899985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:16.949 #39 NEW cov: 11687 ft: 13979 corp: 29/129b lim: 10 exec/s: 39 rss: 69Mb L: 2/9 MS: 1 CopyPart- 00:07:16.949 [2024-07-15 00:15:15.940042] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00007e60 cdw11:00000000 00:07:16.949 [2024-07-15 00:15:15.940066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:16.949 #40 NEW cov: 11687 ft: 13999 corp: 30/131b lim: 10 exec/s: 40 rss: 69Mb L: 2/9 MS: 1 ChangeByte- 00:07:16.949 [2024-07-15 00:15:15.980413] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a10 cdw11:00000000 00:07:16.949 [2024-07-15 00:15:15.980437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:16.949 [2024-07-15 00:15:15.980493] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:16.949 [2024-07-15 00:15:15.980506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:16.949 [2024-07-15 00:15:15.980571] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000f200 cdw11:00000000 00:07:16.949 [2024-07-15 00:15:15.980585] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:16.949 #41 NEW cov: 11687 ft: 14006 corp: 31/137b lim: 10 exec/s: 41 rss: 69Mb L: 6/9 MS: 1 PersAutoDict- DE: "\362\000"- 00:07:17.209 [2024-07-15 00:15:16.020631] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:17.209 [2024-07-15 00:15:16.020656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:17.209 [2024-07-15 00:15:16.020704] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:17.209 [2024-07-15 00:15:16.020717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:17.209 [2024-07-15 00:15:16.020766] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000059 cdw11:00000000 00:07:17.209 [2024-07-15 00:15:16.020780] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:17.209 [2024-07-15 00:15:16.020829] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000002d cdw11:00000000 00:07:17.209 [2024-07-15 00:15:16.020841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:17.209 #42 NEW cov: 11687 ft: 14053 corp: 32/145b lim: 10 exec/s: 42 rss: 69Mb L: 8/9 MS: 1 CrossOver- 00:07:17.209 [2024-07-15 00:15:16.060913] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0e cdw11:00000000 00:07:17.209 [2024-07-15 00:15:16.060937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:17.209 [2024-07-15 00:15:16.061003] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000220a cdw11:00000000 00:07:17.209 [2024-07-15 00:15:16.061016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:17.209 [2024-07-15 00:15:16.061066] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000e59 cdw11:00000000 00:07:17.209 [2024-07-15 00:15:16.061079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:17.209 [2024-07-15 00:15:16.061135] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:17.209 [2024-07-15 00:15:16.061149] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:17.209 [2024-07-15 00:15:16.061200] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 00:07:17.209 [2024-07-15 00:15:16.061213] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:17.209 #43 NEW cov: 11687 ft: 14131 corp: 33/155b lim: 10 exec/s: 43 rss: 69Mb L: 10/10 MS: 1 CrossOver- 00:07:17.209 [2024-07-15 00:15:16.100550] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000200 cdw11:00000000 00:07:17.209 [2024-07-15 00:15:16.100574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:17.209 #44 NEW cov: 11687 ft: 14134 corp: 34/157b lim: 10 exec/s: 44 rss: 69Mb L: 2/10 MS: 1 CMP- DE: "\002\000"- 00:07:17.209 [2024-07-15 00:15:16.140899] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000310a cdw11:00000000 00:07:17.209 [2024-07-15 00:15:16.140923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:17.209 [2024-07-15 00:15:16.140974] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000600a cdw11:00000000 00:07:17.209 [2024-07-15 00:15:16.140987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:17.209 [2024-07-15 00:15:16.141037] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00002d29 cdw11:00000000 00:07:17.209 [2024-07-15 00:15:16.141050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:17.209 #45 NEW cov: 11687 ft: 14158 corp: 35/163b lim: 10 exec/s: 45 rss: 69Mb L: 6/10 MS: 1 ChangeBit- 00:07:17.209 [2024-07-15 00:15:16.180894] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000310a cdw11:00000000 00:07:17.209 [2024-07-15 00:15:16.180918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:17.209 [2024-07-15 00:15:16.180969] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000a60 cdw11:00000000 00:07:17.209 [2024-07-15 00:15:16.180983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:17.209 #46 NEW cov: 11687 ft: 14218 corp: 36/167b lim: 10 exec/s: 46 rss: 69Mb L: 4/10 MS: 1 CrossOver- 00:07:17.209 [2024-07-15 00:15:16.221003] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000e60 cdw11:00000000 00:07:17.209 [2024-07-15 00:15:16.221027] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:17.209 [2024-07-15 00:15:16.221078] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000e60 cdw11:00000000 00:07:17.209 [2024-07-15 00:15:16.221092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:17.209 #47 NEW cov: 11687 ft: 14237 corp: 37/171b lim: 10 exec/s: 47 rss: 70Mb L: 4/10 MS: 1 CopyPart- 00:07:17.209 [2024-07-15 00:15:16.261020] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a2f cdw11:00000000 00:07:17.209 [2024-07-15 00:15:16.261044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:17.469 #48 NEW cov: 11687 ft: 14243 corp: 38/174b lim: 10 exec/s: 48 rss: 70Mb L: 3/10 MS: 1 CrossOver- 00:07:17.469 [2024-07-15 00:15:16.301444] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0e cdw11:00000000 00:07:17.469 [2024-07-15 00:15:16.301471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:17.469 [2024-07-15 00:15:16.301523] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000220a cdw11:00000000 00:07:17.469 [2024-07-15 00:15:16.301536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:17.469 [2024-07-15 00:15:16.301589] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000e2f cdw11:00000000 00:07:17.469 [2024-07-15 00:15:16.301602] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:17.469 [2024-07-15 00:15:16.301651] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:17.469 [2024-07-15 00:15:16.301663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:17.469 #49 NEW cov: 11687 ft: 14251 corp: 39/183b lim: 10 exec/s: 49 rss: 70Mb L: 9/10 MS: 1 ChangeByte- 00:07:17.469 [2024-07-15 00:15:16.341231] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:17.469 [2024-07-15 00:15:16.341254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:17.469 #50 NEW cov: 11687 ft: 14287 corp: 40/186b lim: 10 exec/s: 50 rss: 70Mb L: 3/10 MS: 1 CrossOver- 00:07:17.469 [2024-07-15 00:15:16.381656] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0e cdw11:00000000 00:07:17.469 [2024-07-15 00:15:16.381680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:17.469 [2024-07-15 00:15:16.381748] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00005900 cdw11:00000000 00:07:17.469 [2024-07-15 00:15:16.381762] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:17.469 [2024-07-15 00:15:16.381813] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:17.469 [2024-07-15 00:15:16.381826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:17.469 [2024-07-15 00:15:16.381876] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:000000f2 cdw11:00000000 00:07:17.469 [2024-07-15 00:15:16.381889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:17.469 #51 NEW cov: 11687 ft: 14299 corp: 41/195b lim: 10 exec/s: 51 rss: 70Mb L: 9/10 MS: 1 PersAutoDict- DE: "\362\000"- 00:07:17.469 [2024-07-15 00:15:16.421555] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:17.469 [2024-07-15 00:15:16.421579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:17.469 [2024-07-15 00:15:16.421629] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00002dd2 cdw11:00000000 00:07:17.469 [2024-07-15 00:15:16.421642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:17.469 #52 NEW cov: 11687 ft: 14327 corp: 42/199b lim: 10 exec/s: 52 rss: 70Mb L: 4/10 MS: 1 ChangeBinInt- 00:07:17.469 [2024-07-15 00:15:16.461652] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:17.469 [2024-07-15 00:15:16.461676] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:17.469 [2024-07-15 00:15:16.461742] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000002d cdw11:00000000 00:07:17.469 [2024-07-15 00:15:16.461758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:17.469 #53 NEW cov: 11687 ft: 14345 corp: 43/203b lim: 10 exec/s: 53 rss: 70Mb L: 4/10 MS: 1 EraseBytes- 00:07:17.469 [2024-07-15 00:15:16.501686] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a13 cdw11:00000000 00:07:17.469 [2024-07-15 00:15:16.501710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:17.728 #54 NEW cov: 11687 ft: 14358 corp: 44/206b lim: 10 exec/s: 54 rss: 70Mb L: 3/10 MS: 1 ChangeBinInt- 00:07:17.728 [2024-07-15 00:15:16.542102] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000af2 cdw11:00000000 00:07:17.728 [2024-07-15 00:15:16.542126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:17.728 [2024-07-15 00:15:16.542176] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:17.728 [2024-07-15 00:15:16.542189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:17.728 [2024-07-15 00:15:16.542239] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:17.728 [2024-07-15 00:15:16.542268] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:17.728 [2024-07-15 00:15:16.542319] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000022 cdw11:00000000 00:07:17.729 [2024-07-15 00:15:16.542332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:17.729 #55 NEW cov: 11687 ft: 14369 corp: 45/215b lim: 10 exec/s: 27 rss: 70Mb L: 9/10 MS: 1 CrossOver- 00:07:17.729 #55 DONE cov: 11687 ft: 14369 corp: 45/215b lim: 10 exec/s: 27 rss: 70Mb 00:07:17.729 ###### Recommended dictionary. ###### 00:07:17.729 "\362\000" # Uses: 4 00:07:17.729 "\002\000" # Uses: 0 00:07:17.729 ###### End of recommended dictionary. ###### 00:07:17.729 Done 55 runs in 2 second(s) 00:07:17.729 00:15:16 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_6.conf 00:07:17.729 00:15:16 -- ../common.sh@72 -- # (( i++ )) 00:07:17.729 00:15:16 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:17.729 00:15:16 -- ../common.sh@73 -- # start_llvm_fuzz 7 1 0x1 00:07:17.729 00:15:16 -- nvmf/run.sh@23 -- # local fuzzer_type=7 00:07:17.729 00:15:16 -- nvmf/run.sh@24 -- # local timen=1 00:07:17.729 00:15:16 -- nvmf/run.sh@25 -- # local core=0x1 00:07:17.729 00:15:16 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:07:17.729 00:15:16 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_7.conf 00:07:17.729 00:15:16 -- nvmf/run.sh@29 -- # printf %02d 7 00:07:17.729 00:15:16 -- nvmf/run.sh@29 -- # port=4407 00:07:17.729 00:15:16 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:07:17.729 00:15:16 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4407' 00:07:17.729 00:15:16 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4407"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:17.729 00:15:16 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4407' -c /tmp/fuzz_json_7.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 -Z 7 -r /var/tmp/spdk7.sock 00:07:17.729 [2024-07-15 00:15:16.729964] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:17.729 [2024-07-15 00:15:16.730032] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid329212 ] 00:07:17.729 EAL: No free 2048 kB hugepages reported on node 1 00:07:17.988 [2024-07-15 00:15:16.912215] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:17.988 [2024-07-15 00:15:16.976130] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:17.988 [2024-07-15 00:15:16.976276] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:17.988 [2024-07-15 00:15:17.034256] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:18.247 [2024-07-15 00:15:17.050566] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4407 *** 00:07:18.247 INFO: Running with entropic power schedule (0xFF, 100). 00:07:18.247 INFO: Seed: 1707423812 00:07:18.247 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x280a94c, 0x285dea9), 00:07:18.247 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x285deb0,0x2d93480), 00:07:18.247 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:07:18.247 INFO: A corpus is not provided, starting from an empty corpus 00:07:18.247 #2 INITED exec/s: 0 rss: 60Mb 00:07:18.247 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:18.247 This may also happen if the target rejected all inputs we tried so far 00:07:18.247 [2024-07-15 00:15:17.117028] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:18.247 [2024-07-15 00:15:17.117067] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.247 [2024-07-15 00:15:17.117194] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:18.247 [2024-07-15 00:15:17.117211] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:18.247 [2024-07-15 00:15:17.117317] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ff02 cdw11:00000000 00:07:18.247 [2024-07-15 00:15:17.117334] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:18.506 NEW_FUNC[1/669]: 0x48c180 in fuzz_admin_delete_io_submission_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:172 00:07:18.506 NEW_FUNC[2/669]: 0x4bd260 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:18.506 #4 NEW cov: 11460 ft: 11461 corp: 2/7b lim: 10 exec/s: 0 rss: 67Mb L: 6/6 MS: 2 ChangeBit-InsertRepeatedBytes- 00:07:18.506 [2024-07-15 00:15:17.457434] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00008080 cdw11:00000000 00:07:18.506 [2024-07-15 00:15:17.457491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.506 #8 NEW cov: 11573 ft: 12301 corp: 3/9b lim: 10 exec/s: 0 rss: 67Mb L: 2/6 MS: 4 ChangeBit-ChangeBit-ChangeBit-CopyPart- 00:07:18.506 [2024-07-15 00:15:17.508310] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00008000 cdw11:00000000 00:07:18.506 [2024-07-15 00:15:17.508339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.506 [2024-07-15 00:15:17.508483] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:18.506 [2024-07-15 00:15:17.508500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:18.506 [2024-07-15 00:15:17.508618] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:18.506 [2024-07-15 00:15:17.508634] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:18.506 [2024-07-15 00:15:17.508755] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:18.506 [2024-07-15 00:15:17.508777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:18.506 [2024-07-15 00:15:17.508888] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 00:07:18.506 [2024-07-15 00:15:17.508904] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:18.506 #11 NEW cov: 11579 ft: 12806 corp: 4/19b lim: 10 exec/s: 0 rss: 67Mb L: 10/10 MS: 3 EraseBytes-ShuffleBytes-InsertRepeatedBytes- 00:07:18.506 [2024-07-15 00:15:17.558461] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00008a00 cdw11:00000000 00:07:18.506 [2024-07-15 00:15:17.558489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.506 [2024-07-15 00:15:17.558622] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:18.506 [2024-07-15 00:15:17.558638] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:18.506 [2024-07-15 00:15:17.558764] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:18.506 [2024-07-15 00:15:17.558781] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:18.506 [2024-07-15 00:15:17.558898] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:18.506 [2024-07-15 00:15:17.558915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:18.506 [2024-07-15 00:15:17.559041] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 00:07:18.506 [2024-07-15 00:15:17.559058] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:18.766 #12 NEW cov: 11664 ft: 13072 corp: 5/29b lim: 10 exec/s: 0 rss: 67Mb L: 10/10 MS: 1 ChangeBinInt- 00:07:18.766 [2024-07-15 00:15:17.608646] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002b00 cdw11:00000000 00:07:18.766 [2024-07-15 00:15:17.608673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.766 [2024-07-15 00:15:17.608793] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:18.766 [2024-07-15 00:15:17.608811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:18.766 [2024-07-15 00:15:17.608930] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:18.766 [2024-07-15 00:15:17.608946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:18.766 [2024-07-15 00:15:17.609055] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:18.766 [2024-07-15 00:15:17.609069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:18.766 [2024-07-15 00:15:17.609178] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 00:07:18.766 [2024-07-15 00:15:17.609193] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:18.766 #13 NEW cov: 11664 ft: 13258 corp: 6/39b lim: 10 exec/s: 0 rss: 67Mb L: 10/10 MS: 1 ChangeByte- 00:07:18.766 [2024-07-15 00:15:17.647878] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:18.766 [2024-07-15 00:15:17.647908] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.766 [2024-07-15 00:15:17.648024] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000f7ff cdw11:00000000 00:07:18.766 [2024-07-15 00:15:17.648040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:18.766 [2024-07-15 00:15:17.648155] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ff02 cdw11:00000000 00:07:18.766 [2024-07-15 00:15:17.648170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:18.766 #14 NEW cov: 11664 ft: 13373 corp: 7/45b lim: 10 exec/s: 0 rss: 67Mb L: 6/10 MS: 1 ChangeBit- 00:07:18.766 [2024-07-15 00:15:17.688806] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000008a cdw11:00000000 00:07:18.766 [2024-07-15 00:15:17.688832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.766 [2024-07-15 00:15:17.688938] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:18.766 [2024-07-15 00:15:17.688954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:18.766 [2024-07-15 00:15:17.689060] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:18.766 [2024-07-15 00:15:17.689077] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:18.766 [2024-07-15 00:15:17.689182] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:18.766 [2024-07-15 00:15:17.689198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:18.766 [2024-07-15 00:15:17.689306] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 00:07:18.766 [2024-07-15 00:15:17.689322] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:18.766 #15 NEW cov: 11664 ft: 13412 corp: 8/55b lim: 10 exec/s: 0 rss: 68Mb L: 10/10 MS: 1 ShuffleBytes- 00:07:18.766 [2024-07-15 00:15:17.728907] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002b00 cdw11:00000000 00:07:18.767 [2024-07-15 00:15:17.728932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.767 [2024-07-15 00:15:17.729046] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:18.767 [2024-07-15 00:15:17.729074] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:18.767 [2024-07-15 00:15:17.729183] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:18.767 [2024-07-15 00:15:17.729201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:18.767 [2024-07-15 00:15:17.729314] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:18.767 [2024-07-15 00:15:17.729329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:18.767 [2024-07-15 00:15:17.729436] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 00:07:18.767 [2024-07-15 00:15:17.729455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:18.767 #16 NEW cov: 11664 ft: 13477 corp: 9/65b lim: 10 exec/s: 0 rss: 68Mb L: 10/10 MS: 1 ChangeByte- 00:07:18.767 [2024-07-15 00:15:17.779109] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002b00 cdw11:00000000 00:07:18.767 [2024-07-15 00:15:17.779134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.767 [2024-07-15 00:15:17.779246] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:18.767 [2024-07-15 00:15:17.779261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:18.767 [2024-07-15 00:15:17.779369] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:18.767 [2024-07-15 00:15:17.779384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:18.767 [2024-07-15 00:15:17.779496] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:18.767 [2024-07-15 00:15:17.779511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:18.767 [2024-07-15 00:15:17.779618] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 00:07:18.767 [2024-07-15 00:15:17.779633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:18.767 #17 NEW cov: 11664 ft: 13522 corp: 10/75b lim: 10 exec/s: 0 rss: 68Mb L: 10/10 MS: 1 ShuffleBytes- 00:07:18.767 [2024-07-15 00:15:17.819088] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00008a00 cdw11:00000000 00:07:18.767 [2024-07-15 00:15:17.819114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.767 [2024-07-15 00:15:17.819238] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:18.767 [2024-07-15 00:15:17.819270] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:18.767 [2024-07-15 00:15:17.819385] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:18.767 [2024-07-15 00:15:17.819402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:18.767 [2024-07-15 00:15:17.819519] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:18.767 [2024-07-15 00:15:17.819537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:19.026 #18 NEW cov: 11664 ft: 13579 corp: 11/84b lim: 10 exec/s: 0 rss: 68Mb L: 9/10 MS: 1 EraseBytes- 00:07:19.026 [2024-07-15 00:15:17.858695] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000f7ff cdw11:00000000 00:07:19.026 [2024-07-15 00:15:17.858721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.026 [2024-07-15 00:15:17.858832] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ff02 cdw11:00000000 00:07:19.026 [2024-07-15 00:15:17.858849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.026 #19 NEW cov: 11664 ft: 13767 corp: 12/88b lim: 10 exec/s: 0 rss: 68Mb L: 4/10 MS: 1 EraseBytes- 00:07:19.026 [2024-07-15 00:15:17.899197] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:19.026 [2024-07-15 00:15:17.899223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.026 [2024-07-15 00:15:17.899344] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:19.026 [2024-07-15 00:15:17.899363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.027 [2024-07-15 00:15:17.899482] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000aeff cdw11:00000000 00:07:19.027 [2024-07-15 00:15:17.899499] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:19.027 #20 NEW cov: 11664 ft: 13781 corp: 13/95b lim: 10 exec/s: 0 rss: 68Mb L: 7/10 MS: 1 InsertByte- 00:07:19.027 [2024-07-15 00:15:17.938869] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a26 cdw11:00000000 00:07:19.027 [2024-07-15 00:15:17.938896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.027 #21 NEW cov: 11664 ft: 13868 corp: 14/97b lim: 10 exec/s: 0 rss: 68Mb L: 2/10 MS: 1 InsertByte- 00:07:19.027 [2024-07-15 00:15:17.979784] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:19.027 [2024-07-15 00:15:17.979808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.027 [2024-07-15 00:15:17.979927] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:19.027 [2024-07-15 00:15:17.979943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.027 [2024-07-15 00:15:17.980065] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:19.027 [2024-07-15 00:15:17.980091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:19.027 [2024-07-15 00:15:17.980209] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:000002ff cdw11:00000000 00:07:19.027 [2024-07-15 00:15:17.980225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:19.027 [2024-07-15 00:15:17.980341] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:0000ff02 cdw11:00000000 00:07:19.027 [2024-07-15 00:15:17.980359] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:19.027 NEW_FUNC[1/1]: 0x195e300 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:19.027 #22 NEW cov: 11687 ft: 13902 corp: 15/107b lim: 10 exec/s: 0 rss: 68Mb L: 10/10 MS: 1 CopyPart- 00:07:19.027 [2024-07-15 00:15:18.019711] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00008a00 cdw11:00000000 00:07:19.027 [2024-07-15 00:15:18.019738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.027 [2024-07-15 00:15:18.019855] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:19.027 [2024-07-15 00:15:18.019873] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.027 [2024-07-15 00:15:18.019987] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:19.027 [2024-07-15 00:15:18.020005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:19.027 [2024-07-15 00:15:18.020121] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:19.027 [2024-07-15 00:15:18.020138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:19.027 #23 NEW cov: 11687 ft: 13918 corp: 16/115b lim: 10 exec/s: 0 rss: 68Mb L: 8/10 MS: 1 EraseBytes- 00:07:19.027 [2024-07-15 00:15:18.059438] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00008a00 cdw11:00000000 00:07:19.027 [2024-07-15 00:15:18.059467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.027 [2024-07-15 00:15:18.059587] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:19.027 [2024-07-15 00:15:18.059603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.027 [2024-07-15 00:15:18.059717] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:19.027 [2024-07-15 00:15:18.059732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:19.027 [2024-07-15 00:15:18.059843] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:19.027 [2024-07-15 00:15:18.059857] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:19.027 [2024-07-15 00:15:18.059972] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:0000003d cdw11:00000000 00:07:19.027 [2024-07-15 00:15:18.059987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:19.027 #24 NEW cov: 11687 ft: 13930 corp: 17/125b lim: 10 exec/s: 0 rss: 68Mb L: 10/10 MS: 1 ChangeByte- 00:07:19.287 [2024-07-15 00:15:18.110170] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000008a cdw11:00000000 00:07:19.287 [2024-07-15 00:15:18.110200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.287 [2024-07-15 00:15:18.110324] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 00:07:19.287 [2024-07-15 00:15:18.110341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.287 [2024-07-15 00:15:18.110454] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:19.287 [2024-07-15 00:15:18.110471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:19.287 [2024-07-15 00:15:18.110581] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:19.287 [2024-07-15 00:15:18.110596] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:19.287 [2024-07-15 00:15:18.110713] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 00:07:19.287 [2024-07-15 00:15:18.110729] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:19.287 #25 NEW cov: 11687 ft: 13990 corp: 18/135b lim: 10 exec/s: 25 rss: 68Mb L: 10/10 MS: 1 ChangeBit- 00:07:19.287 [2024-07-15 00:15:18.149912] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002b00 cdw11:00000000 00:07:19.287 [2024-07-15 00:15:18.149939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.287 [2024-07-15 00:15:18.150048] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:19.287 [2024-07-15 00:15:18.150065] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.287 [2024-07-15 00:15:18.150177] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:19.287 [2024-07-15 00:15:18.150196] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:19.287 [2024-07-15 00:15:18.150313] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:19.287 [2024-07-15 00:15:18.150330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:19.287 [2024-07-15 00:15:18.150445] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 00:07:19.287 [2024-07-15 00:15:18.150462] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:19.287 #26 NEW cov: 11687 ft: 14016 corp: 19/145b lim: 10 exec/s: 26 rss: 68Mb L: 10/10 MS: 1 CopyPart- 00:07:19.287 [2024-07-15 00:15:18.190392] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00008a00 cdw11:00000000 00:07:19.287 [2024-07-15 00:15:18.190418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.287 [2024-07-15 00:15:18.190537] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:19.287 [2024-07-15 00:15:18.190554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.287 [2024-07-15 00:15:18.190675] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000002b cdw11:00000000 00:07:19.287 [2024-07-15 00:15:18.190691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:19.287 [2024-07-15 00:15:18.190807] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:19.287 [2024-07-15 00:15:18.190824] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:19.287 [2024-07-15 00:15:18.190943] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 00:07:19.287 [2024-07-15 00:15:18.190959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:19.287 #27 NEW cov: 11687 ft: 14030 corp: 20/155b lim: 10 exec/s: 27 rss: 68Mb L: 10/10 MS: 1 CrossOver- 00:07:19.287 [2024-07-15 00:15:18.229713] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000ff7f cdw11:00000000 00:07:19.287 [2024-07-15 00:15:18.229739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.287 [2024-07-15 00:15:18.229851] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:19.287 [2024-07-15 00:15:18.229868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.287 [2024-07-15 00:15:18.229988] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000aeff cdw11:00000000 00:07:19.287 [2024-07-15 00:15:18.230005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:19.287 #28 NEW cov: 11687 ft: 14044 corp: 21/162b lim: 10 exec/s: 28 rss: 69Mb L: 7/10 MS: 1 ChangeBit- 00:07:19.287 [2024-07-15 00:15:18.280714] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00008a00 cdw11:00000000 00:07:19.287 [2024-07-15 00:15:18.280739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.287 [2024-07-15 00:15:18.280852] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:19.287 [2024-07-15 00:15:18.280869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.287 [2024-07-15 00:15:18.280988] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:19.287 [2024-07-15 00:15:18.281004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:19.287 [2024-07-15 00:15:18.281119] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:19.287 [2024-07-15 00:15:18.281137] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:19.287 [2024-07-15 00:15:18.281247] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 00:07:19.287 [2024-07-15 00:15:18.281264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:19.287 #29 NEW cov: 11687 ft: 14125 corp: 22/172b lim: 10 exec/s: 29 rss: 69Mb L: 10/10 MS: 1 ShuffleBytes- 00:07:19.287 [2024-07-15 00:15:18.320807] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00008a00 cdw11:00000000 00:07:19.287 [2024-07-15 00:15:18.320832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.287 [2024-07-15 00:15:18.320937] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:19.287 [2024-07-15 00:15:18.320952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.287 [2024-07-15 00:15:18.321073] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:19.288 [2024-07-15 00:15:18.321090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:19.288 [2024-07-15 00:15:18.321199] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:19.288 [2024-07-15 00:15:18.321216] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:19.288 [2024-07-15 00:15:18.321330] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 00:07:19.288 [2024-07-15 00:15:18.321346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:19.547 #30 NEW cov: 11687 ft: 14155 corp: 23/182b lim: 10 exec/s: 30 rss: 69Mb L: 10/10 MS: 1 CopyPart- 00:07:19.547 [2024-07-15 00:15:18.360433] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00008000 cdw11:00000000 00:07:19.547 [2024-07-15 00:15:18.360462] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.547 [2024-07-15 00:15:18.360586] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:19.547 [2024-07-15 00:15:18.360602] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.547 [2024-07-15 00:15:18.360722] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:19.547 [2024-07-15 00:15:18.360737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:19.547 [2024-07-15 00:15:18.360854] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:19.547 [2024-07-15 00:15:18.360870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:19.547 [2024-07-15 00:15:18.360987] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 00:07:19.547 [2024-07-15 00:15:18.361005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:19.547 #31 NEW cov: 11687 ft: 14164 corp: 24/192b lim: 10 exec/s: 31 rss: 69Mb L: 10/10 MS: 1 CopyPart- 00:07:19.547 [2024-07-15 00:15:18.400958] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002b00 cdw11:00000000 00:07:19.547 [2024-07-15 00:15:18.400983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.548 [2024-07-15 00:15:18.401095] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:19.548 [2024-07-15 00:15:18.401111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.548 [2024-07-15 00:15:18.401218] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:19.548 [2024-07-15 00:15:18.401235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:19.548 [2024-07-15 00:15:18.401353] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:19.548 [2024-07-15 00:15:18.401369] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:19.548 [2024-07-15 00:15:18.401485] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 00:07:19.548 [2024-07-15 00:15:18.401503] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:19.548 #32 NEW cov: 11687 ft: 14175 corp: 25/202b lim: 10 exec/s: 32 rss: 69Mb L: 10/10 MS: 1 ShuffleBytes- 00:07:19.548 [2024-07-15 00:15:18.441169] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00008aff cdw11:00000000 00:07:19.548 [2024-07-15 00:15:18.441195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.548 [2024-07-15 00:15:18.441301] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:19.548 [2024-07-15 00:15:18.441319] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.548 [2024-07-15 00:15:18.441428] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:19.548 [2024-07-15 00:15:18.441450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:19.548 [2024-07-15 00:15:18.441523] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:19.548 [2024-07-15 00:15:18.441540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:19.548 [2024-07-15 00:15:18.441649] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:0000ff00 cdw11:00000000 00:07:19.548 [2024-07-15 00:15:18.441666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:19.548 #33 NEW cov: 11687 ft: 14199 corp: 26/212b lim: 10 exec/s: 33 rss: 69Mb L: 10/10 MS: 1 CMP- DE: "\377\377\377\377\377\377\377\377"- 00:07:19.548 [2024-07-15 00:15:18.480828] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:19.548 [2024-07-15 00:15:18.480855] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.548 [2024-07-15 00:15:18.480976] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:19.548 [2024-07-15 00:15:18.480993] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.548 [2024-07-15 00:15:18.481109] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000fe02 cdw11:00000000 00:07:19.548 [2024-07-15 00:15:18.481123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:19.548 #34 NEW cov: 11687 ft: 14228 corp: 27/218b lim: 10 exec/s: 34 rss: 69Mb L: 6/10 MS: 1 ChangeBinInt- 00:07:19.548 [2024-07-15 00:15:18.521409] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:19.548 [2024-07-15 00:15:18.521436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.548 [2024-07-15 00:15:18.521549] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:19.548 [2024-07-15 00:15:18.521565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.548 [2024-07-15 00:15:18.521677] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:19.548 [2024-07-15 00:15:18.521693] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:19.548 [2024-07-15 00:15:18.521808] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:000002ff cdw11:00000000 00:07:19.548 [2024-07-15 00:15:18.521825] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:19.548 [2024-07-15 00:15:18.521931] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:0000ff02 cdw11:00000000 00:07:19.548 [2024-07-15 00:15:18.521945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:19.548 #35 NEW cov: 11687 ft: 14237 corp: 28/228b lim: 10 exec/s: 35 rss: 69Mb L: 10/10 MS: 1 ChangeByte- 00:07:19.548 [2024-07-15 00:15:18.561354] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00008a00 cdw11:00000000 00:07:19.548 [2024-07-15 00:15:18.561379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.548 [2024-07-15 00:15:18.561486] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000002b cdw11:00000000 00:07:19.548 [2024-07-15 00:15:18.561502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.548 [2024-07-15 00:15:18.561615] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:19.548 [2024-07-15 00:15:18.561631] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:19.548 [2024-07-15 00:15:18.561748] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:19.548 [2024-07-15 00:15:18.561764] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:19.548 #36 NEW cov: 11687 ft: 14248 corp: 29/236b lim: 10 exec/s: 36 rss: 69Mb L: 8/10 MS: 1 EraseBytes- 00:07:19.548 [2024-07-15 00:15:18.601253] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000008a cdw11:00000000 00:07:19.548 [2024-07-15 00:15:18.601280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.548 [2024-07-15 00:15:18.601393] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 00:07:19.548 [2024-07-15 00:15:18.601409] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.548 [2024-07-15 00:15:18.601526] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:19.548 [2024-07-15 00:15:18.601543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:19.548 [2024-07-15 00:15:18.601653] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:000000f9 cdw11:00000000 00:07:19.548 [2024-07-15 00:15:18.601669] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:19.548 [2024-07-15 00:15:18.601780] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 00:07:19.548 [2024-07-15 00:15:18.601796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:19.808 #37 NEW cov: 11687 ft: 14266 corp: 30/246b lim: 10 exec/s: 37 rss: 69Mb L: 10/10 MS: 1 ChangeBinInt- 00:07:19.808 [2024-07-15 00:15:18.661589] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:19.808 [2024-07-15 00:15:18.661614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.809 [2024-07-15 00:15:18.661742] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:19.809 [2024-07-15 00:15:18.661758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.809 [2024-07-15 00:15:18.661872] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ae7e cdw11:00000000 00:07:19.809 [2024-07-15 00:15:18.661888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:19.809 [2024-07-15 00:15:18.662007] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000ff02 cdw11:00000000 00:07:19.809 [2024-07-15 00:15:18.662023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:19.809 #38 NEW cov: 11687 ft: 14359 corp: 31/254b lim: 10 exec/s: 38 rss: 69Mb L: 8/10 MS: 1 InsertByte- 00:07:19.809 [2024-07-15 00:15:18.712013] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002b00 cdw11:00000000 00:07:19.809 [2024-07-15 00:15:18.712039] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.809 [2024-07-15 00:15:18.712146] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:19.809 [2024-07-15 00:15:18.712162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.809 [2024-07-15 00:15:18.712275] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:19.809 [2024-07-15 00:15:18.712292] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:19.809 [2024-07-15 00:15:18.712404] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:19.809 [2024-07-15 00:15:18.712419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:19.809 [2024-07-15 00:15:18.712530] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 00:07:19.809 [2024-07-15 00:15:18.712547] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:19.809 #39 NEW cov: 11687 ft: 14379 corp: 32/264b lim: 10 exec/s: 39 rss: 69Mb L: 10/10 MS: 1 CopyPart- 00:07:19.809 [2024-07-15 00:15:18.761572] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000008a cdw11:00000000 00:07:19.809 [2024-07-15 00:15:18.761600] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.809 [2024-07-15 00:15:18.761711] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000002b cdw11:00000000 00:07:19.809 [2024-07-15 00:15:18.761727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.809 [2024-07-15 00:15:18.761834] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:19.809 [2024-07-15 00:15:18.761848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:19.809 [2024-07-15 00:15:18.761959] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:19.809 [2024-07-15 00:15:18.761975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:19.809 #40 NEW cov: 11687 ft: 14392 corp: 33/272b lim: 10 exec/s: 40 rss: 69Mb L: 8/10 MS: 1 ShuffleBytes- 00:07:19.809 [2024-07-15 00:15:18.802097] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000f7ff cdw11:00000000 00:07:19.809 [2024-07-15 00:15:18.802123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.809 [2024-07-15 00:15:18.802235] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:19.809 [2024-07-15 00:15:18.802251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.809 [2024-07-15 00:15:18.802355] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ff02 cdw11:00000000 00:07:19.809 [2024-07-15 00:15:18.802371] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:19.809 [2024-07-15 00:15:18.802489] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:19.809 [2024-07-15 00:15:18.802504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:19.809 #41 NEW cov: 11687 ft: 14400 corp: 34/280b lim: 10 exec/s: 41 rss: 69Mb L: 8/10 MS: 1 CrossOver- 00:07:19.809 [2024-07-15 00:15:18.842052] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002b00 cdw11:00000000 00:07:19.809 [2024-07-15 00:15:18.842079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.809 [2024-07-15 00:15:18.842188] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:19.809 [2024-07-15 00:15:18.842204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.809 [2024-07-15 00:15:18.842311] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:19.809 [2024-07-15 00:15:18.842327] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:19.809 [2024-07-15 00:15:18.842433] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:19.809 [2024-07-15 00:15:18.842452] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:19.809 [2024-07-15 00:15:18.842564] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 00:07:19.809 [2024-07-15 00:15:18.842579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:19.809 #42 NEW cov: 11687 ft: 14426 corp: 35/290b lim: 10 exec/s: 42 rss: 70Mb L: 10/10 MS: 1 CopyPart- 00:07:20.069 [2024-07-15 00:15:18.891687] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:20.069 [2024-07-15 00:15:18.891711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:20.069 #45 NEW cov: 11687 ft: 14455 corp: 36/292b lim: 10 exec/s: 45 rss: 70Mb L: 2/10 MS: 3 EraseBytes-CrossOver-InsertByte- 00:07:20.069 [2024-07-15 00:15:18.931931] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:20.069 [2024-07-15 00:15:18.931958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:20.069 [2024-07-15 00:15:18.932066] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000080 cdw11:00000000 00:07:20.069 [2024-07-15 00:15:18.932081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:20.069 #46 NEW cov: 11687 ft: 14490 corp: 37/297b lim: 10 exec/s: 46 rss: 70Mb L: 5/10 MS: 1 InsertRepeatedBytes- 00:07:20.069 [2024-07-15 00:15:18.972472] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:20.069 [2024-07-15 00:15:18.972498] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:20.069 [2024-07-15 00:15:18.972623] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:20.069 [2024-07-15 00:15:18.972638] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:20.069 [2024-07-15 00:15:18.972750] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ae7e cdw11:00000000 00:07:20.069 [2024-07-15 00:15:18.972766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:20.069 [2024-07-15 00:15:18.972881] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000ffc3 cdw11:00000000 00:07:20.069 [2024-07-15 00:15:18.972897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:20.069 [2024-07-15 00:15:19.012615] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000ff00 cdw11:00000000 00:07:20.069 [2024-07-15 00:15:19.012641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:20.069 [2024-07-15 00:15:19.012753] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:20.069 [2024-07-15 00:15:19.012778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:20.069 [2024-07-15 00:15:19.012897] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:20.069 [2024-07-15 00:15:19.012913] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:20.069 [2024-07-15 00:15:19.013021] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:20.069 [2024-07-15 00:15:19.013039] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:20.069 #48 NEW cov: 11687 ft: 14493 corp: 38/306b lim: 10 exec/s: 48 rss: 70Mb L: 9/10 MS: 2 InsertByte-ChangeBinInt- 00:07:20.069 [2024-07-15 00:15:19.052751] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002bff cdw11:00000000 00:07:20.069 [2024-07-15 00:15:19.052778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:20.069 [2024-07-15 00:15:19.052895] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:20.069 [2024-07-15 00:15:19.052911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:20.069 [2024-07-15 00:15:19.053013] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:20.069 [2024-07-15 00:15:19.053030] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:20.069 [2024-07-15 00:15:19.053132] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:20.069 [2024-07-15 00:15:19.053148] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:20.069 [2024-07-15 00:15:19.053262] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:0000ff00 cdw11:00000000 00:07:20.069 [2024-07-15 00:15:19.053279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:20.069 #49 NEW cov: 11687 ft: 14497 corp: 39/316b lim: 10 exec/s: 49 rss: 70Mb L: 10/10 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\377\377"- 00:07:20.069 [2024-07-15 00:15:19.092671] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000fffe cdw11:00000000 00:07:20.069 [2024-07-15 00:15:19.092698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:20.069 [2024-07-15 00:15:19.092814] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:20.069 [2024-07-15 00:15:19.092830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:20.069 [2024-07-15 00:15:19.092941] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ff02 cdw11:00000000 00:07:20.069 [2024-07-15 00:15:19.092958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:20.069 #50 NEW cov: 11687 ft: 14511 corp: 40/322b lim: 10 exec/s: 25 rss: 70Mb L: 6/10 MS: 1 ShuffleBytes- 00:07:20.069 #50 DONE cov: 11687 ft: 14511 corp: 40/322b lim: 10 exec/s: 25 rss: 70Mb 00:07:20.069 ###### Recommended dictionary. ###### 00:07:20.069 "\377\377\377\377\377\377\377\377" # Uses: 1 00:07:20.069 ###### End of recommended dictionary. ###### 00:07:20.069 Done 50 runs in 2 second(s) 00:07:20.328 00:15:19 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_7.conf 00:07:20.328 00:15:19 -- ../common.sh@72 -- # (( i++ )) 00:07:20.328 00:15:19 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:20.328 00:15:19 -- ../common.sh@73 -- # start_llvm_fuzz 8 1 0x1 00:07:20.328 00:15:19 -- nvmf/run.sh@23 -- # local fuzzer_type=8 00:07:20.328 00:15:19 -- nvmf/run.sh@24 -- # local timen=1 00:07:20.328 00:15:19 -- nvmf/run.sh@25 -- # local core=0x1 00:07:20.328 00:15:19 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:07:20.328 00:15:19 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_8.conf 00:07:20.328 00:15:19 -- nvmf/run.sh@29 -- # printf %02d 8 00:07:20.328 00:15:19 -- nvmf/run.sh@29 -- # port=4408 00:07:20.328 00:15:19 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:07:20.328 00:15:19 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4408' 00:07:20.328 00:15:19 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4408"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:20.328 00:15:19 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4408' -c /tmp/fuzz_json_8.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 -Z 8 -r /var/tmp/spdk8.sock 00:07:20.328 [2024-07-15 00:15:19.284964] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:20.328 [2024-07-15 00:15:19.285032] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid329759 ] 00:07:20.328 EAL: No free 2048 kB hugepages reported on node 1 00:07:20.587 [2024-07-15 00:15:19.460720] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:20.587 [2024-07-15 00:15:19.521903] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:20.587 [2024-07-15 00:15:19.522047] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:20.587 [2024-07-15 00:15:19.579776] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:20.587 [2024-07-15 00:15:19.596027] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4408 *** 00:07:20.587 INFO: Running with entropic power schedule (0xFF, 100). 00:07:20.587 INFO: Seed: 4254435001 00:07:20.587 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x280a94c, 0x285dea9), 00:07:20.587 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x285deb0,0x2d93480), 00:07:20.587 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:07:20.587 INFO: A corpus is not provided, starting from an empty corpus 00:07:20.587 [2024-07-15 00:15:19.640747] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.587 [2024-07-15 00:15:19.640782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:20.846 #2 INITED cov: 11488 ft: 11484 corp: 1/1b exec/s: 0 rss: 65Mb 00:07:20.846 [2024-07-15 00:15:19.690745] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.846 [2024-07-15 00:15:19.690778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:20.846 [2024-07-15 00:15:19.690812] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.846 [2024-07-15 00:15:19.690827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:20.846 #3 NEW cov: 11601 ft: 12700 corp: 2/3b lim: 5 exec/s: 0 rss: 66Mb L: 2/2 MS: 1 CopyPart- 00:07:20.846 [2024-07-15 00:15:19.760852] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.846 [2024-07-15 00:15:19.760883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:20.846 #4 NEW cov: 11607 ft: 12941 corp: 3/4b lim: 5 exec/s: 0 rss: 66Mb L: 1/2 MS: 1 ChangeByte- 00:07:20.846 [2024-07-15 00:15:19.810990] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.846 [2024-07-15 00:15:19.811020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:20.846 #5 NEW cov: 11692 ft: 13330 corp: 4/5b lim: 5 exec/s: 0 rss: 66Mb L: 1/2 MS: 1 ChangeBit- 00:07:20.846 [2024-07-15 00:15:19.871097] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.846 [2024-07-15 00:15:19.871128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:21.104 #6 NEW cov: 11692 ft: 13415 corp: 5/6b lim: 5 exec/s: 0 rss: 66Mb L: 1/2 MS: 1 ChangeByte- 00:07:21.104 [2024-07-15 00:15:19.921314] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.104 [2024-07-15 00:15:19.921351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:21.104 #7 NEW cov: 11692 ft: 13517 corp: 6/7b lim: 5 exec/s: 0 rss: 66Mb L: 1/2 MS: 1 CopyPart- 00:07:21.105 [2024-07-15 00:15:19.991523] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.105 [2024-07-15 00:15:19.991566] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:21.105 [2024-07-15 00:15:19.991612] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.105 [2024-07-15 00:15:19.991628] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:21.105 #8 NEW cov: 11692 ft: 13619 corp: 7/9b lim: 5 exec/s: 0 rss: 66Mb L: 2/2 MS: 1 InsertByte- 00:07:21.105 [2024-07-15 00:15:20.051698] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.105 [2024-07-15 00:15:20.051734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:21.105 #9 NEW cov: 11692 ft: 13654 corp: 8/10b lim: 5 exec/s: 0 rss: 66Mb L: 1/2 MS: 1 EraseBytes- 00:07:21.105 [2024-07-15 00:15:20.121973] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.105 [2024-07-15 00:15:20.122009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:21.105 [2024-07-15 00:15:20.122043] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.105 [2024-07-15 00:15:20.122059] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:21.105 [2024-07-15 00:15:20.122088] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.105 [2024-07-15 00:15:20.122103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:21.364 #10 NEW cov: 11692 ft: 13888 corp: 9/13b lim: 5 exec/s: 0 rss: 67Mb L: 3/3 MS: 1 InsertByte- 00:07:21.364 [2024-07-15 00:15:20.192224] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.364 [2024-07-15 00:15:20.192256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:21.364 [2024-07-15 00:15:20.192290] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.364 [2024-07-15 00:15:20.192307] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:21.364 [2024-07-15 00:15:20.192336] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.364 [2024-07-15 00:15:20.192352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:21.364 [2024-07-15 00:15:20.192381] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.364 [2024-07-15 00:15:20.192396] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:21.364 #11 NEW cov: 11692 ft: 14175 corp: 10/17b lim: 5 exec/s: 0 rss: 67Mb L: 4/4 MS: 1 CopyPart- 00:07:21.364 [2024-07-15 00:15:20.252436] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.364 [2024-07-15 00:15:20.252475] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:21.364 [2024-07-15 00:15:20.252509] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.364 [2024-07-15 00:15:20.252527] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:21.364 [2024-07-15 00:15:20.252557] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.364 [2024-07-15 00:15:20.252573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:21.364 [2024-07-15 00:15:20.252603] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.364 [2024-07-15 00:15:20.252619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:21.364 [2024-07-15 00:15:20.252648] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.364 [2024-07-15 00:15:20.252664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:21.364 #12 NEW cov: 11692 ft: 14342 corp: 11/22b lim: 5 exec/s: 0 rss: 67Mb L: 5/5 MS: 1 CrossOver- 00:07:21.364 [2024-07-15 00:15:20.322371] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.364 [2024-07-15 00:15:20.322403] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:21.364 #13 NEW cov: 11692 ft: 14368 corp: 12/23b lim: 5 exec/s: 0 rss: 67Mb L: 1/5 MS: 1 ChangeBit- 00:07:21.364 [2024-07-15 00:15:20.372469] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.364 [2024-07-15 00:15:20.372502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:21.364 #14 NEW cov: 11692 ft: 14398 corp: 13/24b lim: 5 exec/s: 0 rss: 67Mb L: 1/5 MS: 1 ChangeBit- 00:07:21.623 [2024-07-15 00:15:20.422676] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.623 [2024-07-15 00:15:20.422709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:21.623 [2024-07-15 00:15:20.422744] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.623 [2024-07-15 00:15:20.422761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:21.623 #15 NEW cov: 11692 ft: 14442 corp: 14/26b lim: 5 exec/s: 0 rss: 67Mb L: 2/5 MS: 1 CopyPart- 00:07:21.623 [2024-07-15 00:15:20.492879] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.623 [2024-07-15 00:15:20.492911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:21.623 [2024-07-15 00:15:20.492949] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.623 [2024-07-15 00:15:20.492965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:21.882 NEW_FUNC[1/1]: 0x195e300 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:21.882 #16 NEW cov: 11715 ft: 14465 corp: 15/28b lim: 5 exec/s: 16 rss: 68Mb L: 2/5 MS: 1 CrossOver- 00:07:21.882 [2024-07-15 00:15:20.833795] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.882 [2024-07-15 00:15:20.833835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:21.882 #17 NEW cov: 11715 ft: 14572 corp: 16/29b lim: 5 exec/s: 17 rss: 68Mb L: 1/5 MS: 1 CrossOver- 00:07:21.882 [2024-07-15 00:15:20.893986] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.882 [2024-07-15 00:15:20.894017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:21.882 [2024-07-15 00:15:20.894050] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.882 [2024-07-15 00:15:20.894065] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:21.882 [2024-07-15 00:15:20.894095] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.882 [2024-07-15 00:15:20.894111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:21.882 #18 NEW cov: 11715 ft: 14597 corp: 17/32b lim: 5 exec/s: 18 rss: 68Mb L: 3/5 MS: 1 ChangeByte- 00:07:22.142 [2024-07-15 00:15:20.944087] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.142 [2024-07-15 00:15:20.944118] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.142 [2024-07-15 00:15:20.944168] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.142 [2024-07-15 00:15:20.944184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:22.142 [2024-07-15 00:15:20.944215] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.142 [2024-07-15 00:15:20.944231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:22.142 #19 NEW cov: 11715 ft: 14686 corp: 18/35b lim: 5 exec/s: 19 rss: 68Mb L: 3/5 MS: 1 ChangeASCIIInt- 00:07:22.142 [2024-07-15 00:15:21.014289] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.142 [2024-07-15 00:15:21.014320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.142 [2024-07-15 00:15:21.014353] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.142 [2024-07-15 00:15:21.014369] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:22.142 [2024-07-15 00:15:21.014399] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.142 [2024-07-15 00:15:21.014418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:22.142 #20 NEW cov: 11715 ft: 14690 corp: 19/38b lim: 5 exec/s: 20 rss: 69Mb L: 3/5 MS: 1 InsertByte- 00:07:22.142 [2024-07-15 00:15:21.064255] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.142 [2024-07-15 00:15:21.064284] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.142 #21 NEW cov: 11715 ft: 14696 corp: 20/39b lim: 5 exec/s: 21 rss: 69Mb L: 1/5 MS: 1 ShuffleBytes- 00:07:22.142 [2024-07-15 00:15:21.114365] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.142 [2024-07-15 00:15:21.114394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.142 #22 NEW cov: 11715 ft: 14732 corp: 21/40b lim: 5 exec/s: 22 rss: 69Mb L: 1/5 MS: 1 ChangeBit- 00:07:22.142 [2024-07-15 00:15:21.164688] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.142 [2024-07-15 00:15:21.164717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.142 [2024-07-15 00:15:21.164765] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.142 [2024-07-15 00:15:21.164780] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:22.142 [2024-07-15 00:15:21.164810] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.142 [2024-07-15 00:15:21.164825] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:22.142 [2024-07-15 00:15:21.164853] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.142 [2024-07-15 00:15:21.164868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:22.402 #23 NEW cov: 11715 ft: 14766 corp: 22/44b lim: 5 exec/s: 23 rss: 69Mb L: 4/5 MS: 1 InsertRepeatedBytes- 00:07:22.402 [2024-07-15 00:15:21.214667] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.402 [2024-07-15 00:15:21.214699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.402 #24 NEW cov: 11715 ft: 14781 corp: 23/45b lim: 5 exec/s: 24 rss: 69Mb L: 1/5 MS: 1 ShuffleBytes- 00:07:22.402 [2024-07-15 00:15:21.274927] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.402 [2024-07-15 00:15:21.274957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.402 [2024-07-15 00:15:21.274990] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.402 [2024-07-15 00:15:21.275006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:22.402 #25 NEW cov: 11715 ft: 14815 corp: 24/47b lim: 5 exec/s: 25 rss: 69Mb L: 2/5 MS: 1 CrossOver- 00:07:22.402 [2024-07-15 00:15:21.325049] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.402 [2024-07-15 00:15:21.325082] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.402 [2024-07-15 00:15:21.325130] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.402 [2024-07-15 00:15:21.325146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:22.402 [2024-07-15 00:15:21.325175] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.402 [2024-07-15 00:15:21.325190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:22.402 #26 NEW cov: 11715 ft: 14874 corp: 25/50b lim: 5 exec/s: 26 rss: 69Mb L: 3/5 MS: 1 CrossOver- 00:07:22.402 [2024-07-15 00:15:21.395193] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.402 [2024-07-15 00:15:21.395223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.402 #27 NEW cov: 11715 ft: 14904 corp: 26/51b lim: 5 exec/s: 27 rss: 69Mb L: 1/5 MS: 1 ChangeByte- 00:07:22.402 [2024-07-15 00:15:21.455463] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.402 [2024-07-15 00:15:21.455495] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.402 [2024-07-15 00:15:21.455545] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.402 [2024-07-15 00:15:21.455571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:22.402 [2024-07-15 00:15:21.455600] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.402 [2024-07-15 00:15:21.455616] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:22.662 #28 NEW cov: 11715 ft: 14919 corp: 27/54b lim: 5 exec/s: 28 rss: 69Mb L: 3/5 MS: 1 CrossOver- 00:07:22.662 [2024-07-15 00:15:21.505454] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.662 [2024-07-15 00:15:21.505486] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.662 #29 NEW cov: 11716 ft: 14942 corp: 28/55b lim: 5 exec/s: 29 rss: 69Mb L: 1/5 MS: 1 CopyPart- 00:07:22.662 [2024-07-15 00:15:21.555608] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.662 [2024-07-15 00:15:21.555638] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.662 [2024-07-15 00:15:21.555686] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.662 [2024-07-15 00:15:21.555702] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:22.662 #30 NEW cov: 11716 ft: 14951 corp: 29/57b lim: 5 exec/s: 30 rss: 69Mb L: 2/5 MS: 1 InsertByte- 00:07:22.662 [2024-07-15 00:15:21.615734] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.662 [2024-07-15 00:15:21.615767] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.662 #31 NEW cov: 11716 ft: 14981 corp: 30/58b lim: 5 exec/s: 15 rss: 69Mb L: 1/5 MS: 1 CopyPart- 00:07:22.662 #31 DONE cov: 11716 ft: 14981 corp: 30/58b lim: 5 exec/s: 15 rss: 69Mb 00:07:22.662 Done 31 runs in 2 second(s) 00:07:22.921 00:15:21 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_8.conf 00:07:22.921 00:15:21 -- ../common.sh@72 -- # (( i++ )) 00:07:22.921 00:15:21 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:22.921 00:15:21 -- ../common.sh@73 -- # start_llvm_fuzz 9 1 0x1 00:07:22.921 00:15:21 -- nvmf/run.sh@23 -- # local fuzzer_type=9 00:07:22.922 00:15:21 -- nvmf/run.sh@24 -- # local timen=1 00:07:22.922 00:15:21 -- nvmf/run.sh@25 -- # local core=0x1 00:07:22.922 00:15:21 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:07:22.922 00:15:21 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_9.conf 00:07:22.922 00:15:21 -- nvmf/run.sh@29 -- # printf %02d 9 00:07:22.922 00:15:21 -- nvmf/run.sh@29 -- # port=4409 00:07:22.922 00:15:21 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:07:22.922 00:15:21 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4409' 00:07:22.922 00:15:21 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4409"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:22.922 00:15:21 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4409' -c /tmp/fuzz_json_9.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 -Z 9 -r /var/tmp/spdk9.sock 00:07:22.922 [2024-07-15 00:15:21.818383] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:22.922 [2024-07-15 00:15:21.818534] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid330199 ] 00:07:22.922 EAL: No free 2048 kB hugepages reported on node 1 00:07:23.181 [2024-07-15 00:15:22.007593] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:23.181 [2024-07-15 00:15:22.069572] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:23.181 [2024-07-15 00:15:22.069719] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:23.181 [2024-07-15 00:15:22.127766] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:23.181 [2024-07-15 00:15:22.144068] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4409 *** 00:07:23.181 INFO: Running with entropic power schedule (0xFF, 100). 00:07:23.181 INFO: Seed: 2507467340 00:07:23.181 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x280a94c, 0x285dea9), 00:07:23.181 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x285deb0,0x2d93480), 00:07:23.181 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:07:23.181 INFO: A corpus is not provided, starting from an empty corpus 00:07:23.181 [2024-07-15 00:15:22.199365] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.181 [2024-07-15 00:15:22.199394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.181 #2 INITED cov: 11483 ft: 11488 corp: 1/1b exec/s: 0 rss: 65Mb 00:07:23.181 [2024-07-15 00:15:22.229296] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.181 [2024-07-15 00:15:22.229320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.700 NEW_FUNC[1/1]: 0x1c37410 in accel_comp_poll /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/accel/accel_sw.c:554 00:07:23.700 #3 NEW cov: 11601 ft: 11929 corp: 2/2b lim: 5 exec/s: 0 rss: 67Mb L: 1/1 MS: 1 ChangeByte- 00:07:23.700 [2024-07-15 00:15:22.560201] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.700 [2024-07-15 00:15:22.560234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.700 #4 NEW cov: 11607 ft: 12213 corp: 3/3b lim: 5 exec/s: 0 rss: 67Mb L: 1/1 MS: 1 ShuffleBytes- 00:07:23.700 [2024-07-15 00:15:22.600222] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.700 [2024-07-15 00:15:22.600248] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.700 #5 NEW cov: 11692 ft: 12444 corp: 4/4b lim: 5 exec/s: 0 rss: 67Mb L: 1/1 MS: 1 ChangeByte- 00:07:23.700 [2024-07-15 00:15:22.640341] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.700 [2024-07-15 00:15:22.640366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.700 #6 NEW cov: 11692 ft: 12515 corp: 5/5b lim: 5 exec/s: 0 rss: 67Mb L: 1/1 MS: 1 ChangeBit- 00:07:23.700 [2024-07-15 00:15:22.680458] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.700 [2024-07-15 00:15:22.680482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.700 #7 NEW cov: 11692 ft: 12613 corp: 6/6b lim: 5 exec/s: 0 rss: 67Mb L: 1/1 MS: 1 ChangeByte- 00:07:23.700 [2024-07-15 00:15:22.721185] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.700 [2024-07-15 00:15:22.721210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.700 [2024-07-15 00:15:22.721293] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.700 [2024-07-15 00:15:22.721308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:23.700 [2024-07-15 00:15:22.721361] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.700 [2024-07-15 00:15:22.721375] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:23.700 [2024-07-15 00:15:22.721428] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.700 [2024-07-15 00:15:22.721445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:23.700 [2024-07-15 00:15:22.721501] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.700 [2024-07-15 00:15:22.721514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:23.700 #8 NEW cov: 11692 ft: 13552 corp: 7/11b lim: 5 exec/s: 0 rss: 67Mb L: 5/5 MS: 1 InsertRepeatedBytes- 00:07:23.959 [2024-07-15 00:15:22.770863] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.959 [2024-07-15 00:15:22.770888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.959 [2024-07-15 00:15:22.770949] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.959 [2024-07-15 00:15:22.770962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:23.959 #9 NEW cov: 11692 ft: 13766 corp: 8/13b lim: 5 exec/s: 0 rss: 68Mb L: 2/5 MS: 1 InsertByte- 00:07:23.959 [2024-07-15 00:15:22.810823] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.959 [2024-07-15 00:15:22.810847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.959 #10 NEW cov: 11692 ft: 13829 corp: 9/14b lim: 5 exec/s: 0 rss: 68Mb L: 1/5 MS: 1 EraseBytes- 00:07:23.959 [2024-07-15 00:15:22.850966] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.959 [2024-07-15 00:15:22.850990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.959 #11 NEW cov: 11692 ft: 13892 corp: 10/15b lim: 5 exec/s: 0 rss: 68Mb L: 1/5 MS: 1 ChangeByte- 00:07:23.959 [2024-07-15 00:15:22.891710] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.959 [2024-07-15 00:15:22.891734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.959 [2024-07-15 00:15:22.891789] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.959 [2024-07-15 00:15:22.891803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:23.959 [2024-07-15 00:15:22.891856] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.959 [2024-07-15 00:15:22.891868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:23.960 [2024-07-15 00:15:22.891922] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.960 [2024-07-15 00:15:22.891935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:23.960 [2024-07-15 00:15:22.891990] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.960 [2024-07-15 00:15:22.892003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:23.960 #12 NEW cov: 11692 ft: 13922 corp: 11/20b lim: 5 exec/s: 0 rss: 68Mb L: 5/5 MS: 1 InsertRepeatedBytes- 00:07:23.960 [2024-07-15 00:15:22.931531] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.960 [2024-07-15 00:15:22.931555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.960 [2024-07-15 00:15:22.931615] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.960 [2024-07-15 00:15:22.931628] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:23.960 [2024-07-15 00:15:22.931686] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.960 [2024-07-15 00:15:22.931703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:23.960 #13 NEW cov: 11692 ft: 14097 corp: 12/23b lim: 5 exec/s: 0 rss: 68Mb L: 3/5 MS: 1 CMP- DE: "\231["- 00:07:23.960 [2024-07-15 00:15:22.971632] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.960 [2024-07-15 00:15:22.971656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.960 [2024-07-15 00:15:22.971713] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.960 [2024-07-15 00:15:22.971726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:23.960 [2024-07-15 00:15:22.971782] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.960 [2024-07-15 00:15:22.971795] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:23.960 #14 NEW cov: 11692 ft: 14149 corp: 13/26b lim: 5 exec/s: 0 rss: 68Mb L: 3/5 MS: 1 CopyPart- 00:07:23.960 [2024-07-15 00:15:23.011581] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.960 [2024-07-15 00:15:23.011606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.960 [2024-07-15 00:15:23.011676] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.960 [2024-07-15 00:15:23.011689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:24.220 #15 NEW cov: 11692 ft: 14177 corp: 14/28b lim: 5 exec/s: 0 rss: 68Mb L: 2/5 MS: 1 ChangeBinInt- 00:07:24.220 [2024-07-15 00:15:23.051779] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.220 [2024-07-15 00:15:23.051804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.220 [2024-07-15 00:15:23.051860] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.220 [2024-07-15 00:15:23.051874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:24.220 [2024-07-15 00:15:23.051930] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.220 [2024-07-15 00:15:23.051943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:24.220 NEW_FUNC[1/1]: 0x195e300 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:24.220 #16 NEW cov: 11715 ft: 14195 corp: 15/31b lim: 5 exec/s: 0 rss: 68Mb L: 3/5 MS: 1 PersAutoDict- DE: "\231["- 00:07:24.220 [2024-07-15 00:15:23.102012] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.220 [2024-07-15 00:15:23.102037] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.220 [2024-07-15 00:15:23.102095] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.220 [2024-07-15 00:15:23.102112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:24.220 [2024-07-15 00:15:23.102184] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.220 [2024-07-15 00:15:23.102198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:24.220 #17 NEW cov: 11715 ft: 14226 corp: 16/34b lim: 5 exec/s: 0 rss: 68Mb L: 3/5 MS: 1 ChangeByte- 00:07:24.220 [2024-07-15 00:15:23.141919] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.220 [2024-07-15 00:15:23.141944] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.220 [2024-07-15 00:15:23.141997] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.220 [2024-07-15 00:15:23.142011] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:24.220 #18 NEW cov: 11715 ft: 14246 corp: 17/36b lim: 5 exec/s: 0 rss: 68Mb L: 2/5 MS: 1 ChangeByte- 00:07:24.220 [2024-07-15 00:15:23.172153] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.220 [2024-07-15 00:15:23.172177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.220 [2024-07-15 00:15:23.172241] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.220 [2024-07-15 00:15:23.172254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:24.220 [2024-07-15 00:15:23.172308] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.220 [2024-07-15 00:15:23.172321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:24.220 #19 NEW cov: 11715 ft: 14312 corp: 18/39b lim: 5 exec/s: 19 rss: 68Mb L: 3/5 MS: 1 ChangeBinInt- 00:07:24.220 [2024-07-15 00:15:23.212299] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.220 [2024-07-15 00:15:23.212323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.220 [2024-07-15 00:15:23.212382] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.220 [2024-07-15 00:15:23.212395] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:24.220 [2024-07-15 00:15:23.212454] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.220 [2024-07-15 00:15:23.212484] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:24.220 #20 NEW cov: 11715 ft: 14393 corp: 19/42b lim: 5 exec/s: 20 rss: 68Mb L: 3/5 MS: 1 PersAutoDict- DE: "\231["- 00:07:24.220 [2024-07-15 00:15:23.252240] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.220 [2024-07-15 00:15:23.252265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.220 [2024-07-15 00:15:23.252323] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.220 [2024-07-15 00:15:23.252337] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:24.480 #21 NEW cov: 11715 ft: 14432 corp: 20/44b lim: 5 exec/s: 21 rss: 68Mb L: 2/5 MS: 1 ChangeByte- 00:07:24.480 [2024-07-15 00:15:23.292364] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.480 [2024-07-15 00:15:23.292389] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.480 [2024-07-15 00:15:23.292447] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.480 [2024-07-15 00:15:23.292461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:24.480 #22 NEW cov: 11715 ft: 14446 corp: 21/46b lim: 5 exec/s: 22 rss: 68Mb L: 2/5 MS: 1 ChangeBinInt- 00:07:24.480 [2024-07-15 00:15:23.332633] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.480 [2024-07-15 00:15:23.332658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.480 [2024-07-15 00:15:23.332715] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.480 [2024-07-15 00:15:23.332729] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:24.480 [2024-07-15 00:15:23.332785] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.480 [2024-07-15 00:15:23.332798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:24.480 #23 NEW cov: 11715 ft: 14489 corp: 22/49b lim: 5 exec/s: 23 rss: 69Mb L: 3/5 MS: 1 EraseBytes- 00:07:24.480 [2024-07-15 00:15:23.373099] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.480 [2024-07-15 00:15:23.373123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.480 [2024-07-15 00:15:23.373197] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.480 [2024-07-15 00:15:23.373211] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:24.480 [2024-07-15 00:15:23.373268] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.480 [2024-07-15 00:15:23.373281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:24.480 [2024-07-15 00:15:23.373335] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.480 [2024-07-15 00:15:23.373348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:24.480 [2024-07-15 00:15:23.373402] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.480 [2024-07-15 00:15:23.373416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:24.480 #24 NEW cov: 11715 ft: 14497 corp: 23/54b lim: 5 exec/s: 24 rss: 69Mb L: 5/5 MS: 1 InsertRepeatedBytes- 00:07:24.480 [2024-07-15 00:15:23.413181] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.480 [2024-07-15 00:15:23.413207] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.480 [2024-07-15 00:15:23.413243] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.480 [2024-07-15 00:15:23.413256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:24.480 [2024-07-15 00:15:23.413310] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.480 [2024-07-15 00:15:23.413323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:24.480 [2024-07-15 00:15:23.413378] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.480 [2024-07-15 00:15:23.413391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:24.480 [2024-07-15 00:15:23.413446] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.480 [2024-07-15 00:15:23.413457] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:24.480 #25 NEW cov: 11715 ft: 14508 corp: 24/59b lim: 5 exec/s: 25 rss: 69Mb L: 5/5 MS: 1 ChangeByte- 00:07:24.481 [2024-07-15 00:15:23.453319] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.481 [2024-07-15 00:15:23.453344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.481 [2024-07-15 00:15:23.453401] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.481 [2024-07-15 00:15:23.453414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:24.481 [2024-07-15 00:15:23.453471] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.481 [2024-07-15 00:15:23.453485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:24.481 [2024-07-15 00:15:23.453539] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.481 [2024-07-15 00:15:23.453552] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:24.481 [2024-07-15 00:15:23.453603] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.481 [2024-07-15 00:15:23.453616] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:24.481 #26 NEW cov: 11715 ft: 14513 corp: 25/64b lim: 5 exec/s: 26 rss: 69Mb L: 5/5 MS: 1 CrossOver- 00:07:24.481 [2024-07-15 00:15:23.493102] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.481 [2024-07-15 00:15:23.493130] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.481 [2024-07-15 00:15:23.493188] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.481 [2024-07-15 00:15:23.493202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:24.481 [2024-07-15 00:15:23.493258] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.481 [2024-07-15 00:15:23.493271] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:24.481 #27 NEW cov: 11715 ft: 14532 corp: 26/67b lim: 5 exec/s: 27 rss: 69Mb L: 3/5 MS: 1 ShuffleBytes- 00:07:24.481 [2024-07-15 00:15:23.533341] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.481 [2024-07-15 00:15:23.533366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.481 [2024-07-15 00:15:23.533437] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.481 [2024-07-15 00:15:23.533456] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:24.481 [2024-07-15 00:15:23.533513] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.481 [2024-07-15 00:15:23.533526] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:24.481 [2024-07-15 00:15:23.533582] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.481 [2024-07-15 00:15:23.533595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:24.740 #28 NEW cov: 11715 ft: 14541 corp: 27/71b lim: 5 exec/s: 28 rss: 69Mb L: 4/5 MS: 1 InsertByte- 00:07:24.740 [2024-07-15 00:15:23.573457] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.740 [2024-07-15 00:15:23.573482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.740 [2024-07-15 00:15:23.573539] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.740 [2024-07-15 00:15:23.573553] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:24.740 [2024-07-15 00:15:23.573608] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.740 [2024-07-15 00:15:23.573621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:24.740 [2024-07-15 00:15:23.573675] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.740 [2024-07-15 00:15:23.573688] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:24.740 #29 NEW cov: 11715 ft: 14558 corp: 28/75b lim: 5 exec/s: 29 rss: 69Mb L: 4/5 MS: 1 InsertByte- 00:07:24.740 [2024-07-15 00:15:23.613138] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.740 [2024-07-15 00:15:23.613165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.740 #30 NEW cov: 11715 ft: 14574 corp: 29/76b lim: 5 exec/s: 30 rss: 69Mb L: 1/5 MS: 1 CrossOver- 00:07:24.740 [2024-07-15 00:15:23.643540] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.740 [2024-07-15 00:15:23.643564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.740 [2024-07-15 00:15:23.643622] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.740 [2024-07-15 00:15:23.643635] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:24.740 [2024-07-15 00:15:23.643707] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.740 [2024-07-15 00:15:23.643721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:24.740 #31 NEW cov: 11715 ft: 14602 corp: 30/79b lim: 5 exec/s: 31 rss: 69Mb L: 3/5 MS: 1 ShuffleBytes- 00:07:24.740 [2024-07-15 00:15:23.683710] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.740 [2024-07-15 00:15:23.683735] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.740 [2024-07-15 00:15:23.683791] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.740 [2024-07-15 00:15:23.683804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:24.740 [2024-07-15 00:15:23.683858] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.740 [2024-07-15 00:15:23.683870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:24.740 #32 NEW cov: 11715 ft: 14605 corp: 31/82b lim: 5 exec/s: 32 rss: 69Mb L: 3/5 MS: 1 InsertByte- 00:07:24.740 [2024-07-15 00:15:23.723719] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.740 [2024-07-15 00:15:23.723743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.740 [2024-07-15 00:15:23.723801] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.740 [2024-07-15 00:15:23.723814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:24.740 #33 NEW cov: 11715 ft: 14619 corp: 32/84b lim: 5 exec/s: 33 rss: 69Mb L: 2/5 MS: 1 PersAutoDict- DE: "\231["- 00:07:24.740 [2024-07-15 00:15:23.764109] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.740 [2024-07-15 00:15:23.764134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.740 [2024-07-15 00:15:23.764189] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.740 [2024-07-15 00:15:23.764202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:24.740 [2024-07-15 00:15:23.764256] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.740 [2024-07-15 00:15:23.764269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:24.740 [2024-07-15 00:15:23.764322] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.740 [2024-07-15 00:15:23.764335] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:24.740 #34 NEW cov: 11715 ft: 14658 corp: 33/88b lim: 5 exec/s: 34 rss: 69Mb L: 4/5 MS: 1 CrossOver- 00:07:25.000 [2024-07-15 00:15:23.804381] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.000 [2024-07-15 00:15:23.804406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.000 [2024-07-15 00:15:23.804461] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.000 [2024-07-15 00:15:23.804475] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.000 [2024-07-15 00:15:23.804530] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.000 [2024-07-15 00:15:23.804543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:25.000 [2024-07-15 00:15:23.804597] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.000 [2024-07-15 00:15:23.804610] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:25.000 [2024-07-15 00:15:23.804664] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.000 [2024-07-15 00:15:23.804677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:25.000 #35 NEW cov: 11715 ft: 14724 corp: 34/93b lim: 5 exec/s: 35 rss: 70Mb L: 5/5 MS: 1 CopyPart- 00:07:25.000 [2024-07-15 00:15:23.844180] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.000 [2024-07-15 00:15:23.844204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.000 [2024-07-15 00:15:23.844262] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.000 [2024-07-15 00:15:23.844275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.000 [2024-07-15 00:15:23.844329] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.000 [2024-07-15 00:15:23.844342] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:25.000 #36 NEW cov: 11715 ft: 14741 corp: 35/96b lim: 5 exec/s: 36 rss: 70Mb L: 3/5 MS: 1 CrossOver- 00:07:25.000 [2024-07-15 00:15:23.884278] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.000 [2024-07-15 00:15:23.884305] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.000 [2024-07-15 00:15:23.884362] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.000 [2024-07-15 00:15:23.884375] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.000 [2024-07-15 00:15:23.884429] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.000 [2024-07-15 00:15:23.884446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:25.000 #37 NEW cov: 11715 ft: 14747 corp: 36/99b lim: 5 exec/s: 37 rss: 70Mb L: 3/5 MS: 1 ChangeBit- 00:07:25.000 [2024-07-15 00:15:23.924719] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.000 [2024-07-15 00:15:23.924743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.000 [2024-07-15 00:15:23.924798] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.000 [2024-07-15 00:15:23.924812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.000 [2024-07-15 00:15:23.924865] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.000 [2024-07-15 00:15:23.924878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:25.000 [2024-07-15 00:15:23.924929] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.000 [2024-07-15 00:15:23.924942] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:25.000 [2024-07-15 00:15:23.924996] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.000 [2024-07-15 00:15:23.925009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:25.000 #38 NEW cov: 11715 ft: 14773 corp: 37/104b lim: 5 exec/s: 38 rss: 70Mb L: 5/5 MS: 1 CMP- DE: "\377\377"- 00:07:25.000 [2024-07-15 00:15:23.964656] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.000 [2024-07-15 00:15:23.964680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.000 [2024-07-15 00:15:23.964734] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.000 [2024-07-15 00:15:23.964747] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.000 [2024-07-15 00:15:23.964801] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.000 [2024-07-15 00:15:23.964814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:25.000 [2024-07-15 00:15:23.964870] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.000 [2024-07-15 00:15:23.964883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:25.000 #39 NEW cov: 11715 ft: 14786 corp: 38/108b lim: 5 exec/s: 39 rss: 70Mb L: 4/5 MS: 1 InsertByte- 00:07:25.000 [2024-07-15 00:15:24.004600] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.000 [2024-07-15 00:15:24.004624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.000 [2024-07-15 00:15:24.004682] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.000 [2024-07-15 00:15:24.004696] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.000 [2024-07-15 00:15:24.004752] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.000 [2024-07-15 00:15:24.004765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:25.000 #40 NEW cov: 11715 ft: 14800 corp: 39/111b lim: 5 exec/s: 40 rss: 70Mb L: 3/5 MS: 1 ChangeBit- 00:07:25.000 [2024-07-15 00:15:24.044751] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.000 [2024-07-15 00:15:24.044776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.000 [2024-07-15 00:15:24.044849] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.000 [2024-07-15 00:15:24.044863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.000 [2024-07-15 00:15:24.044920] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.000 [2024-07-15 00:15:24.044932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:25.259 #41 NEW cov: 11715 ft: 14819 corp: 40/114b lim: 5 exec/s: 41 rss: 70Mb L: 3/5 MS: 1 ShuffleBytes- 00:07:25.259 [2024-07-15 00:15:24.084849] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.259 [2024-07-15 00:15:24.084874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.259 [2024-07-15 00:15:24.084931] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.259 [2024-07-15 00:15:24.084946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.259 [2024-07-15 00:15:24.084998] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.259 [2024-07-15 00:15:24.085010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:25.259 #42 NEW cov: 11715 ft: 14859 corp: 41/117b lim: 5 exec/s: 42 rss: 70Mb L: 3/5 MS: 1 ChangeByte- 00:07:25.259 [2024-07-15 00:15:24.125120] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.259 [2024-07-15 00:15:24.125144] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.259 [2024-07-15 00:15:24.125213] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.259 [2024-07-15 00:15:24.125234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.259 [2024-07-15 00:15:24.125288] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.259 [2024-07-15 00:15:24.125301] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:25.259 [2024-07-15 00:15:24.125352] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.259 [2024-07-15 00:15:24.125365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:25.259 #43 NEW cov: 11715 ft: 14917 corp: 42/121b lim: 5 exec/s: 43 rss: 70Mb L: 4/5 MS: 1 CopyPart- 00:07:25.259 [2024-07-15 00:15:24.164910] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.259 [2024-07-15 00:15:24.164933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.259 [2024-07-15 00:15:24.164989] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.259 [2024-07-15 00:15:24.165002] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.259 #44 NEW cov: 11715 ft: 14929 corp: 43/123b lim: 5 exec/s: 22 rss: 70Mb L: 2/5 MS: 1 EraseBytes- 00:07:25.259 #44 DONE cov: 11715 ft: 14929 corp: 43/123b lim: 5 exec/s: 22 rss: 70Mb 00:07:25.259 ###### Recommended dictionary. ###### 00:07:25.259 "\231[" # Uses: 3 00:07:25.259 "\377\377" # Uses: 0 00:07:25.259 ###### End of recommended dictionary. ###### 00:07:25.259 Done 44 runs in 2 second(s) 00:07:25.259 00:15:24 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_9.conf 00:07:25.259 00:15:24 -- ../common.sh@72 -- # (( i++ )) 00:07:25.259 00:15:24 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:25.259 00:15:24 -- ../common.sh@73 -- # start_llvm_fuzz 10 1 0x1 00:07:25.259 00:15:24 -- nvmf/run.sh@23 -- # local fuzzer_type=10 00:07:25.259 00:15:24 -- nvmf/run.sh@24 -- # local timen=1 00:07:25.259 00:15:24 -- nvmf/run.sh@25 -- # local core=0x1 00:07:25.259 00:15:24 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:07:25.259 00:15:24 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_10.conf 00:07:25.259 00:15:24 -- nvmf/run.sh@29 -- # printf %02d 10 00:07:25.259 00:15:24 -- nvmf/run.sh@29 -- # port=4410 00:07:25.259 00:15:24 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:07:25.518 00:15:24 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4410' 00:07:25.518 00:15:24 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4410"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:25.518 00:15:24 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4410' -c /tmp/fuzz_json_10.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 -Z 10 -r /var/tmp/spdk10.sock 00:07:25.518 [2024-07-15 00:15:24.349322] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:25.518 [2024-07-15 00:15:24.349417] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid330593 ] 00:07:25.518 EAL: No free 2048 kB hugepages reported on node 1 00:07:25.518 [2024-07-15 00:15:24.531424] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:25.777 [2024-07-15 00:15:24.594044] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:25.777 [2024-07-15 00:15:24.594192] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:25.777 [2024-07-15 00:15:24.652455] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:25.777 [2024-07-15 00:15:24.668754] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4410 *** 00:07:25.777 INFO: Running with entropic power schedule (0xFF, 100). 00:07:25.777 INFO: Seed: 737510326 00:07:25.777 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x280a94c, 0x285dea9), 00:07:25.777 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x285deb0,0x2d93480), 00:07:25.777 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:07:25.777 INFO: A corpus is not provided, starting from an empty corpus 00:07:25.777 #2 INITED exec/s: 0 rss: 60Mb 00:07:25.777 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:25.777 This may also happen if the target rejected all inputs we tried so far 00:07:25.777 [2024-07-15 00:15:24.734957] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:2a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:25.777 [2024-07-15 00:15:24.734992] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.777 [2024-07-15 00:15:24.735133] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:25.777 [2024-07-15 00:15:24.735151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.035 NEW_FUNC[1/670]: 0x48daf0 in fuzz_admin_security_receive_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:205 00:07:26.035 NEW_FUNC[2/670]: 0x4bd260 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:26.035 #5 NEW cov: 11511 ft: 11512 corp: 2/20b lim: 40 exec/s: 0 rss: 67Mb L: 19/19 MS: 3 CrossOver-ChangeBit-InsertRepeatedBytes- 00:07:26.035 [2024-07-15 00:15:25.075901] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.035 [2024-07-15 00:15:25.075947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.035 [2024-07-15 00:15:25.076115] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffff0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.035 [2024-07-15 00:15:25.076138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.293 #6 NEW cov: 11624 ft: 12084 corp: 3/36b lim: 40 exec/s: 0 rss: 67Mb L: 16/19 MS: 1 InsertRepeatedBytes- 00:07:26.293 [2024-07-15 00:15:25.115827] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:2a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.293 [2024-07-15 00:15:25.115855] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.293 [2024-07-15 00:15:25.115980] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.293 [2024-07-15 00:15:25.115997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.293 #12 NEW cov: 11630 ft: 12326 corp: 4/55b lim: 40 exec/s: 0 rss: 67Mb L: 19/19 MS: 1 ShuffleBytes- 00:07:26.293 [2024-07-15 00:15:25.175820] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.293 [2024-07-15 00:15:25.175846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.293 #13 NEW cov: 11715 ft: 13030 corp: 5/68b lim: 40 exec/s: 0 rss: 67Mb L: 13/19 MS: 1 CrossOver- 00:07:26.293 [2024-07-15 00:15:25.226951] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:2a000000 cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.293 [2024-07-15 00:15:25.226978] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.293 [2024-07-15 00:15:25.227103] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.293 [2024-07-15 00:15:25.227120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.293 [2024-07-15 00:15:25.227245] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.293 [2024-07-15 00:15:25.227261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:26.293 [2024-07-15 00:15:25.227360] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ff000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.293 [2024-07-15 00:15:25.227377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:26.293 [2024-07-15 00:15:25.227540] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:0000000a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.293 [2024-07-15 00:15:25.227558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:26.293 #14 NEW cov: 11715 ft: 13753 corp: 6/108b lim: 40 exec/s: 0 rss: 67Mb L: 40/40 MS: 1 InsertRepeatedBytes- 00:07:26.293 [2024-07-15 00:15:25.287013] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:2a000000 cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.293 [2024-07-15 00:15:25.287040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.293 [2024-07-15 00:15:25.287174] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.293 [2024-07-15 00:15:25.287190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.293 [2024-07-15 00:15:25.287321] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:fffbffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.293 [2024-07-15 00:15:25.287336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:26.293 [2024-07-15 00:15:25.287475] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ff000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.293 [2024-07-15 00:15:25.287491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:26.294 [2024-07-15 00:15:25.287598] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:0000000a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.294 [2024-07-15 00:15:25.287613] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:26.294 #15 NEW cov: 11715 ft: 13833 corp: 7/148b lim: 40 exec/s: 0 rss: 67Mb L: 40/40 MS: 1 ChangeBit- 00:07:26.294 [2024-07-15 00:15:25.346919] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:2a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.294 [2024-07-15 00:15:25.346952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.294 [2024-07-15 00:15:25.347083] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.294 [2024-07-15 00:15:25.347100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.294 [2024-07-15 00:15:25.347248] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.294 [2024-07-15 00:15:25.347265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:26.552 #16 NEW cov: 11715 ft: 14068 corp: 8/178b lim: 40 exec/s: 0 rss: 67Mb L: 30/40 MS: 1 CopyPart- 00:07:26.552 [2024-07-15 00:15:25.386110] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.552 [2024-07-15 00:15:25.386136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.552 #17 NEW cov: 11715 ft: 14142 corp: 9/187b lim: 40 exec/s: 0 rss: 68Mb L: 9/40 MS: 1 EraseBytes- 00:07:26.552 [2024-07-15 00:15:25.437362] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:2a000000 cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.552 [2024-07-15 00:15:25.437388] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.552 [2024-07-15 00:15:25.437525] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.552 [2024-07-15 00:15:25.437547] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.552 [2024-07-15 00:15:25.437675] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.552 [2024-07-15 00:15:25.437691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:26.552 [2024-07-15 00:15:25.437817] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ff000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.552 [2024-07-15 00:15:25.437834] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:26.552 #18 NEW cov: 11715 ft: 14200 corp: 10/220b lim: 40 exec/s: 0 rss: 68Mb L: 33/40 MS: 1 EraseBytes- 00:07:26.552 [2024-07-15 00:15:25.477013] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.552 [2024-07-15 00:15:25.477039] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.552 [2024-07-15 00:15:25.477168] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:fffffcff cdw11:ffffff0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.552 [2024-07-15 00:15:25.477185] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.552 #19 NEW cov: 11715 ft: 14262 corp: 11/236b lim: 40 exec/s: 0 rss: 68Mb L: 16/40 MS: 1 ChangeBinInt- 00:07:26.552 [2024-07-15 00:15:25.516780] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.552 [2024-07-15 00:15:25.516807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.552 #20 NEW cov: 11715 ft: 14289 corp: 12/245b lim: 40 exec/s: 0 rss: 68Mb L: 9/40 MS: 1 EraseBytes- 00:07:26.552 [2024-07-15 00:15:25.557765] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:2a000000 cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.552 [2024-07-15 00:15:25.557791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.552 [2024-07-15 00:15:25.557931] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.552 [2024-07-15 00:15:25.557947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.552 [2024-07-15 00:15:25.558063] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.552 [2024-07-15 00:15:25.558080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:26.552 [2024-07-15 00:15:25.558210] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ff000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.552 [2024-07-15 00:15:25.558226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:26.552 #21 NEW cov: 11715 ft: 14342 corp: 13/278b lim: 40 exec/s: 0 rss: 68Mb L: 33/40 MS: 1 CrossOver- 00:07:26.552 [2024-07-15 00:15:25.597413] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:2a000000 cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.552 [2024-07-15 00:15:25.597439] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.552 [2024-07-15 00:15:25.597578] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.552 [2024-07-15 00:15:25.597595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.552 [2024-07-15 00:15:25.597738] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffff00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.552 [2024-07-15 00:15:25.597755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:26.552 [2024-07-15 00:15:25.597882] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ff000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.552 [2024-07-15 00:15:25.597899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:26.812 NEW_FUNC[1/1]: 0x195e300 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:26.812 #22 NEW cov: 11738 ft: 14383 corp: 14/311b lim: 40 exec/s: 0 rss: 68Mb L: 33/40 MS: 1 CrossOver- 00:07:26.812 [2024-07-15 00:15:25.648120] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:2a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.812 [2024-07-15 00:15:25.648147] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.812 [2024-07-15 00:15:25.648281] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:003c3c3c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.812 [2024-07-15 00:15:25.648297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.812 [2024-07-15 00:15:25.648428] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:3c3c3c3c cdw11:3c3c3c3c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.812 [2024-07-15 00:15:25.648446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:26.812 [2024-07-15 00:15:25.648585] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:3c3c3c3c cdw11:3c3c3c3c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.812 [2024-07-15 00:15:25.648601] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:26.812 [2024-07-15 00:15:25.648727] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:3c3c0000 cdw11:0000000a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.812 [2024-07-15 00:15:25.648744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:26.812 #23 NEW cov: 11738 ft: 14397 corp: 15/351b lim: 40 exec/s: 0 rss: 68Mb L: 40/40 MS: 1 InsertRepeatedBytes- 00:07:26.812 [2024-07-15 00:15:25.698347] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:2a000400 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.812 [2024-07-15 00:15:25.698373] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.812 [2024-07-15 00:15:25.698518] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00ffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.812 [2024-07-15 00:15:25.698535] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.812 [2024-07-15 00:15:25.698672] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:fffbffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.812 [2024-07-15 00:15:25.698689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:26.812 [2024-07-15 00:15:25.698824] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ff000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.812 [2024-07-15 00:15:25.698841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:26.812 [2024-07-15 00:15:25.698963] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:0000000a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.812 [2024-07-15 00:15:25.698979] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:26.812 #24 NEW cov: 11738 ft: 14410 corp: 16/391b lim: 40 exec/s: 24 rss: 68Mb L: 40/40 MS: 1 CMP- DE: "\000\004\000\000\000\000\000\000"- 00:07:26.812 [2024-07-15 00:15:25.747970] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:2a000000 cdw11:fbffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.812 [2024-07-15 00:15:25.747996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.812 [2024-07-15 00:15:25.748133] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.812 [2024-07-15 00:15:25.748150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.812 [2024-07-15 00:15:25.748270] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.812 [2024-07-15 00:15:25.748286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:26.812 [2024-07-15 00:15:25.748416] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ff000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.812 [2024-07-15 00:15:25.748435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:26.812 #25 NEW cov: 11738 ft: 14435 corp: 17/424b lim: 40 exec/s: 25 rss: 68Mb L: 33/40 MS: 1 ChangeBit- 00:07:26.812 [2024-07-15 00:15:25.798505] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:2a000400 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.812 [2024-07-15 00:15:25.798530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.812 [2024-07-15 00:15:25.798674] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00ffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.812 [2024-07-15 00:15:25.798691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.812 [2024-07-15 00:15:25.798824] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:fffbffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.812 [2024-07-15 00:15:25.798842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:26.812 [2024-07-15 00:15:25.798963] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ff000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.812 [2024-07-15 00:15:25.798979] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:26.812 [2024-07-15 00:15:25.799120] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:1e000000 cdw11:0000000a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.812 [2024-07-15 00:15:25.799143] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:26.812 #26 NEW cov: 11738 ft: 14460 corp: 18/464b lim: 40 exec/s: 26 rss: 68Mb L: 40/40 MS: 1 CMP- DE: "\036\000"- 00:07:26.812 [2024-07-15 00:15:25.837594] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:2a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.812 [2024-07-15 00:15:25.837619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.812 [2024-07-15 00:15:25.837756] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:ffffff00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.812 [2024-07-15 00:15:25.837772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.812 #27 NEW cov: 11738 ft: 14486 corp: 19/483b lim: 40 exec/s: 27 rss: 68Mb L: 19/40 MS: 1 CrossOver- 00:07:27.072 [2024-07-15 00:15:25.878387] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:2a000000 cdw11:002a9bc3 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.072 [2024-07-15 00:15:25.878415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.072 [2024-07-15 00:15:25.878552] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:bb30799c cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.072 [2024-07-15 00:15:25.878570] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.072 [2024-07-15 00:15:25.878694] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.072 [2024-07-15 00:15:25.878709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:27.072 #28 NEW cov: 11738 ft: 14502 corp: 20/513b lim: 40 exec/s: 28 rss: 68Mb L: 30/40 MS: 1 CMP- DE: "\000*\233\303\2730y\234"- 00:07:27.072 [2024-07-15 00:15:25.918332] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:2a000000 cdw11:002a9bc3 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.072 [2024-07-15 00:15:25.918360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.072 [2024-07-15 00:15:25.918519] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:bb30799c cdw11:0000002a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.072 [2024-07-15 00:15:25.918538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.072 [2024-07-15 00:15:25.918668] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:9bc3bb30 cdw11:799c0000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.072 [2024-07-15 00:15:25.918684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:27.072 #29 NEW cov: 11738 ft: 14526 corp: 21/543b lim: 40 exec/s: 29 rss: 68Mb L: 30/40 MS: 1 PersAutoDict- DE: "\000*\233\303\2730y\234"- 00:07:27.072 [2024-07-15 00:15:25.968464] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ff00fdff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.072 [2024-07-15 00:15:25.968490] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.072 [2024-07-15 00:15:25.968625] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffff0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.072 [2024-07-15 00:15:25.968641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.072 #30 NEW cov: 11738 ft: 14638 corp: 22/559b lim: 40 exec/s: 30 rss: 69Mb L: 16/40 MS: 1 ChangeBinInt- 00:07:27.072 [2024-07-15 00:15:26.007870] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.072 [2024-07-15 00:15:26.007897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.072 #31 NEW cov: 11738 ft: 14652 corp: 23/569b lim: 40 exec/s: 31 rss: 69Mb L: 10/40 MS: 1 InsertByte- 00:07:27.072 [2024-07-15 00:15:26.059204] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.072 [2024-07-15 00:15:26.059232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.072 [2024-07-15 00:15:26.059364] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ff2a0000 cdw11:00002a9b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.072 [2024-07-15 00:15:26.059383] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.072 [2024-07-15 00:15:26.059530] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:c3bb3079 cdw11:9c000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.072 [2024-07-15 00:15:26.059548] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:27.072 [2024-07-15 00:15:26.059686] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.072 [2024-07-15 00:15:26.059702] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:27.072 #32 NEW cov: 11738 ft: 14664 corp: 24/608b lim: 40 exec/s: 32 rss: 69Mb L: 39/40 MS: 1 CrossOver- 00:07:27.072 [2024-07-15 00:15:26.109561] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:2a000000 cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.072 [2024-07-15 00:15:26.109591] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.072 [2024-07-15 00:15:26.109726] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.072 [2024-07-15 00:15:26.109744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.072 [2024-07-15 00:15:26.109888] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.072 [2024-07-15 00:15:26.109904] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:27.072 [2024-07-15 00:15:26.110043] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.072 [2024-07-15 00:15:26.110060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:27.072 [2024-07-15 00:15:26.110164] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:0000000a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.072 [2024-07-15 00:15:26.110181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:27.331 #33 NEW cov: 11738 ft: 14711 corp: 25/648b lim: 40 exec/s: 33 rss: 69Mb L: 40/40 MS: 1 InsertRepeatedBytes- 00:07:27.331 [2024-07-15 00:15:26.149402] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:2a001e00 cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.331 [2024-07-15 00:15:26.149430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.331 [2024-07-15 00:15:26.149571] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.331 [2024-07-15 00:15:26.149589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.331 [2024-07-15 00:15:26.149727] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffff00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.331 [2024-07-15 00:15:26.149746] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:27.331 [2024-07-15 00:15:26.149888] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ff000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.331 [2024-07-15 00:15:26.149904] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:27.331 #34 NEW cov: 11738 ft: 14725 corp: 26/681b lim: 40 exec/s: 34 rss: 69Mb L: 33/40 MS: 1 PersAutoDict- DE: "\036\000"- 00:07:27.331 [2024-07-15 00:15:26.208954] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:2a000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.331 [2024-07-15 00:15:26.208983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.331 #35 NEW cov: 11738 ft: 14748 corp: 27/689b lim: 40 exec/s: 35 rss: 69Mb L: 8/40 MS: 1 CrossOver- 00:07:27.331 [2024-07-15 00:15:26.269851] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:2a000000 cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.331 [2024-07-15 00:15:26.269877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.331 [2024-07-15 00:15:26.270019] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.331 [2024-07-15 00:15:26.270042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.331 [2024-07-15 00:15:26.270180] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.331 [2024-07-15 00:15:26.270196] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:27.331 [2024-07-15 00:15:26.270335] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffff00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.331 [2024-07-15 00:15:26.270351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:27.331 #36 NEW cov: 11738 ft: 14763 corp: 28/728b lim: 40 exec/s: 36 rss: 69Mb L: 39/40 MS: 1 InsertRepeatedBytes- 00:07:27.331 [2024-07-15 00:15:26.310230] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:2a000000 cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.331 [2024-07-15 00:15:26.310258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.331 [2024-07-15 00:15:26.310388] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:002a9bc3 cdw11:bb30799c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.331 [2024-07-15 00:15:26.310406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.331 [2024-07-15 00:15:26.310546] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.331 [2024-07-15 00:15:26.310562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:27.331 [2024-07-15 00:15:26.310697] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.331 [2024-07-15 00:15:26.310713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:27.331 [2024-07-15 00:15:26.310853] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:0000000a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.331 [2024-07-15 00:15:26.310869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:27.331 #37 NEW cov: 11738 ft: 14799 corp: 29/768b lim: 40 exec/s: 37 rss: 69Mb L: 40/40 MS: 1 PersAutoDict- DE: "\000*\233\303\2730y\234"- 00:07:27.331 [2024-07-15 00:15:26.370318] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:2a000400 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.331 [2024-07-15 00:15:26.370345] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.331 [2024-07-15 00:15:26.370486] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00ffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.331 [2024-07-15 00:15:26.370503] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.331 [2024-07-15 00:15:26.370634] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:fffbffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.331 [2024-07-15 00:15:26.370650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:27.331 [2024-07-15 00:15:26.370778] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ff000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.331 [2024-07-15 00:15:26.370797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:27.331 [2024-07-15 00:15:26.370933] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:0000000a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.331 [2024-07-15 00:15:26.370950] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:27.591 #38 NEW cov: 11738 ft: 14813 corp: 30/808b lim: 40 exec/s: 38 rss: 69Mb L: 40/40 MS: 1 ShuffleBytes- 00:07:27.591 [2024-07-15 00:15:26.420290] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:2a000000 cdw11:002a9bc3 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.591 [2024-07-15 00:15:26.420317] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.591 [2024-07-15 00:15:26.420455] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:bb30799c cdw11:0000002a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.591 [2024-07-15 00:15:26.420472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.591 [2024-07-15 00:15:26.420622] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:9b000000 cdw11:c3bb3079 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.591 [2024-07-15 00:15:26.420640] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:27.591 [2024-07-15 00:15:26.420774] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:9c000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.591 [2024-07-15 00:15:26.420788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:27.591 #39 NEW cov: 11738 ft: 14828 corp: 31/841b lim: 40 exec/s: 39 rss: 69Mb L: 33/40 MS: 1 InsertRepeatedBytes- 00:07:27.591 [2024-07-15 00:15:26.480701] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:2a000000 cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.591 [2024-07-15 00:15:26.480728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.591 [2024-07-15 00:15:26.480870] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffff00 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.591 [2024-07-15 00:15:26.480888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.591 [2024-07-15 00:15:26.481015] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:0000000a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.591 [2024-07-15 00:15:26.481031] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:27.591 [2024-07-15 00:15:26.481138] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ff000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.591 [2024-07-15 00:15:26.481154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:27.591 [2024-07-15 00:15:26.481296] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:0000000a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.591 [2024-07-15 00:15:26.481314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:27.591 #40 NEW cov: 11738 ft: 14829 corp: 32/881b lim: 40 exec/s: 40 rss: 69Mb L: 40/40 MS: 1 CrossOver- 00:07:27.591 [2024-07-15 00:15:26.530594] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:2a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.591 [2024-07-15 00:15:26.530627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.591 [2024-07-15 00:15:26.530767] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:2a9bc3bb SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.591 [2024-07-15 00:15:26.530783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.591 [2024-07-15 00:15:26.530927] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:30799c00 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.591 [2024-07-15 00:15:26.530943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:27.591 [2024-07-15 00:15:26.531071] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.591 [2024-07-15 00:15:26.531086] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:27.591 #41 NEW cov: 11738 ft: 14848 corp: 33/919b lim: 40 exec/s: 41 rss: 69Mb L: 38/40 MS: 1 PersAutoDict- DE: "\000*\233\303\2730y\234"- 00:07:27.591 [2024-07-15 00:15:26.580240] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:2a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.591 [2024-07-15 00:15:26.580267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.591 [2024-07-15 00:15:26.580405] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.591 [2024-07-15 00:15:26.580421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.591 #42 NEW cov: 11738 ft: 14859 corp: 34/938b lim: 40 exec/s: 42 rss: 69Mb L: 19/40 MS: 1 ChangeBit- 00:07:27.591 [2024-07-15 00:15:26.630802] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:2a000000 cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.591 [2024-07-15 00:15:26.630827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.591 [2024-07-15 00:15:26.630968] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:002a9bc3 cdw11:bb30799c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.591 [2024-07-15 00:15:26.630984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.591 [2024-07-15 00:15:26.631123] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ff000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.591 [2024-07-15 00:15:26.631138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:27.850 #43 NEW cov: 11738 ft: 14875 corp: 35/967b lim: 40 exec/s: 43 rss: 69Mb L: 29/40 MS: 1 EraseBytes- 00:07:27.850 [2024-07-15 00:15:26.691096] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:2a000000 cdw11:000000ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.850 [2024-07-15 00:15:26.691121] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.851 [2024-07-15 00:15:26.691254] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:fffffffb SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.851 [2024-07-15 00:15:26.691269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.851 [2024-07-15 00:15:26.691403] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffff00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.851 [2024-07-15 00:15:26.691429] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:27.851 [2024-07-15 00:15:26.691573] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.851 [2024-07-15 00:15:26.691588] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:27.851 #44 NEW cov: 11738 ft: 14882 corp: 36/1005b lim: 40 exec/s: 44 rss: 69Mb L: 38/40 MS: 1 EraseBytes- 00:07:27.851 [2024-07-15 00:15:26.741310] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:2a000000 cdw11:ffedffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.851 [2024-07-15 00:15:26.741336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.851 [2024-07-15 00:15:26.741470] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.851 [2024-07-15 00:15:26.741486] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.851 [2024-07-15 00:15:26.741616] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.851 [2024-07-15 00:15:26.741632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:27.851 [2024-07-15 00:15:26.741767] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ff000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.851 [2024-07-15 00:15:26.741783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:27.851 #45 NEW cov: 11738 ft: 14900 corp: 37/1038b lim: 40 exec/s: 22 rss: 69Mb L: 33/40 MS: 1 ChangeByte- 00:07:27.851 #45 DONE cov: 11738 ft: 14900 corp: 37/1038b lim: 40 exec/s: 22 rss: 69Mb 00:07:27.851 ###### Recommended dictionary. ###### 00:07:27.851 "\000\004\000\000\000\000\000\000" # Uses: 0 00:07:27.851 "\036\000" # Uses: 1 00:07:27.851 "\000*\233\303\2730y\234" # Uses: 3 00:07:27.851 ###### End of recommended dictionary. ###### 00:07:27.851 Done 45 runs in 2 second(s) 00:07:27.851 00:15:26 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_10.conf 00:07:27.851 00:15:26 -- ../common.sh@72 -- # (( i++ )) 00:07:27.851 00:15:26 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:27.851 00:15:26 -- ../common.sh@73 -- # start_llvm_fuzz 11 1 0x1 00:07:27.851 00:15:26 -- nvmf/run.sh@23 -- # local fuzzer_type=11 00:07:27.851 00:15:26 -- nvmf/run.sh@24 -- # local timen=1 00:07:27.851 00:15:26 -- nvmf/run.sh@25 -- # local core=0x1 00:07:27.851 00:15:26 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:07:27.851 00:15:26 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_11.conf 00:07:27.851 00:15:26 -- nvmf/run.sh@29 -- # printf %02d 11 00:07:27.851 00:15:26 -- nvmf/run.sh@29 -- # port=4411 00:07:27.851 00:15:26 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:07:27.851 00:15:26 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4411' 00:07:27.851 00:15:26 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4411"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:27.851 00:15:26 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4411' -c /tmp/fuzz_json_11.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 -Z 11 -r /var/tmp/spdk11.sock 00:07:28.110 [2024-07-15 00:15:26.923587] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:28.110 [2024-07-15 00:15:26.923667] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid331130 ] 00:07:28.110 EAL: No free 2048 kB hugepages reported on node 1 00:07:28.110 [2024-07-15 00:15:27.099527] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:28.110 [2024-07-15 00:15:27.162017] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:28.110 [2024-07-15 00:15:27.162164] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:28.369 [2024-07-15 00:15:27.220289] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:28.369 [2024-07-15 00:15:27.236548] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4411 *** 00:07:28.369 INFO: Running with entropic power schedule (0xFF, 100). 00:07:28.369 INFO: Seed: 3305495945 00:07:28.369 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x280a94c, 0x285dea9), 00:07:28.369 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x285deb0,0x2d93480), 00:07:28.369 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:07:28.369 INFO: A corpus is not provided, starting from an empty corpus 00:07:28.369 #2 INITED exec/s: 0 rss: 60Mb 00:07:28.369 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:28.369 This may also happen if the target rejected all inputs we tried so far 00:07:28.369 [2024-07-15 00:15:27.291875] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.369 [2024-07-15 00:15:27.291903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.628 NEW_FUNC[1/671]: 0x48f860 in fuzz_admin_security_send_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:223 00:07:28.628 NEW_FUNC[2/671]: 0x4bd260 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:28.628 #10 NEW cov: 11523 ft: 11524 corp: 2/10b lim: 40 exec/s: 0 rss: 67Mb L: 9/9 MS: 3 ChangeBinInt-ChangeByte-InsertRepeatedBytes- 00:07:28.628 [2024-07-15 00:15:27.622677] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.628 [2024-07-15 00:15:27.622709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.628 #21 NEW cov: 11636 ft: 11844 corp: 3/19b lim: 40 exec/s: 0 rss: 67Mb L: 9/9 MS: 1 ChangeBinInt- 00:07:28.628 [2024-07-15 00:15:27.662698] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:000000a4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.628 [2024-07-15 00:15:27.662724] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.886 #27 NEW cov: 11642 ft: 12171 corp: 4/27b lim: 40 exec/s: 0 rss: 67Mb L: 8/9 MS: 1 EraseBytes- 00:07:28.886 [2024-07-15 00:15:27.702783] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000057 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.886 [2024-07-15 00:15:27.702810] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.886 #28 NEW cov: 11727 ft: 12477 corp: 5/35b lim: 40 exec/s: 0 rss: 67Mb L: 8/9 MS: 1 EraseBytes- 00:07:28.886 [2024-07-15 00:15:27.742944] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:41000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.886 [2024-07-15 00:15:27.742969] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.886 #34 NEW cov: 11727 ft: 12667 corp: 6/45b lim: 40 exec/s: 0 rss: 67Mb L: 10/10 MS: 1 InsertByte- 00:07:28.886 [2024-07-15 00:15:27.772994] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:020000a4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.886 [2024-07-15 00:15:27.773020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.886 #35 NEW cov: 11727 ft: 12750 corp: 7/53b lim: 40 exec/s: 0 rss: 68Mb L: 8/10 MS: 1 ChangeByte- 00:07:28.886 [2024-07-15 00:15:27.813153] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00004000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.886 [2024-07-15 00:15:27.813178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.886 #39 NEW cov: 11727 ft: 12821 corp: 8/65b lim: 40 exec/s: 0 rss: 68Mb L: 12/12 MS: 4 EraseBytes-InsertByte-ChangeBit-CopyPart- 00:07:28.886 [2024-07-15 00:15:27.853299] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:000000a4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.886 [2024-07-15 00:15:27.853324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.886 #40 NEW cov: 11727 ft: 12865 corp: 9/73b lim: 40 exec/s: 0 rss: 68Mb L: 8/12 MS: 1 CrossOver- 00:07:28.886 [2024-07-15 00:15:27.893388] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:000000a4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.886 [2024-07-15 00:15:27.893413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.886 #41 NEW cov: 11727 ft: 12952 corp: 10/81b lim: 40 exec/s: 0 rss: 68Mb L: 8/12 MS: 1 EraseBytes- 00:07:28.886 [2024-07-15 00:15:27.933499] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00a400a4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.886 [2024-07-15 00:15:27.933524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.146 #42 NEW cov: 11727 ft: 12994 corp: 11/89b lim: 40 exec/s: 0 rss: 68Mb L: 8/12 MS: 1 CopyPart- 00:07:29.146 [2024-07-15 00:15:27.963903] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.146 [2024-07-15 00:15:27.963928] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.146 [2024-07-15 00:15:27.963990] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:dfdfdfdf cdw11:dfdfdfdf SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.146 [2024-07-15 00:15:27.964003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.146 [2024-07-15 00:15:27.964062] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:dfdfdfdf cdw11:dfdfdfdf SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.146 [2024-07-15 00:15:27.964074] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:29.146 #43 NEW cov: 11727 ft: 13748 corp: 12/118b lim: 40 exec/s: 0 rss: 68Mb L: 29/29 MS: 1 InsertRepeatedBytes- 00:07:29.146 [2024-07-15 00:15:28.004238] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.146 [2024-07-15 00:15:28.004263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.146 [2024-07-15 00:15:28.004323] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:dfdfdfdf cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.146 [2024-07-15 00:15:28.004337] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.146 [2024-07-15 00:15:28.004398] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffdf cdw11:dfdfdfdf SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.146 [2024-07-15 00:15:28.004411] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:29.146 [2024-07-15 00:15:28.004472] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:dfdfdfdf cdw11:dfdfdfdf SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.146 [2024-07-15 00:15:28.004485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:29.146 #44 NEW cov: 11727 ft: 14086 corp: 13/154b lim: 40 exec/s: 0 rss: 68Mb L: 36/36 MS: 1 InsertRepeatedBytes- 00:07:29.146 [2024-07-15 00:15:28.044040] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.146 [2024-07-15 00:15:28.044065] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.146 [2024-07-15 00:15:28.044139] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00a40000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.146 [2024-07-15 00:15:28.044154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.146 #45 NEW cov: 11727 ft: 14289 corp: 14/170b lim: 40 exec/s: 0 rss: 68Mb L: 16/36 MS: 1 CrossOver- 00:07:29.146 [2024-07-15 00:15:28.083992] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00410000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.146 [2024-07-15 00:15:28.084017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.146 #46 NEW cov: 11727 ft: 14310 corp: 15/179b lim: 40 exec/s: 0 rss: 68Mb L: 9/36 MS: 1 ChangeByte- 00:07:29.146 [2024-07-15 00:15:28.114050] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.146 [2024-07-15 00:15:28.114075] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.146 #47 NEW cov: 11727 ft: 14376 corp: 16/189b lim: 40 exec/s: 0 rss: 68Mb L: 10/36 MS: 1 CrossOver- 00:07:29.146 [2024-07-15 00:15:28.154664] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000707 cdw11:07070707 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.146 [2024-07-15 00:15:28.154689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.146 [2024-07-15 00:15:28.154763] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:07070707 cdw11:07070707 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.146 [2024-07-15 00:15:28.154777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.146 [2024-07-15 00:15:28.154834] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:07070707 cdw11:07070707 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.146 [2024-07-15 00:15:28.154847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:29.146 [2024-07-15 00:15:28.154905] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:07070707 cdw11:07000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.146 [2024-07-15 00:15:28.154919] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:29.146 NEW_FUNC[1/1]: 0x195e300 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:29.146 #48 NEW cov: 11750 ft: 14427 corp: 17/225b lim: 40 exec/s: 0 rss: 68Mb L: 36/36 MS: 1 InsertRepeatedBytes- 00:07:29.405 [2024-07-15 00:15:28.204314] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:b1000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.405 [2024-07-15 00:15:28.204340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.405 #54 NEW cov: 11750 ft: 14474 corp: 18/234b lim: 40 exec/s: 0 rss: 68Mb L: 9/36 MS: 1 InsertByte- 00:07:29.405 [2024-07-15 00:15:28.244898] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.405 [2024-07-15 00:15:28.244922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.405 [2024-07-15 00:15:28.244980] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:dfdfdfdf cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.405 [2024-07-15 00:15:28.244994] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.405 [2024-07-15 00:15:28.245049] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:7fffffdf cdw11:dfdfdfdf SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.405 [2024-07-15 00:15:28.245063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:29.405 [2024-07-15 00:15:28.245117] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:dfdfdfdf cdw11:dfdfdfdf SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.405 [2024-07-15 00:15:28.245131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:29.405 #55 NEW cov: 11750 ft: 14502 corp: 19/270b lim: 40 exec/s: 55 rss: 69Mb L: 36/36 MS: 1 ChangeBit- 00:07:29.405 [2024-07-15 00:15:28.285055] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000707 cdw11:07070707 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.405 [2024-07-15 00:15:28.285079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.405 [2024-07-15 00:15:28.285138] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:07070707 cdw11:07070707 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.405 [2024-07-15 00:15:28.285151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.405 [2024-07-15 00:15:28.285207] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:07070707 cdw11:07070707 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.405 [2024-07-15 00:15:28.285220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:29.405 [2024-07-15 00:15:28.285275] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:07070707 cdw11:07070707 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.405 [2024-07-15 00:15:28.285286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:29.405 #56 NEW cov: 11750 ft: 14514 corp: 20/309b lim: 40 exec/s: 56 rss: 69Mb L: 39/39 MS: 1 CopyPart- 00:07:29.405 [2024-07-15 00:15:28.324702] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:20000000 cdw11:020000a4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.405 [2024-07-15 00:15:28.324727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.405 #57 NEW cov: 11750 ft: 14535 corp: 21/317b lim: 40 exec/s: 57 rss: 69Mb L: 8/39 MS: 1 ChangeBit- 00:07:29.405 [2024-07-15 00:15:28.364855] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:f80000a4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.405 [2024-07-15 00:15:28.364882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.405 [2024-07-15 00:15:28.394919] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:20000000 cdw11:020000a4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.405 [2024-07-15 00:15:28.394943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.405 #63 NEW cov: 11750 ft: 14638 corp: 22/325b lim: 40 exec/s: 63 rss: 69Mb L: 8/39 MS: 1 ShuffleBytes- 00:07:29.405 [2024-07-15 00:15:28.435050] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:0200009f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.405 [2024-07-15 00:15:28.435075] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.405 #64 NEW cov: 11750 ft: 14680 corp: 23/333b lim: 40 exec/s: 64 rss: 69Mb L: 8/39 MS: 1 ChangeBinInt- 00:07:29.665 [2024-07-15 00:15:28.475456] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.665 [2024-07-15 00:15:28.475481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.665 [2024-07-15 00:15:28.475542] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:dfdfdfdf cdw11:0000dfdf SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.665 [2024-07-15 00:15:28.475556] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.665 [2024-07-15 00:15:28.475615] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:dfdfdfdf cdw11:dfdfdfdf SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.665 [2024-07-15 00:15:28.475629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:29.665 #65 NEW cov: 11750 ft: 14696 corp: 24/362b lim: 40 exec/s: 65 rss: 69Mb L: 29/39 MS: 1 CopyPart- 00:07:29.665 [2024-07-15 00:15:28.515286] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00900000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.665 [2024-07-15 00:15:28.515311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.665 #66 NEW cov: 11750 ft: 14752 corp: 25/371b lim: 40 exec/s: 66 rss: 69Mb L: 9/39 MS: 1 ChangeByte- 00:07:29.665 [2024-07-15 00:15:28.545362] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0000f300 cdw11:000000a4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.665 [2024-07-15 00:15:28.545386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.665 #67 NEW cov: 11750 ft: 14759 corp: 26/379b lim: 40 exec/s: 67 rss: 69Mb L: 8/39 MS: 1 ChangeByte- 00:07:29.665 [2024-07-15 00:15:28.585804] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.665 [2024-07-15 00:15:28.585828] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.665 [2024-07-15 00:15:28.585904] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:dfdfdfdf cdw11:dfdfdfdf SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.665 [2024-07-15 00:15:28.585918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.665 [2024-07-15 00:15:28.585980] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:dfdfdfdf cdw11:dfdfdfdf SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.665 [2024-07-15 00:15:28.585994] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:29.665 #68 NEW cov: 11750 ft: 14779 corp: 27/408b lim: 40 exec/s: 68 rss: 69Mb L: 29/39 MS: 1 CrossOver- 00:07:29.665 [2024-07-15 00:15:28.625567] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000010 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.665 [2024-07-15 00:15:28.625591] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.665 #69 NEW cov: 11750 ft: 14827 corp: 28/417b lim: 40 exec/s: 69 rss: 69Mb L: 9/39 MS: 1 ChangeBit- 00:07:29.665 [2024-07-15 00:15:28.655689] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:000000dd cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.665 [2024-07-15 00:15:28.655714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.665 #70 NEW cov: 11750 ft: 14846 corp: 29/427b lim: 40 exec/s: 70 rss: 69Mb L: 10/39 MS: 1 ChangeByte- 00:07:29.665 [2024-07-15 00:15:28.695759] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:000000dd cdw11:f6ffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.665 [2024-07-15 00:15:28.695783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.924 #71 NEW cov: 11750 ft: 14858 corp: 30/437b lim: 40 exec/s: 71 rss: 69Mb L: 10/39 MS: 1 ChangeBinInt- 00:07:29.925 [2024-07-15 00:15:28.735946] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000010 cdw11:00002000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.925 [2024-07-15 00:15:28.735971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.925 #72 NEW cov: 11750 ft: 14871 corp: 31/446b lim: 40 exec/s: 72 rss: 69Mb L: 9/39 MS: 1 ChangeBit- 00:07:29.925 [2024-07-15 00:15:28.776030] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00004000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.925 [2024-07-15 00:15:28.776054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.925 #73 NEW cov: 11750 ft: 14889 corp: 32/458b lim: 40 exec/s: 73 rss: 69Mb L: 12/39 MS: 1 ChangeBit- 00:07:29.925 [2024-07-15 00:15:28.816607] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000707 cdw11:07070707 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.925 [2024-07-15 00:15:28.816632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.925 [2024-07-15 00:15:28.816691] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:07070707 cdw11:07070707 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.925 [2024-07-15 00:15:28.816705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.925 [2024-07-15 00:15:28.816763] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:07070707 cdw11:07070707 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.925 [2024-07-15 00:15:28.816776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:29.925 [2024-07-15 00:15:28.816834] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:25070707 cdw11:07070707 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.925 [2024-07-15 00:15:28.816848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:29.925 [2024-07-15 00:15:28.856689] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000707 cdw11:07070707 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.925 [2024-07-15 00:15:28.856713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.925 [2024-07-15 00:15:28.856777] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:07070707 cdw11:07070707 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.925 [2024-07-15 00:15:28.856791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.925 [2024-07-15 00:15:28.856849] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:07070707 cdw11:07070707 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.925 [2024-07-15 00:15:28.856862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:29.925 [2024-07-15 00:15:28.856921] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:020000a4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.925 [2024-07-15 00:15:28.856934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:29.925 #80 NEW cov: 11750 ft: 14908 corp: 33/497b lim: 40 exec/s: 80 rss: 70Mb L: 39/39 MS: 2 ChangeByte-CrossOver- 00:07:29.925 [2024-07-15 00:15:28.896768] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:d4d4d4d4 cdw11:d4d4d4d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.925 [2024-07-15 00:15:28.896793] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.925 [2024-07-15 00:15:28.896866] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:d4d4d4d4 cdw11:d4d4d4d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.925 [2024-07-15 00:15:28.896881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.925 [2024-07-15 00:15:28.896920] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:d4d4d4d4 cdw11:d4d4d4d4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.925 [2024-07-15 00:15:28.896933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:29.925 [2024-07-15 00:15:28.896993] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:d4d4d4d4 cdw11:d4d4d400 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.925 [2024-07-15 00:15:28.897007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:29.925 #83 NEW cov: 11750 ft: 14925 corp: 34/532b lim: 40 exec/s: 83 rss: 70Mb L: 35/39 MS: 3 EraseBytes-CrossOver-InsertRepeatedBytes- 00:07:29.925 [2024-07-15 00:15:28.936427] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:39000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.925 [2024-07-15 00:15:28.936454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.925 #89 NEW cov: 11750 ft: 14980 corp: 35/542b lim: 40 exec/s: 89 rss: 70Mb L: 10/39 MS: 1 ChangeBinInt- 00:07:29.925 [2024-07-15 00:15:28.977036] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.925 [2024-07-15 00:15:28.977062] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.925 [2024-07-15 00:15:28.977123] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:dfdfdfdf cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.925 [2024-07-15 00:15:28.977137] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.925 [2024-07-15 00:15:28.977194] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffdf cdw11:dfdfdfdf SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.925 [2024-07-15 00:15:28.977208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:29.925 [2024-07-15 00:15:28.977269] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:dfdfdfdf cdw11:dfdfdfdf SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.925 [2024-07-15 00:15:28.977283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:30.183 [2024-07-15 00:15:29.017257] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.184 [2024-07-15 00:15:29.017282] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.184 [2024-07-15 00:15:29.017358] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:dfdfdfdf cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.184 [2024-07-15 00:15:29.017371] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.184 [2024-07-15 00:15:29.017431] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffdf cdw11:dfdf47df SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.184 [2024-07-15 00:15:29.017447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:30.184 [2024-07-15 00:15:29.017503] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:dfdfdfdf cdw11:dfdfdfdf SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.184 [2024-07-15 00:15:29.017516] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:30.184 [2024-07-15 00:15:29.017587] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:8 nsid:0 cdw10:dfdfdfdf cdw11:dfdfdfa4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.184 [2024-07-15 00:15:29.017600] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:30.184 #91 NEW cov: 11750 ft: 15038 corp: 36/582b lim: 40 exec/s: 91 rss: 70Mb L: 40/40 MS: 2 CopyPart-CrossOver- 00:07:30.184 [2024-07-15 00:15:29.056766] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:000000dd cdw11:01010101 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.184 [2024-07-15 00:15:29.056791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.184 #92 NEW cov: 11750 ft: 15045 corp: 37/597b lim: 40 exec/s: 92 rss: 70Mb L: 15/40 MS: 1 InsertRepeatedBytes- 00:07:30.184 [2024-07-15 00:15:29.087137] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.184 [2024-07-15 00:15:29.087162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.184 [2024-07-15 00:15:29.087223] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:dfdfdfdf cdw11:0000dfdf SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.184 [2024-07-15 00:15:29.087236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.184 [2024-07-15 00:15:29.087293] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:dfdfdfdf cdw11:dfdfdfdf SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.184 [2024-07-15 00:15:29.087307] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:30.184 #93 NEW cov: 11750 ft: 15086 corp: 38/626b lim: 40 exec/s: 93 rss: 70Mb L: 29/40 MS: 1 ChangeBit- 00:07:30.184 [2024-07-15 00:15:29.126954] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:41000001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.184 [2024-07-15 00:15:29.126980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.184 #94 NEW cov: 11750 ft: 15144 corp: 39/638b lim: 40 exec/s: 94 rss: 70Mb L: 12/40 MS: 1 CMP- DE: "\001\002"- 00:07:30.184 [2024-07-15 00:15:29.157043] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:002c0000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.184 [2024-07-15 00:15:29.157068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.184 #95 NEW cov: 11750 ft: 15148 corp: 40/647b lim: 40 exec/s: 95 rss: 70Mb L: 9/40 MS: 1 ChangeByte- 00:07:30.184 [2024-07-15 00:15:29.187151] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:41000001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.184 [2024-07-15 00:15:29.187176] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.184 #96 NEW cov: 11750 ft: 15155 corp: 41/659b lim: 40 exec/s: 96 rss: 70Mb L: 12/40 MS: 1 ShuffleBytes- 00:07:30.184 [2024-07-15 00:15:29.227806] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.184 [2024-07-15 00:15:29.227830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.184 [2024-07-15 00:15:29.227889] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:dfdfdfdf cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.184 [2024-07-15 00:15:29.227903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.184 [2024-07-15 00:15:29.227974] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffdf cdw11:dfdfdfdf SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.184 [2024-07-15 00:15:29.227988] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:30.184 [2024-07-15 00:15:29.228043] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:dfdfdfdf cdw11:dfdfdfdf SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.184 [2024-07-15 00:15:29.228056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:30.443 #97 NEW cov: 11750 ft: 15190 corp: 42/695b lim: 40 exec/s: 97 rss: 70Mb L: 36/40 MS: 1 ChangeByte- 00:07:30.443 [2024-07-15 00:15:29.267863] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.443 [2024-07-15 00:15:29.267888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.443 [2024-07-15 00:15:29.267946] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:dfdfdfdf cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.443 [2024-07-15 00:15:29.267960] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.443 [2024-07-15 00:15:29.268015] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffdf cdw11:dfdfdfdf SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.443 [2024-07-15 00:15:29.268028] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:30.443 [2024-07-15 00:15:29.268082] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:dfdfdfdf cdw11:dfdfdfdf SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.443 [2024-07-15 00:15:29.268096] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:30.443 #98 NEW cov: 11750 ft: 15198 corp: 43/731b lim: 40 exec/s: 49 rss: 70Mb L: 36/40 MS: 1 EraseBytes- 00:07:30.443 #98 DONE cov: 11750 ft: 15198 corp: 43/731b lim: 40 exec/s: 49 rss: 70Mb 00:07:30.443 ###### Recommended dictionary. ###### 00:07:30.443 "\001\002" # Uses: 0 00:07:30.443 ###### End of recommended dictionary. ###### 00:07:30.443 Done 98 runs in 2 second(s) 00:07:30.443 00:15:29 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_11.conf 00:07:30.443 00:15:29 -- ../common.sh@72 -- # (( i++ )) 00:07:30.443 00:15:29 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:30.443 00:15:29 -- ../common.sh@73 -- # start_llvm_fuzz 12 1 0x1 00:07:30.443 00:15:29 -- nvmf/run.sh@23 -- # local fuzzer_type=12 00:07:30.443 00:15:29 -- nvmf/run.sh@24 -- # local timen=1 00:07:30.443 00:15:29 -- nvmf/run.sh@25 -- # local core=0x1 00:07:30.443 00:15:29 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:07:30.443 00:15:29 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_12.conf 00:07:30.443 00:15:29 -- nvmf/run.sh@29 -- # printf %02d 12 00:07:30.443 00:15:29 -- nvmf/run.sh@29 -- # port=4412 00:07:30.443 00:15:29 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:07:30.443 00:15:29 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4412' 00:07:30.443 00:15:29 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4412"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:30.443 00:15:29 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4412' -c /tmp/fuzz_json_12.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 -Z 12 -r /var/tmp/spdk12.sock 00:07:30.443 [2024-07-15 00:15:29.456913] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:30.443 [2024-07-15 00:15:29.456988] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid331559 ] 00:07:30.443 EAL: No free 2048 kB hugepages reported on node 1 00:07:30.703 [2024-07-15 00:15:29.637962] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:30.703 [2024-07-15 00:15:29.703540] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:30.703 [2024-07-15 00:15:29.703686] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:30.962 [2024-07-15 00:15:29.761507] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:30.962 [2024-07-15 00:15:29.777796] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4412 *** 00:07:30.962 INFO: Running with entropic power schedule (0xFF, 100). 00:07:30.962 INFO: Seed: 1551518110 00:07:30.962 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x280a94c, 0x285dea9), 00:07:30.962 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x285deb0,0x2d93480), 00:07:30.962 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:07:30.962 INFO: A corpus is not provided, starting from an empty corpus 00:07:30.962 #2 INITED exec/s: 0 rss: 60Mb 00:07:30.962 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:30.962 This may also happen if the target rejected all inputs we tried so far 00:07:30.962 [2024-07-15 00:15:29.822515] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:08898989 cdw11:89898989 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.962 [2024-07-15 00:15:29.822549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.962 [2024-07-15 00:15:29.822599] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:89898989 cdw11:89898989 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.962 [2024-07-15 00:15:29.822615] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.221 NEW_FUNC[1/671]: 0x4915d0 in fuzz_admin_directive_send_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:241 00:07:31.221 NEW_FUNC[2/671]: 0x4bd260 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:31.221 #14 NEW cov: 11521 ft: 11522 corp: 2/20b lim: 40 exec/s: 0 rss: 67Mb L: 19/19 MS: 2 ChangeBit-InsertRepeatedBytes- 00:07:31.221 [2024-07-15 00:15:30.153302] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:08898989 cdw11:0a898989 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.221 [2024-07-15 00:15:30.153352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.221 [2024-07-15 00:15:30.153402] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:89898989 cdw11:89898989 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.221 [2024-07-15 00:15:30.153418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.221 #15 NEW cov: 11634 ft: 11973 corp: 3/39b lim: 40 exec/s: 0 rss: 67Mb L: 19/19 MS: 1 CrossOver- 00:07:31.221 [2024-07-15 00:15:30.223485] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:89898989 cdw11:89898989 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.221 [2024-07-15 00:15:30.223516] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.221 [2024-07-15 00:15:30.223564] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:89898989 cdw11:89088989 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.221 [2024-07-15 00:15:30.223580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.221 [2024-07-15 00:15:30.223610] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:89898989 cdw11:89898989 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.221 [2024-07-15 00:15:30.223625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:31.222 [2024-07-15 00:15:30.223654] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:89898989 cdw11:89898989 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.222 [2024-07-15 00:15:30.223669] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:31.222 #16 NEW cov: 11640 ft: 12581 corp: 4/71b lim: 40 exec/s: 0 rss: 67Mb L: 32/32 MS: 1 CopyPart- 00:07:31.222 [2024-07-15 00:15:30.273540] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:08898989 cdw11:89012a9b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.222 [2024-07-15 00:15:30.273571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.222 [2024-07-15 00:15:30.273620] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:c1219ed1 cdw11:1c898989 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.222 [2024-07-15 00:15:30.273637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.222 [2024-07-15 00:15:30.273667] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:89898989 cdw11:89898989 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.222 [2024-07-15 00:15:30.273683] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:31.481 #22 NEW cov: 11725 ft: 13097 corp: 5/98b lim: 40 exec/s: 0 rss: 67Mb L: 27/32 MS: 1 CMP- DE: "\001*\233\301!\236\321\034"- 00:07:31.481 [2024-07-15 00:15:30.323713] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:08012a9b cdw11:c1219ed1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.481 [2024-07-15 00:15:30.323743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.481 [2024-07-15 00:15:30.323791] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:1c219ed1 cdw11:1c898989 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.481 [2024-07-15 00:15:30.323811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.481 [2024-07-15 00:15:30.323841] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:89898989 cdw11:89898989 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.481 [2024-07-15 00:15:30.323856] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:31.481 #23 NEW cov: 11725 ft: 13219 corp: 6/125b lim: 40 exec/s: 0 rss: 67Mb L: 27/32 MS: 1 PersAutoDict- DE: "\001*\233\301!\236\321\034"- 00:07:31.481 [2024-07-15 00:15:30.383726] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a030000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.481 [2024-07-15 00:15:30.383755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.481 #26 NEW cov: 11725 ft: 14036 corp: 7/136b lim: 40 exec/s: 0 rss: 67Mb L: 11/32 MS: 3 CopyPart-CMP-CMP- DE: "\001\001"-"\003\000\000\000\000\000\000\000"- 00:07:31.481 [2024-07-15 00:15:30.443919] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:89898989 cdw11:89898989 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.481 [2024-07-15 00:15:30.443949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.481 #27 NEW cov: 11725 ft: 14186 corp: 8/150b lim: 40 exec/s: 0 rss: 67Mb L: 14/32 MS: 1 CrossOver- 00:07:31.481 [2024-07-15 00:15:30.514017] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:89898989 cdw11:89898989 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.481 [2024-07-15 00:15:30.514046] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.740 #28 NEW cov: 11725 ft: 14266 corp: 9/164b lim: 40 exec/s: 0 rss: 68Mb L: 14/32 MS: 1 CrossOver- 00:07:31.740 [2024-07-15 00:15:30.584327] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:08898989 cdw11:0a898929 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.741 [2024-07-15 00:15:30.584357] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.741 [2024-07-15 00:15:30.584406] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:89898989 cdw11:89898989 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.741 [2024-07-15 00:15:30.584423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.741 #29 NEW cov: 11725 ft: 14302 corp: 10/184b lim: 40 exec/s: 0 rss: 68Mb L: 20/32 MS: 1 InsertByte- 00:07:31.741 [2024-07-15 00:15:30.644560] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:08898989 cdw11:0a898929 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.741 [2024-07-15 00:15:30.644590] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.741 [2024-07-15 00:15:30.644637] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:89000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.741 [2024-07-15 00:15:30.644654] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.741 [2024-07-15 00:15:30.644683] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.741 [2024-07-15 00:15:30.644699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:31.741 [2024-07-15 00:15:30.644727] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000089 cdw11:89898989 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.741 [2024-07-15 00:15:30.644746] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:31.741 #30 NEW cov: 11725 ft: 14347 corp: 11/222b lim: 40 exec/s: 0 rss: 68Mb L: 38/38 MS: 1 InsertRepeatedBytes- 00:07:31.741 [2024-07-15 00:15:30.704771] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:08898989 cdw11:89890a89 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.741 [2024-07-15 00:15:30.704801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.741 [2024-07-15 00:15:30.704849] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:89298900 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.741 [2024-07-15 00:15:30.704865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.741 [2024-07-15 00:15:30.704895] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.741 [2024-07-15 00:15:30.704911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:31.741 [2024-07-15 00:15:30.704940] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00898989 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.741 [2024-07-15 00:15:30.704955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:31.741 [2024-07-15 00:15:30.704985] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:8 nsid:0 cdw10:89898989 cdw11:89898989 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.741 [2024-07-15 00:15:30.705000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:31.741 NEW_FUNC[1/1]: 0x195e300 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:31.741 #31 NEW cov: 11742 ft: 14448 corp: 12/262b lim: 40 exec/s: 0 rss: 68Mb L: 40/40 MS: 1 CrossOver- 00:07:31.741 [2024-07-15 00:15:30.774987] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:89898989 cdw11:89898989 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.741 [2024-07-15 00:15:30.775019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.741 [2024-07-15 00:15:30.775052] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:89898989 cdw11:89088989 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.741 [2024-07-15 00:15:30.775067] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.741 [2024-07-15 00:15:30.775097] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:89898989 cdw11:89898989 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.741 [2024-07-15 00:15:30.775112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:31.741 [2024-07-15 00:15:30.775141] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:89898989 cdw11:30898989 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.741 [2024-07-15 00:15:30.775155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:32.000 #32 NEW cov: 11742 ft: 14467 corp: 13/294b lim: 40 exec/s: 32 rss: 68Mb L: 32/40 MS: 1 ChangeByte- 00:07:32.000 [2024-07-15 00:15:30.834948] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:89898989 cdw11:89898989 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.000 [2024-07-15 00:15:30.834978] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.000 #33 NEW cov: 11742 ft: 14541 corp: 14/308b lim: 40 exec/s: 33 rss: 68Mb L: 14/40 MS: 1 ShuffleBytes- 00:07:32.000 [2024-07-15 00:15:30.885096] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:08898989 cdw11:89898989 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.000 [2024-07-15 00:15:30.885127] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.000 [2024-07-15 00:15:30.885160] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:89898989 cdw11:89888989 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.000 [2024-07-15 00:15:30.885175] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.000 #34 NEW cov: 11742 ft: 14621 corp: 15/327b lim: 40 exec/s: 34 rss: 68Mb L: 19/40 MS: 1 ChangeBit- 00:07:32.000 [2024-07-15 00:15:30.935156] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:89898989 cdw11:89890189 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.000 [2024-07-15 00:15:30.935187] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.000 #35 NEW cov: 11742 ft: 14632 corp: 16/341b lim: 40 exec/s: 35 rss: 68Mb L: 14/40 MS: 1 ShuffleBytes- 00:07:32.000 [2024-07-15 00:15:31.005414] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a030000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.000 [2024-07-15 00:15:31.005450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.000 [2024-07-15 00:15:31.005499] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00010103 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.000 [2024-07-15 00:15:31.005515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.000 #36 NEW cov: 11742 ft: 14647 corp: 17/360b lim: 40 exec/s: 36 rss: 68Mb L: 19/40 MS: 1 PersAutoDict- DE: "\003\000\000\000\000\000\000\000"- 00:07:32.259 [2024-07-15 00:15:31.065593] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a030000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.259 [2024-07-15 00:15:31.065624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.259 [2024-07-15 00:15:31.065671] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00010108 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.259 [2024-07-15 00:15:31.065687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.259 #37 NEW cov: 11742 ft: 14686 corp: 18/379b lim: 40 exec/s: 37 rss: 68Mb L: 19/40 MS: 1 ChangeBinInt- 00:07:32.259 [2024-07-15 00:15:31.125784] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:089f2a9b cdw11:c1219ed1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.259 [2024-07-15 00:15:31.125814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.259 [2024-07-15 00:15:31.125861] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:1c219ed1 cdw11:1c898989 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.259 [2024-07-15 00:15:31.125876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.259 [2024-07-15 00:15:31.125905] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:89898989 cdw11:89898989 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.259 [2024-07-15 00:15:31.125921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:32.259 #38 NEW cov: 11742 ft: 14716 corp: 19/406b lim: 40 exec/s: 38 rss: 69Mb L: 27/40 MS: 1 ChangeByte- 00:07:32.259 [2024-07-15 00:15:31.185993] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:08012a9b cdw11:c1219ed1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.259 [2024-07-15 00:15:31.186025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.259 [2024-07-15 00:15:31.186058] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:23219ed1 cdw11:1c898989 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.259 [2024-07-15 00:15:31.186074] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.259 [2024-07-15 00:15:31.186102] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:89898989 cdw11:89898989 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.259 [2024-07-15 00:15:31.186118] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:32.259 #39 NEW cov: 11742 ft: 14731 corp: 20/433b lim: 40 exec/s: 39 rss: 69Mb L: 27/40 MS: 1 ChangeBinInt- 00:07:32.259 [2024-07-15 00:15:31.236069] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:08012a9b cdw11:c1219ed1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.259 [2024-07-15 00:15:31.236098] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.259 [2024-07-15 00:15:31.236146] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:1c219ed1 cdw11:1c898989 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.259 [2024-07-15 00:15:31.236161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.259 [2024-07-15 00:15:31.236190] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:89890000 cdw11:89898989 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.259 [2024-07-15 00:15:31.236206] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:32.259 #40 NEW cov: 11742 ft: 14740 corp: 21/462b lim: 40 exec/s: 40 rss: 69Mb L: 29/40 MS: 1 CMP- DE: "\000\000"- 00:07:32.259 [2024-07-15 00:15:31.286185] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:08898989 cdw11:89012a9b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.259 [2024-07-15 00:15:31.286215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.259 [2024-07-15 00:15:31.286264] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:c1219ed1 cdw11:1c898989 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.259 [2024-07-15 00:15:31.286290] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.518 #41 NEW cov: 11742 ft: 14754 corp: 22/483b lim: 40 exec/s: 41 rss: 69Mb L: 21/40 MS: 1 EraseBytes- 00:07:32.518 [2024-07-15 00:15:31.336275] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a030000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.518 [2024-07-15 00:15:31.336304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.518 [2024-07-15 00:15:31.336351] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00010108 cdw11:00003d00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.518 [2024-07-15 00:15:31.336367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.518 #42 NEW cov: 11742 ft: 14793 corp: 23/503b lim: 40 exec/s: 42 rss: 69Mb L: 20/40 MS: 1 InsertByte- 00:07:32.518 [2024-07-15 00:15:31.396523] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:08898989 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.518 [2024-07-15 00:15:31.396556] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.518 [2024-07-15 00:15:31.396589] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00008989 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.518 [2024-07-15 00:15:31.396605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.518 [2024-07-15 00:15:31.396634] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:89898989 cdw11:89898989 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.518 [2024-07-15 00:15:31.396649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:32.518 #43 NEW cov: 11742 ft: 14845 corp: 24/528b lim: 40 exec/s: 43 rss: 69Mb L: 25/40 MS: 1 EraseBytes- 00:07:32.518 [2024-07-15 00:15:31.456624] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.518 [2024-07-15 00:15:31.456653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.518 [2024-07-15 00:15:31.456687] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.518 [2024-07-15 00:15:31.456702] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.518 #48 NEW cov: 11742 ft: 14859 corp: 25/549b lim: 40 exec/s: 48 rss: 69Mb L: 21/40 MS: 5 InsertByte-ShuffleBytes-ShuffleBytes-ChangeBit-InsertRepeatedBytes- 00:07:32.518 [2024-07-15 00:15:31.506680] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:89898989 cdw11:89898989 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.518 [2024-07-15 00:15:31.506708] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.518 #49 NEW cov: 11742 ft: 14864 corp: 26/563b lim: 40 exec/s: 49 rss: 69Mb L: 14/40 MS: 1 ChangeBinInt- 00:07:32.777 [2024-07-15 00:15:31.576957] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:08898989 cdw11:89898989 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.777 [2024-07-15 00:15:31.576989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.777 [2024-07-15 00:15:31.577037] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:89898989 cdw11:890b0088 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.777 [2024-07-15 00:15:31.577054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.777 #50 NEW cov: 11742 ft: 14889 corp: 27/584b lim: 40 exec/s: 50 rss: 69Mb L: 21/40 MS: 1 CMP- DE: "\013\000"- 00:07:32.777 [2024-07-15 00:15:31.637181] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a03b6b6 cdw11:b6b6b6b6 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.777 [2024-07-15 00:15:31.637211] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.777 [2024-07-15 00:15:31.637245] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:b6b6b6b6 cdw11:b6b6b6b6 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.777 [2024-07-15 00:15:31.637261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.777 [2024-07-15 00:15:31.637290] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:b6000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.777 [2024-07-15 00:15:31.637306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:32.777 #51 NEW cov: 11742 ft: 14947 corp: 28/610b lim: 40 exec/s: 51 rss: 69Mb L: 26/40 MS: 1 InsertRepeatedBytes- 00:07:32.777 [2024-07-15 00:15:31.687167] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a030000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.777 [2024-07-15 00:15:31.687197] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.777 #52 NEW cov: 11749 ft: 14964 corp: 29/625b lim: 40 exec/s: 52 rss: 69Mb L: 15/40 MS: 1 CMP- DE: "\377\377\377\002"- 00:07:32.777 [2024-07-15 00:15:31.747493] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:89898989 cdw11:89898989 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.777 [2024-07-15 00:15:31.747523] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.777 [2024-07-15 00:15:31.747570] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:89898989 cdw11:89a30889 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.777 [2024-07-15 00:15:31.747586] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.777 [2024-07-15 00:15:31.747615] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:89898989 cdw11:89898989 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.777 [2024-07-15 00:15:31.747631] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:32.777 [2024-07-15 00:15:31.747659] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:89898989 cdw11:89898989 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.777 [2024-07-15 00:15:31.747674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:32.777 #53 NEW cov: 11749 ft: 14969 corp: 30/658b lim: 40 exec/s: 53 rss: 69Mb L: 33/40 MS: 1 InsertByte- 00:07:32.777 [2024-07-15 00:15:31.797476] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:f4fcffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.777 [2024-07-15 00:15:31.797505] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.036 #54 NEW cov: 11749 ft: 14990 corp: 31/673b lim: 40 exec/s: 27 rss: 69Mb L: 15/40 MS: 1 ChangeBinInt- 00:07:33.036 #54 DONE cov: 11749 ft: 14990 corp: 31/673b lim: 40 exec/s: 27 rss: 69Mb 00:07:33.037 ###### Recommended dictionary. ###### 00:07:33.037 "\001*\233\301!\236\321\034" # Uses: 1 00:07:33.037 "\001\001" # Uses: 0 00:07:33.037 "\003\000\000\000\000\000\000\000" # Uses: 1 00:07:33.037 "\000\000" # Uses: 0 00:07:33.037 "\013\000" # Uses: 0 00:07:33.037 "\377\377\377\002" # Uses: 0 00:07:33.037 ###### End of recommended dictionary. ###### 00:07:33.037 Done 54 runs in 2 second(s) 00:07:33.037 00:15:31 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_12.conf 00:07:33.037 00:15:31 -- ../common.sh@72 -- # (( i++ )) 00:07:33.037 00:15:31 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:33.037 00:15:31 -- ../common.sh@73 -- # start_llvm_fuzz 13 1 0x1 00:07:33.037 00:15:31 -- nvmf/run.sh@23 -- # local fuzzer_type=13 00:07:33.037 00:15:31 -- nvmf/run.sh@24 -- # local timen=1 00:07:33.037 00:15:31 -- nvmf/run.sh@25 -- # local core=0x1 00:07:33.037 00:15:31 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:07:33.037 00:15:31 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_13.conf 00:07:33.037 00:15:31 -- nvmf/run.sh@29 -- # printf %02d 13 00:07:33.037 00:15:31 -- nvmf/run.sh@29 -- # port=4413 00:07:33.037 00:15:31 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:07:33.037 00:15:31 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4413' 00:07:33.037 00:15:31 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4413"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:33.037 00:15:31 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4413' -c /tmp/fuzz_json_13.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 -Z 13 -r /var/tmp/spdk13.sock 00:07:33.037 [2024-07-15 00:15:32.010328] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:33.037 [2024-07-15 00:15:32.010397] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid331966 ] 00:07:33.037 EAL: No free 2048 kB hugepages reported on node 1 00:07:33.296 [2024-07-15 00:15:32.187302] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:33.296 [2024-07-15 00:15:32.249590] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:33.296 [2024-07-15 00:15:32.249739] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:33.296 [2024-07-15 00:15:32.307660] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:33.296 [2024-07-15 00:15:32.323972] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4413 *** 00:07:33.296 INFO: Running with entropic power schedule (0xFF, 100). 00:07:33.296 INFO: Seed: 4095520014 00:07:33.555 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x280a94c, 0x285dea9), 00:07:33.555 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x285deb0,0x2d93480), 00:07:33.555 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:07:33.555 INFO: A corpus is not provided, starting from an empty corpus 00:07:33.555 #2 INITED exec/s: 0 rss: 60Mb 00:07:33.555 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:33.555 This may also happen if the target rejected all inputs we tried so far 00:07:33.555 [2024-07-15 00:15:32.393605] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:dededede cdw11:dededede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:33.555 [2024-07-15 00:15:32.393641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.814 NEW_FUNC[1/670]: 0x493190 in fuzz_admin_directive_receive_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:257 00:07:33.814 NEW_FUNC[2/670]: 0x4bd260 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:33.814 #9 NEW cov: 11509 ft: 11510 corp: 2/16b lim: 40 exec/s: 0 rss: 67Mb L: 15/15 MS: 2 InsertByte-InsertRepeatedBytes- 00:07:33.814 [2024-07-15 00:15:32.734377] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:dededede cdw11:dededede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:33.814 [2024-07-15 00:15:32.734420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.814 #10 NEW cov: 11622 ft: 12069 corp: 3/31b lim: 40 exec/s: 0 rss: 67Mb L: 15/15 MS: 1 CopyPart- 00:07:33.814 [2024-07-15 00:15:32.784472] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:33.814 [2024-07-15 00:15:32.784497] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.814 #11 NEW cov: 11628 ft: 12379 corp: 4/40b lim: 40 exec/s: 0 rss: 67Mb L: 9/15 MS: 1 CMP- DE: "\377\377\377\377\377\377\377\010"- 00:07:33.814 [2024-07-15 00:15:32.824797] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:dede23de cdw11:dededede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:33.814 [2024-07-15 00:15:32.824822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.814 [2024-07-15 00:15:32.824939] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:dededede cdw11:dede0a40 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:33.814 [2024-07-15 00:15:32.824957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.814 #12 NEW cov: 11713 ft: 13055 corp: 5/56b lim: 40 exec/s: 0 rss: 67Mb L: 16/16 MS: 1 InsertByte- 00:07:33.814 [2024-07-15 00:15:32.864931] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:33.814 [2024-07-15 00:15:32.864956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.814 [2024-07-15 00:15:32.865073] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:33.814 [2024-07-15 00:15:32.865089] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.073 #15 NEW cov: 11713 ft: 13119 corp: 6/77b lim: 40 exec/s: 0 rss: 67Mb L: 21/21 MS: 3 CopyPart-ChangeBit-InsertRepeatedBytes- 00:07:34.073 [2024-07-15 00:15:32.904806] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.073 [2024-07-15 00:15:32.904833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.073 #17 NEW cov: 11713 ft: 13176 corp: 7/86b lim: 40 exec/s: 0 rss: 67Mb L: 9/21 MS: 2 ShuffleBytes-PersAutoDict- DE: "\377\377\377\377\377\377\377\010"- 00:07:34.073 [2024-07-15 00:15:32.944910] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:dede23de cdw11:dede9ede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.073 [2024-07-15 00:15:32.944937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.073 [2024-07-15 00:15:32.945054] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:dededede cdw11:dede0a40 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.073 [2024-07-15 00:15:32.945070] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.073 #18 NEW cov: 11713 ft: 13282 corp: 8/102b lim: 40 exec/s: 0 rss: 67Mb L: 16/21 MS: 1 ChangeBit- 00:07:34.073 [2024-07-15 00:15:32.995393] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:dede23de cdw11:dededede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.073 [2024-07-15 00:15:32.995419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.073 [2024-07-15 00:15:32.995573] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:dededede cdw11:dede0a40 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.073 [2024-07-15 00:15:32.995590] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.073 #19 NEW cov: 11713 ft: 13344 corp: 9/118b lim: 40 exec/s: 0 rss: 67Mb L: 16/21 MS: 1 ShuffleBytes- 00:07:34.073 [2024-07-15 00:15:33.035509] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:dede23de cdw11:dededede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.073 [2024-07-15 00:15:33.035535] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.073 [2024-07-15 00:15:33.035665] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:dededede cdw11:dede0a40 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.073 [2024-07-15 00:15:33.035682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.073 #20 NEW cov: 11713 ft: 13394 corp: 10/134b lim: 40 exec/s: 0 rss: 67Mb L: 16/21 MS: 1 ShuffleBytes- 00:07:34.073 [2024-07-15 00:15:33.075402] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:dededede cdw11:dededede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.073 [2024-07-15 00:15:33.075430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.073 #21 NEW cov: 11713 ft: 13486 corp: 11/146b lim: 40 exec/s: 0 rss: 67Mb L: 12/21 MS: 1 EraseBytes- 00:07:34.073 [2024-07-15 00:15:33.115658] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:dede23de cdw11:dededede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.073 [2024-07-15 00:15:33.115684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.073 [2024-07-15 00:15:33.115826] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:dededede cdw11:dede1440 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.073 [2024-07-15 00:15:33.115843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.333 #22 NEW cov: 11713 ft: 13545 corp: 12/162b lim: 40 exec/s: 0 rss: 68Mb L: 16/21 MS: 1 ChangeBinInt- 00:07:34.333 [2024-07-15 00:15:33.156001] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:dededede cdw11:dededede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.333 [2024-07-15 00:15:33.156027] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.333 [2024-07-15 00:15:33.156159] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:dede23de cdw11:dededede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.333 [2024-07-15 00:15:33.156174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.333 [2024-07-15 00:15:33.156297] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:dededede cdw11:dede0a40 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.333 [2024-07-15 00:15:33.156313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.333 #23 NEW cov: 11713 ft: 13779 corp: 13/186b lim: 40 exec/s: 0 rss: 68Mb L: 24/24 MS: 1 CrossOver- 00:07:34.333 [2024-07-15 00:15:33.195696] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffff08 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.333 [2024-07-15 00:15:33.195721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.333 #25 NEW cov: 11713 ft: 13806 corp: 14/195b lim: 40 exec/s: 0 rss: 68Mb L: 9/24 MS: 2 CopyPart-PersAutoDict- DE: "\377\377\377\377\377\377\377\010"- 00:07:34.333 [2024-07-15 00:15:33.236222] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00dede23 cdw11:dededede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.333 [2024-07-15 00:15:33.236249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.333 [2024-07-15 00:15:33.236384] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:dede0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.333 [2024-07-15 00:15:33.236402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.333 [2024-07-15 00:15:33.236524] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.333 [2024-07-15 00:15:33.236540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.333 NEW_FUNC[1/1]: 0x195e300 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:34.333 #26 NEW cov: 11736 ft: 13841 corp: 15/225b lim: 40 exec/s: 0 rss: 68Mb L: 30/30 MS: 1 CrossOver- 00:07:34.333 [2024-07-15 00:15:33.285935] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:deffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.333 [2024-07-15 00:15:33.285961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.333 #27 NEW cov: 11736 ft: 13852 corp: 16/237b lim: 40 exec/s: 0 rss: 68Mb L: 12/30 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\377\010"- 00:07:34.333 [2024-07-15 00:15:33.326096] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0afffbff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.333 [2024-07-15 00:15:33.326123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.333 #28 NEW cov: 11736 ft: 13896 corp: 17/246b lim: 40 exec/s: 0 rss: 68Mb L: 9/30 MS: 1 ChangeBinInt- 00:07:34.333 [2024-07-15 00:15:33.366687] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:deffffff cdw11:ff000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.333 [2024-07-15 00:15:33.366713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.333 [2024-07-15 00:15:33.366837] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.333 [2024-07-15 00:15:33.366852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.333 [2024-07-15 00:15:33.366989] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00ffffff cdw11:080a4040 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.333 [2024-07-15 00:15:33.367004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.592 #29 NEW cov: 11736 ft: 13904 corp: 18/270b lim: 40 exec/s: 29 rss: 68Mb L: 24/30 MS: 1 InsertRepeatedBytes- 00:07:34.592 [2024-07-15 00:15:33.406407] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.592 [2024-07-15 00:15:33.406432] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.592 #30 NEW cov: 11736 ft: 13957 corp: 19/279b lim: 40 exec/s: 30 rss: 68Mb L: 9/30 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\377\010"- 00:07:34.592 [2024-07-15 00:15:33.446768] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:dededede cdw11:32dedede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.592 [2024-07-15 00:15:33.446792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.592 [2024-07-15 00:15:33.446908] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:dededede cdw11:dede0a40 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.592 [2024-07-15 00:15:33.446925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.592 #31 NEW cov: 11736 ft: 13963 corp: 20/295b lim: 40 exec/s: 31 rss: 69Mb L: 16/30 MS: 1 InsertByte- 00:07:34.592 [2024-07-15 00:15:33.487225] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:e5e5e5e5 cdw11:e5e5e5e5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.592 [2024-07-15 00:15:33.487249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.592 [2024-07-15 00:15:33.487383] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:e5e5e5e5 cdw11:e5e5e5e5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.592 [2024-07-15 00:15:33.487398] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.592 [2024-07-15 00:15:33.487525] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:e5e50000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.592 [2024-07-15 00:15:33.487541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.592 [2024-07-15 00:15:33.487669] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.592 [2024-07-15 00:15:33.487685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:34.592 #32 NEW cov: 11736 ft: 14421 corp: 21/334b lim: 40 exec/s: 32 rss: 69Mb L: 39/39 MS: 1 InsertRepeatedBytes- 00:07:34.592 [2024-07-15 00:15:33.526821] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0afffbff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.592 [2024-07-15 00:15:33.526846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.592 #33 NEW cov: 11736 ft: 14450 corp: 22/343b lim: 40 exec/s: 33 rss: 69Mb L: 9/39 MS: 1 ShuffleBytes- 00:07:34.592 [2024-07-15 00:15:33.567317] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:deffffff cdw11:ff000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.592 [2024-07-15 00:15:33.567343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.592 [2024-07-15 00:15:33.567477] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.592 [2024-07-15 00:15:33.567494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.592 [2024-07-15 00:15:33.567619] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:02ffffff cdw11:080a4040 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.592 [2024-07-15 00:15:33.567634] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.592 #34 NEW cov: 11736 ft: 14468 corp: 23/367b lim: 40 exec/s: 34 rss: 69Mb L: 24/39 MS: 1 ChangeBinInt- 00:07:34.592 [2024-07-15 00:15:33.607576] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:e5e5e5e5 cdw11:e5e5e5e5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.592 [2024-07-15 00:15:33.607601] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.592 [2024-07-15 00:15:33.607725] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:e5e5e5e5 cdw11:e5e5e5e5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.592 [2024-07-15 00:15:33.607740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.592 [2024-07-15 00:15:33.607870] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:e5e50000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.592 [2024-07-15 00:15:33.607887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.593 [2024-07-15 00:15:33.608013] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.593 [2024-07-15 00:15:33.608029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:34.593 #35 NEW cov: 11736 ft: 14486 corp: 24/406b lim: 40 exec/s: 35 rss: 69Mb L: 39/39 MS: 1 CopyPart- 00:07:34.851 [2024-07-15 00:15:33.657097] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:dededede cdw11:dedede0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.851 [2024-07-15 00:15:33.657124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.851 #36 NEW cov: 11736 ft: 14517 corp: 25/415b lim: 40 exec/s: 36 rss: 69Mb L: 9/39 MS: 1 EraseBytes- 00:07:34.851 [2024-07-15 00:15:33.697501] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:dede23de cdw11:dededede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.851 [2024-07-15 00:15:33.697527] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.851 [2024-07-15 00:15:33.697642] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:dededede cdw11:dedef740 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.851 [2024-07-15 00:15:33.697659] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.851 #37 NEW cov: 11736 ft: 14526 corp: 26/431b lim: 40 exec/s: 37 rss: 69Mb L: 16/39 MS: 1 ChangeBinInt- 00:07:34.851 [2024-07-15 00:15:33.737349] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:dededede cdw11:dededede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.851 [2024-07-15 00:15:33.737373] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.851 #38 NEW cov: 11736 ft: 14563 corp: 27/443b lim: 40 exec/s: 38 rss: 69Mb L: 12/39 MS: 1 CopyPart- 00:07:34.851 [2024-07-15 00:15:33.777940] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:dededede cdw11:de32dede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.851 [2024-07-15 00:15:33.777966] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.851 [2024-07-15 00:15:33.778098] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:dededede cdw11:dededede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.851 [2024-07-15 00:15:33.778116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.851 [2024-07-15 00:15:33.778244] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:32dedede cdw11:dededede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.851 [2024-07-15 00:15:33.778261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.851 #39 NEW cov: 11736 ft: 14599 corp: 28/471b lim: 40 exec/s: 39 rss: 69Mb L: 28/39 MS: 1 CopyPart- 00:07:34.851 [2024-07-15 00:15:33.827897] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:dede23de cdw11:dededede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.851 [2024-07-15 00:15:33.827924] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.851 [2024-07-15 00:15:33.828056] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:de32dede cdw11:dede0a40 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.851 [2024-07-15 00:15:33.828073] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.851 #40 NEW cov: 11736 ft: 14635 corp: 29/487b lim: 40 exec/s: 40 rss: 69Mb L: 16/39 MS: 1 ChangeByte- 00:07:34.851 [2024-07-15 00:15:33.868023] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:dedc23de cdw11:dede9ede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.851 [2024-07-15 00:15:33.868050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.851 [2024-07-15 00:15:33.868175] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:dededede cdw11:dede0a40 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.851 [2024-07-15 00:15:33.868191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.851 #41 NEW cov: 11736 ft: 14658 corp: 30/503b lim: 40 exec/s: 41 rss: 69Mb L: 16/39 MS: 1 ChangeBit- 00:07:35.110 [2024-07-15 00:15:33.918245] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:2b0adede cdw11:23dedede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.110 [2024-07-15 00:15:33.918273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.110 [2024-07-15 00:15:33.918387] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:dededede cdw11:dededede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.110 [2024-07-15 00:15:33.918404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.110 #43 NEW cov: 11736 ft: 14668 corp: 31/521b lim: 40 exec/s: 43 rss: 69Mb L: 18/39 MS: 2 InsertByte-CrossOver- 00:07:35.110 [2024-07-15 00:15:33.958337] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:dededede cdw11:32dedede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.110 [2024-07-15 00:15:33.958362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.110 [2024-07-15 00:15:33.958487] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:dedede9e cdw11:dede0a40 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.110 [2024-07-15 00:15:33.958505] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.110 #44 NEW cov: 11736 ft: 14734 corp: 32/537b lim: 40 exec/s: 44 rss: 69Mb L: 16/39 MS: 1 ChangeBit- 00:07:35.110 [2024-07-15 00:15:33.998406] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:e5e5e5e5 cdw11:e5e5e5e5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.110 [2024-07-15 00:15:33.998431] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.110 [2024-07-15 00:15:33.998555] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:e5e5e5e5 cdw11:e5e5e5e5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.110 [2024-07-15 00:15:33.998570] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.111 #45 NEW cov: 11736 ft: 14748 corp: 33/559b lim: 40 exec/s: 45 rss: 69Mb L: 22/39 MS: 1 EraseBytes- 00:07:35.111 [2024-07-15 00:15:34.038350] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:02ffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.111 [2024-07-15 00:15:34.038377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.111 #46 NEW cov: 11736 ft: 14786 corp: 34/568b lim: 40 exec/s: 46 rss: 70Mb L: 9/39 MS: 1 ChangeBit- 00:07:35.111 [2024-07-15 00:15:34.078428] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:dededede cdw11:dedcdede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.111 [2024-07-15 00:15:34.078462] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.111 #47 NEW cov: 11736 ft: 14843 corp: 35/580b lim: 40 exec/s: 47 rss: 70Mb L: 12/39 MS: 1 ChangeBinInt- 00:07:35.111 [2024-07-15 00:15:34.118768] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:dededede cdw11:32ffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.111 [2024-07-15 00:15:34.118796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.111 [2024-07-15 00:15:34.118914] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:08de0a40 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.111 [2024-07-15 00:15:34.118932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.111 #48 NEW cov: 11736 ft: 14885 corp: 36/596b lim: 40 exec/s: 48 rss: 70Mb L: 16/39 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\377\010"- 00:07:35.111 [2024-07-15 00:15:34.159195] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:deffdfff cdw11:ff000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.111 [2024-07-15 00:15:34.159221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.111 [2024-07-15 00:15:34.159358] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.111 [2024-07-15 00:15:34.159376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.111 [2024-07-15 00:15:34.159514] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:02ffffff cdw11:080a4040 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.111 [2024-07-15 00:15:34.159531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.430 #49 NEW cov: 11736 ft: 14897 corp: 37/620b lim: 40 exec/s: 49 rss: 70Mb L: 24/39 MS: 1 ChangeBit- 00:07:35.430 [2024-07-15 00:15:34.208994] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffff00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.430 [2024-07-15 00:15:34.209020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.430 [2024-07-15 00:15:34.209145] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.430 [2024-07-15 00:15:34.209161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.430 [2024-07-15 00:15:34.209281] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:000000ff cdw11:ffffff08 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.430 [2024-07-15 00:15:34.209297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.431 #51 NEW cov: 11736 ft: 14907 corp: 38/644b lim: 40 exec/s: 51 rss: 70Mb L: 24/39 MS: 2 EraseBytes-CrossOver- 00:07:35.431 [2024-07-15 00:15:34.249439] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.431 [2024-07-15 00:15:34.249469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.431 [2024-07-15 00:15:34.249595] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.431 [2024-07-15 00:15:34.249614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.431 [2024-07-15 00:15:34.249747] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.431 [2024-07-15 00:15:34.249764] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.431 #52 NEW cov: 11736 ft: 14943 corp: 39/670b lim: 40 exec/s: 52 rss: 70Mb L: 26/39 MS: 1 InsertRepeatedBytes- 00:07:35.431 [2024-07-15 00:15:34.289530] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:dededede cdw11:de32dede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.431 [2024-07-15 00:15:34.289560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.431 [2024-07-15 00:15:34.289696] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:dededede cdw11:dededede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.431 [2024-07-15 00:15:34.289713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.431 [2024-07-15 00:15:34.289847] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:32defbfb cdw11:fbdedede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.431 [2024-07-15 00:15:34.289864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.431 #53 NEW cov: 11736 ft: 14952 corp: 40/701b lim: 40 exec/s: 53 rss: 70Mb L: 31/39 MS: 1 InsertRepeatedBytes- 00:07:35.431 [2024-07-15 00:15:34.339932] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:e5e5e5e5 cdw11:e5e5e5e5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.431 [2024-07-15 00:15:34.339958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.431 [2024-07-15 00:15:34.340092] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:e5e5e5e5 cdw11:e5e5e5e5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.431 [2024-07-15 00:15:34.340108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.431 [2024-07-15 00:15:34.340241] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:e5e50000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.431 [2024-07-15 00:15:34.340269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.431 [2024-07-15 00:15:34.340400] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.431 [2024-07-15 00:15:34.340419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:35.431 #54 NEW cov: 11736 ft: 14968 corp: 41/740b lim: 40 exec/s: 54 rss: 70Mb L: 39/39 MS: 1 ShuffleBytes- 00:07:35.431 [2024-07-15 00:15:34.379373] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:3adedede cdw11:dededede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.431 [2024-07-15 00:15:34.379408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.431 #55 NEW cov: 11736 ft: 14969 corp: 42/752b lim: 40 exec/s: 27 rss: 70Mb L: 12/39 MS: 1 ChangeByte- 00:07:35.431 #55 DONE cov: 11736 ft: 14969 corp: 42/752b lim: 40 exec/s: 27 rss: 70Mb 00:07:35.431 ###### Recommended dictionary. ###### 00:07:35.431 "\377\377\377\377\377\377\377\010" # Uses: 5 00:07:35.431 ###### End of recommended dictionary. ###### 00:07:35.431 Done 55 runs in 2 second(s) 00:07:35.690 00:15:34 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_13.conf 00:07:35.690 00:15:34 -- ../common.sh@72 -- # (( i++ )) 00:07:35.690 00:15:34 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:35.690 00:15:34 -- ../common.sh@73 -- # start_llvm_fuzz 14 1 0x1 00:07:35.690 00:15:34 -- nvmf/run.sh@23 -- # local fuzzer_type=14 00:07:35.690 00:15:34 -- nvmf/run.sh@24 -- # local timen=1 00:07:35.690 00:15:34 -- nvmf/run.sh@25 -- # local core=0x1 00:07:35.690 00:15:34 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:07:35.690 00:15:34 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_14.conf 00:07:35.690 00:15:34 -- nvmf/run.sh@29 -- # printf %02d 14 00:07:35.690 00:15:34 -- nvmf/run.sh@29 -- # port=4414 00:07:35.690 00:15:34 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:07:35.690 00:15:34 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4414' 00:07:35.690 00:15:34 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4414"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:35.690 00:15:34 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4414' -c /tmp/fuzz_json_14.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 -Z 14 -r /var/tmp/spdk14.sock 00:07:35.690 [2024-07-15 00:15:34.568060] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:35.690 [2024-07-15 00:15:34.568126] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid332511 ] 00:07:35.690 EAL: No free 2048 kB hugepages reported on node 1 00:07:35.690 [2024-07-15 00:15:34.746127] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:35.949 [2024-07-15 00:15:34.807401] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:35.949 [2024-07-15 00:15:34.807551] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:35.949 [2024-07-15 00:15:34.865273] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:35.949 [2024-07-15 00:15:34.881546] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4414 *** 00:07:35.949 INFO: Running with entropic power schedule (0xFF, 100). 00:07:35.949 INFO: Seed: 2359580199 00:07:35.949 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x280a94c, 0x285dea9), 00:07:35.949 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x285deb0,0x2d93480), 00:07:35.949 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:07:35.949 INFO: A corpus is not provided, starting from an empty corpus 00:07:35.949 #2 INITED exec/s: 0 rss: 60Mb 00:07:35.949 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:35.949 This may also happen if the target rejected all inputs we tried so far 00:07:35.949 [2024-07-15 00:15:34.926426] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.949 [2024-07-15 00:15:34.926467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.949 [2024-07-15 00:15:34.926517] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.949 [2024-07-15 00:15:34.926533] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.949 [2024-07-15 00:15:34.926564] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.949 [2024-07-15 00:15:34.926580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:36.208 NEW_FUNC[1/673]: 0x494d50 in fuzz_admin_set_features_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:392 00:07:36.208 NEW_FUNC[2/673]: 0x4b60f0 in feat_write_atomicity /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:340 00:07:36.208 #11 NEW cov: 11531 ft: 11531 corp: 2/34b lim: 35 exec/s: 0 rss: 67Mb L: 33/33 MS: 4 CrossOver-CrossOver-ShuffleBytes-InsertRepeatedBytes- 00:07:36.208 [2024-07-15 00:15:35.257215] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000023 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.208 [2024-07-15 00:15:35.257256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.208 [2024-07-15 00:15:35.257293] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.208 [2024-07-15 00:15:35.257310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.208 [2024-07-15 00:15:35.257342] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.208 [2024-07-15 00:15:35.257365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.208 [2024-07-15 00:15:35.257396] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.208 [2024-07-15 00:15:35.257413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:36.467 #14 NEW cov: 11656 ft: 12127 corp: 3/62b lim: 35 exec/s: 0 rss: 67Mb L: 28/33 MS: 3 CMP-ChangeByte-InsertRepeatedBytes- DE: "\000\037"- 00:07:36.467 [2024-07-15 00:15:35.307258] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000023 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.467 [2024-07-15 00:15:35.307290] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.467 [2024-07-15 00:15:35.307325] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.467 [2024-07-15 00:15:35.307342] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.467 [2024-07-15 00:15:35.307373] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.467 [2024-07-15 00:15:35.307390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.467 [2024-07-15 00:15:35.307420] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.467 [2024-07-15 00:15:35.307436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:36.467 #15 NEW cov: 11662 ft: 12264 corp: 4/90b lim: 35 exec/s: 0 rss: 67Mb L: 28/33 MS: 1 CopyPart- 00:07:36.467 [2024-07-15 00:15:35.367306] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000023 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.467 [2024-07-15 00:15:35.367336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.467 [2024-07-15 00:15:35.367384] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.467 [2024-07-15 00:15:35.367401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.467 [2024-07-15 00:15:35.367430] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.467 [2024-07-15 00:15:35.367453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.467 [2024-07-15 00:15:35.367484] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.467 [2024-07-15 00:15:35.367500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:36.467 #16 NEW cov: 11747 ft: 12553 corp: 5/118b lim: 35 exec/s: 0 rss: 67Mb L: 28/33 MS: 1 ChangeByte- 00:07:36.467 [2024-07-15 00:15:35.437544] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000023 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.467 [2024-07-15 00:15:35.437575] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.467 [2024-07-15 00:15:35.437609] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.467 [2024-07-15 00:15:35.437625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.467 [2024-07-15 00:15:35.437660] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.467 [2024-07-15 00:15:35.437675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.467 [2024-07-15 00:15:35.437705] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.467 [2024-07-15 00:15:35.437721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:36.467 #17 NEW cov: 11747 ft: 12747 corp: 6/150b lim: 35 exec/s: 0 rss: 67Mb L: 32/33 MS: 1 InsertRepeatedBytes- 00:07:36.467 [2024-07-15 00:15:35.487715] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000023 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.467 [2024-07-15 00:15:35.487745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.467 [2024-07-15 00:15:35.487795] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.467 [2024-07-15 00:15:35.487812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.467 [2024-07-15 00:15:35.487842] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.467 [2024-07-15 00:15:35.487858] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.467 [2024-07-15 00:15:35.487888] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.467 [2024-07-15 00:15:35.487905] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:36.726 #18 NEW cov: 11747 ft: 12800 corp: 7/184b lim: 35 exec/s: 0 rss: 67Mb L: 34/34 MS: 1 PersAutoDict- DE: "\000\037"- 00:07:36.726 #23 NEW cov: 11747 ft: 13574 corp: 8/192b lim: 35 exec/s: 0 rss: 67Mb L: 8/34 MS: 5 CopyPart-PersAutoDict-EraseBytes-ShuffleBytes-CrossOver- DE: "\000\037"- 00:07:36.726 [2024-07-15 00:15:35.608017] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000023 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.726 [2024-07-15 00:15:35.608048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.726 [2024-07-15 00:15:35.608099] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.726 [2024-07-15 00:15:35.608116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.726 [2024-07-15 00:15:35.608149] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.726 [2024-07-15 00:15:35.608165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.726 [2024-07-15 00:15:35.608196] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:000000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.726 [2024-07-15 00:15:35.608211] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:36.726 #24 NEW cov: 11747 ft: 13667 corp: 9/220b lim: 35 exec/s: 0 rss: 68Mb L: 28/34 MS: 1 ChangeBinInt- 00:07:36.726 [2024-07-15 00:15:35.678201] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000023 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.726 [2024-07-15 00:15:35.678231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.726 [2024-07-15 00:15:35.678272] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.726 [2024-07-15 00:15:35.678289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.726 [2024-07-15 00:15:35.678318] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000fe SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.726 [2024-07-15 00:15:35.678334] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.726 [2024-07-15 00:15:35.678363] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.726 [2024-07-15 00:15:35.678379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:36.726 #25 NEW cov: 11747 ft: 13722 corp: 10/248b lim: 35 exec/s: 0 rss: 68Mb L: 28/34 MS: 1 ChangeBit- 00:07:36.726 [2024-07-15 00:15:35.728280] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.726 [2024-07-15 00:15:35.728311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.726 [2024-07-15 00:15:35.728359] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.726 [2024-07-15 00:15:35.728375] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.726 [2024-07-15 00:15:35.728404] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.726 [2024-07-15 00:15:35.728420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.726 [2024-07-15 00:15:35.728457] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.726 [2024-07-15 00:15:35.728473] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:36.726 #26 NEW cov: 11747 ft: 13773 corp: 11/281b lim: 35 exec/s: 0 rss: 68Mb L: 33/34 MS: 1 InsertRepeatedBytes- 00:07:36.985 [2024-07-15 00:15:35.788487] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000023 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.985 [2024-07-15 00:15:35.788518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.985 [2024-07-15 00:15:35.788568] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.985 [2024-07-15 00:15:35.788585] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.985 [2024-07-15 00:15:35.788615] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000fe SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.985 [2024-07-15 00:15:35.788641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.985 [2024-07-15 00:15:35.788670] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.985 [2024-07-15 00:15:35.788686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:36.985 NEW_FUNC[1/1]: 0x195e300 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:36.985 #27 NEW cov: 11764 ft: 13833 corp: 12/310b lim: 35 exec/s: 0 rss: 68Mb L: 29/34 MS: 1 InsertByte- 00:07:36.985 [2024-07-15 00:15:35.848642] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000023 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.985 [2024-07-15 00:15:35.848672] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.985 [2024-07-15 00:15:35.848706] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.985 [2024-07-15 00:15:35.848722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.985 [2024-07-15 00:15:35.848752] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000d8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.985 [2024-07-15 00:15:35.848768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.985 [2024-07-15 00:15:35.848797] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.985 [2024-07-15 00:15:35.848812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:36.985 #28 NEW cov: 11764 ft: 13915 corp: 13/338b lim: 35 exec/s: 0 rss: 68Mb L: 28/34 MS: 1 ChangeByte- 00:07:36.985 [2024-07-15 00:15:35.898769] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000023 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.985 [2024-07-15 00:15:35.898799] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.985 [2024-07-15 00:15:35.898832] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.985 [2024-07-15 00:15:35.898849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.985 [2024-07-15 00:15:35.898878] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.985 [2024-07-15 00:15:35.898892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.985 [2024-07-15 00:15:35.898922] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.985 [2024-07-15 00:15:35.898938] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:36.985 #29 NEW cov: 11764 ft: 13926 corp: 14/370b lim: 35 exec/s: 29 rss: 68Mb L: 32/34 MS: 1 ShuffleBytes- 00:07:36.985 [2024-07-15 00:15:35.948871] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000023 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.985 [2024-07-15 00:15:35.948901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.985 [2024-07-15 00:15:35.948934] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.986 [2024-07-15 00:15:35.948949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.986 [2024-07-15 00:15:35.948979] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000fe SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.986 [2024-07-15 00:15:35.948995] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.986 [2024-07-15 00:15:35.949024] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.986 [2024-07-15 00:15:35.949039] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:36.986 #30 NEW cov: 11764 ft: 13951 corp: 15/399b lim: 35 exec/s: 30 rss: 68Mb L: 29/34 MS: 1 ChangeBinInt- 00:07:36.986 [2024-07-15 00:15:36.009062] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000023 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.986 [2024-07-15 00:15:36.009091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.986 [2024-07-15 00:15:36.009125] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.986 [2024-07-15 00:15:36.009141] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.986 [2024-07-15 00:15:36.009170] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.986 [2024-07-15 00:15:36.009186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.986 [2024-07-15 00:15:36.009215] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.986 [2024-07-15 00:15:36.009230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:37.245 #31 NEW cov: 11764 ft: 14047 corp: 16/433b lim: 35 exec/s: 31 rss: 68Mb L: 34/34 MS: 1 CopyPart- 00:07:37.245 [2024-07-15 00:15:36.069172] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000023 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.245 [2024-07-15 00:15:36.069201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.245 [2024-07-15 00:15:36.069249] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.245 [2024-07-15 00:15:36.069265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.245 [2024-07-15 00:15:36.069295] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000fe SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.245 [2024-07-15 00:15:36.069311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.245 [2024-07-15 00:15:36.069340] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.245 [2024-07-15 00:15:36.069356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:37.245 #32 NEW cov: 11764 ft: 14069 corp: 17/462b lim: 35 exec/s: 32 rss: 68Mb L: 29/34 MS: 1 ChangeByte- 00:07:37.245 [2024-07-15 00:15:36.129328] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000023 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.245 [2024-07-15 00:15:36.129357] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.245 [2024-07-15 00:15:36.129405] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.245 [2024-07-15 00:15:36.129421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.245 [2024-07-15 00:15:36.129457] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000fe SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.245 [2024-07-15 00:15:36.129473] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.245 [2024-07-15 00:15:36.129502] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.245 [2024-07-15 00:15:36.129518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:37.245 #33 NEW cov: 11764 ft: 14092 corp: 18/492b lim: 35 exec/s: 33 rss: 68Mb L: 30/34 MS: 1 InsertByte- 00:07:37.245 [2024-07-15 00:15:36.189540] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000023 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.245 [2024-07-15 00:15:36.189570] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.245 [2024-07-15 00:15:36.189603] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.245 [2024-07-15 00:15:36.189619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.245 [2024-07-15 00:15:36.189649] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000fe SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.245 [2024-07-15 00:15:36.189665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.245 [2024-07-15 00:15:36.189710] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.245 [2024-07-15 00:15:36.189727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:37.245 #34 NEW cov: 11764 ft: 14106 corp: 19/521b lim: 35 exec/s: 34 rss: 68Mb L: 29/34 MS: 1 ShuffleBytes- 00:07:37.245 [2024-07-15 00:15:36.239537] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000023 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.245 [2024-07-15 00:15:36.239567] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.245 [2024-07-15 00:15:36.239600] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.245 [2024-07-15 00:15:36.239615] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.245 #35 NEW cov: 11764 ft: 14338 corp: 20/541b lim: 35 exec/s: 35 rss: 69Mb L: 20/34 MS: 1 EraseBytes- 00:07:37.245 [2024-07-15 00:15:36.289661] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:80000023 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.245 [2024-07-15 00:15:36.289693] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.245 [2024-07-15 00:15:36.289742] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.245 [2024-07-15 00:15:36.289758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.504 #36 NEW cov: 11764 ft: 14351 corp: 21/561b lim: 35 exec/s: 36 rss: 69Mb L: 20/34 MS: 1 EraseBytes- 00:07:37.504 [2024-07-15 00:15:36.339903] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000023 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.504 [2024-07-15 00:15:36.339934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.504 [2024-07-15 00:15:36.339968] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.504 [2024-07-15 00:15:36.339985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.504 [2024-07-15 00:15:36.340015] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.504 [2024-07-15 00:15:36.340031] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.504 [2024-07-15 00:15:36.340064] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.504 [2024-07-15 00:15:36.340080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:37.504 #37 NEW cov: 11764 ft: 14445 corp: 22/595b lim: 35 exec/s: 37 rss: 69Mb L: 34/34 MS: 1 PersAutoDict- DE: "\000\037"- 00:07:37.504 [2024-07-15 00:15:36.400128] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000023 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.504 [2024-07-15 00:15:36.400158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.504 [2024-07-15 00:15:36.400192] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.504 [2024-07-15 00:15:36.400208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.504 [2024-07-15 00:15:36.400238] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.504 [2024-07-15 00:15:36.400254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.504 NEW_FUNC[1/2]: 0x4af5c0 in feat_arbitration /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:273 00:07:37.504 NEW_FUNC[2/2]: 0x114c650 in nvmf_ctrlr_set_features_arbitration /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:1489 00:07:37.504 #38 NEW cov: 11821 ft: 14514 corp: 23/623b lim: 35 exec/s: 38 rss: 69Mb L: 28/34 MS: 1 ChangeBinInt- 00:07:37.504 [2024-07-15 00:15:36.450277] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000023 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.504 [2024-07-15 00:15:36.450307] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.504 [2024-07-15 00:15:36.450356] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:5 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.504 [2024-07-15 00:15:36.450373] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.504 [2024-07-15 00:15:36.450404] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.504 [2024-07-15 00:15:36.450421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.504 [2024-07-15 00:15:36.450459] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:000000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.504 [2024-07-15 00:15:36.450475] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:37.504 [2024-07-15 00:15:36.450505] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:8 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.504 [2024-07-15 00:15:36.450521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:37.505 #39 NEW cov: 11821 ft: 14641 corp: 24/658b lim: 35 exec/s: 39 rss: 69Mb L: 35/35 MS: 1 CrossOver- 00:07:37.505 [2024-07-15 00:15:36.500314] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000023 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.505 [2024-07-15 00:15:36.500344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.505 [2024-07-15 00:15:36.500377] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.505 [2024-07-15 00:15:36.500394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.505 [2024-07-15 00:15:36.500429] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES INTERRUPT COALESCING cid:6 cdw10:80000008 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.505 [2024-07-15 00:15:36.500451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.505 NEW_FUNC[1/1]: 0x4b4ac0 in feat_interrupt_coalescing /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:325 00:07:37.505 #40 NEW cov: 11839 ft: 14664 corp: 25/686b lim: 35 exec/s: 40 rss: 69Mb L: 28/35 MS: 1 ChangeBinInt- 00:07:37.505 [2024-07-15 00:15:36.560225] ctrlr.c:1621:nvmf_ctrlr_set_features_error_recovery: *ERROR*: Host set unsupported DULBE bit 00:07:37.505 [2024-07-15 00:15:36.560348] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES ERROR_RECOVERY cid:4 cdw10:00000005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.505 [2024-07-15 00:15:36.560371] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.764 NEW_FUNC[1/2]: 0x4b2a70 in feat_error_recover /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:304 00:07:37.764 NEW_FUNC[2/2]: 0x114faf0 in nvmf_ctrlr_set_features_error_recovery /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:1609 00:07:37.764 #44 NEW cov: 11889 ft: 14844 corp: 26/693b lim: 35 exec/s: 44 rss: 69Mb L: 7/35 MS: 4 PersAutoDict-CopyPart-ChangeBinInt-InsertByte- DE: "\000\037"- 00:07:37.764 [2024-07-15 00:15:36.610493] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000023 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.764 [2024-07-15 00:15:36.610522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.764 [2024-07-15 00:15:36.610570] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.764 [2024-07-15 00:15:36.610586] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.764 #45 NEW cov: 11889 ft: 14921 corp: 27/709b lim: 35 exec/s: 45 rss: 69Mb L: 16/35 MS: 1 EraseBytes- 00:07:37.764 [2024-07-15 00:15:36.670825] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000023 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.764 [2024-07-15 00:15:36.670854] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.764 [2024-07-15 00:15:36.670902] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.764 [2024-07-15 00:15:36.670919] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.764 [2024-07-15 00:15:36.670948] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.764 [2024-07-15 00:15:36.670963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.764 [2024-07-15 00:15:36.670992] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.764 [2024-07-15 00:15:36.671008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:37.764 [2024-07-15 00:15:36.671036] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:8 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.764 [2024-07-15 00:15:36.671051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:37.764 #46 NEW cov: 11889 ft: 14930 corp: 28/744b lim: 35 exec/s: 46 rss: 69Mb L: 35/35 MS: 1 CopyPart- 00:07:37.764 [2024-07-15 00:15:36.720927] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000023 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.764 [2024-07-15 00:15:36.720963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.764 [2024-07-15 00:15:36.720997] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.764 [2024-07-15 00:15:36.721013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.764 [2024-07-15 00:15:36.721042] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.764 [2024-07-15 00:15:36.721058] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.764 [2024-07-15 00:15:36.721103] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.764 [2024-07-15 00:15:36.721119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:37.764 #47 NEW cov: 11889 ft: 14933 corp: 29/773b lim: 35 exec/s: 47 rss: 69Mb L: 29/35 MS: 1 InsertByte- 00:07:37.764 [2024-07-15 00:15:36.771082] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000023 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.764 [2024-07-15 00:15:36.771110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.764 [2024-07-15 00:15:36.771157] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:5 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.764 [2024-07-15 00:15:36.771173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.764 [2024-07-15 00:15:36.771203] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.764 [2024-07-15 00:15:36.771218] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.764 [2024-07-15 00:15:36.771248] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:000000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.764 [2024-07-15 00:15:36.771262] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:37.764 [2024-07-15 00:15:36.771291] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:8 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.764 [2024-07-15 00:15:36.771306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:37.764 #48 NEW cov: 11896 ft: 14953 corp: 30/808b lim: 35 exec/s: 48 rss: 69Mb L: 35/35 MS: 1 CopyPart- 00:07:38.024 [2024-07-15 00:15:36.831177] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000023 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.024 [2024-07-15 00:15:36.831207] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.024 [2024-07-15 00:15:36.831256] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.024 [2024-07-15 00:15:36.831272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.024 [2024-07-15 00:15:36.831301] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000fe SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.024 [2024-07-15 00:15:36.831317] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:38.024 [2024-07-15 00:15:36.831351] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.024 [2024-07-15 00:15:36.831366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:38.024 #49 NEW cov: 11896 ft: 14961 corp: 31/837b lim: 35 exec/s: 49 rss: 69Mb L: 29/35 MS: 1 ChangeBit- 00:07:38.024 [2024-07-15 00:15:36.881153] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000023 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.024 [2024-07-15 00:15:36.881183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.024 [2024-07-15 00:15:36.881231] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.024 [2024-07-15 00:15:36.881247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.024 #50 NEW cov: 11896 ft: 15000 corp: 32/857b lim: 35 exec/s: 25 rss: 69Mb L: 20/35 MS: 1 CrossOver- 00:07:38.024 #50 DONE cov: 11896 ft: 15000 corp: 32/857b lim: 35 exec/s: 25 rss: 69Mb 00:07:38.024 ###### Recommended dictionary. ###### 00:07:38.024 "\000\037" # Uses: 4 00:07:38.024 ###### End of recommended dictionary. ###### 00:07:38.024 Done 50 runs in 2 second(s) 00:07:38.024 00:15:37 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_14.conf 00:07:38.024 00:15:37 -- ../common.sh@72 -- # (( i++ )) 00:07:38.024 00:15:37 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:38.024 00:15:37 -- ../common.sh@73 -- # start_llvm_fuzz 15 1 0x1 00:07:38.024 00:15:37 -- nvmf/run.sh@23 -- # local fuzzer_type=15 00:07:38.024 00:15:37 -- nvmf/run.sh@24 -- # local timen=1 00:07:38.024 00:15:37 -- nvmf/run.sh@25 -- # local core=0x1 00:07:38.024 00:15:37 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:07:38.024 00:15:37 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_15.conf 00:07:38.024 00:15:37 -- nvmf/run.sh@29 -- # printf %02d 15 00:07:38.024 00:15:37 -- nvmf/run.sh@29 -- # port=4415 00:07:38.024 00:15:37 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:07:38.024 00:15:37 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4415' 00:07:38.024 00:15:37 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4415"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:38.024 00:15:37 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4415' -c /tmp/fuzz_json_15.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 -Z 15 -r /var/tmp/spdk15.sock 00:07:38.283 [2024-07-15 00:15:37.083590] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:38.283 [2024-07-15 00:15:37.083661] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid332929 ] 00:07:38.283 EAL: No free 2048 kB hugepages reported on node 1 00:07:38.283 [2024-07-15 00:15:37.279809] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:38.542 [2024-07-15 00:15:37.342477] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:38.542 [2024-07-15 00:15:37.342611] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:38.542 [2024-07-15 00:15:37.400532] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:38.542 [2024-07-15 00:15:37.416830] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4415 *** 00:07:38.542 INFO: Running with entropic power schedule (0xFF, 100). 00:07:38.542 INFO: Seed: 597648810 00:07:38.542 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x280a94c, 0x285dea9), 00:07:38.542 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x285deb0,0x2d93480), 00:07:38.542 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:07:38.542 INFO: A corpus is not provided, starting from an empty corpus 00:07:38.542 #2 INITED exec/s: 0 rss: 60Mb 00:07:38.542 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:38.542 This may also happen if the target rejected all inputs we tried so far 00:07:38.542 [2024-07-15 00:15:37.464959] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000748 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.542 [2024-07-15 00:15:37.464987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.542 [2024-07-15 00:15:37.465046] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.542 [2024-07-15 00:15:37.465059] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.802 NEW_FUNC[1/670]: 0x496290 in fuzz_admin_get_features_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:460 00:07:38.802 NEW_FUNC[2/670]: 0x4bd260 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:38.802 #6 NEW cov: 11491 ft: 11492 corp: 2/20b lim: 35 exec/s: 0 rss: 66Mb L: 19/19 MS: 4 InsertByte-ChangeBit-ChangeByte-InsertRepeatedBytes- 00:07:38.802 [2024-07-15 00:15:37.785868] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000748 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.802 [2024-07-15 00:15:37.785903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.802 [2024-07-15 00:15:37.785966] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.802 [2024-07-15 00:15:37.785981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.802 #7 NEW cov: 11604 ft: 11987 corp: 3/40b lim: 35 exec/s: 0 rss: 66Mb L: 20/20 MS: 1 CrossOver- 00:07:38.802 [2024-07-15 00:15:37.825893] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000748 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.802 [2024-07-15 00:15:37.825918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.802 [2024-07-15 00:15:37.825997] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.802 [2024-07-15 00:15:37.826011] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.802 #8 NEW cov: 11610 ft: 12346 corp: 4/60b lim: 35 exec/s: 0 rss: 66Mb L: 20/20 MS: 1 ChangeBinInt- 00:07:39.062 [2024-07-15 00:15:37.866150] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.062 [2024-07-15 00:15:37.866175] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.062 [2024-07-15 00:15:37.866237] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.062 [2024-07-15 00:15:37.866251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.062 NEW_FUNC[1/1]: 0x4b60f0 in feat_write_atomicity /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:340 00:07:39.062 #15 NEW cov: 11709 ft: 12781 corp: 5/81b lim: 35 exec/s: 0 rss: 66Mb L: 21/21 MS: 2 CopyPart-InsertRepeatedBytes- 00:07:39.062 [2024-07-15 00:15:37.906108] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000748 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.062 [2024-07-15 00:15:37.906133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.062 [2024-07-15 00:15:37.906198] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.062 [2024-07-15 00:15:37.906212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.062 #16 NEW cov: 11709 ft: 12818 corp: 6/100b lim: 35 exec/s: 0 rss: 66Mb L: 19/21 MS: 1 CrossOver- 00:07:39.062 [2024-07-15 00:15:37.946168] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000748 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.062 [2024-07-15 00:15:37.946192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.062 [2024-07-15 00:15:37.946257] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.062 [2024-07-15 00:15:37.946270] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.062 #17 NEW cov: 11709 ft: 12957 corp: 7/120b lim: 35 exec/s: 0 rss: 67Mb L: 20/21 MS: 1 ShuffleBytes- 00:07:39.062 [2024-07-15 00:15:37.986446] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000748 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.062 [2024-07-15 00:15:37.986471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.062 [2024-07-15 00:15:37.986553] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.062 [2024-07-15 00:15:37.986568] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.062 [2024-07-15 00:15:37.986635] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.062 [2024-07-15 00:15:37.986648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.062 #18 NEW cov: 11709 ft: 13237 corp: 8/143b lim: 35 exec/s: 0 rss: 67Mb L: 23/23 MS: 1 CMP- DE: "\006\000\000\000"- 00:07:39.062 [2024-07-15 00:15:38.026425] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000748 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.062 [2024-07-15 00:15:38.026453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.062 [2024-07-15 00:15:38.026529] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.062 [2024-07-15 00:15:38.026544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.062 #19 NEW cov: 11709 ft: 13300 corp: 9/163b lim: 35 exec/s: 0 rss: 67Mb L: 20/23 MS: 1 CMP- DE: "\006\000"- 00:07:39.062 [2024-07-15 00:15:38.066668] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000748 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.062 [2024-07-15 00:15:38.066692] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.062 [2024-07-15 00:15:38.066775] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000100 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.062 [2024-07-15 00:15:38.066789] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.062 [2024-07-15 00:15:38.066853] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.062 [2024-07-15 00:15:38.066867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.062 #20 NEW cov: 11709 ft: 13311 corp: 10/187b lim: 35 exec/s: 0 rss: 67Mb L: 24/24 MS: 1 InsertRepeatedBytes- 00:07:39.062 [2024-07-15 00:15:38.106671] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000748 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.062 [2024-07-15 00:15:38.106695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.062 [2024-07-15 00:15:38.106755] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.062 [2024-07-15 00:15:38.106769] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.321 #21 NEW cov: 11709 ft: 13355 corp: 11/207b lim: 35 exec/s: 0 rss: 67Mb L: 20/24 MS: 1 ChangeBit- 00:07:39.321 [2024-07-15 00:15:38.146784] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000748 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.321 [2024-07-15 00:15:38.146808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.321 [2024-07-15 00:15:38.146870] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.321 [2024-07-15 00:15:38.146884] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.321 #22 NEW cov: 11709 ft: 13387 corp: 12/227b lim: 35 exec/s: 0 rss: 67Mb L: 20/24 MS: 1 ChangeByte- 00:07:39.321 [2024-07-15 00:15:38.177061] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000748 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.321 [2024-07-15 00:15:38.177085] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.322 [2024-07-15 00:15:38.177165] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.322 [2024-07-15 00:15:38.177180] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.322 [2024-07-15 00:15:38.177243] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000000e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.322 [2024-07-15 00:15:38.177256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.322 #23 NEW cov: 11709 ft: 13424 corp: 13/252b lim: 35 exec/s: 0 rss: 67Mb L: 25/25 MS: 1 InsertRepeatedBytes- 00:07:39.322 [2024-07-15 00:15:38.217161] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000748 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.322 [2024-07-15 00:15:38.217185] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.322 [2024-07-15 00:15:38.217239] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000100 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.322 [2024-07-15 00:15:38.217254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.322 [2024-07-15 00:15:38.217318] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.322 [2024-07-15 00:15:38.217332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.322 #24 NEW cov: 11709 ft: 13468 corp: 14/276b lim: 35 exec/s: 0 rss: 67Mb L: 24/25 MS: 1 ChangeBinInt- 00:07:39.322 [2024-07-15 00:15:38.257243] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000748 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.322 [2024-07-15 00:15:38.257269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.322 [2024-07-15 00:15:38.257347] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000100 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.322 [2024-07-15 00:15:38.257366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.322 [2024-07-15 00:15:38.257431] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.322 [2024-07-15 00:15:38.257450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.322 #25 NEW cov: 11709 ft: 13470 corp: 15/300b lim: 35 exec/s: 0 rss: 67Mb L: 24/25 MS: 1 ChangeBit- 00:07:39.322 [2024-07-15 00:15:38.297376] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000748 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.322 [2024-07-15 00:15:38.297401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.322 [2024-07-15 00:15:38.297523] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.322 [2024-07-15 00:15:38.297539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.322 NEW_FUNC[1/1]: 0x4b3bc0 in feat_number_of_queues /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:318 00:07:39.322 #26 NEW cov: 11741 ft: 13564 corp: 16/324b lim: 35 exec/s: 0 rss: 67Mb L: 24/25 MS: 1 CMP- DE: "\007\000"- 00:07:39.322 [2024-07-15 00:15:38.337374] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000748 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.322 [2024-07-15 00:15:38.337400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.322 [2024-07-15 00:15:38.337462] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.322 [2024-07-15 00:15:38.337476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.322 NEW_FUNC[1/1]: 0x195e300 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:39.322 #27 NEW cov: 11764 ft: 13668 corp: 17/343b lim: 35 exec/s: 0 rss: 67Mb L: 19/25 MS: 1 ChangeBit- 00:07:39.582 [2024-07-15 00:15:38.377673] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000748 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.582 [2024-07-15 00:15:38.377700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.582 [2024-07-15 00:15:38.377767] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000100 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.582 [2024-07-15 00:15:38.377783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.582 [2024-07-15 00:15:38.377849] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.582 [2024-07-15 00:15:38.377863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.582 #28 NEW cov: 11764 ft: 13696 corp: 18/367b lim: 35 exec/s: 0 rss: 67Mb L: 24/25 MS: 1 ChangeBit- 00:07:39.582 [2024-07-15 00:15:38.417752] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000748 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.582 [2024-07-15 00:15:38.417778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.582 [2024-07-15 00:15:38.417843] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000100 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.582 [2024-07-15 00:15:38.417857] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.582 [2024-07-15 00:15:38.417926] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.582 [2024-07-15 00:15:38.417940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.582 #29 NEW cov: 11764 ft: 13731 corp: 19/391b lim: 35 exec/s: 0 rss: 67Mb L: 24/25 MS: 1 ChangeBinInt- 00:07:39.582 [2024-07-15 00:15:38.457966] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000748 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.582 [2024-07-15 00:15:38.457991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.582 [2024-07-15 00:15:38.458073] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.582 [2024-07-15 00:15:38.458088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.582 [2024-07-15 00:15:38.458150] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:0000012e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.582 [2024-07-15 00:15:38.458164] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.582 [2024-07-15 00:15:38.458227] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000748 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.582 [2024-07-15 00:15:38.458240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.582 #30 NEW cov: 11764 ft: 14174 corp: 20/419b lim: 35 exec/s: 30 rss: 67Mb L: 28/28 MS: 1 InsertRepeatedBytes- 00:07:39.582 [2024-07-15 00:15:38.497947] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000748 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.582 [2024-07-15 00:15:38.497972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.582 [2024-07-15 00:15:38.498054] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000100 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.582 [2024-07-15 00:15:38.498069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.582 [2024-07-15 00:15:38.498134] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000021 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.582 [2024-07-15 00:15:38.498148] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.582 #31 NEW cov: 11764 ft: 14184 corp: 21/443b lim: 35 exec/s: 31 rss: 68Mb L: 24/28 MS: 1 ChangeByte- 00:07:39.582 [2024-07-15 00:15:38.537961] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000748 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.582 [2024-07-15 00:15:38.537986] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.582 [2024-07-15 00:15:38.538065] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.582 [2024-07-15 00:15:38.538080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.582 #32 NEW cov: 11764 ft: 14202 corp: 22/462b lim: 35 exec/s: 32 rss: 68Mb L: 19/28 MS: 1 ShuffleBytes- 00:07:39.582 [2024-07-15 00:15:38.578172] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000748 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.582 [2024-07-15 00:15:38.578197] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.582 [2024-07-15 00:15:38.578263] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000100 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.582 [2024-07-15 00:15:38.578280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.582 [2024-07-15 00:15:38.578343] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.582 [2024-07-15 00:15:38.578358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.582 #33 NEW cov: 11764 ft: 14222 corp: 23/486b lim: 35 exec/s: 33 rss: 68Mb L: 24/28 MS: 1 CrossOver- 00:07:39.582 [2024-07-15 00:15:38.618138] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000028a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.582 [2024-07-15 00:15:38.618164] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.582 [2024-07-15 00:15:38.618228] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.582 [2024-07-15 00:15:38.618242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.582 #35 NEW cov: 11764 ft: 14244 corp: 24/506b lim: 35 exec/s: 35 rss: 68Mb L: 20/28 MS: 2 ChangeBit-CrossOver- 00:07:39.842 [2024-07-15 00:15:38.648218] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000748 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.842 [2024-07-15 00:15:38.648244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.842 [2024-07-15 00:15:38.648306] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES HOST CONTROLLED THERMAL MANAGEMENT cid:5 cdw10:00000010 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.842 [2024-07-15 00:15:38.648321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.842 #36 NEW cov: 11764 ft: 14268 corp: 25/520b lim: 35 exec/s: 36 rss: 68Mb L: 14/28 MS: 1 EraseBytes- 00:07:39.842 [2024-07-15 00:15:38.688377] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000028a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.842 [2024-07-15 00:15:38.688403] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.842 [2024-07-15 00:15:38.688483] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.842 [2024-07-15 00:15:38.688499] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.842 #37 NEW cov: 11764 ft: 14305 corp: 26/540b lim: 35 exec/s: 37 rss: 68Mb L: 20/28 MS: 1 ShuffleBytes- 00:07:39.842 [2024-07-15 00:15:38.728686] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:0000036c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.842 [2024-07-15 00:15:38.728711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.842 [2024-07-15 00:15:38.728758] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:0000036c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.842 [2024-07-15 00:15:38.728772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.842 #40 NEW cov: 11764 ft: 14311 corp: 27/567b lim: 35 exec/s: 40 rss: 68Mb L: 27/28 MS: 3 CrossOver-ChangeBinInt-InsertRepeatedBytes- 00:07:39.842 [2024-07-15 00:15:38.758730] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000748 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.842 [2024-07-15 00:15:38.758756] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.842 [2024-07-15 00:15:38.758823] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000020 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.842 [2024-07-15 00:15:38.758840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.842 [2024-07-15 00:15:38.758906] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.842 [2024-07-15 00:15:38.758919] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.842 #41 NEW cov: 11764 ft: 14319 corp: 28/590b lim: 35 exec/s: 41 rss: 68Mb L: 23/28 MS: 1 ChangeBit- 00:07:39.842 [2024-07-15 00:15:38.798688] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000748 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.842 [2024-07-15 00:15:38.798713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.842 [2024-07-15 00:15:38.798773] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.842 [2024-07-15 00:15:38.798787] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.842 #42 NEW cov: 11764 ft: 14323 corp: 29/609b lim: 35 exec/s: 42 rss: 68Mb L: 19/28 MS: 1 PersAutoDict- DE: "\006\000"- 00:07:39.842 [2024-07-15 00:15:38.829102] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000748 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.842 [2024-07-15 00:15:38.829127] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.842 [2024-07-15 00:15:38.829193] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000200 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.842 [2024-07-15 00:15:38.829207] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.842 [2024-07-15 00:15:38.829272] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.842 [2024-07-15 00:15:38.829285] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.842 [2024-07-15 00:15:38.829347] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:0000012e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.842 [2024-07-15 00:15:38.829361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.842 #43 NEW cov: 11764 ft: 14332 corp: 30/638b lim: 35 exec/s: 43 rss: 68Mb L: 29/29 MS: 1 CrossOver- 00:07:39.842 [2024-07-15 00:15:38.868910] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000748 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.842 [2024-07-15 00:15:38.868934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.842 [2024-07-15 00:15:38.868999] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.842 [2024-07-15 00:15:38.869013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.842 #44 NEW cov: 11764 ft: 14340 corp: 31/658b lim: 35 exec/s: 44 rss: 68Mb L: 20/29 MS: 1 CopyPart- 00:07:40.102 [2024-07-15 00:15:38.909039] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000748 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.102 [2024-07-15 00:15:38.909064] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.102 [2024-07-15 00:15:38.909132] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.102 [2024-07-15 00:15:38.909145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.102 #45 NEW cov: 11764 ft: 14342 corp: 32/678b lim: 35 exec/s: 45 rss: 68Mb L: 20/29 MS: 1 CrossOver- 00:07:40.102 [2024-07-15 00:15:38.939107] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000748 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.102 [2024-07-15 00:15:38.939131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.102 [2024-07-15 00:15:38.939194] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.102 [2024-07-15 00:15:38.939208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.102 #46 NEW cov: 11764 ft: 14345 corp: 33/698b lim: 35 exec/s: 46 rss: 68Mb L: 20/29 MS: 1 ChangeByte- 00:07:40.102 [2024-07-15 00:15:38.979362] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000748 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.102 [2024-07-15 00:15:38.979387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.102 [2024-07-15 00:15:38.979457] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000100 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.102 [2024-07-15 00:15:38.979472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.102 [2024-07-15 00:15:38.979532] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.102 [2024-07-15 00:15:38.979546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:40.102 #47 NEW cov: 11764 ft: 14348 corp: 34/722b lim: 35 exec/s: 47 rss: 68Mb L: 24/29 MS: 1 ChangeByte- 00:07:40.102 [2024-07-15 00:15:39.019473] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000748 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.102 [2024-07-15 00:15:39.019498] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.102 [2024-07-15 00:15:39.019562] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000100 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.102 [2024-07-15 00:15:39.019576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.102 [2024-07-15 00:15:39.019637] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.102 [2024-07-15 00:15:39.019651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:40.102 #48 NEW cov: 11764 ft: 14385 corp: 35/747b lim: 35 exec/s: 48 rss: 68Mb L: 25/29 MS: 1 InsertByte- 00:07:40.102 [2024-07-15 00:15:39.059641] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000748 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.102 [2024-07-15 00:15:39.059665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.102 [2024-07-15 00:15:39.059728] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000100 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.102 [2024-07-15 00:15:39.059742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.102 [2024-07-15 00:15:39.059806] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000021 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.102 [2024-07-15 00:15:39.059819] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:40.103 #49 NEW cov: 11764 ft: 14389 corp: 36/772b lim: 35 exec/s: 49 rss: 68Mb L: 25/29 MS: 1 InsertByte- 00:07:40.103 [2024-07-15 00:15:39.099737] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000748 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.103 [2024-07-15 00:15:39.099761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.103 [2024-07-15 00:15:39.099829] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000100 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.103 [2024-07-15 00:15:39.099843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.103 [2024-07-15 00:15:39.099905] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000021 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.103 [2024-07-15 00:15:39.099918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:40.103 #50 NEW cov: 11764 ft: 14422 corp: 37/797b lim: 35 exec/s: 50 rss: 68Mb L: 25/29 MS: 1 CrossOver- 00:07:40.103 [2024-07-15 00:15:39.139869] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000348 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.103 [2024-07-15 00:15:39.139894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.103 [2024-07-15 00:15:39.139959] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.103 [2024-07-15 00:15:39.139973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.103 [2024-07-15 00:15:39.140053] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.103 [2024-07-15 00:15:39.140067] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:40.363 #51 NEW cov: 11764 ft: 14459 corp: 38/822b lim: 35 exec/s: 51 rss: 69Mb L: 25/29 MS: 1 InsertByte- 00:07:40.363 [2024-07-15 00:15:39.180066] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000493 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.363 [2024-07-15 00:15:39.180091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.363 [2024-07-15 00:15:39.180176] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:0000036c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.363 [2024-07-15 00:15:39.180190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:40.363 #52 NEW cov: 11764 ft: 14465 corp: 39/849b lim: 35 exec/s: 52 rss: 69Mb L: 27/29 MS: 1 ChangeBinInt- 00:07:40.363 [2024-07-15 00:15:39.220171] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:0000036c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.363 [2024-07-15 00:15:39.220196] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.363 [2024-07-15 00:15:39.220255] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:0000036c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.363 [2024-07-15 00:15:39.220269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:40.363 #53 NEW cov: 11764 ft: 14478 corp: 40/876b lim: 35 exec/s: 53 rss: 69Mb L: 27/29 MS: 1 ShuffleBytes- 00:07:40.363 [2024-07-15 00:15:39.260213] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000748 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.363 [2024-07-15 00:15:39.260237] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.363 [2024-07-15 00:15:39.260300] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000100 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.363 [2024-07-15 00:15:39.260317] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.363 [2024-07-15 00:15:39.260378] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000021 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.363 [2024-07-15 00:15:39.260392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:40.363 #54 NEW cov: 11764 ft: 14514 corp: 41/901b lim: 35 exec/s: 54 rss: 69Mb L: 25/29 MS: 1 ChangeBit- 00:07:40.363 [2024-07-15 00:15:39.300228] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000048 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.363 [2024-07-15 00:15:39.300253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.363 [2024-07-15 00:15:39.300314] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES HOST CONTROLLED THERMAL MANAGEMENT cid:5 cdw10:00000010 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.363 [2024-07-15 00:15:39.300327] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.363 #55 NEW cov: 11764 ft: 14531 corp: 42/915b lim: 35 exec/s: 55 rss: 69Mb L: 14/29 MS: 1 ChangeBinInt- 00:07:40.363 [2024-07-15 00:15:39.340496] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000348 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.363 [2024-07-15 00:15:39.340520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.363 [2024-07-15 00:15:39.340580] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:0000005d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.363 [2024-07-15 00:15:39.340593] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.363 [2024-07-15 00:15:39.340655] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.363 [2024-07-15 00:15:39.340668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:40.363 #56 NEW cov: 11764 ft: 14546 corp: 43/941b lim: 35 exec/s: 56 rss: 69Mb L: 26/29 MS: 1 InsertByte- 00:07:40.363 [2024-07-15 00:15:39.380582] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000748 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.363 [2024-07-15 00:15:39.380607] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.363 [2024-07-15 00:15:39.380670] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000100 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.363 [2024-07-15 00:15:39.380683] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.363 [2024-07-15 00:15:39.380722] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:0000002d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.363 [2024-07-15 00:15:39.380735] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:40.363 #57 NEW cov: 11764 ft: 14591 corp: 44/966b lim: 35 exec/s: 57 rss: 69Mb L: 25/29 MS: 1 InsertByte- 00:07:40.622 [2024-07-15 00:15:39.420892] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:0000036c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.622 [2024-07-15 00:15:39.420917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.622 [2024-07-15 00:15:39.420995] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:0000036c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.622 [2024-07-15 00:15:39.421010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:40.622 [2024-07-15 00:15:39.421073] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:0000036c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.622 [2024-07-15 00:15:39.421087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:40.622 #58 NEW cov: 11764 ft: 14643 corp: 45/997b lim: 35 exec/s: 29 rss: 69Mb L: 31/31 MS: 1 PersAutoDict- DE: "\006\000\000\000"- 00:07:40.622 #58 DONE cov: 11764 ft: 14643 corp: 45/997b lim: 35 exec/s: 29 rss: 69Mb 00:07:40.622 ###### Recommended dictionary. ###### 00:07:40.622 "\006\000\000\000" # Uses: 1 00:07:40.622 "\006\000" # Uses: 1 00:07:40.622 "\007\000" # Uses: 0 00:07:40.622 ###### End of recommended dictionary. ###### 00:07:40.622 Done 58 runs in 2 second(s) 00:07:40.622 00:15:39 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_15.conf 00:07:40.622 00:15:39 -- ../common.sh@72 -- # (( i++ )) 00:07:40.622 00:15:39 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:40.622 00:15:39 -- ../common.sh@73 -- # start_llvm_fuzz 16 1 0x1 00:07:40.622 00:15:39 -- nvmf/run.sh@23 -- # local fuzzer_type=16 00:07:40.622 00:15:39 -- nvmf/run.sh@24 -- # local timen=1 00:07:40.622 00:15:39 -- nvmf/run.sh@25 -- # local core=0x1 00:07:40.622 00:15:39 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:07:40.622 00:15:39 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_16.conf 00:07:40.622 00:15:39 -- nvmf/run.sh@29 -- # printf %02d 16 00:07:40.622 00:15:39 -- nvmf/run.sh@29 -- # port=4416 00:07:40.622 00:15:39 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:07:40.622 00:15:39 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4416' 00:07:40.622 00:15:39 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4416"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:40.623 00:15:39 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4416' -c /tmp/fuzz_json_16.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 -Z 16 -r /var/tmp/spdk16.sock 00:07:40.623 [2024-07-15 00:15:39.611783] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:40.623 [2024-07-15 00:15:39.611852] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid333345 ] 00:07:40.623 EAL: No free 2048 kB hugepages reported on node 1 00:07:40.881 [2024-07-15 00:15:39.786017] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:40.881 [2024-07-15 00:15:39.848707] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:40.881 [2024-07-15 00:15:39.848852] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:40.882 [2024-07-15 00:15:39.907027] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:40.882 [2024-07-15 00:15:39.923322] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4416 *** 00:07:41.141 INFO: Running with entropic power schedule (0xFF, 100). 00:07:41.141 INFO: Seed: 3106601878 00:07:41.141 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x280a94c, 0x285dea9), 00:07:41.141 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x285deb0,0x2d93480), 00:07:41.141 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:07:41.141 INFO: A corpus is not provided, starting from an empty corpus 00:07:41.141 #2 INITED exec/s: 0 rss: 61Mb 00:07:41.141 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:41.141 This may also happen if the target rejected all inputs we tried so far 00:07:41.141 [2024-07-15 00:15:39.968648] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:6004234345560363859 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.141 [2024-07-15 00:15:39.968681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:41.141 [2024-07-15 00:15:39.968756] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:6004234345560363859 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.141 [2024-07-15 00:15:39.968773] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:41.141 [2024-07-15 00:15:39.968828] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:6004234345560363859 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.141 [2024-07-15 00:15:39.968843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:41.401 NEW_FUNC[1/671]: 0x497740 in fuzz_nvm_read_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:519 00:07:41.401 NEW_FUNC[2/671]: 0x4bd260 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:41.401 #10 NEW cov: 11594 ft: 11595 corp: 2/80b lim: 105 exec/s: 0 rss: 67Mb L: 79/79 MS: 3 ChangeByte-ChangeByte-InsertRepeatedBytes- 00:07:41.401 [2024-07-15 00:15:40.299511] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:6004234345560363859 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.401 [2024-07-15 00:15:40.299563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:41.401 [2024-07-15 00:15:40.299638] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:6004234345560363859 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.401 [2024-07-15 00:15:40.299653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:41.401 [2024-07-15 00:15:40.299708] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:6004234345560363859 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.402 [2024-07-15 00:15:40.299724] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:41.402 #16 NEW cov: 11707 ft: 11998 corp: 3/160b lim: 105 exec/s: 0 rss: 67Mb L: 80/80 MS: 1 InsertByte- 00:07:41.402 [2024-07-15 00:15:40.349546] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:6004234345560363859 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.402 [2024-07-15 00:15:40.349575] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:41.402 [2024-07-15 00:15:40.349613] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:6004234345560363859 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.402 [2024-07-15 00:15:40.349629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:41.402 [2024-07-15 00:15:40.349683] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:6004234345560363859 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.402 [2024-07-15 00:15:40.349698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:41.402 #17 NEW cov: 11713 ft: 12191 corp: 4/239b lim: 105 exec/s: 0 rss: 67Mb L: 79/80 MS: 1 CopyPart- 00:07:41.402 [2024-07-15 00:15:40.389665] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:6004234345560363859 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.402 [2024-07-15 00:15:40.389694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:41.402 [2024-07-15 00:15:40.389732] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:6004234345560363859 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.402 [2024-07-15 00:15:40.389746] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:41.402 [2024-07-15 00:15:40.389803] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:6004234345560363859 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.402 [2024-07-15 00:15:40.389818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:41.402 #18 NEW cov: 11798 ft: 12429 corp: 5/319b lim: 105 exec/s: 0 rss: 67Mb L: 80/80 MS: 1 ShuffleBytes- 00:07:41.402 [2024-07-15 00:15:40.429880] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:6004234345560363859 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.402 [2024-07-15 00:15:40.429907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:41.402 [2024-07-15 00:15:40.429954] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:6004234345560363859 len:39322 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.402 [2024-07-15 00:15:40.429969] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:41.402 [2024-07-15 00:15:40.430025] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:11067969177764075929 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.402 [2024-07-15 00:15:40.430039] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:41.402 [2024-07-15 00:15:40.430092] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:6004234345560363859 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.402 [2024-07-15 00:15:40.430106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:41.402 #19 NEW cov: 11798 ft: 13052 corp: 6/415b lim: 105 exec/s: 0 rss: 67Mb L: 96/96 MS: 1 InsertRepeatedBytes- 00:07:41.662 [2024-07-15 00:15:40.469846] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:6004234345560363859 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.662 [2024-07-15 00:15:40.469874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:41.662 [2024-07-15 00:15:40.469912] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:6004234345560363859 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.662 [2024-07-15 00:15:40.469926] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:41.662 [2024-07-15 00:15:40.469979] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:6004234345560363859 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.662 [2024-07-15 00:15:40.469995] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:41.662 #20 NEW cov: 11798 ft: 13175 corp: 7/494b lim: 105 exec/s: 0 rss: 67Mb L: 79/96 MS: 1 ChangeByte- 00:07:41.662 [2024-07-15 00:15:40.510092] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:6004234345560363859 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.662 [2024-07-15 00:15:40.510119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:41.662 [2024-07-15 00:15:40.510169] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:6004234345560363859 len:39322 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.662 [2024-07-15 00:15:40.510184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:41.662 [2024-07-15 00:15:40.510236] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:43234556422756608 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.662 [2024-07-15 00:15:40.510252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:41.662 [2024-07-15 00:15:40.510308] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:6004234345560363859 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.662 [2024-07-15 00:15:40.510324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:41.662 #21 NEW cov: 11798 ft: 13373 corp: 8/592b lim: 105 exec/s: 0 rss: 67Mb L: 98/98 MS: 1 CMP- DE: "\000\000"- 00:07:41.662 [2024-07-15 00:15:40.550111] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:6004234345560363859 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.662 [2024-07-15 00:15:40.550139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:41.662 [2024-07-15 00:15:40.550190] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:6004234345560363859 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.662 [2024-07-15 00:15:40.550207] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:41.662 [2024-07-15 00:15:40.550262] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:6004234345560363859 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.662 [2024-07-15 00:15:40.550278] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:41.662 #22 NEW cov: 11798 ft: 13402 corp: 9/672b lim: 105 exec/s: 0 rss: 68Mb L: 80/98 MS: 1 ShuffleBytes- 00:07:41.662 [2024-07-15 00:15:40.590281] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:6004234345560363859 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.662 [2024-07-15 00:15:40.590307] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:41.662 [2024-07-15 00:15:40.590375] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:6004234345560363859 len:39322 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.662 [2024-07-15 00:15:40.590390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:41.662 [2024-07-15 00:15:40.590448] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:43234556422756608 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.662 [2024-07-15 00:15:40.590462] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:41.662 [2024-07-15 00:15:40.590517] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:6004234345560363859 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.662 [2024-07-15 00:15:40.590532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:41.662 #23 NEW cov: 11798 ft: 13482 corp: 10/770b lim: 105 exec/s: 0 rss: 68Mb L: 98/98 MS: 1 ChangeBinInt- 00:07:41.662 [2024-07-15 00:15:40.630421] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:6004234345560363859 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.662 [2024-07-15 00:15:40.630450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:41.662 [2024-07-15 00:15:40.630516] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:6004234345560342528 len:39322 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.662 [2024-07-15 00:15:40.630532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:41.662 [2024-07-15 00:15:40.630595] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:43234556422756608 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.662 [2024-07-15 00:15:40.630613] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:41.662 [2024-07-15 00:15:40.630668] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:6004234345560363859 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.662 [2024-07-15 00:15:40.630683] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:41.662 #24 NEW cov: 11798 ft: 13530 corp: 11/868b lim: 105 exec/s: 0 rss: 68Mb L: 98/98 MS: 1 PersAutoDict- DE: "\000\000"- 00:07:41.662 [2024-07-15 00:15:40.670539] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:6004234345560363859 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.662 [2024-07-15 00:15:40.670565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:41.662 [2024-07-15 00:15:40.670616] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:23454040412345088 len:21402 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.662 [2024-07-15 00:15:40.670632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:41.662 [2024-07-15 00:15:40.670683] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:168884986026393 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.662 [2024-07-15 00:15:40.670697] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:41.662 [2024-07-15 00:15:40.670751] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:6004234345560363859 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.662 [2024-07-15 00:15:40.670766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:41.662 #25 NEW cov: 11798 ft: 13550 corp: 12/967b lim: 105 exec/s: 0 rss: 68Mb L: 99/99 MS: 1 InsertByte- 00:07:41.662 [2024-07-15 00:15:40.710655] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:6004234345560363859 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.662 [2024-07-15 00:15:40.710682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:41.662 [2024-07-15 00:15:40.710734] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:6004234345560363859 len:39322 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.662 [2024-07-15 00:15:40.710750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:41.663 [2024-07-15 00:15:40.710803] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:43265342748334336 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.663 [2024-07-15 00:15:40.710817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:41.663 [2024-07-15 00:15:40.710869] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:6004234345560363859 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.663 [2024-07-15 00:15:40.710883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:41.923 #26 NEW cov: 11798 ft: 13635 corp: 13/1065b lim: 105 exec/s: 0 rss: 68Mb L: 98/99 MS: 1 ChangeByte- 00:07:41.923 [2024-07-15 00:15:40.750735] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:6004234345560363835 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.923 [2024-07-15 00:15:40.750762] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:41.923 [2024-07-15 00:15:40.750801] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:6004234345560363859 len:39322 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.923 [2024-07-15 00:15:40.750819] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:41.923 [2024-07-15 00:15:40.750872] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:43234556422756608 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.923 [2024-07-15 00:15:40.750887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:41.923 [2024-07-15 00:15:40.750941] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:6004234345560363859 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.923 [2024-07-15 00:15:40.750955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:41.923 #27 NEW cov: 11798 ft: 13683 corp: 14/1163b lim: 105 exec/s: 0 rss: 68Mb L: 98/99 MS: 1 ChangeByte- 00:07:41.923 [2024-07-15 00:15:40.790466] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:6004234345560363859 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.923 [2024-07-15 00:15:40.790493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:41.923 #28 NEW cov: 11798 ft: 14222 corp: 15/1184b lim: 105 exec/s: 0 rss: 68Mb L: 21/99 MS: 1 CrossOver- 00:07:41.923 [2024-07-15 00:15:40.830817] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:6004234345560363859 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.923 [2024-07-15 00:15:40.830844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:41.923 [2024-07-15 00:15:40.830891] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:6004234345560363859 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.923 [2024-07-15 00:15:40.830906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:41.923 [2024-07-15 00:15:40.830975] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:6004234345560363859 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.923 [2024-07-15 00:15:40.830991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:41.923 #29 NEW cov: 11798 ft: 14245 corp: 16/1263b lim: 105 exec/s: 0 rss: 68Mb L: 79/99 MS: 1 CopyPart- 00:07:41.923 [2024-07-15 00:15:40.870971] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:6004234345560363859 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.923 [2024-07-15 00:15:40.870998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:41.923 [2024-07-15 00:15:40.871047] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:6004234345560363859 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.923 [2024-07-15 00:15:40.871062] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:41.923 [2024-07-15 00:15:40.871116] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:6004234345560363859 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.923 [2024-07-15 00:15:40.871131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:41.923 NEW_FUNC[1/1]: 0x195e300 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:41.923 #30 NEW cov: 11821 ft: 14273 corp: 17/1342b lim: 105 exec/s: 0 rss: 68Mb L: 79/99 MS: 1 ChangeBit- 00:07:41.923 [2024-07-15 00:15:40.911211] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:6004234345554903099 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.923 [2024-07-15 00:15:40.911240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:41.923 [2024-07-15 00:15:40.911281] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:6004234345560363859 len:39322 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.923 [2024-07-15 00:15:40.911297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:41.923 [2024-07-15 00:15:40.911350] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:43234556422756608 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.923 [2024-07-15 00:15:40.911367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:41.923 [2024-07-15 00:15:40.911418] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:6004234345560363859 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.923 [2024-07-15 00:15:40.911434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:41.923 #31 NEW cov: 11821 ft: 14295 corp: 18/1440b lim: 105 exec/s: 0 rss: 68Mb L: 98/99 MS: 1 PersAutoDict- DE: "\000\000"- 00:07:41.923 [2024-07-15 00:15:40.951440] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:6004234345560363859 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.923 [2024-07-15 00:15:40.951471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:41.923 [2024-07-15 00:15:40.951511] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:6004234345560363859 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.923 [2024-07-15 00:15:40.951526] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:41.923 [2024-07-15 00:15:40.951580] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:6004234345560363859 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.923 [2024-07-15 00:15:40.951595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:41.923 #32 NEW cov: 11821 ft: 14346 corp: 19/1520b lim: 105 exec/s: 32 rss: 68Mb L: 80/99 MS: 1 CopyPart- 00:07:42.183 [2024-07-15 00:15:40.991463] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:6004234345560363835 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.183 [2024-07-15 00:15:40.991491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:42.183 [2024-07-15 00:15:40.991565] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:6004234345560363859 len:39322 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.183 [2024-07-15 00:15:40.991581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:42.183 [2024-07-15 00:15:40.991635] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:6004234345560363859 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.183 [2024-07-15 00:15:40.991650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:42.183 [2024-07-15 00:15:40.991704] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:6004234345560363859 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.183 [2024-07-15 00:15:40.991719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:42.183 #33 NEW cov: 11821 ft: 14372 corp: 20/1618b lim: 105 exec/s: 33 rss: 68Mb L: 98/99 MS: 1 CrossOver- 00:07:42.183 [2024-07-15 00:15:41.031611] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:6004234345560363859 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.183 [2024-07-15 00:15:41.031641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:42.183 [2024-07-15 00:15:41.031697] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:6004234345560363859 len:39322 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.183 [2024-07-15 00:15:41.031712] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:42.183 [2024-07-15 00:15:41.031766] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:1610612736 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.183 [2024-07-15 00:15:41.031780] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:42.183 [2024-07-15 00:15:41.031836] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:6004234345560363859 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.183 [2024-07-15 00:15:41.031851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:42.183 #34 NEW cov: 11821 ft: 14466 corp: 21/1714b lim: 105 exec/s: 34 rss: 68Mb L: 96/99 MS: 1 ChangeBinInt- 00:07:42.183 [2024-07-15 00:15:41.071556] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1302123110951162386 len:4627 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.183 [2024-07-15 00:15:41.071583] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:42.183 [2024-07-15 00:15:41.071645] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:1302123111085380114 len:4627 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.183 [2024-07-15 00:15:41.071660] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:42.183 [2024-07-15 00:15:41.071720] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:1302123111085380114 len:4627 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.183 [2024-07-15 00:15:41.071734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:42.183 #36 NEW cov: 11821 ft: 14534 corp: 22/1791b lim: 105 exec/s: 36 rss: 68Mb L: 77/99 MS: 2 CopyPart-InsertRepeatedBytes- 00:07:42.183 [2024-07-15 00:15:41.111788] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:6004234345560363859 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.183 [2024-07-15 00:15:41.111814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:42.183 [2024-07-15 00:15:41.111880] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:6004234345560363859 len:39322 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.183 [2024-07-15 00:15:41.111896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:42.183 [2024-07-15 00:15:41.111948] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:43234556422756608 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.183 [2024-07-15 00:15:41.111964] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:42.184 [2024-07-15 00:15:41.112017] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:6004234345560363859 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.184 [2024-07-15 00:15:41.112032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:42.184 #37 NEW cov: 11821 ft: 14583 corp: 23/1892b lim: 105 exec/s: 37 rss: 68Mb L: 101/101 MS: 1 CopyPart- 00:07:42.184 [2024-07-15 00:15:41.151913] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:6004234345560363859 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.184 [2024-07-15 00:15:41.151941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:42.184 [2024-07-15 00:15:41.151996] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:23454040412345088 len:21402 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.184 [2024-07-15 00:15:41.152012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:42.184 [2024-07-15 00:15:41.152070] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:168884986026393 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.184 [2024-07-15 00:15:41.152085] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:42.184 [2024-07-15 00:15:41.152142] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:6004234345560363859 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.184 [2024-07-15 00:15:41.152157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:42.184 #38 NEW cov: 11821 ft: 14597 corp: 24/1991b lim: 105 exec/s: 38 rss: 68Mb L: 99/101 MS: 1 CrossOver- 00:07:42.184 [2024-07-15 00:15:41.192004] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:6004234345560363859 len:21312 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.184 [2024-07-15 00:15:41.192031] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:42.184 [2024-07-15 00:15:41.192095] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:4557430974698176319 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.184 [2024-07-15 00:15:41.192109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:42.184 [2024-07-15 00:15:41.192165] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:6004234345560363859 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.184 [2024-07-15 00:15:41.192178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:42.184 [2024-07-15 00:15:41.192234] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:6004234345560363859 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.184 [2024-07-15 00:15:41.192249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:42.184 #39 NEW cov: 11821 ft: 14613 corp: 25/2088b lim: 105 exec/s: 39 rss: 68Mb L: 97/101 MS: 1 InsertRepeatedBytes- 00:07:42.184 [2024-07-15 00:15:41.232119] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:6004234345560363859 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.184 [2024-07-15 00:15:41.232145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:42.184 [2024-07-15 00:15:41.232191] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:6004234345560342528 len:39322 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.184 [2024-07-15 00:15:41.232206] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:42.184 [2024-07-15 00:15:41.232260] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:43234556422756608 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.184 [2024-07-15 00:15:41.232275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:42.184 [2024-07-15 00:15:41.232329] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:6004235084294738771 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.184 [2024-07-15 00:15:41.232347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:42.444 #40 NEW cov: 11821 ft: 14627 corp: 26/2189b lim: 105 exec/s: 40 rss: 68Mb L: 101/101 MS: 1 InsertRepeatedBytes- 00:07:42.444 [2024-07-15 00:15:41.272230] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1302123110951162386 len:4627 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.444 [2024-07-15 00:15:41.272256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:42.444 [2024-07-15 00:15:41.272305] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:1302123111085380114 len:4627 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.444 [2024-07-15 00:15:41.272320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:42.444 [2024-07-15 00:15:41.272375] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:1302123111085380114 len:4627 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.444 [2024-07-15 00:15:41.272390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:42.444 [2024-07-15 00:15:41.272446] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:1302123111085380114 len:4627 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.444 [2024-07-15 00:15:41.272462] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:42.444 #41 NEW cov: 11821 ft: 14645 corp: 27/2275b lim: 105 exec/s: 41 rss: 69Mb L: 86/101 MS: 1 CopyPart- 00:07:42.444 [2024-07-15 00:15:41.312283] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:6004234345560363859 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.444 [2024-07-15 00:15:41.312310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:42.444 [2024-07-15 00:15:41.312345] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:6004234345560363859 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.444 [2024-07-15 00:15:41.312360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:42.444 [2024-07-15 00:15:41.312415] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:6004234345560363859 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.444 [2024-07-15 00:15:41.312430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:42.444 #42 NEW cov: 11821 ft: 14680 corp: 28/2355b lim: 105 exec/s: 42 rss: 69Mb L: 80/101 MS: 1 ShuffleBytes- 00:07:42.444 [2024-07-15 00:15:41.352477] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:6004234345560363859 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.444 [2024-07-15 00:15:41.352504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:42.444 [2024-07-15 00:15:41.352570] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:6004234345560363859 len:39322 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.444 [2024-07-15 00:15:41.352585] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:42.444 [2024-07-15 00:15:41.352643] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:1610612736 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.444 [2024-07-15 00:15:41.352657] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:42.444 [2024-07-15 00:15:41.352715] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:6004234345560363859 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.444 [2024-07-15 00:15:41.352732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:42.444 #43 NEW cov: 11821 ft: 14682 corp: 29/2451b lim: 105 exec/s: 43 rss: 69Mb L: 96/101 MS: 1 ChangeBit- 00:07:42.444 [2024-07-15 00:15:41.392596] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:6004234345560363859 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.444 [2024-07-15 00:15:41.392624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:42.444 [2024-07-15 00:15:41.392686] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:23454040412345088 len:21402 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.444 [2024-07-15 00:15:41.392701] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:42.444 [2024-07-15 00:15:41.392756] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:168884986026393 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.444 [2024-07-15 00:15:41.392771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:42.444 [2024-07-15 00:15:41.392826] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:6004234345560363859 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.444 [2024-07-15 00:15:41.392840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:42.444 #44 NEW cov: 11821 ft: 14698 corp: 30/2554b lim: 105 exec/s: 44 rss: 69Mb L: 103/103 MS: 1 InsertRepeatedBytes- 00:07:42.444 [2024-07-15 00:15:41.432748] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:6004234345560363859 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.444 [2024-07-15 00:15:41.432777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:42.444 [2024-07-15 00:15:41.432819] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:6004234345560363859 len:39322 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.444 [2024-07-15 00:15:41.432833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:42.444 [2024-07-15 00:15:41.432889] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:43265342748334336 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.444 [2024-07-15 00:15:41.432903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:42.444 [2024-07-15 00:15:41.432961] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:6004234345560363859 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.444 [2024-07-15 00:15:41.432975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:42.444 #45 NEW cov: 11821 ft: 14703 corp: 31/2652b lim: 105 exec/s: 45 rss: 69Mb L: 98/103 MS: 1 CMP- DE: "\377\377\377\377\377\377\377\377"- 00:07:42.444 [2024-07-15 00:15:41.472827] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:6004234345560363859 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.444 [2024-07-15 00:15:41.472855] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:42.444 [2024-07-15 00:15:41.472919] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:6004234344167854931 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.444 [2024-07-15 00:15:41.472934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:42.445 [2024-07-15 00:15:41.472990] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:11068046444225730969 len:154 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.445 [2024-07-15 00:15:41.473009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:42.445 [2024-07-15 00:15:41.473062] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:6004234345560363859 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.445 [2024-07-15 00:15:41.473077] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:42.445 #46 NEW cov: 11821 ft: 14711 corp: 32/2756b lim: 105 exec/s: 46 rss: 69Mb L: 104/104 MS: 1 CopyPart- 00:07:42.705 [2024-07-15 00:15:41.512945] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:6004234345560363859 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.705 [2024-07-15 00:15:41.512973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:42.705 [2024-07-15 00:15:41.513034] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:6004234348144055123 len:39322 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.705 [2024-07-15 00:15:41.513049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:42.705 [2024-07-15 00:15:41.513105] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:11067969177764075929 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.705 [2024-07-15 00:15:41.513119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:42.705 [2024-07-15 00:15:41.513176] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:6004234345560363859 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.705 [2024-07-15 00:15:41.513191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:42.705 #47 NEW cov: 11821 ft: 14715 corp: 33/2852b lim: 105 exec/s: 47 rss: 69Mb L: 96/104 MS: 1 ChangeByte- 00:07:42.705 [2024-07-15 00:15:41.553108] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:6004234345560363859 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.705 [2024-07-15 00:15:41.553134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:42.705 [2024-07-15 00:15:41.553198] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.705 [2024-07-15 00:15:41.553214] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:42.705 [2024-07-15 00:15:41.553271] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:6004234344162415443 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.705 [2024-07-15 00:15:41.553285] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:42.705 [2024-07-15 00:15:41.553343] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:6004234345560363859 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.705 [2024-07-15 00:15:41.553359] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:42.705 #48 NEW cov: 11821 ft: 14735 corp: 34/2954b lim: 105 exec/s: 48 rss: 69Mb L: 102/104 MS: 1 InsertRepeatedBytes- 00:07:42.705 [2024-07-15 00:15:41.593194] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:6004234345560363859 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.705 [2024-07-15 00:15:41.593222] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:42.705 [2024-07-15 00:15:41.593270] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:6004234345560363859 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.705 [2024-07-15 00:15:41.593289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:42.705 [2024-07-15 00:15:41.593345] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:2893606913523066920 len:10281 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.705 [2024-07-15 00:15:41.593359] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:42.705 [2024-07-15 00:15:41.593411] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:6004234345560363859 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.705 [2024-07-15 00:15:41.593426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:42.705 #49 NEW cov: 11821 ft: 14748 corp: 35/3053b lim: 105 exec/s: 49 rss: 69Mb L: 99/104 MS: 1 InsertRepeatedBytes- 00:07:42.705 [2024-07-15 00:15:41.633231] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:6004234345560363859 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.705 [2024-07-15 00:15:41.633258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:42.705 [2024-07-15 00:15:41.633295] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:6004234345560363859 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.705 [2024-07-15 00:15:41.633311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:42.705 [2024-07-15 00:15:41.633369] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:5991005021654963027 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.705 [2024-07-15 00:15:41.633384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:42.705 #50 NEW cov: 11821 ft: 14760 corp: 36/3134b lim: 105 exec/s: 50 rss: 69Mb L: 81/104 MS: 1 InsertByte- 00:07:42.705 [2024-07-15 00:15:41.673327] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1302123110951162386 len:4627 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.705 [2024-07-15 00:15:41.673354] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:42.705 [2024-07-15 00:15:41.673412] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:1302123111085380114 len:4627 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.705 [2024-07-15 00:15:41.673429] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:42.705 [2024-07-15 00:15:41.673490] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:1302123111085380114 len:4627 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.705 [2024-07-15 00:15:41.673505] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:42.705 #51 NEW cov: 11821 ft: 14778 corp: 37/3211b lim: 105 exec/s: 51 rss: 69Mb L: 77/104 MS: 1 ChangeByte- 00:07:42.705 [2024-07-15 00:15:41.713449] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:6004234345560363859 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.705 [2024-07-15 00:15:41.713476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:42.705 [2024-07-15 00:15:41.713528] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:6004234345560363859 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.705 [2024-07-15 00:15:41.713544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:42.705 [2024-07-15 00:15:41.713603] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:6004234345560363859 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.705 [2024-07-15 00:15:41.713618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:42.705 #52 NEW cov: 11821 ft: 14784 corp: 38/3291b lim: 105 exec/s: 52 rss: 70Mb L: 80/104 MS: 1 InsertByte- 00:07:42.705 [2024-07-15 00:15:41.753694] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:6004234345560363859 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.705 [2024-07-15 00:15:41.753721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:42.705 [2024-07-15 00:15:41.753785] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:6004234345560363859 len:39322 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.705 [2024-07-15 00:15:41.753801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:42.705 [2024-07-15 00:15:41.753855] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:11067969177764075929 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.705 [2024-07-15 00:15:41.753871] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:42.705 [2024-07-15 00:15:41.753927] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:6004234345560363859 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.705 [2024-07-15 00:15:41.753943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:42.965 #53 NEW cov: 11821 ft: 14793 corp: 39/3387b lim: 105 exec/s: 53 rss: 70Mb L: 96/104 MS: 1 ChangeBit- 00:07:42.965 [2024-07-15 00:15:41.793770] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:6004234345560363859 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.965 [2024-07-15 00:15:41.793797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:42.965 [2024-07-15 00:15:41.793846] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.965 [2024-07-15 00:15:41.793863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:42.965 [2024-07-15 00:15:41.793916] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:6004234344195969875 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.965 [2024-07-15 00:15:41.793931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:42.965 [2024-07-15 00:15:41.793983] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:6004234345560363859 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.965 [2024-07-15 00:15:41.793996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:42.965 #54 NEW cov: 11821 ft: 14814 corp: 40/3489b lim: 105 exec/s: 54 rss: 70Mb L: 102/104 MS: 1 ChangeBit- 00:07:42.965 [2024-07-15 00:15:41.834021] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:6004234345560363859 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.966 [2024-07-15 00:15:41.834048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:42.966 [2024-07-15 00:15:41.834099] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:6004234345560363859 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.966 [2024-07-15 00:15:41.834115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:42.966 [2024-07-15 00:15:41.834173] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:4991471925338843205 len:10281 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.966 [2024-07-15 00:15:41.834188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:42.966 [2024-07-15 00:15:41.834243] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:6004234344836114515 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.966 [2024-07-15 00:15:41.834258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:42.966 [2024-07-15 00:15:41.834314] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:4 nsid:0 lba:6004234345560363859 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.966 [2024-07-15 00:15:41.834330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:42.966 #55 NEW cov: 11821 ft: 14883 corp: 41/3594b lim: 105 exec/s: 55 rss: 70Mb L: 105/105 MS: 1 InsertRepeatedBytes- 00:07:42.966 [2024-07-15 00:15:41.874021] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:6004234345560363859 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.966 [2024-07-15 00:15:41.874048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:42.966 [2024-07-15 00:15:41.874098] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:6004234345560363859 len:39322 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.966 [2024-07-15 00:15:41.874113] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:42.966 [2024-07-15 00:15:41.874166] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:1610612736 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.966 [2024-07-15 00:15:41.874180] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:42.966 [2024-07-15 00:15:41.874233] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:6004234345560363859 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.966 [2024-07-15 00:15:41.874248] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:42.966 #56 NEW cov: 11821 ft: 14897 corp: 42/3690b lim: 105 exec/s: 56 rss: 70Mb L: 96/105 MS: 1 ChangeByte- 00:07:42.966 [2024-07-15 00:15:41.914117] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:6004234345560363835 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.966 [2024-07-15 00:15:41.914143] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:42.966 [2024-07-15 00:15:41.914194] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:6004234345560363859 len:39322 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.966 [2024-07-15 00:15:41.914209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:42.966 [2024-07-15 00:15:41.914264] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:6004234345560363859 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.966 [2024-07-15 00:15:41.914279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:42.966 [2024-07-15 00:15:41.914334] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:6004234345560363859 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.966 [2024-07-15 00:15:41.914348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:42.966 #57 NEW cov: 11821 ft: 14900 corp: 43/3789b lim: 105 exec/s: 57 rss: 70Mb L: 99/105 MS: 1 InsertByte- 00:07:42.966 [2024-07-15 00:15:41.954244] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:6004234345560363859 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.966 [2024-07-15 00:15:41.954271] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:42.966 [2024-07-15 00:15:41.954320] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:23454040412345088 len:21402 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.966 [2024-07-15 00:15:41.954335] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:42.966 [2024-07-15 00:15:41.954403] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:6004234346734768979 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.966 [2024-07-15 00:15:41.954419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:42.966 [2024-07-15 00:15:41.954477] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:6004234345560363859 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.966 [2024-07-15 00:15:41.954493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:42.966 #58 NEW cov: 11821 ft: 14931 corp: 44/3880b lim: 105 exec/s: 29 rss: 70Mb L: 91/105 MS: 1 EraseBytes- 00:07:42.966 #58 DONE cov: 11821 ft: 14931 corp: 44/3880b lim: 105 exec/s: 29 rss: 70Mb 00:07:42.966 ###### Recommended dictionary. ###### 00:07:42.966 "\000\000" # Uses: 2 00:07:42.966 "\377\377\377\377\377\377\377\377" # Uses: 0 00:07:42.966 ###### End of recommended dictionary. ###### 00:07:42.966 Done 58 runs in 2 second(s) 00:07:43.225 00:15:42 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_16.conf 00:07:43.225 00:15:42 -- ../common.sh@72 -- # (( i++ )) 00:07:43.225 00:15:42 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:43.225 00:15:42 -- ../common.sh@73 -- # start_llvm_fuzz 17 1 0x1 00:07:43.225 00:15:42 -- nvmf/run.sh@23 -- # local fuzzer_type=17 00:07:43.225 00:15:42 -- nvmf/run.sh@24 -- # local timen=1 00:07:43.225 00:15:42 -- nvmf/run.sh@25 -- # local core=0x1 00:07:43.225 00:15:42 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:07:43.225 00:15:42 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_17.conf 00:07:43.225 00:15:42 -- nvmf/run.sh@29 -- # printf %02d 17 00:07:43.225 00:15:42 -- nvmf/run.sh@29 -- # port=4417 00:07:43.225 00:15:42 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:07:43.225 00:15:42 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4417' 00:07:43.225 00:15:42 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4417"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:43.225 00:15:42 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4417' -c /tmp/fuzz_json_17.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 -Z 17 -r /var/tmp/spdk17.sock 00:07:43.225 [2024-07-15 00:15:42.139289] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:43.225 [2024-07-15 00:15:42.139363] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid333882 ] 00:07:43.225 EAL: No free 2048 kB hugepages reported on node 1 00:07:43.485 [2024-07-15 00:15:42.319167] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:43.485 [2024-07-15 00:15:42.380651] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:43.485 [2024-07-15 00:15:42.380797] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:43.485 [2024-07-15 00:15:42.438747] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:43.485 [2024-07-15 00:15:42.455028] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4417 *** 00:07:43.485 INFO: Running with entropic power schedule (0xFF, 100). 00:07:43.485 INFO: Seed: 1341631128 00:07:43.485 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x280a94c, 0x285dea9), 00:07:43.485 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x285deb0,0x2d93480), 00:07:43.485 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:07:43.485 INFO: A corpus is not provided, starting from an empty corpus 00:07:43.485 #2 INITED exec/s: 0 rss: 60Mb 00:07:43.485 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:43.485 This may also happen if the target rejected all inputs we tried so far 00:07:43.485 [2024-07-15 00:15:42.503723] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070222446591 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.485 [2024-07-15 00:15:42.503754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:43.744 NEW_FUNC[1/672]: 0x49aa30 in fuzz_nvm_write_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:540 00:07:43.744 NEW_FUNC[2/672]: 0x4bd260 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:43.744 #21 NEW cov: 11615 ft: 11616 corp: 2/37b lim: 120 exec/s: 0 rss: 67Mb L: 36/36 MS: 4 InsertByte-CrossOver-ChangeByte-InsertRepeatedBytes- 00:07:44.003 [2024-07-15 00:15:42.814598] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070222446591 len:65444 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.003 [2024-07-15 00:15:42.814631] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:44.003 #22 NEW cov: 11728 ft: 12130 corp: 3/74b lim: 120 exec/s: 0 rss: 67Mb L: 37/37 MS: 1 InsertByte- 00:07:44.003 [2024-07-15 00:15:42.854640] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070222446591 len:41984 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.003 [2024-07-15 00:15:42.854670] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:44.003 #23 NEW cov: 11734 ft: 12366 corp: 4/111b lim: 120 exec/s: 0 rss: 67Mb L: 37/37 MS: 1 ShuffleBytes- 00:07:44.003 [2024-07-15 00:15:42.894725] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2315255808 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.003 [2024-07-15 00:15:42.894752] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:44.003 #26 NEW cov: 11819 ft: 12636 corp: 5/144b lim: 120 exec/s: 0 rss: 67Mb L: 33/37 MS: 3 CopyPart-ChangeBit-InsertRepeatedBytes- 00:07:44.003 [2024-07-15 00:15:42.934892] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070222446591 len:65444 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.004 [2024-07-15 00:15:42.934921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:44.004 #27 NEW cov: 11819 ft: 12752 corp: 6/181b lim: 120 exec/s: 0 rss: 67Mb L: 37/37 MS: 1 ShuffleBytes- 00:07:44.004 [2024-07-15 00:15:42.974975] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2315291136 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.004 [2024-07-15 00:15:42.975004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:44.004 #29 NEW cov: 11819 ft: 12908 corp: 7/215b lim: 120 exec/s: 0 rss: 67Mb L: 34/37 MS: 2 EraseBytes-CopyPart- 00:07:44.004 [2024-07-15 00:15:43.015144] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2315255808 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.004 [2024-07-15 00:15:43.015173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:44.004 #30 NEW cov: 11819 ft: 13003 corp: 8/249b lim: 120 exec/s: 0 rss: 67Mb L: 34/37 MS: 1 InsertByte- 00:07:44.004 [2024-07-15 00:15:43.055225] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070222446591 len:65444 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.004 [2024-07-15 00:15:43.055253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:44.263 #31 NEW cov: 11819 ft: 13044 corp: 9/286b lim: 120 exec/s: 0 rss: 67Mb L: 37/37 MS: 1 ChangeByte- 00:07:44.263 [2024-07-15 00:15:43.095353] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2315255808 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.263 [2024-07-15 00:15:43.095380] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:44.263 #37 NEW cov: 11819 ft: 13082 corp: 10/319b lim: 120 exec/s: 0 rss: 67Mb L: 33/37 MS: 1 ShuffleBytes- 00:07:44.263 [2024-07-15 00:15:43.135471] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070222446591 len:65444 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.263 [2024-07-15 00:15:43.135499] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:44.263 #38 NEW cov: 11819 ft: 13139 corp: 11/357b lim: 120 exec/s: 0 rss: 68Mb L: 38/38 MS: 1 InsertByte- 00:07:44.263 [2024-07-15 00:15:43.175633] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070222446591 len:65444 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.263 [2024-07-15 00:15:43.175661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:44.263 #39 NEW cov: 11819 ft: 13168 corp: 12/395b lim: 120 exec/s: 0 rss: 68Mb L: 38/38 MS: 1 InsertByte- 00:07:44.263 [2024-07-15 00:15:43.205717] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2315255808 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.263 [2024-07-15 00:15:43.205745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:44.263 #40 NEW cov: 11819 ft: 13216 corp: 13/429b lim: 120 exec/s: 0 rss: 68Mb L: 34/38 MS: 1 ChangeBinInt- 00:07:44.263 [2024-07-15 00:15:43.245833] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070222446591 len:41984 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.263 [2024-07-15 00:15:43.245862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:44.263 #41 NEW cov: 11819 ft: 13223 corp: 14/466b lim: 120 exec/s: 0 rss: 68Mb L: 37/38 MS: 1 CrossOver- 00:07:44.263 [2024-07-15 00:15:43.285922] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2315255808 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.263 [2024-07-15 00:15:43.285950] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:44.263 #42 NEW cov: 11819 ft: 13275 corp: 15/507b lim: 120 exec/s: 0 rss: 68Mb L: 41/41 MS: 1 CopyPart- 00:07:44.263 [2024-07-15 00:15:43.316324] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744027272773631 len:62966 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.263 [2024-07-15 00:15:43.316353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:44.263 [2024-07-15 00:15:43.316390] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:17723342345328784885 len:62966 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.263 [2024-07-15 00:15:43.316407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:44.263 [2024-07-15 00:15:43.316471] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:17723353383394735605 len:41984 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.263 [2024-07-15 00:15:43.316489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:44.521 #43 NEW cov: 11819 ft: 14175 corp: 16/591b lim: 120 exec/s: 0 rss: 68Mb L: 84/84 MS: 1 InsertRepeatedBytes- 00:07:44.521 [2024-07-15 00:15:43.366154] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2315255808 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.521 [2024-07-15 00:15:43.366182] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:44.521 NEW_FUNC[1/1]: 0x195e300 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:44.521 #44 NEW cov: 11842 ft: 14204 corp: 17/615b lim: 120 exec/s: 0 rss: 68Mb L: 24/84 MS: 1 EraseBytes- 00:07:44.521 [2024-07-15 00:15:43.406424] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2315255808 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.521 [2024-07-15 00:15:43.406457] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:44.521 [2024-07-15 00:15:43.406514] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.521 [2024-07-15 00:15:43.406528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:44.521 #50 NEW cov: 11842 ft: 14541 corp: 18/686b lim: 120 exec/s: 0 rss: 68Mb L: 71/84 MS: 1 InsertRepeatedBytes- 00:07:44.521 [2024-07-15 00:15:43.446460] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2315255808 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.521 [2024-07-15 00:15:43.446488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:44.521 #51 NEW cov: 11842 ft: 14569 corp: 19/720b lim: 120 exec/s: 0 rss: 68Mb L: 34/84 MS: 1 ChangeByte- 00:07:44.521 [2024-07-15 00:15:43.486536] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070222446591 len:41984 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.521 [2024-07-15 00:15:43.486564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:44.521 #52 NEW cov: 11842 ft: 14628 corp: 20/757b lim: 120 exec/s: 52 rss: 68Mb L: 37/84 MS: 1 CMP- DE: "\011\000\000\000"- 00:07:44.521 [2024-07-15 00:15:43.526619] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070222446591 len:41984 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.521 [2024-07-15 00:15:43.526647] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:44.521 #53 NEW cov: 11842 ft: 14745 corp: 21/794b lim: 120 exec/s: 53 rss: 68Mb L: 37/84 MS: 1 CopyPart- 00:07:44.521 [2024-07-15 00:15:43.566755] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070222446591 len:41984 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.521 [2024-07-15 00:15:43.566783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:44.780 #54 NEW cov: 11842 ft: 14777 corp: 22/831b lim: 120 exec/s: 54 rss: 69Mb L: 37/84 MS: 1 ShuffleBytes- 00:07:44.780 [2024-07-15 00:15:43.606902] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070222446550 len:65444 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.780 [2024-07-15 00:15:43.606930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:44.780 #55 NEW cov: 11842 ft: 14789 corp: 23/869b lim: 120 exec/s: 55 rss: 69Mb L: 38/84 MS: 1 InsertByte- 00:07:44.780 [2024-07-15 00:15:43.646968] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070222446591 len:41984 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.780 [2024-07-15 00:15:43.646999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:44.780 #56 NEW cov: 11842 ft: 14795 corp: 24/906b lim: 120 exec/s: 56 rss: 69Mb L: 37/84 MS: 1 ChangeByte- 00:07:44.780 [2024-07-15 00:15:43.687078] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070222446591 len:41984 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.780 [2024-07-15 00:15:43.687104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:44.780 #62 NEW cov: 11842 ft: 14818 corp: 25/930b lim: 120 exec/s: 62 rss: 69Mb L: 24/84 MS: 1 EraseBytes- 00:07:44.780 [2024-07-15 00:15:43.717328] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2315255808 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.780 [2024-07-15 00:15:43.717355] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:44.780 [2024-07-15 00:15:43.717420] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.780 [2024-07-15 00:15:43.717437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:44.780 #63 NEW cov: 11842 ft: 14826 corp: 26/987b lim: 120 exec/s: 63 rss: 69Mb L: 57/84 MS: 1 InsertRepeatedBytes- 00:07:44.780 [2024-07-15 00:15:43.757318] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070222446591 len:65444 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.780 [2024-07-15 00:15:43.757344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:44.780 #64 NEW cov: 11842 ft: 14845 corp: 27/1024b lim: 120 exec/s: 64 rss: 69Mb L: 37/84 MS: 1 ShuffleBytes- 00:07:44.780 [2024-07-15 00:15:43.797746] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2315291136 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.780 [2024-07-15 00:15:43.797774] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:44.780 [2024-07-15 00:15:43.797814] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.780 [2024-07-15 00:15:43.797830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:44.780 [2024-07-15 00:15:43.797887] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.780 [2024-07-15 00:15:43.797903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:44.780 #65 NEW cov: 11842 ft: 14884 corp: 28/1098b lim: 120 exec/s: 65 rss: 69Mb L: 74/84 MS: 1 InsertRepeatedBytes- 00:07:45.039 [2024-07-15 00:15:43.837586] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070222446550 len:65444 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.039 [2024-07-15 00:15:43.837615] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:45.039 #66 NEW cov: 11842 ft: 14905 corp: 29/1136b lim: 120 exec/s: 66 rss: 69Mb L: 38/84 MS: 1 ChangeBit- 00:07:45.039 [2024-07-15 00:15:43.877669] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744069585512191 len:65444 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.039 [2024-07-15 00:15:43.877696] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:45.039 #68 NEW cov: 11842 ft: 14909 corp: 30/1167b lim: 120 exec/s: 68 rss: 69Mb L: 31/84 MS: 2 CopyPart-CrossOver- 00:07:45.039 [2024-07-15 00:15:43.907707] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070035406847 len:65444 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.039 [2024-07-15 00:15:43.907737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:45.039 #69 NEW cov: 11842 ft: 14985 corp: 31/1204b lim: 120 exec/s: 69 rss: 69Mb L: 37/84 MS: 1 ChangeBinInt- 00:07:45.039 [2024-07-15 00:15:43.947822] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070222446591 len:41984 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.039 [2024-07-15 00:15:43.947850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:45.039 #70 NEW cov: 11842 ft: 15006 corp: 32/1241b lim: 120 exec/s: 70 rss: 69Mb L: 37/84 MS: 1 ShuffleBytes- 00:07:45.039 [2024-07-15 00:15:43.977902] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2315255808 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.039 [2024-07-15 00:15:43.977928] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:45.039 #71 NEW cov: 11842 ft: 15017 corp: 33/1265b lim: 120 exec/s: 71 rss: 69Mb L: 24/84 MS: 1 ChangeBit- 00:07:45.039 [2024-07-15 00:15:44.018343] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:77268776910848 len:17991 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.039 [2024-07-15 00:15:44.018371] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:45.039 [2024-07-15 00:15:44.018408] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:5063812098665367110 len:17991 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.039 [2024-07-15 00:15:44.018424] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:45.039 [2024-07-15 00:15:44.018502] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:5063812098665367110 len:17991 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.039 [2024-07-15 00:15:44.018518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:45.039 #75 NEW cov: 11842 ft: 15096 corp: 34/1348b lim: 120 exec/s: 75 rss: 69Mb L: 83/84 MS: 4 EraseBytes-EraseBytes-CrossOver-InsertRepeatedBytes- 00:07:45.039 [2024-07-15 00:15:44.058481] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:77268776910848 len:17991 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.039 [2024-07-15 00:15:44.058508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:45.039 [2024-07-15 00:15:44.058566] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:5063812098665367110 len:47431 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.039 [2024-07-15 00:15:44.058582] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:45.039 [2024-07-15 00:15:44.058640] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:5063812098665367110 len:17991 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.039 [2024-07-15 00:15:44.058654] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:45.039 #76 NEW cov: 11842 ft: 15155 corp: 35/1431b lim: 120 exec/s: 76 rss: 70Mb L: 83/84 MS: 1 ChangeBinInt- 00:07:45.298 [2024-07-15 00:15:44.098299] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446463694757363711 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.298 [2024-07-15 00:15:44.098326] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:45.299 #77 NEW cov: 11842 ft: 15215 corp: 36/1467b lim: 120 exec/s: 77 rss: 70Mb L: 36/84 MS: 1 ChangeByte- 00:07:45.299 [2024-07-15 00:15:44.138402] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070222446591 len:41984 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.299 [2024-07-15 00:15:44.138433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:45.299 #78 NEW cov: 11842 ft: 15225 corp: 37/1504b lim: 120 exec/s: 78 rss: 70Mb L: 37/84 MS: 1 ChangeBit- 00:07:45.299 [2024-07-15 00:15:44.178483] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070222446591 len:65444 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.299 [2024-07-15 00:15:44.178510] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:45.299 #79 NEW cov: 11842 ft: 15239 corp: 38/1541b lim: 120 exec/s: 79 rss: 70Mb L: 37/84 MS: 1 ChangeBit- 00:07:45.299 [2024-07-15 00:15:44.218768] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070035406847 len:65444 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.299 [2024-07-15 00:15:44.218795] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:45.299 [2024-07-15 00:15:44.218841] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.299 [2024-07-15 00:15:44.218856] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:45.299 #80 NEW cov: 11842 ft: 15268 corp: 39/1611b lim: 120 exec/s: 80 rss: 70Mb L: 70/84 MS: 1 InsertRepeatedBytes- 00:07:45.299 [2024-07-15 00:15:44.258907] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070035406847 len:65444 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.299 [2024-07-15 00:15:44.258935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:45.299 [2024-07-15 00:15:44.258978] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:12327 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.299 [2024-07-15 00:15:44.258994] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:45.299 #81 NEW cov: 11842 ft: 15269 corp: 40/1664b lim: 120 exec/s: 81 rss: 70Mb L: 53/84 MS: 1 CrossOver- 00:07:45.299 [2024-07-15 00:15:44.298901] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2315256056 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.299 [2024-07-15 00:15:44.298928] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:45.299 #82 NEW cov: 11842 ft: 15278 corp: 41/1688b lim: 120 exec/s: 82 rss: 70Mb L: 24/84 MS: 1 ChangeBinInt- 00:07:45.299 [2024-07-15 00:15:44.329478] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2315255808 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.299 [2024-07-15 00:15:44.329506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:45.299 [2024-07-15 00:15:44.329558] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.299 [2024-07-15 00:15:44.329574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:45.299 [2024-07-15 00:15:44.329631] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.299 [2024-07-15 00:15:44.329647] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:45.299 [2024-07-15 00:15:44.329703] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:256 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.299 [2024-07-15 00:15:44.329719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:45.559 #83 NEW cov: 11842 ft: 15720 corp: 42/1797b lim: 120 exec/s: 83 rss: 70Mb L: 109/109 MS: 1 InsertRepeatedBytes- 00:07:45.559 [2024-07-15 00:15:44.379096] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070222184447 len:41984 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.559 [2024-07-15 00:15:44.379125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:45.559 #84 NEW cov: 11842 ft: 15754 corp: 43/1834b lim: 120 exec/s: 84 rss: 70Mb L: 37/109 MS: 1 ChangeBit- 00:07:45.559 [2024-07-15 00:15:44.409188] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070222446591 len:41984 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.559 [2024-07-15 00:15:44.409216] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:45.559 #85 NEW cov: 11842 ft: 15800 corp: 44/1871b lim: 120 exec/s: 85 rss: 70Mb L: 37/109 MS: 1 ChangeBinInt- 00:07:45.559 [2024-07-15 00:15:44.449663] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:77268776910848 len:17991 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.559 [2024-07-15 00:15:44.449690] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:45.559 [2024-07-15 00:15:44.449745] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:5063812098665367110 len:17991 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.559 [2024-07-15 00:15:44.449761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:45.559 [2024-07-15 00:15:44.449823] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:5063812098665367110 len:17991 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.559 [2024-07-15 00:15:44.449839] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:45.559 #86 NEW cov: 11842 ft: 15839 corp: 45/1958b lim: 120 exec/s: 86 rss: 70Mb L: 87/109 MS: 1 PersAutoDict- DE: "\011\000\000\000"- 00:07:45.559 [2024-07-15 00:15:44.489944] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2315255808 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.559 [2024-07-15 00:15:44.489971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:45.559 [2024-07-15 00:15:44.490009] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.559 [2024-07-15 00:15:44.490026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:45.559 [2024-07-15 00:15:44.490082] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.559 [2024-07-15 00:15:44.490097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:45.559 [2024-07-15 00:15:44.490154] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446743008557662208 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.559 [2024-07-15 00:15:44.490168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:45.559 #87 NEW cov: 11842 ft: 15843 corp: 46/2071b lim: 120 exec/s: 43 rss: 70Mb L: 113/113 MS: 1 CMP- DE: "\377\377\377\010"- 00:07:45.559 #87 DONE cov: 11842 ft: 15843 corp: 46/2071b lim: 120 exec/s: 43 rss: 70Mb 00:07:45.559 ###### Recommended dictionary. ###### 00:07:45.559 "\011\000\000\000" # Uses: 1 00:07:45.559 "\377\377\377\010" # Uses: 0 00:07:45.559 ###### End of recommended dictionary. ###### 00:07:45.559 Done 87 runs in 2 second(s) 00:07:45.819 00:15:44 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_17.conf 00:07:45.819 00:15:44 -- ../common.sh@72 -- # (( i++ )) 00:07:45.819 00:15:44 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:45.819 00:15:44 -- ../common.sh@73 -- # start_llvm_fuzz 18 1 0x1 00:07:45.819 00:15:44 -- nvmf/run.sh@23 -- # local fuzzer_type=18 00:07:45.819 00:15:44 -- nvmf/run.sh@24 -- # local timen=1 00:07:45.819 00:15:44 -- nvmf/run.sh@25 -- # local core=0x1 00:07:45.819 00:15:44 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:07:45.819 00:15:44 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_18.conf 00:07:45.819 00:15:44 -- nvmf/run.sh@29 -- # printf %02d 18 00:07:45.819 00:15:44 -- nvmf/run.sh@29 -- # port=4418 00:07:45.819 00:15:44 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:07:45.819 00:15:44 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4418' 00:07:45.819 00:15:44 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4418"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:45.819 00:15:44 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4418' -c /tmp/fuzz_json_18.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 -Z 18 -r /var/tmp/spdk18.sock 00:07:45.819 [2024-07-15 00:15:44.684016] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:45.819 [2024-07-15 00:15:44.684108] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid334253 ] 00:07:45.819 EAL: No free 2048 kB hugepages reported on node 1 00:07:45.819 [2024-07-15 00:15:44.866796] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:46.078 [2024-07-15 00:15:44.930869] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:46.078 [2024-07-15 00:15:44.931014] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:46.078 [2024-07-15 00:15:44.989180] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:46.078 [2024-07-15 00:15:45.005483] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4418 *** 00:07:46.078 INFO: Running with entropic power schedule (0xFF, 100). 00:07:46.078 INFO: Seed: 3894625877 00:07:46.078 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x280a94c, 0x285dea9), 00:07:46.078 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x285deb0,0x2d93480), 00:07:46.078 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:07:46.078 INFO: A corpus is not provided, starting from an empty corpus 00:07:46.078 #2 INITED exec/s: 0 rss: 60Mb 00:07:46.078 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:46.078 This may also happen if the target rejected all inputs we tried so far 00:07:46.078 [2024-07-15 00:15:45.060626] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:46.078 [2024-07-15 00:15:45.060655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:46.338 NEW_FUNC[1/670]: 0x49e290 in fuzz_nvm_write_zeroes_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:562 00:07:46.338 NEW_FUNC[2/670]: 0x4bd260 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:46.338 #42 NEW cov: 11559 ft: 11560 corp: 2/30b lim: 100 exec/s: 0 rss: 67Mb L: 29/29 MS: 5 ChangeBit-CopyPart-CopyPart-CopyPart-InsertRepeatedBytes- 00:07:46.338 [2024-07-15 00:15:45.391422] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:46.338 [2024-07-15 00:15:45.391463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:46.597 #43 NEW cov: 11672 ft: 12066 corp: 3/59b lim: 100 exec/s: 0 rss: 67Mb L: 29/29 MS: 1 ChangeBit- 00:07:46.597 [2024-07-15 00:15:45.431510] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:46.597 [2024-07-15 00:15:45.431540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:46.597 #44 NEW cov: 11678 ft: 12445 corp: 4/89b lim: 100 exec/s: 0 rss: 67Mb L: 30/30 MS: 1 CopyPart- 00:07:46.597 [2024-07-15 00:15:45.471851] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:46.597 [2024-07-15 00:15:45.471877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:46.597 [2024-07-15 00:15:45.471929] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:46.597 [2024-07-15 00:15:45.471943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:46.597 [2024-07-15 00:15:45.471998] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:46.597 [2024-07-15 00:15:45.472013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:46.597 #45 NEW cov: 11763 ft: 13046 corp: 5/154b lim: 100 exec/s: 0 rss: 67Mb L: 65/65 MS: 1 InsertRepeatedBytes- 00:07:46.598 [2024-07-15 00:15:45.511916] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:46.598 [2024-07-15 00:15:45.511943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:46.598 [2024-07-15 00:15:45.511981] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:46.598 [2024-07-15 00:15:45.511997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:46.598 [2024-07-15 00:15:45.512050] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:46.598 [2024-07-15 00:15:45.512065] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:46.598 #46 NEW cov: 11763 ft: 13148 corp: 6/216b lim: 100 exec/s: 0 rss: 67Mb L: 62/65 MS: 1 InsertRepeatedBytes- 00:07:46.598 [2024-07-15 00:15:45.552066] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:46.598 [2024-07-15 00:15:45.552092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:46.598 [2024-07-15 00:15:45.552129] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:46.598 [2024-07-15 00:15:45.552144] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:46.598 [2024-07-15 00:15:45.552199] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:46.598 [2024-07-15 00:15:45.552215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:46.598 #47 NEW cov: 11763 ft: 13180 corp: 7/284b lim: 100 exec/s: 0 rss: 67Mb L: 68/68 MS: 1 InsertRepeatedBytes- 00:07:46.598 [2024-07-15 00:15:45.592079] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:46.598 [2024-07-15 00:15:45.592105] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:46.598 [2024-07-15 00:15:45.592157] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:46.598 [2024-07-15 00:15:45.592172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:46.598 #48 NEW cov: 11763 ft: 13450 corp: 8/330b lim: 100 exec/s: 0 rss: 67Mb L: 46/68 MS: 1 InsertRepeatedBytes- 00:07:46.598 [2024-07-15 00:15:45.632291] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:46.598 [2024-07-15 00:15:45.632321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:46.598 [2024-07-15 00:15:45.632372] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:46.598 [2024-07-15 00:15:45.632386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:46.598 [2024-07-15 00:15:45.632445] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:46.598 [2024-07-15 00:15:45.632461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:46.598 #49 NEW cov: 11763 ft: 13497 corp: 9/392b lim: 100 exec/s: 0 rss: 67Mb L: 62/68 MS: 1 CopyPart- 00:07:46.857 [2024-07-15 00:15:45.672256] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:46.857 [2024-07-15 00:15:45.672283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:46.857 [2024-07-15 00:15:45.672335] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:46.857 [2024-07-15 00:15:45.672351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:46.857 #50 NEW cov: 11763 ft: 13568 corp: 10/439b lim: 100 exec/s: 0 rss: 68Mb L: 47/68 MS: 1 InsertByte- 00:07:46.857 [2024-07-15 00:15:45.712415] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:46.857 [2024-07-15 00:15:45.712440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:46.857 [2024-07-15 00:15:45.712510] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:46.857 [2024-07-15 00:15:45.712524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:46.858 #51 NEW cov: 11763 ft: 13607 corp: 11/485b lim: 100 exec/s: 0 rss: 68Mb L: 46/68 MS: 1 CrossOver- 00:07:46.858 [2024-07-15 00:15:45.752376] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:46.858 [2024-07-15 00:15:45.752401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:46.858 #52 NEW cov: 11763 ft: 13675 corp: 12/514b lim: 100 exec/s: 0 rss: 68Mb L: 29/68 MS: 1 ShuffleBytes- 00:07:46.858 [2024-07-15 00:15:45.792538] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:46.858 [2024-07-15 00:15:45.792563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:46.858 #53 NEW cov: 11763 ft: 13723 corp: 13/543b lim: 100 exec/s: 0 rss: 68Mb L: 29/68 MS: 1 ChangeByte- 00:07:46.858 [2024-07-15 00:15:45.832657] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:46.858 [2024-07-15 00:15:45.832681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:46.858 #54 NEW cov: 11763 ft: 13761 corp: 14/570b lim: 100 exec/s: 0 rss: 68Mb L: 27/68 MS: 1 EraseBytes- 00:07:46.858 [2024-07-15 00:15:45.872906] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:46.858 [2024-07-15 00:15:45.872931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:46.858 [2024-07-15 00:15:45.872998] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:46.858 [2024-07-15 00:15:45.873012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:46.858 #55 NEW cov: 11763 ft: 13779 corp: 15/624b lim: 100 exec/s: 0 rss: 68Mb L: 54/68 MS: 1 InsertRepeatedBytes- 00:07:46.858 [2024-07-15 00:15:45.913289] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:46.858 [2024-07-15 00:15:45.913315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:46.858 [2024-07-15 00:15:45.913381] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:46.858 [2024-07-15 00:15:45.913397] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:46.858 [2024-07-15 00:15:45.913458] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:46.858 [2024-07-15 00:15:45.913472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:46.858 [2024-07-15 00:15:45.913525] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:46.858 [2024-07-15 00:15:45.913540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:47.117 NEW_FUNC[1/1]: 0x195e300 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:47.117 #56 NEW cov: 11786 ft: 14080 corp: 16/723b lim: 100 exec/s: 0 rss: 68Mb L: 99/99 MS: 1 CrossOver- 00:07:47.117 [2024-07-15 00:15:45.963432] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:47.117 [2024-07-15 00:15:45.963461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:47.117 [2024-07-15 00:15:45.963530] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:47.117 [2024-07-15 00:15:45.963545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:47.117 [2024-07-15 00:15:45.963596] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:47.117 [2024-07-15 00:15:45.963611] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:47.117 [2024-07-15 00:15:45.963661] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:47.118 [2024-07-15 00:15:45.963677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:47.118 #57 NEW cov: 11786 ft: 14101 corp: 17/805b lim: 100 exec/s: 0 rss: 68Mb L: 82/99 MS: 1 CopyPart- 00:07:47.118 [2024-07-15 00:15:46.003114] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:47.118 [2024-07-15 00:15:46.003140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:47.118 #58 NEW cov: 11786 ft: 14136 corp: 18/832b lim: 100 exec/s: 0 rss: 68Mb L: 27/99 MS: 1 ChangeByte- 00:07:47.118 [2024-07-15 00:15:46.043635] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:47.118 [2024-07-15 00:15:46.043663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:47.118 [2024-07-15 00:15:46.043711] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:47.118 [2024-07-15 00:15:46.043728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:47.118 [2024-07-15 00:15:46.043785] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:47.118 [2024-07-15 00:15:46.043801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:47.118 [2024-07-15 00:15:46.043859] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:47.118 [2024-07-15 00:15:46.043873] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:47.118 #59 NEW cov: 11786 ft: 14189 corp: 19/925b lim: 100 exec/s: 59 rss: 68Mb L: 93/99 MS: 1 InsertRepeatedBytes- 00:07:47.118 [2024-07-15 00:15:46.083591] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:47.118 [2024-07-15 00:15:46.083618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:47.118 [2024-07-15 00:15:46.083657] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:47.118 [2024-07-15 00:15:46.083673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:47.118 [2024-07-15 00:15:46.083730] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:47.118 [2024-07-15 00:15:46.083744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:47.118 #60 NEW cov: 11786 ft: 14246 corp: 20/1004b lim: 100 exec/s: 60 rss: 69Mb L: 79/99 MS: 1 InsertRepeatedBytes- 00:07:47.118 [2024-07-15 00:15:46.123455] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:47.118 [2024-07-15 00:15:46.123482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:47.118 #61 NEW cov: 11786 ft: 14273 corp: 21/1031b lim: 100 exec/s: 61 rss: 69Mb L: 27/99 MS: 1 ChangeBinInt- 00:07:47.118 [2024-07-15 00:15:46.163916] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:47.118 [2024-07-15 00:15:46.163942] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:47.118 [2024-07-15 00:15:46.163981] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:47.118 [2024-07-15 00:15:46.163996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:47.118 [2024-07-15 00:15:46.164051] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:47.118 [2024-07-15 00:15:46.164065] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:47.118 [2024-07-15 00:15:46.164120] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:47.118 [2024-07-15 00:15:46.164136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:47.377 #62 NEW cov: 11786 ft: 14311 corp: 22/1129b lim: 100 exec/s: 62 rss: 69Mb L: 98/99 MS: 1 InsertRepeatedBytes- 00:07:47.377 [2024-07-15 00:15:46.204082] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:47.377 [2024-07-15 00:15:46.204109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:47.377 [2024-07-15 00:15:46.204159] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:47.377 [2024-07-15 00:15:46.204173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:47.377 [2024-07-15 00:15:46.204228] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:47.377 [2024-07-15 00:15:46.204243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:47.377 [2024-07-15 00:15:46.204301] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:47.377 [2024-07-15 00:15:46.204315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:47.377 [2024-07-15 00:15:46.234175] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:47.377 [2024-07-15 00:15:46.234203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:47.377 [2024-07-15 00:15:46.234241] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:47.377 [2024-07-15 00:15:46.234255] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:47.377 [2024-07-15 00:15:46.234310] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:47.377 [2024-07-15 00:15:46.234341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:47.377 [2024-07-15 00:15:46.234398] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:47.377 [2024-07-15 00:15:46.234412] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:47.377 #64 NEW cov: 11786 ft: 14321 corp: 23/1227b lim: 100 exec/s: 64 rss: 69Mb L: 98/99 MS: 2 InsertRepeatedBytes-CMP- DE: "\177\000"- 00:07:47.377 [2024-07-15 00:15:46.273907] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:47.377 [2024-07-15 00:15:46.273932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:47.377 #65 NEW cov: 11786 ft: 14341 corp: 24/1254b lim: 100 exec/s: 65 rss: 69Mb L: 27/99 MS: 1 ChangeBinInt- 00:07:47.377 [2024-07-15 00:15:46.314048] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:47.377 [2024-07-15 00:15:46.314076] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:47.377 #66 NEW cov: 11786 ft: 14435 corp: 25/1286b lim: 100 exec/s: 66 rss: 69Mb L: 32/99 MS: 1 CopyPart- 00:07:47.377 [2024-07-15 00:15:46.354582] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:47.377 [2024-07-15 00:15:46.354609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:47.377 [2024-07-15 00:15:46.354665] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:47.377 [2024-07-15 00:15:46.354679] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:47.377 [2024-07-15 00:15:46.354731] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:47.377 [2024-07-15 00:15:46.354746] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:47.377 [2024-07-15 00:15:46.354800] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:47.377 [2024-07-15 00:15:46.354815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:47.378 [2024-07-15 00:15:46.354871] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:4 nsid:0 00:07:47.378 [2024-07-15 00:15:46.354885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:47.378 #67 NEW cov: 11786 ft: 14465 corp: 26/1386b lim: 100 exec/s: 67 rss: 69Mb L: 100/100 MS: 1 CopyPart- 00:07:47.378 [2024-07-15 00:15:46.394248] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:47.378 [2024-07-15 00:15:46.394275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:47.378 #68 NEW cov: 11786 ft: 14486 corp: 27/1414b lim: 100 exec/s: 68 rss: 69Mb L: 28/100 MS: 1 InsertByte- 00:07:47.378 [2024-07-15 00:15:46.424604] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:47.378 [2024-07-15 00:15:46.424631] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:47.378 [2024-07-15 00:15:46.424684] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:47.378 [2024-07-15 00:15:46.424699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:47.378 [2024-07-15 00:15:46.424757] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:47.378 [2024-07-15 00:15:46.424772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:47.637 #69 NEW cov: 11786 ft: 14541 corp: 28/1482b lim: 100 exec/s: 69 rss: 69Mb L: 68/100 MS: 1 CopyPart- 00:07:47.637 [2024-07-15 00:15:46.464821] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:47.637 [2024-07-15 00:15:46.464848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:47.637 [2024-07-15 00:15:46.464886] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:47.637 [2024-07-15 00:15:46.464901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:47.637 [2024-07-15 00:15:46.464956] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:47.637 [2024-07-15 00:15:46.464971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:47.637 [2024-07-15 00:15:46.465028] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:47.637 [2024-07-15 00:15:46.465043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:47.637 #70 NEW cov: 11786 ft: 14545 corp: 29/1565b lim: 100 exec/s: 70 rss: 69Mb L: 83/100 MS: 1 InsertByte- 00:07:47.637 [2024-07-15 00:15:46.504568] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:47.637 [2024-07-15 00:15:46.504596] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:47.637 #76 NEW cov: 11786 ft: 14586 corp: 30/1592b lim: 100 exec/s: 76 rss: 69Mb L: 27/100 MS: 1 ChangeBit- 00:07:47.637 [2024-07-15 00:15:46.544708] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:47.637 [2024-07-15 00:15:46.544736] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:47.637 #77 NEW cov: 11786 ft: 14608 corp: 31/1619b lim: 100 exec/s: 77 rss: 69Mb L: 27/100 MS: 1 CopyPart- 00:07:47.637 [2024-07-15 00:15:46.575093] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:47.637 [2024-07-15 00:15:46.575119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:47.637 [2024-07-15 00:15:46.575163] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:47.637 [2024-07-15 00:15:46.575177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:47.637 [2024-07-15 00:15:46.575232] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:47.637 [2024-07-15 00:15:46.575246] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:47.637 #78 NEW cov: 11786 ft: 14618 corp: 32/1679b lim: 100 exec/s: 78 rss: 70Mb L: 60/100 MS: 1 CrossOver- 00:07:47.637 [2024-07-15 00:15:46.614941] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:47.637 [2024-07-15 00:15:46.614967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:47.637 #79 NEW cov: 11786 ft: 14632 corp: 33/1711b lim: 100 exec/s: 79 rss: 70Mb L: 32/100 MS: 1 ChangeBinInt- 00:07:47.637 [2024-07-15 00:15:46.655031] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:47.637 [2024-07-15 00:15:46.655057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:47.637 #80 NEW cov: 11786 ft: 14646 corp: 34/1743b lim: 100 exec/s: 80 rss: 70Mb L: 32/100 MS: 1 ChangeBit- 00:07:47.896 [2024-07-15 00:15:46.695227] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:47.896 [2024-07-15 00:15:46.695255] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:47.896 #81 NEW cov: 11786 ft: 14672 corp: 35/1775b lim: 100 exec/s: 81 rss: 70Mb L: 32/100 MS: 1 ChangeByte- 00:07:47.896 [2024-07-15 00:15:46.725477] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:47.896 [2024-07-15 00:15:46.725503] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:47.896 [2024-07-15 00:15:46.725553] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:47.896 [2024-07-15 00:15:46.725566] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:47.896 [2024-07-15 00:15:46.725624] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:47.896 [2024-07-15 00:15:46.725640] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:47.896 #82 NEW cov: 11786 ft: 14709 corp: 36/1837b lim: 100 exec/s: 82 rss: 70Mb L: 62/100 MS: 1 ChangeBit- 00:07:47.896 [2024-07-15 00:15:46.765391] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:47.896 [2024-07-15 00:15:46.765417] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:47.896 #83 NEW cov: 11786 ft: 14712 corp: 37/1866b lim: 100 exec/s: 83 rss: 70Mb L: 29/100 MS: 1 PersAutoDict- DE: "\177\000"- 00:07:47.896 [2024-07-15 00:15:46.795808] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:47.896 [2024-07-15 00:15:46.795834] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:47.896 [2024-07-15 00:15:46.795884] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:47.896 [2024-07-15 00:15:46.795898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:47.896 [2024-07-15 00:15:46.795955] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:47.896 [2024-07-15 00:15:46.795969] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:47.896 [2024-07-15 00:15:46.796024] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:47.896 [2024-07-15 00:15:46.796038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:47.896 #84 NEW cov: 11786 ft: 14718 corp: 38/1965b lim: 100 exec/s: 84 rss: 70Mb L: 99/100 MS: 1 InsertByte- 00:07:47.896 [2024-07-15 00:15:46.835863] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:47.896 [2024-07-15 00:15:46.835889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:47.896 [2024-07-15 00:15:46.835934] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:47.896 [2024-07-15 00:15:46.835949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:47.896 [2024-07-15 00:15:46.836012] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:47.896 [2024-07-15 00:15:46.836025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:47.896 #85 NEW cov: 11786 ft: 14724 corp: 39/2037b lim: 100 exec/s: 85 rss: 70Mb L: 72/100 MS: 1 InsertRepeatedBytes- 00:07:47.896 [2024-07-15 00:15:46.865693] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:47.896 [2024-07-15 00:15:46.865718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:47.896 #86 NEW cov: 11786 ft: 14758 corp: 40/2069b lim: 100 exec/s: 86 rss: 70Mb L: 32/100 MS: 1 ChangeBit- 00:07:47.896 [2024-07-15 00:15:46.906020] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:47.896 [2024-07-15 00:15:46.906045] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:47.896 [2024-07-15 00:15:46.906082] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:47.896 [2024-07-15 00:15:46.906097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:47.896 [2024-07-15 00:15:46.906153] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:47.896 [2024-07-15 00:15:46.906166] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:47.896 #87 NEW cov: 11786 ft: 14770 corp: 41/2131b lim: 100 exec/s: 87 rss: 70Mb L: 62/100 MS: 1 ChangeBinInt- 00:07:47.896 [2024-07-15 00:15:46.946251] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:47.896 [2024-07-15 00:15:46.946277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:47.896 [2024-07-15 00:15:46.946325] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:47.896 [2024-07-15 00:15:46.946339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:47.896 [2024-07-15 00:15:46.946392] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:47.896 [2024-07-15 00:15:46.946406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:47.896 [2024-07-15 00:15:46.946463] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:47.896 [2024-07-15 00:15:46.946478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:48.155 #93 NEW cov: 11786 ft: 14799 corp: 42/2229b lim: 100 exec/s: 93 rss: 70Mb L: 98/100 MS: 1 ShuffleBytes- 00:07:48.155 [2024-07-15 00:15:46.986045] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:48.155 [2024-07-15 00:15:46.986070] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:48.155 #94 NEW cov: 11786 ft: 14890 corp: 43/2257b lim: 100 exec/s: 94 rss: 70Mb L: 28/100 MS: 1 InsertByte- 00:07:48.155 [2024-07-15 00:15:47.026277] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:48.155 [2024-07-15 00:15:47.026303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:48.155 [2024-07-15 00:15:47.026340] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:48.155 [2024-07-15 00:15:47.026354] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:48.155 #95 NEW cov: 11786 ft: 14899 corp: 44/2311b lim: 100 exec/s: 47 rss: 70Mb L: 54/100 MS: 1 ChangeByte- 00:07:48.155 #95 DONE cov: 11786 ft: 14899 corp: 44/2311b lim: 100 exec/s: 47 rss: 70Mb 00:07:48.155 ###### Recommended dictionary. ###### 00:07:48.155 "\177\000" # Uses: 6 00:07:48.155 ###### End of recommended dictionary. ###### 00:07:48.155 Done 95 runs in 2 second(s) 00:07:48.155 00:15:47 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_18.conf 00:07:48.155 00:15:47 -- ../common.sh@72 -- # (( i++ )) 00:07:48.155 00:15:47 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:48.155 00:15:47 -- ../common.sh@73 -- # start_llvm_fuzz 19 1 0x1 00:07:48.155 00:15:47 -- nvmf/run.sh@23 -- # local fuzzer_type=19 00:07:48.155 00:15:47 -- nvmf/run.sh@24 -- # local timen=1 00:07:48.155 00:15:47 -- nvmf/run.sh@25 -- # local core=0x1 00:07:48.155 00:15:47 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:07:48.155 00:15:47 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_19.conf 00:07:48.155 00:15:47 -- nvmf/run.sh@29 -- # printf %02d 19 00:07:48.155 00:15:47 -- nvmf/run.sh@29 -- # port=4419 00:07:48.155 00:15:47 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:07:48.155 00:15:47 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4419' 00:07:48.155 00:15:47 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4419"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:48.155 00:15:47 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4419' -c /tmp/fuzz_json_19.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 -Z 19 -r /var/tmp/spdk19.sock 00:07:48.413 [2024-07-15 00:15:47.212080] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:48.413 [2024-07-15 00:15:47.212174] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid334714 ] 00:07:48.413 EAL: No free 2048 kB hugepages reported on node 1 00:07:48.413 [2024-07-15 00:15:47.388505] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:48.413 [2024-07-15 00:15:47.452449] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:48.413 [2024-07-15 00:15:47.452598] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:48.672 [2024-07-15 00:15:47.510656] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:48.672 [2024-07-15 00:15:47.526949] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4419 *** 00:07:48.672 INFO: Running with entropic power schedule (0xFF, 100). 00:07:48.672 INFO: Seed: 2120654711 00:07:48.672 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x280a94c, 0x285dea9), 00:07:48.672 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x285deb0,0x2d93480), 00:07:48.672 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:07:48.672 INFO: A corpus is not provided, starting from an empty corpus 00:07:48.672 #2 INITED exec/s: 0 rss: 60Mb 00:07:48.672 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:48.672 This may also happen if the target rejected all inputs we tried so far 00:07:48.672 [2024-07-15 00:15:47.603173] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 00:07:48.672 [2024-07-15 00:15:47.603208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:48.672 [2024-07-15 00:15:47.603330] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:07:48.672 [2024-07-15 00:15:47.603354] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:48.672 [2024-07-15 00:15:47.603478] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:07:48.672 [2024-07-15 00:15:47.603502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:48.930 NEW_FUNC[1/669]: 0x4a1250 in fuzz_nvm_write_uncorrectable_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:582 00:07:48.930 NEW_FUNC[2/669]: 0x4bd260 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:48.930 #15 NEW cov: 11519 ft: 11538 corp: 2/40b lim: 50 exec/s: 0 rss: 67Mb L: 39/39 MS: 3 ChangeByte-CrossOver-InsertRepeatedBytes- 00:07:48.930 [2024-07-15 00:15:47.944030] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 00:07:48.930 [2024-07-15 00:15:47.944089] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:48.930 [2024-07-15 00:15:47.944192] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:07:48.930 [2024-07-15 00:15:47.944221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:48.930 [2024-07-15 00:15:47.944348] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446742978492891135 len:1 00:07:48.930 [2024-07-15 00:15:47.944376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:48.930 NEW_FUNC[1/1]: 0x1cae0f0 in thread_execute_poller /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:932 00:07:48.930 #16 NEW cov: 11650 ft: 12221 corp: 3/79b lim: 50 exec/s: 0 rss: 67Mb L: 39/39 MS: 1 ChangeBinInt- 00:07:49.189 [2024-07-15 00:15:47.993570] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 00:07:49.189 [2024-07-15 00:15:47.993606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:49.189 [2024-07-15 00:15:47.993728] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:07:49.189 [2024-07-15 00:15:47.993745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:49.189 [2024-07-15 00:15:47.993865] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446742978492891135 len:1 00:07:49.189 [2024-07-15 00:15:47.993888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:49.189 #17 NEW cov: 11656 ft: 12532 corp: 4/118b lim: 50 exec/s: 0 rss: 67Mb L: 39/39 MS: 1 ShuffleBytes- 00:07:49.189 [2024-07-15 00:15:48.034074] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 00:07:49.189 [2024-07-15 00:15:48.034106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:49.189 [2024-07-15 00:15:48.034194] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:07:49.189 [2024-07-15 00:15:48.034215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:49.189 [2024-07-15 00:15:48.034326] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446743021442564095 len:65536 00:07:49.189 [2024-07-15 00:15:48.034347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:49.189 #18 NEW cov: 11741 ft: 12826 corp: 5/148b lim: 50 exec/s: 0 rss: 67Mb L: 30/39 MS: 1 CrossOver- 00:07:49.189 [2024-07-15 00:15:48.074178] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 00:07:49.189 [2024-07-15 00:15:48.074211] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:49.189 [2024-07-15 00:15:48.074332] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:07:49.189 [2024-07-15 00:15:48.074352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:49.189 [2024-07-15 00:15:48.074465] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709488895 len:2816 00:07:49.189 [2024-07-15 00:15:48.074486] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:49.189 #19 NEW cov: 11741 ft: 12896 corp: 6/179b lim: 50 exec/s: 0 rss: 67Mb L: 31/39 MS: 1 CrossOver- 00:07:49.189 [2024-07-15 00:15:48.114344] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 00:07:49.189 [2024-07-15 00:15:48.114375] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:49.189 [2024-07-15 00:15:48.114455] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:07:49.189 [2024-07-15 00:15:48.114478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:49.189 [2024-07-15 00:15:48.114594] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446742978492891135 len:1 00:07:49.189 [2024-07-15 00:15:48.114616] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:49.189 #20 NEW cov: 11741 ft: 12963 corp: 7/218b lim: 50 exec/s: 0 rss: 67Mb L: 39/39 MS: 1 ShuffleBytes- 00:07:49.189 [2024-07-15 00:15:48.154527] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18391293503297552383 len:65536 00:07:49.189 [2024-07-15 00:15:48.154558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:49.189 [2024-07-15 00:15:48.154643] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:07:49.189 [2024-07-15 00:15:48.154666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:49.189 [2024-07-15 00:15:48.154783] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709488895 len:2816 00:07:49.189 [2024-07-15 00:15:48.154802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:49.189 #21 NEW cov: 11741 ft: 13018 corp: 8/249b lim: 50 exec/s: 0 rss: 68Mb L: 31/39 MS: 1 ChangeByte- 00:07:49.189 [2024-07-15 00:15:48.194664] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 00:07:49.189 [2024-07-15 00:15:48.194693] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:49.189 [2024-07-15 00:15:48.194798] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:07:49.189 [2024-07-15 00:15:48.194825] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:49.189 [2024-07-15 00:15:48.194939] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:07:49.189 [2024-07-15 00:15:48.194962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:49.189 #22 NEW cov: 11741 ft: 13047 corp: 9/288b lim: 50 exec/s: 0 rss: 68Mb L: 39/39 MS: 1 ShuffleBytes- 00:07:49.189 [2024-07-15 00:15:48.234411] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:14793702183455643 len:39681 00:07:49.189 [2024-07-15 00:15:48.234435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:49.447 #27 NEW cov: 11741 ft: 13422 corp: 10/299b lim: 50 exec/s: 0 rss: 68Mb L: 11/39 MS: 5 ShuffleBytes-CMP-CopyPart-EraseBytes-CopyPart- DE: "3\2474\216\313\233*\000"- 00:07:49.447 [2024-07-15 00:15:48.274544] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:14793702200232859 len:39681 00:07:49.447 [2024-07-15 00:15:48.274572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:49.447 #28 NEW cov: 11741 ft: 13465 corp: 11/310b lim: 50 exec/s: 0 rss: 68Mb L: 11/39 MS: 1 ChangeASCIIInt- 00:07:49.447 [2024-07-15 00:15:48.314677] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3787188205641243392 len:39681 00:07:49.447 [2024-07-15 00:15:48.314701] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:49.447 #29 NEW cov: 11741 ft: 13521 corp: 12/321b lim: 50 exec/s: 0 rss: 68Mb L: 11/39 MS: 1 CopyPart- 00:07:49.447 [2024-07-15 00:15:48.355099] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:792633534417207295 len:65536 00:07:49.447 [2024-07-15 00:15:48.355130] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:49.447 [2024-07-15 00:15:48.355219] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744069599133695 len:65536 00:07:49.447 [2024-07-15 00:15:48.355241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:49.447 [2024-07-15 00:15:48.355346] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:07:49.447 [2024-07-15 00:15:48.355367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:49.447 #30 NEW cov: 11741 ft: 13551 corp: 13/360b lim: 50 exec/s: 0 rss: 68Mb L: 39/39 MS: 1 CrossOver- 00:07:49.447 [2024-07-15 00:15:48.395292] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 00:07:49.447 [2024-07-15 00:15:48.395320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:49.447 [2024-07-15 00:15:48.395430] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:07:49.447 [2024-07-15 00:15:48.395454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:49.447 [2024-07-15 00:15:48.395571] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551360 len:1 00:07:49.447 [2024-07-15 00:15:48.395590] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:49.447 #31 NEW cov: 11741 ft: 13640 corp: 14/399b lim: 50 exec/s: 0 rss: 68Mb L: 39/39 MS: 1 ShuffleBytes- 00:07:49.447 [2024-07-15 00:15:48.435291] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 00:07:49.447 [2024-07-15 00:15:48.435323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:49.447 [2024-07-15 00:15:48.435405] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073695985663 len:65536 00:07:49.447 [2024-07-15 00:15:48.435422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:49.447 [2024-07-15 00:15:48.435542] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:07:49.447 [2024-07-15 00:15:48.435560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:49.447 #32 NEW cov: 11741 ft: 13670 corp: 15/438b lim: 50 exec/s: 0 rss: 68Mb L: 39/39 MS: 1 ChangeByte- 00:07:49.447 [2024-07-15 00:15:48.475461] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 00:07:49.447 [2024-07-15 00:15:48.475490] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:49.447 [2024-07-15 00:15:48.475547] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:07:49.447 [2024-07-15 00:15:48.475570] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:49.447 [2024-07-15 00:15:48.475685] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446742978492891135 len:1 00:07:49.447 [2024-07-15 00:15:48.475708] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:49.447 NEW_FUNC[1/1]: 0x195e300 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:49.447 #33 NEW cov: 11764 ft: 13723 corp: 16/477b lim: 50 exec/s: 0 rss: 68Mb L: 39/39 MS: 1 ChangeByte- 00:07:49.731 [2024-07-15 00:15:48.515581] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073698934783 len:65536 00:07:49.731 [2024-07-15 00:15:48.515612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:49.731 [2024-07-15 00:15:48.515689] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:07:49.731 [2024-07-15 00:15:48.515711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:49.731 [2024-07-15 00:15:48.515821] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446743021442564095 len:65536 00:07:49.731 [2024-07-15 00:15:48.515841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:49.731 #34 NEW cov: 11764 ft: 13735 corp: 17/507b lim: 50 exec/s: 0 rss: 68Mb L: 30/39 MS: 1 ChangeByte- 00:07:49.731 [2024-07-15 00:15:48.555865] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3787188207540450215 len:10753 00:07:49.731 [2024-07-15 00:15:48.555896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:49.731 [2024-07-15 00:15:48.556015] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:07:49.731 [2024-07-15 00:15:48.556038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:49.731 [2024-07-15 00:15:48.556150] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:07:49.731 [2024-07-15 00:15:48.556173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:49.731 [2024-07-15 00:15:48.556288] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446462603011096575 len:2304 00:07:49.731 [2024-07-15 00:15:48.556307] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:49.731 #35 NEW cov: 11764 ft: 13972 corp: 18/554b lim: 50 exec/s: 35 rss: 68Mb L: 47/47 MS: 1 PersAutoDict- DE: "3\2474\216\313\233*\000"- 00:07:49.731 [2024-07-15 00:15:48.605899] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744039339196415 len:65536 00:07:49.731 [2024-07-15 00:15:48.605929] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:49.731 [2024-07-15 00:15:48.606042] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:07:49.731 [2024-07-15 00:15:48.606063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:49.731 [2024-07-15 00:15:48.606180] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446743021442564095 len:65536 00:07:49.731 [2024-07-15 00:15:48.606202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:49.731 #36 NEW cov: 11764 ft: 13992 corp: 19/584b lim: 50 exec/s: 36 rss: 69Mb L: 30/47 MS: 1 ChangeBit- 00:07:49.731 [2024-07-15 00:15:48.645863] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073698934783 len:65536 00:07:49.731 [2024-07-15 00:15:48.645893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:49.731 [2024-07-15 00:15:48.645990] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:07:49.731 [2024-07-15 00:15:48.646013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:49.731 [2024-07-15 00:15:48.646130] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446743021442564095 len:65536 00:07:49.731 [2024-07-15 00:15:48.646152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:49.731 #37 NEW cov: 11764 ft: 14002 corp: 20/614b lim: 50 exec/s: 37 rss: 69Mb L: 30/47 MS: 1 CopyPart- 00:07:49.731 [2024-07-15 00:15:48.685816] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:24064 00:07:49.731 [2024-07-15 00:15:48.685848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:49.731 [2024-07-15 00:15:48.685956] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:07:49.731 [2024-07-15 00:15:48.685980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:49.731 #38 NEW cov: 11764 ft: 14224 corp: 21/636b lim: 50 exec/s: 38 rss: 69Mb L: 22/47 MS: 1 CrossOver- 00:07:49.731 [2024-07-15 00:15:48.725838] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3271557120 len:1 00:07:49.731 [2024-07-15 00:15:48.725862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:49.731 #41 NEW cov: 11764 ft: 14251 corp: 22/650b lim: 50 exec/s: 41 rss: 69Mb L: 14/47 MS: 3 InsertByte-ChangeByte-InsertRepeatedBytes- 00:07:49.731 [2024-07-15 00:15:48.766245] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 00:07:49.731 [2024-07-15 00:15:48.766276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:49.731 [2024-07-15 00:15:48.766379] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:07:49.731 [2024-07-15 00:15:48.766401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:49.731 [2024-07-15 00:15:48.766524] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446743201831190527 len:36556 00:07:49.731 [2024-07-15 00:15:48.766542] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:49.731 #42 NEW cov: 11764 ft: 14277 corp: 23/689b lim: 50 exec/s: 42 rss: 69Mb L: 39/47 MS: 1 CrossOver- 00:07:49.990 [2024-07-15 00:15:48.806079] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3787188203430032295 len:10753 00:07:49.990 [2024-07-15 00:15:48.806104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:49.990 #47 NEW cov: 11764 ft: 14296 corp: 24/699b lim: 50 exec/s: 47 rss: 69Mb L: 10/47 MS: 5 CrossOver-EraseBytes-ShuffleBytes-CopyPart-PersAutoDict- DE: "3\2474\216\313\233*\000"- 00:07:49.990 [2024-07-15 00:15:48.846144] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709510143 len:65536 00:07:49.990 [2024-07-15 00:15:48.846178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:49.990 #48 NEW cov: 11764 ft: 14308 corp: 25/715b lim: 50 exec/s: 48 rss: 69Mb L: 16/47 MS: 1 EraseBytes- 00:07:49.990 [2024-07-15 00:15:48.886621] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709510911 len:65536 00:07:49.990 [2024-07-15 00:15:48.886651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:49.990 [2024-07-15 00:15:48.886727] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:07:49.990 [2024-07-15 00:15:48.886744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:49.990 [2024-07-15 00:15:48.886853] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:07:49.990 [2024-07-15 00:15:48.886874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:49.990 #49 NEW cov: 11764 ft: 14381 corp: 26/754b lim: 50 exec/s: 49 rss: 69Mb L: 39/47 MS: 1 ChangeByte- 00:07:49.990 [2024-07-15 00:15:48.926794] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073698934783 len:65536 00:07:49.990 [2024-07-15 00:15:48.926828] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:49.990 [2024-07-15 00:15:48.926913] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551501 len:65536 00:07:49.990 [2024-07-15 00:15:48.926936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:49.990 [2024-07-15 00:15:48.927059] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:2816 00:07:49.990 [2024-07-15 00:15:48.927077] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:49.990 #50 NEW cov: 11764 ft: 14413 corp: 27/785b lim: 50 exec/s: 50 rss: 69Mb L: 31/47 MS: 1 InsertByte- 00:07:49.990 [2024-07-15 00:15:48.966764] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 00:07:49.990 [2024-07-15 00:15:48.966796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:49.990 [2024-07-15 00:15:48.966902] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:07:49.990 [2024-07-15 00:15:48.966925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:49.990 [2024-07-15 00:15:48.967044] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446742982787858431 len:1 00:07:49.990 [2024-07-15 00:15:48.967063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:49.990 #51 NEW cov: 11764 ft: 14444 corp: 28/824b lim: 50 exec/s: 51 rss: 69Mb L: 39/47 MS: 1 ChangeBinInt- 00:07:49.990 [2024-07-15 00:15:49.006700] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18391293503297552383 len:65536 00:07:49.990 [2024-07-15 00:15:49.006731] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:49.990 [2024-07-15 00:15:49.006833] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:72057598332895231 len:65536 00:07:49.990 [2024-07-15 00:15:49.006859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:49.990 [2024-07-15 00:15:49.006972] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709488895 len:2816 00:07:49.990 [2024-07-15 00:15:49.006991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:49.990 #52 NEW cov: 11764 ft: 14471 corp: 29/855b lim: 50 exec/s: 52 rss: 69Mb L: 31/47 MS: 1 CMP- DE: "\001\000\000\000"- 00:07:50.248 [2024-07-15 00:15:49.047303] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 00:07:50.248 [2024-07-15 00:15:49.047335] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:50.248 [2024-07-15 00:15:49.047431] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:07:50.248 [2024-07-15 00:15:49.047455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:50.248 [2024-07-15 00:15:49.047570] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446742978492891135 len:1 00:07:50.248 [2024-07-15 00:15:49.047592] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:50.248 [2024-07-15 00:15:49.047707] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446744069565579263 len:65291 00:07:50.248 [2024-07-15 00:15:49.047726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:50.248 #53 NEW cov: 11764 ft: 14536 corp: 30/895b lim: 50 exec/s: 53 rss: 69Mb L: 40/47 MS: 1 CrossOver- 00:07:50.248 [2024-07-15 00:15:49.087397] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 00:07:50.248 [2024-07-15 00:15:49.087430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:50.248 [2024-07-15 00:15:49.087507] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551401 len:65536 00:07:50.248 [2024-07-15 00:15:49.087532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:50.248 [2024-07-15 00:15:49.087647] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:72057594037927935 len:65281 00:07:50.248 [2024-07-15 00:15:49.087672] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:50.248 [2024-07-15 00:15:49.087795] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446744069415174143 len:65291 00:07:50.248 [2024-07-15 00:15:49.087818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:50.248 #54 NEW cov: 11764 ft: 14551 corp: 31/935b lim: 50 exec/s: 54 rss: 69Mb L: 40/47 MS: 1 InsertByte- 00:07:50.248 [2024-07-15 00:15:49.127416] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744039339196159 len:65536 00:07:50.248 [2024-07-15 00:15:49.127449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:50.249 [2024-07-15 00:15:49.127561] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:07:50.249 [2024-07-15 00:15:49.127582] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:50.249 [2024-07-15 00:15:49.127699] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446743021442564095 len:65536 00:07:50.249 [2024-07-15 00:15:49.127725] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:50.249 #55 NEW cov: 11764 ft: 14561 corp: 32/965b lim: 50 exec/s: 55 rss: 69Mb L: 30/47 MS: 1 ChangeBit- 00:07:50.249 [2024-07-15 00:15:49.167471] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 00:07:50.249 [2024-07-15 00:15:49.167503] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:50.249 [2024-07-15 00:15:49.167578] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073695199231 len:65536 00:07:50.249 [2024-07-15 00:15:49.167601] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:50.249 [2024-07-15 00:15:49.167709] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:07:50.249 [2024-07-15 00:15:49.167731] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:50.249 #56 NEW cov: 11764 ft: 14568 corp: 33/1004b lim: 50 exec/s: 56 rss: 69Mb L: 39/47 MS: 1 ChangeByte- 00:07:50.249 [2024-07-15 00:15:49.207820] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709510911 len:65536 00:07:50.249 [2024-07-15 00:15:49.207852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:50.249 [2024-07-15 00:15:49.207918] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:07:50.249 [2024-07-15 00:15:49.207940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:50.249 [2024-07-15 00:15:49.208041] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:07:50.249 [2024-07-15 00:15:49.208062] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:50.249 [2024-07-15 00:15:49.208169] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 00:07:50.249 [2024-07-15 00:15:49.208191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:50.249 #57 NEW cov: 11764 ft: 14602 corp: 34/1047b lim: 50 exec/s: 57 rss: 69Mb L: 43/47 MS: 1 InsertRepeatedBytes- 00:07:50.249 [2024-07-15 00:15:49.247668] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446463702539436031 len:1 00:07:50.249 [2024-07-15 00:15:49.247704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:50.249 [2024-07-15 00:15:49.247807] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:07:50.249 [2024-07-15 00:15:49.247827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:50.249 [2024-07-15 00:15:49.247944] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446474693360746495 len:65536 00:07:50.249 [2024-07-15 00:15:49.247967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:50.249 #58 NEW cov: 11764 ft: 14611 corp: 35/1082b lim: 50 exec/s: 58 rss: 69Mb L: 35/47 MS: 1 PersAutoDict- DE: "\001\000\000\000"- 00:07:50.249 [2024-07-15 00:15:49.287652] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 00:07:50.249 [2024-07-15 00:15:49.287687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:50.249 [2024-07-15 00:15:49.287820] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:07:50.249 [2024-07-15 00:15:49.287841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:50.507 #59 NEW cov: 11764 ft: 14626 corp: 36/1111b lim: 50 exec/s: 59 rss: 69Mb L: 29/47 MS: 1 EraseBytes- 00:07:50.507 [2024-07-15 00:15:49.327636] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3787121340063652763 len:1 00:07:50.507 [2024-07-15 00:15:49.327661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:50.507 #60 NEW cov: 11764 ft: 14657 corp: 37/1122b lim: 50 exec/s: 60 rss: 69Mb L: 11/47 MS: 1 ShuffleBytes- 00:07:50.507 [2024-07-15 00:15:49.368108] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744070052118527 len:65536 00:07:50.507 [2024-07-15 00:15:49.368142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:50.508 [2024-07-15 00:15:49.368262] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:07:50.508 [2024-07-15 00:15:49.368289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:50.508 [2024-07-15 00:15:49.368404] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:2816 00:07:50.508 [2024-07-15 00:15:49.368429] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:50.508 #61 NEW cov: 11764 ft: 14681 corp: 38/1153b lim: 50 exec/s: 61 rss: 69Mb L: 31/47 MS: 1 InsertByte- 00:07:50.508 [2024-07-15 00:15:49.408460] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65344 00:07:50.508 [2024-07-15 00:15:49.408490] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:50.508 [2024-07-15 00:15:49.408579] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:07:50.508 [2024-07-15 00:15:49.408603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:50.508 [2024-07-15 00:15:49.408723] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:1 00:07:50.508 [2024-07-15 00:15:49.408748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:50.508 [2024-07-15 00:15:49.408867] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446744069415174143 len:65536 00:07:50.508 [2024-07-15 00:15:49.408893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:50.508 #62 NEW cov: 11764 ft: 14709 corp: 39/1194b lim: 50 exec/s: 62 rss: 70Mb L: 41/47 MS: 1 InsertByte- 00:07:50.508 [2024-07-15 00:15:49.458401] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744070756761599 len:65536 00:07:50.508 [2024-07-15 00:15:49.458432] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:50.508 [2024-07-15 00:15:49.458535] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:07:50.508 [2024-07-15 00:15:49.458559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:50.508 [2024-07-15 00:15:49.458675] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:07:50.508 [2024-07-15 00:15:49.458695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:50.508 #65 NEW cov: 11764 ft: 14711 corp: 40/1228b lim: 50 exec/s: 65 rss: 70Mb L: 34/47 MS: 3 ShuffleBytes-ChangeByte-InsertRepeatedBytes- 00:07:50.508 [2024-07-15 00:15:49.498421] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073698934783 len:65536 00:07:50.508 [2024-07-15 00:15:49.498455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:50.508 [2024-07-15 00:15:49.498556] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:07:50.508 [2024-07-15 00:15:49.498580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:50.508 [2024-07-15 00:15:49.498698] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446743021442564095 len:13056 00:07:50.508 [2024-07-15 00:15:49.498718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:50.508 #66 NEW cov: 11764 ft: 14716 corp: 41/1258b lim: 50 exec/s: 66 rss: 70Mb L: 30/47 MS: 1 ChangeByte- 00:07:50.508 [2024-07-15 00:15:49.538594] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446535166489657343 len:65536 00:07:50.508 [2024-07-15 00:15:49.538620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:50.508 [2024-07-15 00:15:49.538736] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:07:50.508 [2024-07-15 00:15:49.538753] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:50.508 [2024-07-15 00:15:49.538864] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446743021442564095 len:13056 00:07:50.508 [2024-07-15 00:15:49.538884] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:50.767 #67 NEW cov: 11764 ft: 14727 corp: 42/1288b lim: 50 exec/s: 67 rss: 70Mb L: 30/47 MS: 1 ChangeByte- 00:07:50.767 [2024-07-15 00:15:49.578600] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18444492273885249535 len:65536 00:07:50.767 [2024-07-15 00:15:49.578626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:50.767 [2024-07-15 00:15:49.578751] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:07:50.767 [2024-07-15 00:15:49.578773] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:50.767 #68 NEW cov: 11764 ft: 14731 corp: 43/1316b lim: 50 exec/s: 34 rss: 70Mb L: 28/47 MS: 1 EraseBytes- 00:07:50.767 #68 DONE cov: 11764 ft: 14731 corp: 43/1316b lim: 50 exec/s: 34 rss: 70Mb 00:07:50.767 ###### Recommended dictionary. ###### 00:07:50.767 "3\2474\216\313\233*\000" # Uses: 2 00:07:50.767 "\001\000\000\000" # Uses: 1 00:07:50.767 ###### End of recommended dictionary. ###### 00:07:50.767 Done 68 runs in 2 second(s) 00:07:50.767 00:15:49 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_19.conf 00:07:50.767 00:15:49 -- ../common.sh@72 -- # (( i++ )) 00:07:50.767 00:15:49 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:50.767 00:15:49 -- ../common.sh@73 -- # start_llvm_fuzz 20 1 0x1 00:07:50.767 00:15:49 -- nvmf/run.sh@23 -- # local fuzzer_type=20 00:07:50.767 00:15:49 -- nvmf/run.sh@24 -- # local timen=1 00:07:50.767 00:15:49 -- nvmf/run.sh@25 -- # local core=0x1 00:07:50.767 00:15:49 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:07:50.767 00:15:49 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_20.conf 00:07:50.767 00:15:49 -- nvmf/run.sh@29 -- # printf %02d 20 00:07:50.767 00:15:49 -- nvmf/run.sh@29 -- # port=4420 00:07:50.767 00:15:49 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:07:50.767 00:15:49 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4420' 00:07:50.767 00:15:49 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4420"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:50.767 00:15:49 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4420' -c /tmp/fuzz_json_20.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 -Z 20 -r /var/tmp/spdk20.sock 00:07:50.767 [2024-07-15 00:15:49.763369] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:50.767 [2024-07-15 00:15:49.763437] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid335257 ] 00:07:50.767 EAL: No free 2048 kB hugepages reported on node 1 00:07:51.025 [2024-07-15 00:15:49.938611] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:51.025 [2024-07-15 00:15:50.000203] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:51.025 [2024-07-15 00:15:50.000350] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:51.025 [2024-07-15 00:15:50.059365] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:51.025 [2024-07-15 00:15:50.075654] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4420 *** 00:07:51.284 INFO: Running with entropic power schedule (0xFF, 100). 00:07:51.284 INFO: Seed: 372709572 00:07:51.284 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x280a94c, 0x285dea9), 00:07:51.284 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x285deb0,0x2d93480), 00:07:51.284 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:07:51.284 INFO: A corpus is not provided, starting from an empty corpus 00:07:51.284 #2 INITED exec/s: 0 rss: 61Mb 00:07:51.284 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:51.284 This may also happen if the target rejected all inputs we tried so far 00:07:51.284 [2024-07-15 00:15:50.145706] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:51.284 [2024-07-15 00:15:50.145739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:51.284 [2024-07-15 00:15:50.145860] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:51.284 [2024-07-15 00:15:50.145883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:51.542 NEW_FUNC[1/672]: 0x4a2e10 in fuzz_nvm_reservation_acquire_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:597 00:07:51.542 NEW_FUNC[2/672]: 0x4bd260 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:51.542 #21 NEW cov: 11594 ft: 11595 corp: 2/37b lim: 90 exec/s: 0 rss: 67Mb L: 36/36 MS: 4 CrossOver-InsertByte-ChangeByte-InsertRepeatedBytes- 00:07:51.542 [2024-07-15 00:15:50.487093] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:51.542 [2024-07-15 00:15:50.487134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:51.543 [2024-07-15 00:15:50.487271] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:51.543 [2024-07-15 00:15:50.487293] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:51.543 [2024-07-15 00:15:50.487422] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:51.543 [2024-07-15 00:15:50.487445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:51.543 [2024-07-15 00:15:50.487542] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:07:51.543 [2024-07-15 00:15:50.487565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:51.543 #26 NEW cov: 11708 ft: 12567 corp: 3/109b lim: 90 exec/s: 0 rss: 67Mb L: 72/72 MS: 5 ChangeByte-InsertByte-ChangeByte-EraseBytes-InsertRepeatedBytes- 00:07:51.543 [2024-07-15 00:15:50.526806] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:51.543 [2024-07-15 00:15:50.526838] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:51.543 [2024-07-15 00:15:50.526962] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:51.543 [2024-07-15 00:15:50.526988] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:51.543 [2024-07-15 00:15:50.527109] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:51.543 [2024-07-15 00:15:50.527133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:51.543 #27 NEW cov: 11714 ft: 13128 corp: 4/168b lim: 90 exec/s: 0 rss: 67Mb L: 59/72 MS: 1 InsertRepeatedBytes- 00:07:51.543 [2024-07-15 00:15:50.577203] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:51.543 [2024-07-15 00:15:50.577234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:51.543 [2024-07-15 00:15:50.577316] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:51.543 [2024-07-15 00:15:50.577337] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:51.543 [2024-07-15 00:15:50.577466] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:51.543 [2024-07-15 00:15:50.577489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:51.543 [2024-07-15 00:15:50.577609] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:07:51.543 [2024-07-15 00:15:50.577628] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:51.802 #28 NEW cov: 11799 ft: 13414 corp: 5/240b lim: 90 exec/s: 0 rss: 68Mb L: 72/72 MS: 1 ChangeByte- 00:07:51.802 [2024-07-15 00:15:50.627389] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:51.802 [2024-07-15 00:15:50.627419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:51.802 [2024-07-15 00:15:50.627543] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:51.802 [2024-07-15 00:15:50.627564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:51.802 [2024-07-15 00:15:50.627694] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:51.802 [2024-07-15 00:15:50.627717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:51.802 [2024-07-15 00:15:50.627841] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:07:51.803 [2024-07-15 00:15:50.627860] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:51.803 #29 NEW cov: 11799 ft: 13490 corp: 6/320b lim: 90 exec/s: 0 rss: 68Mb L: 80/80 MS: 1 CMP- DE: ">\000\000\000\000\000\000\000"- 00:07:51.803 [2024-07-15 00:15:50.667116] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:51.803 [2024-07-15 00:15:50.667145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:51.803 [2024-07-15 00:15:50.667237] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:51.803 [2024-07-15 00:15:50.667256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:51.803 [2024-07-15 00:15:50.667369] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:51.803 [2024-07-15 00:15:50.667386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:51.803 [2024-07-15 00:15:50.667510] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:07:51.803 [2024-07-15 00:15:50.667531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:51.803 #30 NEW cov: 11799 ft: 13607 corp: 7/392b lim: 90 exec/s: 0 rss: 68Mb L: 72/80 MS: 1 ShuffleBytes- 00:07:51.803 [2024-07-15 00:15:50.717639] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:51.803 [2024-07-15 00:15:50.717670] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:51.803 [2024-07-15 00:15:50.717756] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:51.803 [2024-07-15 00:15:50.717781] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:51.803 [2024-07-15 00:15:50.717904] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:51.803 [2024-07-15 00:15:50.717924] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:51.803 [2024-07-15 00:15:50.718048] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:07:51.803 [2024-07-15 00:15:50.718070] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:51.803 #31 NEW cov: 11799 ft: 13688 corp: 8/464b lim: 90 exec/s: 0 rss: 68Mb L: 72/80 MS: 1 ChangeBit- 00:07:51.803 [2024-07-15 00:15:50.757830] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:51.803 [2024-07-15 00:15:50.757860] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:51.803 [2024-07-15 00:15:50.757912] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:51.803 [2024-07-15 00:15:50.757925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:51.803 [2024-07-15 00:15:50.758039] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:51.803 [2024-07-15 00:15:50.758062] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:51.803 [2024-07-15 00:15:50.758155] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:07:51.803 [2024-07-15 00:15:50.758174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:51.803 #32 NEW cov: 11799 ft: 13756 corp: 9/536b lim: 90 exec/s: 0 rss: 68Mb L: 72/80 MS: 1 CopyPart- 00:07:51.803 [2024-07-15 00:15:50.797887] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:51.803 [2024-07-15 00:15:50.797920] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:51.803 [2024-07-15 00:15:50.798036] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:51.803 [2024-07-15 00:15:50.798056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:51.803 [2024-07-15 00:15:50.798171] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:51.803 [2024-07-15 00:15:50.798191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:51.803 [2024-07-15 00:15:50.798304] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:07:51.803 [2024-07-15 00:15:50.798327] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:51.803 #33 NEW cov: 11799 ft: 13781 corp: 10/608b lim: 90 exec/s: 0 rss: 68Mb L: 72/80 MS: 1 CrossOver- 00:07:51.803 [2024-07-15 00:15:50.847347] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:51.803 [2024-07-15 00:15:50.847378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:51.803 [2024-07-15 00:15:50.847487] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:51.803 [2024-07-15 00:15:50.847508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:52.063 #34 NEW cov: 11799 ft: 13873 corp: 11/661b lim: 90 exec/s: 0 rss: 68Mb L: 53/80 MS: 1 EraseBytes- 00:07:52.063 [2024-07-15 00:15:50.887731] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:52.063 [2024-07-15 00:15:50.887762] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:52.063 [2024-07-15 00:15:50.887862] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:52.063 [2024-07-15 00:15:50.887882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:52.063 [2024-07-15 00:15:50.888001] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:52.063 [2024-07-15 00:15:50.888020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:52.063 [2024-07-15 00:15:50.888141] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:07:52.063 [2024-07-15 00:15:50.888161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:52.063 #40 NEW cov: 11799 ft: 13913 corp: 12/742b lim: 90 exec/s: 0 rss: 68Mb L: 81/81 MS: 1 InsertByte- 00:07:52.063 [2024-07-15 00:15:50.938384] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:52.063 [2024-07-15 00:15:50.938409] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:52.063 [2024-07-15 00:15:50.938527] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:52.063 [2024-07-15 00:15:50.938550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:52.063 [2024-07-15 00:15:50.938671] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:52.063 [2024-07-15 00:15:50.938691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:52.063 [2024-07-15 00:15:50.938816] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:07:52.063 [2024-07-15 00:15:50.938836] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:52.063 #41 NEW cov: 11799 ft: 13932 corp: 13/814b lim: 90 exec/s: 0 rss: 68Mb L: 72/81 MS: 1 ChangeByte- 00:07:52.063 [2024-07-15 00:15:50.977922] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:52.063 [2024-07-15 00:15:50.977955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:52.063 [2024-07-15 00:15:50.978071] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:52.063 [2024-07-15 00:15:50.978091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:52.063 #42 NEW cov: 11799 ft: 13944 corp: 14/859b lim: 90 exec/s: 0 rss: 68Mb L: 45/81 MS: 1 InsertRepeatedBytes- 00:07:52.063 [2024-07-15 00:15:51.018458] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:52.063 [2024-07-15 00:15:51.018490] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:52.063 [2024-07-15 00:15:51.018572] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:52.063 [2024-07-15 00:15:51.018594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:52.063 [2024-07-15 00:15:51.018713] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:52.063 [2024-07-15 00:15:51.018737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:52.063 [2024-07-15 00:15:51.018867] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:07:52.063 [2024-07-15 00:15:51.018889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:52.063 NEW_FUNC[1/1]: 0x195e300 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:52.063 #43 NEW cov: 11822 ft: 14008 corp: 15/931b lim: 90 exec/s: 0 rss: 69Mb L: 72/81 MS: 1 PersAutoDict- DE: ">\000\000\000\000\000\000\000"- 00:07:52.063 [2024-07-15 00:15:51.068663] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:52.063 [2024-07-15 00:15:51.068695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:52.063 [2024-07-15 00:15:51.068807] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:52.063 [2024-07-15 00:15:51.068832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:52.063 [2024-07-15 00:15:51.068946] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:52.064 [2024-07-15 00:15:51.068965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:52.064 [2024-07-15 00:15:51.069053] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:07:52.064 [2024-07-15 00:15:51.069076] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:52.064 #44 NEW cov: 11822 ft: 14029 corp: 16/1020b lim: 90 exec/s: 0 rss: 69Mb L: 89/89 MS: 1 InsertRepeatedBytes- 00:07:52.064 [2024-07-15 00:15:51.108769] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:52.064 [2024-07-15 00:15:51.108799] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:52.064 [2024-07-15 00:15:51.108897] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:52.064 [2024-07-15 00:15:51.108913] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:52.064 [2024-07-15 00:15:51.109032] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:52.064 [2024-07-15 00:15:51.109048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:52.064 [2024-07-15 00:15:51.109172] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:07:52.064 [2024-07-15 00:15:51.109195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:52.324 #45 NEW cov: 11822 ft: 14068 corp: 17/1092b lim: 90 exec/s: 45 rss: 69Mb L: 72/89 MS: 1 ShuffleBytes- 00:07:52.324 [2024-07-15 00:15:51.148955] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:52.324 [2024-07-15 00:15:51.148983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:52.324 [2024-07-15 00:15:51.149080] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:52.324 [2024-07-15 00:15:51.149101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:52.324 [2024-07-15 00:15:51.149219] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:52.324 [2024-07-15 00:15:51.149242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:52.324 [2024-07-15 00:15:51.149364] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:07:52.324 [2024-07-15 00:15:51.149384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:52.324 #46 NEW cov: 11822 ft: 14085 corp: 18/1167b lim: 90 exec/s: 46 rss: 69Mb L: 75/89 MS: 1 InsertRepeatedBytes- 00:07:52.324 [2024-07-15 00:15:51.188758] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:52.324 [2024-07-15 00:15:51.188788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:52.324 [2024-07-15 00:15:51.188899] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:52.324 [2024-07-15 00:15:51.188923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:52.324 [2024-07-15 00:15:51.189046] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:52.324 [2024-07-15 00:15:51.189069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:52.324 #47 NEW cov: 11822 ft: 14109 corp: 19/1226b lim: 90 exec/s: 47 rss: 69Mb L: 59/89 MS: 1 EraseBytes- 00:07:52.324 [2024-07-15 00:15:51.229142] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:52.324 [2024-07-15 00:15:51.229170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:52.324 [2024-07-15 00:15:51.229269] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:52.324 [2024-07-15 00:15:51.229292] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:52.324 [2024-07-15 00:15:51.229408] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:52.324 [2024-07-15 00:15:51.229425] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:52.324 [2024-07-15 00:15:51.229566] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:07:52.324 [2024-07-15 00:15:51.229599] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:52.324 #48 NEW cov: 11822 ft: 14118 corp: 20/1298b lim: 90 exec/s: 48 rss: 69Mb L: 72/89 MS: 1 PersAutoDict- DE: ">\000\000\000\000\000\000\000"- 00:07:52.324 [2024-07-15 00:15:51.268795] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:52.324 [2024-07-15 00:15:51.268819] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:52.324 [2024-07-15 00:15:51.268948] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:52.324 [2024-07-15 00:15:51.268970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:52.324 [2024-07-15 00:15:51.308904] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:52.324 [2024-07-15 00:15:51.308936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:52.324 [2024-07-15 00:15:51.309059] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:52.324 [2024-07-15 00:15:51.309078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:52.324 #50 NEW cov: 11822 ft: 14123 corp: 21/1349b lim: 90 exec/s: 50 rss: 69Mb L: 51/89 MS: 2 InsertRepeatedBytes-PersAutoDict- DE: ">\000\000\000\000\000\000\000"- 00:07:52.324 [2024-07-15 00:15:51.349061] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:52.324 [2024-07-15 00:15:51.349086] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:52.324 [2024-07-15 00:15:51.349209] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:52.324 [2024-07-15 00:15:51.349231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:52.324 #51 NEW cov: 11822 ft: 14138 corp: 22/1400b lim: 90 exec/s: 51 rss: 69Mb L: 51/89 MS: 1 ShuffleBytes- 00:07:52.584 [2024-07-15 00:15:51.389671] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:52.584 [2024-07-15 00:15:51.389702] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:52.584 [2024-07-15 00:15:51.389807] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:52.584 [2024-07-15 00:15:51.389827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:52.584 [2024-07-15 00:15:51.389952] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:52.584 [2024-07-15 00:15:51.389974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:52.584 [2024-07-15 00:15:51.390096] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:07:52.584 [2024-07-15 00:15:51.390119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:52.584 #52 NEW cov: 11822 ft: 14153 corp: 23/1472b lim: 90 exec/s: 52 rss: 69Mb L: 72/89 MS: 1 PersAutoDict- DE: ">\000\000\000\000\000\000\000"- 00:07:52.584 [2024-07-15 00:15:51.429724] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:52.584 [2024-07-15 00:15:51.429755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:52.584 [2024-07-15 00:15:51.429856] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:52.584 [2024-07-15 00:15:51.429879] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:52.584 [2024-07-15 00:15:51.430009] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:52.584 [2024-07-15 00:15:51.430025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:52.584 [2024-07-15 00:15:51.430160] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:07:52.584 [2024-07-15 00:15:51.430183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:52.584 #53 NEW cov: 11822 ft: 14163 corp: 24/1547b lim: 90 exec/s: 53 rss: 69Mb L: 75/89 MS: 1 ChangeBinInt- 00:07:52.584 [2024-07-15 00:15:51.469833] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:52.584 [2024-07-15 00:15:51.469863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:52.584 [2024-07-15 00:15:51.469955] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:52.584 [2024-07-15 00:15:51.469980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:52.584 [2024-07-15 00:15:51.470101] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:52.584 [2024-07-15 00:15:51.470123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:52.584 [2024-07-15 00:15:51.470240] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:07:52.584 [2024-07-15 00:15:51.470265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:52.584 #54 NEW cov: 11822 ft: 14169 corp: 25/1627b lim: 90 exec/s: 54 rss: 69Mb L: 80/89 MS: 1 PersAutoDict- DE: ">\000\000\000\000\000\000\000"- 00:07:52.584 [2024-07-15 00:15:51.519535] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:52.584 [2024-07-15 00:15:51.519568] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:52.584 [2024-07-15 00:15:51.519665] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:52.584 [2024-07-15 00:15:51.519696] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:52.584 #55 NEW cov: 11822 ft: 14199 corp: 26/1680b lim: 90 exec/s: 55 rss: 69Mb L: 53/89 MS: 1 CMP- DE: "\000\000\000\000"- 00:07:52.584 [2024-07-15 00:15:51.570298] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:52.585 [2024-07-15 00:15:51.570330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:52.585 [2024-07-15 00:15:51.570428] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:52.585 [2024-07-15 00:15:51.570465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:52.585 [2024-07-15 00:15:51.570588] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:52.585 [2024-07-15 00:15:51.570610] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:52.585 [2024-07-15 00:15:51.570739] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:07:52.585 [2024-07-15 00:15:51.570760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:52.585 #56 NEW cov: 11822 ft: 14217 corp: 27/1760b lim: 90 exec/s: 56 rss: 69Mb L: 80/89 MS: 1 PersAutoDict- DE: ">\000\000\000\000\000\000\000"- 00:07:52.585 [2024-07-15 00:15:51.609795] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:52.585 [2024-07-15 00:15:51.609829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:52.585 [2024-07-15 00:15:51.609921] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:52.585 [2024-07-15 00:15:51.609942] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:52.585 [2024-07-15 00:15:51.610061] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:52.585 [2024-07-15 00:15:51.610080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:52.585 [2024-07-15 00:15:51.610194] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:07:52.585 [2024-07-15 00:15:51.610216] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:52.585 #57 NEW cov: 11822 ft: 14233 corp: 28/1842b lim: 90 exec/s: 57 rss: 69Mb L: 82/89 MS: 1 CopyPart- 00:07:52.845 [2024-07-15 00:15:51.650341] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:52.845 [2024-07-15 00:15:51.650371] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:52.845 [2024-07-15 00:15:51.650492] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:52.845 [2024-07-15 00:15:51.650513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:52.845 [2024-07-15 00:15:51.650631] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:52.845 [2024-07-15 00:15:51.650649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:52.845 [2024-07-15 00:15:51.650767] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:07:52.845 [2024-07-15 00:15:51.650786] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:52.845 #58 NEW cov: 11822 ft: 14237 corp: 29/1922b lim: 90 exec/s: 58 rss: 69Mb L: 80/89 MS: 1 ShuffleBytes- 00:07:52.845 [2024-07-15 00:15:51.690419] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:52.845 [2024-07-15 00:15:51.690450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:52.845 [2024-07-15 00:15:51.690550] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:52.845 [2024-07-15 00:15:51.690570] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:52.845 [2024-07-15 00:15:51.690684] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:52.845 [2024-07-15 00:15:51.690708] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:52.845 [2024-07-15 00:15:51.690828] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:07:52.845 [2024-07-15 00:15:51.690848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:52.845 #59 NEW cov: 11822 ft: 14243 corp: 30/1994b lim: 90 exec/s: 59 rss: 69Mb L: 72/89 MS: 1 ChangeByte- 00:07:52.845 [2024-07-15 00:15:51.730576] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:52.845 [2024-07-15 00:15:51.730607] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:52.845 [2024-07-15 00:15:51.730706] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:52.845 [2024-07-15 00:15:51.730728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:52.845 [2024-07-15 00:15:51.730842] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:52.845 [2024-07-15 00:15:51.730858] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:52.845 [2024-07-15 00:15:51.730974] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:07:52.845 [2024-07-15 00:15:51.730995] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:52.845 #60 NEW cov: 11822 ft: 14258 corp: 31/2076b lim: 90 exec/s: 60 rss: 70Mb L: 82/89 MS: 1 InsertRepeatedBytes- 00:07:52.845 [2024-07-15 00:15:51.780779] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:52.845 [2024-07-15 00:15:51.780809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:52.845 [2024-07-15 00:15:51.780913] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:52.845 [2024-07-15 00:15:51.780933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:52.845 [2024-07-15 00:15:51.781057] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:52.845 [2024-07-15 00:15:51.781084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:52.845 [2024-07-15 00:15:51.781204] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:07:52.845 [2024-07-15 00:15:51.781222] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:52.845 #61 NEW cov: 11822 ft: 14306 corp: 32/2156b lim: 90 exec/s: 61 rss: 70Mb L: 80/89 MS: 1 ShuffleBytes- 00:07:52.845 [2024-07-15 00:15:51.830911] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:52.845 [2024-07-15 00:15:51.830943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:52.845 [2024-07-15 00:15:51.831074] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:52.845 [2024-07-15 00:15:51.831094] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:52.845 [2024-07-15 00:15:51.831203] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:52.845 [2024-07-15 00:15:51.831223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:52.845 [2024-07-15 00:15:51.831340] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:07:52.845 [2024-07-15 00:15:51.831361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:52.845 #62 NEW cov: 11822 ft: 14316 corp: 33/2244b lim: 90 exec/s: 62 rss: 70Mb L: 88/89 MS: 1 InsertRepeatedBytes- 00:07:52.845 [2024-07-15 00:15:51.871028] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:52.845 [2024-07-15 00:15:51.871060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:52.845 [2024-07-15 00:15:51.871164] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:52.845 [2024-07-15 00:15:51.871181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:52.845 [2024-07-15 00:15:51.871303] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:52.845 [2024-07-15 00:15:51.871327] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:52.846 [2024-07-15 00:15:51.871454] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:07:52.846 [2024-07-15 00:15:51.871473] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:52.846 #63 NEW cov: 11822 ft: 14411 corp: 34/2332b lim: 90 exec/s: 63 rss: 70Mb L: 88/89 MS: 1 ChangeByte- 00:07:53.106 [2024-07-15 00:15:51.911241] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:53.106 [2024-07-15 00:15:51.911275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:53.106 [2024-07-15 00:15:51.911401] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:53.106 [2024-07-15 00:15:51.911427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:53.106 [2024-07-15 00:15:51.911560] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:53.106 [2024-07-15 00:15:51.911585] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:53.106 [2024-07-15 00:15:51.911706] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:07:53.106 [2024-07-15 00:15:51.911722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:53.106 #64 NEW cov: 11822 ft: 14421 corp: 35/2415b lim: 90 exec/s: 64 rss: 70Mb L: 83/89 MS: 1 InsertRepeatedBytes- 00:07:53.106 [2024-07-15 00:15:51.951214] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:53.106 [2024-07-15 00:15:51.951248] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:53.106 [2024-07-15 00:15:51.951361] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:53.106 [2024-07-15 00:15:51.951386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:53.106 [2024-07-15 00:15:51.951500] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:53.106 [2024-07-15 00:15:51.951520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:53.106 #65 NEW cov: 11831 ft: 14506 corp: 36/2469b lim: 90 exec/s: 65 rss: 70Mb L: 54/89 MS: 1 EraseBytes- 00:07:53.106 [2024-07-15 00:15:51.990948] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:53.106 [2024-07-15 00:15:51.990972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:53.106 [2024-07-15 00:15:51.991111] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:53.106 [2024-07-15 00:15:51.991136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:53.106 #66 NEW cov: 11831 ft: 14520 corp: 37/2514b lim: 90 exec/s: 66 rss: 70Mb L: 45/89 MS: 1 ChangeByte- 00:07:53.106 [2024-07-15 00:15:52.031532] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:53.106 [2024-07-15 00:15:52.031564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:53.106 [2024-07-15 00:15:52.031671] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:53.106 [2024-07-15 00:15:52.031702] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:53.106 [2024-07-15 00:15:52.031821] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:53.106 [2024-07-15 00:15:52.031841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:53.106 [2024-07-15 00:15:52.031962] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:07:53.106 [2024-07-15 00:15:52.031980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:53.106 #67 NEW cov: 11831 ft: 14532 corp: 38/2599b lim: 90 exec/s: 67 rss: 70Mb L: 85/89 MS: 1 InsertRepeatedBytes- 00:07:53.106 [2024-07-15 00:15:52.081286] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:53.106 [2024-07-15 00:15:52.081316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:53.106 [2024-07-15 00:15:52.081447] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:53.106 [2024-07-15 00:15:52.081472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:53.106 [2024-07-15 00:15:52.081595] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:53.106 [2024-07-15 00:15:52.081614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:53.106 [2024-07-15 00:15:52.081740] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:07:53.106 [2024-07-15 00:15:52.081761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:53.106 #68 NEW cov: 11831 ft: 14608 corp: 39/2671b lim: 90 exec/s: 68 rss: 70Mb L: 72/89 MS: 1 CopyPart- 00:07:53.106 [2024-07-15 00:15:52.121729] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:53.106 [2024-07-15 00:15:52.121765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:53.106 [2024-07-15 00:15:52.121889] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:53.106 [2024-07-15 00:15:52.121916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:53.106 [2024-07-15 00:15:52.122039] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:53.106 [2024-07-15 00:15:52.122062] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:53.106 [2024-07-15 00:15:52.122184] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:07:53.106 [2024-07-15 00:15:52.122208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:53.106 #69 NEW cov: 11831 ft: 14638 corp: 40/2753b lim: 90 exec/s: 34 rss: 70Mb L: 82/89 MS: 1 ShuffleBytes- 00:07:53.106 #69 DONE cov: 11831 ft: 14638 corp: 40/2753b lim: 90 exec/s: 34 rss: 70Mb 00:07:53.106 ###### Recommended dictionary. ###### 00:07:53.106 ">\000\000\000\000\000\000\000" # Uses: 6 00:07:53.106 "\000\000\000\000" # Uses: 0 00:07:53.106 ###### End of recommended dictionary. ###### 00:07:53.106 Done 69 runs in 2 second(s) 00:07:53.366 00:15:52 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_20.conf 00:07:53.366 00:15:52 -- ../common.sh@72 -- # (( i++ )) 00:07:53.366 00:15:52 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:53.366 00:15:52 -- ../common.sh@73 -- # start_llvm_fuzz 21 1 0x1 00:07:53.366 00:15:52 -- nvmf/run.sh@23 -- # local fuzzer_type=21 00:07:53.366 00:15:52 -- nvmf/run.sh@24 -- # local timen=1 00:07:53.366 00:15:52 -- nvmf/run.sh@25 -- # local core=0x1 00:07:53.366 00:15:52 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:07:53.366 00:15:52 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_21.conf 00:07:53.366 00:15:52 -- nvmf/run.sh@29 -- # printf %02d 21 00:07:53.366 00:15:52 -- nvmf/run.sh@29 -- # port=4421 00:07:53.366 00:15:52 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:07:53.366 00:15:52 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4421' 00:07:53.366 00:15:52 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4421"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:53.366 00:15:52 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4421' -c /tmp/fuzz_json_21.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 -Z 21 -r /var/tmp/spdk21.sock 00:07:53.366 [2024-07-15 00:15:52.307332] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:53.366 [2024-07-15 00:15:52.307417] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid335675 ] 00:07:53.366 EAL: No free 2048 kB hugepages reported on node 1 00:07:53.625 [2024-07-15 00:15:52.491914] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:53.625 [2024-07-15 00:15:52.556802] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:53.625 [2024-07-15 00:15:52.556947] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:53.625 [2024-07-15 00:15:52.614743] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:53.625 [2024-07-15 00:15:52.631028] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4421 *** 00:07:53.625 INFO: Running with entropic power schedule (0xFF, 100). 00:07:53.625 INFO: Seed: 2927685565 00:07:53.625 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x280a94c, 0x285dea9), 00:07:53.625 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x285deb0,0x2d93480), 00:07:53.625 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:07:53.625 INFO: A corpus is not provided, starting from an empty corpus 00:07:53.625 #2 INITED exec/s: 0 rss: 60Mb 00:07:53.625 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:53.625 This may also happen if the target rejected all inputs we tried so far 00:07:53.884 [2024-07-15 00:15:52.701463] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:53.884 [2024-07-15 00:15:52.701515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:53.884 [2024-07-15 00:15:52.701624] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:53.884 [2024-07-15 00:15:52.701648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:53.884 [2024-07-15 00:15:52.701784] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:53.884 [2024-07-15 00:15:52.701807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:53.885 [2024-07-15 00:15:52.701930] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:53.885 [2024-07-15 00:15:52.701948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:54.144 NEW_FUNC[1/672]: 0x4a6030 in fuzz_nvm_reservation_release_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:623 00:07:54.144 NEW_FUNC[2/672]: 0x4bd260 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:54.144 #8 NEW cov: 11570 ft: 11571 corp: 2/46b lim: 50 exec/s: 0 rss: 67Mb L: 45/45 MS: 1 InsertRepeatedBytes- 00:07:54.144 [2024-07-15 00:15:53.032240] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:54.144 [2024-07-15 00:15:53.032277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.144 [2024-07-15 00:15:53.032403] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:54.144 [2024-07-15 00:15:53.032431] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.144 [2024-07-15 00:15:53.032566] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:54.144 [2024-07-15 00:15:53.032586] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:54.144 [2024-07-15 00:15:53.032708] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:54.144 [2024-07-15 00:15:53.032730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:54.144 #9 NEW cov: 11683 ft: 12229 corp: 3/95b lim: 50 exec/s: 0 rss: 67Mb L: 49/49 MS: 1 CopyPart- 00:07:54.144 [2024-07-15 00:15:53.092375] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:54.144 [2024-07-15 00:15:53.092411] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.144 [2024-07-15 00:15:53.092536] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:54.144 [2024-07-15 00:15:53.092559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.144 [2024-07-15 00:15:53.092688] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:54.144 [2024-07-15 00:15:53.092709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:54.144 [2024-07-15 00:15:53.092837] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:54.144 [2024-07-15 00:15:53.092864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:54.144 #10 NEW cov: 11689 ft: 12402 corp: 4/143b lim: 50 exec/s: 0 rss: 67Mb L: 48/49 MS: 1 InsertRepeatedBytes- 00:07:54.144 [2024-07-15 00:15:53.142821] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:54.144 [2024-07-15 00:15:53.142855] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.144 [2024-07-15 00:15:53.142960] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:54.144 [2024-07-15 00:15:53.142985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.144 [2024-07-15 00:15:53.143105] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:54.144 [2024-07-15 00:15:53.143127] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:54.144 [2024-07-15 00:15:53.143234] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:54.144 [2024-07-15 00:15:53.143257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:54.144 [2024-07-15 00:15:53.143383] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:07:54.144 [2024-07-15 00:15:53.143407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:54.144 #11 NEW cov: 11774 ft: 12730 corp: 5/193b lim: 50 exec/s: 0 rss: 67Mb L: 50/50 MS: 1 InsertRepeatedBytes- 00:07:54.144 [2024-07-15 00:15:53.192620] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:54.144 [2024-07-15 00:15:53.192651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.144 [2024-07-15 00:15:53.192728] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:54.144 [2024-07-15 00:15:53.192751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.144 [2024-07-15 00:15:53.192879] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:54.144 [2024-07-15 00:15:53.192903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:54.144 [2024-07-15 00:15:53.193029] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:54.144 [2024-07-15 00:15:53.193051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:54.403 #12 NEW cov: 11774 ft: 12791 corp: 6/241b lim: 50 exec/s: 0 rss: 67Mb L: 48/50 MS: 1 ChangeBinInt- 00:07:54.403 [2024-07-15 00:15:53.242788] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:54.403 [2024-07-15 00:15:53.242817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.403 [2024-07-15 00:15:53.242928] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:54.403 [2024-07-15 00:15:53.242949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.403 [2024-07-15 00:15:53.243079] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:54.403 [2024-07-15 00:15:53.243103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:54.403 [2024-07-15 00:15:53.243232] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:54.403 [2024-07-15 00:15:53.243256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:54.403 #13 NEW cov: 11774 ft: 12948 corp: 7/286b lim: 50 exec/s: 0 rss: 67Mb L: 45/50 MS: 1 ShuffleBytes- 00:07:54.403 [2024-07-15 00:15:53.293025] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:54.403 [2024-07-15 00:15:53.293057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.403 [2024-07-15 00:15:53.293154] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:54.403 [2024-07-15 00:15:53.293182] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.403 [2024-07-15 00:15:53.293304] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:54.403 [2024-07-15 00:15:53.293327] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:54.403 [2024-07-15 00:15:53.293464] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:54.403 [2024-07-15 00:15:53.293489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:54.403 #14 NEW cov: 11774 ft: 12988 corp: 8/331b lim: 50 exec/s: 0 rss: 67Mb L: 45/50 MS: 1 ChangeBinInt- 00:07:54.403 [2024-07-15 00:15:53.343057] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:54.403 [2024-07-15 00:15:53.343088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.403 [2024-07-15 00:15:53.343193] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:54.403 [2024-07-15 00:15:53.343215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.403 [2024-07-15 00:15:53.343350] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:54.403 [2024-07-15 00:15:53.343373] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:54.403 [2024-07-15 00:15:53.343520] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:54.403 [2024-07-15 00:15:53.343548] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:54.403 #15 NEW cov: 11774 ft: 13028 corp: 9/379b lim: 50 exec/s: 0 rss: 67Mb L: 48/50 MS: 1 ChangeBinInt- 00:07:54.403 [2024-07-15 00:15:53.393595] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:54.403 [2024-07-15 00:15:53.393627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.403 [2024-07-15 00:15:53.393739] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:54.403 [2024-07-15 00:15:53.393759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.403 [2024-07-15 00:15:53.393894] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:54.403 [2024-07-15 00:15:53.393921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:54.403 [2024-07-15 00:15:53.394054] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:54.403 [2024-07-15 00:15:53.394076] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:54.403 [2024-07-15 00:15:53.394207] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:07:54.403 [2024-07-15 00:15:53.394232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:54.403 #16 NEW cov: 11774 ft: 13090 corp: 10/429b lim: 50 exec/s: 0 rss: 68Mb L: 50/50 MS: 1 CopyPart- 00:07:54.403 [2024-07-15 00:15:53.443779] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:54.403 [2024-07-15 00:15:53.443815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.403 [2024-07-15 00:15:53.443926] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:54.403 [2024-07-15 00:15:53.443946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.403 [2024-07-15 00:15:53.444069] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:54.403 [2024-07-15 00:15:53.444092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:54.403 [2024-07-15 00:15:53.444213] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:54.403 [2024-07-15 00:15:53.444236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:54.403 [2024-07-15 00:15:53.444371] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:07:54.403 [2024-07-15 00:15:53.444390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:54.662 #17 NEW cov: 11774 ft: 13112 corp: 11/479b lim: 50 exec/s: 0 rss: 68Mb L: 50/50 MS: 1 ChangeBinInt- 00:07:54.662 [2024-07-15 00:15:53.503377] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:54.662 [2024-07-15 00:15:53.503411] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.662 [2024-07-15 00:15:53.503545] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:54.662 [2024-07-15 00:15:53.503566] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.662 [2024-07-15 00:15:53.503702] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:54.662 [2024-07-15 00:15:53.503725] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:54.662 #18 NEW cov: 11774 ft: 13518 corp: 12/517b lim: 50 exec/s: 0 rss: 68Mb L: 38/50 MS: 1 CrossOver- 00:07:54.662 [2024-07-15 00:15:53.554212] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:54.662 [2024-07-15 00:15:53.554247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.662 [2024-07-15 00:15:53.554319] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:54.662 [2024-07-15 00:15:53.554340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.662 [2024-07-15 00:15:53.554473] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:54.662 [2024-07-15 00:15:53.554492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:54.662 [2024-07-15 00:15:53.554596] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:54.662 [2024-07-15 00:15:53.554625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:54.662 [2024-07-15 00:15:53.554767] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:07:54.662 [2024-07-15 00:15:53.554794] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:54.662 NEW_FUNC[1/1]: 0x195e300 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:54.662 #19 NEW cov: 11797 ft: 13594 corp: 13/567b lim: 50 exec/s: 0 rss: 68Mb L: 50/50 MS: 1 ShuffleBytes- 00:07:54.662 [2024-07-15 00:15:53.614371] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:54.662 [2024-07-15 00:15:53.614406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.662 [2024-07-15 00:15:53.614512] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:54.662 [2024-07-15 00:15:53.614538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.662 [2024-07-15 00:15:53.614654] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:54.662 [2024-07-15 00:15:53.614676] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:54.662 [2024-07-15 00:15:53.614797] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:54.662 [2024-07-15 00:15:53.614812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:54.662 [2024-07-15 00:15:53.614942] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:07:54.662 [2024-07-15 00:15:53.614965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:54.662 #20 NEW cov: 11797 ft: 13610 corp: 14/617b lim: 50 exec/s: 0 rss: 68Mb L: 50/50 MS: 1 ChangeByte- 00:07:54.662 [2024-07-15 00:15:53.664527] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:54.662 [2024-07-15 00:15:53.664560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.662 [2024-07-15 00:15:53.664651] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:54.662 [2024-07-15 00:15:53.664675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.662 [2024-07-15 00:15:53.664806] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:54.662 [2024-07-15 00:15:53.664827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:54.662 [2024-07-15 00:15:53.664948] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:54.662 [2024-07-15 00:15:53.664968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:54.662 [2024-07-15 00:15:53.665096] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:07:54.662 [2024-07-15 00:15:53.665115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:54.662 #22 NEW cov: 11797 ft: 13644 corp: 15/667b lim: 50 exec/s: 22 rss: 68Mb L: 50/50 MS: 2 ChangeBit-InsertRepeatedBytes- 00:07:54.662 [2024-07-15 00:15:53.714660] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:54.662 [2024-07-15 00:15:53.714698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.662 [2024-07-15 00:15:53.714795] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:54.662 [2024-07-15 00:15:53.714815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.662 [2024-07-15 00:15:53.714934] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:54.662 [2024-07-15 00:15:53.714952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:54.662 [2024-07-15 00:15:53.715082] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:54.662 [2024-07-15 00:15:53.715105] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:54.662 [2024-07-15 00:15:53.715236] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:07:54.662 [2024-07-15 00:15:53.715255] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:54.921 #23 NEW cov: 11797 ft: 13669 corp: 16/717b lim: 50 exec/s: 23 rss: 68Mb L: 50/50 MS: 1 ChangeByte- 00:07:54.921 [2024-07-15 00:15:53.764780] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:54.921 [2024-07-15 00:15:53.764812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.921 [2024-07-15 00:15:53.764909] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:54.921 [2024-07-15 00:15:53.764934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.921 [2024-07-15 00:15:53.765063] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:54.921 [2024-07-15 00:15:53.765085] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:54.921 [2024-07-15 00:15:53.765225] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:54.921 [2024-07-15 00:15:53.765247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:54.921 [2024-07-15 00:15:53.765380] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:07:54.921 [2024-07-15 00:15:53.765405] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:54.921 #24 NEW cov: 11797 ft: 13680 corp: 17/767b lim: 50 exec/s: 24 rss: 68Mb L: 50/50 MS: 1 CopyPart- 00:07:54.921 [2024-07-15 00:15:53.814475] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:54.921 [2024-07-15 00:15:53.814500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.921 [2024-07-15 00:15:53.814624] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:54.921 [2024-07-15 00:15:53.814646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.921 [2024-07-15 00:15:53.814766] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:54.921 [2024-07-15 00:15:53.814785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:54.921 [2024-07-15 00:15:53.814908] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:54.921 [2024-07-15 00:15:53.814926] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:54.921 #25 NEW cov: 11797 ft: 13697 corp: 18/816b lim: 50 exec/s: 25 rss: 68Mb L: 49/50 MS: 1 InsertRepeatedBytes- 00:07:54.921 [2024-07-15 00:15:53.865128] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:54.921 [2024-07-15 00:15:53.865158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.921 [2024-07-15 00:15:53.865266] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:54.921 [2024-07-15 00:15:53.865291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.921 [2024-07-15 00:15:53.865432] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:54.921 [2024-07-15 00:15:53.865462] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:54.921 [2024-07-15 00:15:53.865577] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:54.921 [2024-07-15 00:15:53.865598] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:54.921 [2024-07-15 00:15:53.865730] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:07:54.921 [2024-07-15 00:15:53.865751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:54.921 #31 NEW cov: 11797 ft: 13774 corp: 19/866b lim: 50 exec/s: 31 rss: 68Mb L: 50/50 MS: 1 ChangeByte- 00:07:54.921 [2024-07-15 00:15:53.924497] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:54.921 [2024-07-15 00:15:53.924526] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.921 [2024-07-15 00:15:53.924653] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:54.921 [2024-07-15 00:15:53.924677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.921 #32 NEW cov: 11797 ft: 14090 corp: 20/895b lim: 50 exec/s: 32 rss: 68Mb L: 29/50 MS: 1 EraseBytes- 00:07:55.180 [2024-07-15 00:15:53.985334] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:55.180 [2024-07-15 00:15:53.985367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:55.180 [2024-07-15 00:15:53.985504] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:55.180 [2024-07-15 00:15:53.985523] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:55.180 [2024-07-15 00:15:53.985659] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:55.180 [2024-07-15 00:15:53.985683] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:55.180 [2024-07-15 00:15:53.985820] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:55.180 [2024-07-15 00:15:53.985843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:55.180 #33 NEW cov: 11797 ft: 14100 corp: 21/944b lim: 50 exec/s: 33 rss: 69Mb L: 49/50 MS: 1 InsertByte- 00:07:55.180 [2024-07-15 00:15:54.045481] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:55.180 [2024-07-15 00:15:54.045517] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:55.180 [2024-07-15 00:15:54.045630] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:55.180 [2024-07-15 00:15:54.045656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:55.180 [2024-07-15 00:15:54.045789] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:55.180 [2024-07-15 00:15:54.045810] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:55.180 [2024-07-15 00:15:54.045922] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:55.180 [2024-07-15 00:15:54.045945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:55.180 #34 NEW cov: 11797 ft: 14125 corp: 22/993b lim: 50 exec/s: 34 rss: 69Mb L: 49/50 MS: 1 CopyPart- 00:07:55.180 [2024-07-15 00:15:54.105623] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:55.180 [2024-07-15 00:15:54.105658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:55.180 [2024-07-15 00:15:54.105783] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:55.180 [2024-07-15 00:15:54.105807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:55.180 [2024-07-15 00:15:54.105942] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:55.180 [2024-07-15 00:15:54.105964] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:55.180 [2024-07-15 00:15:54.106090] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:55.180 [2024-07-15 00:15:54.106115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:55.180 #35 NEW cov: 11797 ft: 14152 corp: 23/1042b lim: 50 exec/s: 35 rss: 69Mb L: 49/50 MS: 1 ChangeByte- 00:07:55.180 [2024-07-15 00:15:54.155974] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:55.180 [2024-07-15 00:15:54.156009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:55.180 [2024-07-15 00:15:54.156129] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:55.180 [2024-07-15 00:15:54.156157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:55.180 [2024-07-15 00:15:54.156299] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:55.180 [2024-07-15 00:15:54.156327] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:55.180 [2024-07-15 00:15:54.156461] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:55.180 [2024-07-15 00:15:54.156485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:55.180 [2024-07-15 00:15:54.156607] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:07:55.180 [2024-07-15 00:15:54.156632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:55.180 #36 NEW cov: 11797 ft: 14197 corp: 24/1092b lim: 50 exec/s: 36 rss: 69Mb L: 50/50 MS: 1 ShuffleBytes- 00:07:55.180 [2024-07-15 00:15:54.205826] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:55.180 [2024-07-15 00:15:54.205857] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:55.180 [2024-07-15 00:15:54.205980] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:55.180 [2024-07-15 00:15:54.206007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:55.180 [2024-07-15 00:15:54.206142] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:55.180 [2024-07-15 00:15:54.206167] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:55.180 [2024-07-15 00:15:54.206300] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:55.180 [2024-07-15 00:15:54.206322] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:55.180 #37 NEW cov: 11797 ft: 14212 corp: 25/1141b lim: 50 exec/s: 37 rss: 69Mb L: 49/50 MS: 1 ChangeBit- 00:07:55.440 [2024-07-15 00:15:54.266071] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:55.440 [2024-07-15 00:15:54.266104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:55.440 [2024-07-15 00:15:54.266212] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:55.440 [2024-07-15 00:15:54.266235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:55.440 [2024-07-15 00:15:54.266374] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:55.440 [2024-07-15 00:15:54.266397] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:55.440 [2024-07-15 00:15:54.266525] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:55.440 [2024-07-15 00:15:54.266553] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:55.440 #38 NEW cov: 11797 ft: 14258 corp: 26/1190b lim: 50 exec/s: 38 rss: 69Mb L: 49/50 MS: 1 ChangeBinInt- 00:07:55.440 [2024-07-15 00:15:54.326527] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:55.440 [2024-07-15 00:15:54.326559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:55.440 [2024-07-15 00:15:54.326634] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:55.440 [2024-07-15 00:15:54.326662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:55.440 [2024-07-15 00:15:54.326788] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:55.440 [2024-07-15 00:15:54.326808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:55.440 [2024-07-15 00:15:54.326940] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:55.440 [2024-07-15 00:15:54.326963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:55.440 [2024-07-15 00:15:54.327080] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:07:55.440 [2024-07-15 00:15:54.327103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:55.440 #39 NEW cov: 11797 ft: 14277 corp: 27/1240b lim: 50 exec/s: 39 rss: 69Mb L: 50/50 MS: 1 CrossOver- 00:07:55.440 [2024-07-15 00:15:54.386748] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:55.440 [2024-07-15 00:15:54.386784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:55.440 [2024-07-15 00:15:54.386883] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:55.440 [2024-07-15 00:15:54.386910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:55.440 [2024-07-15 00:15:54.387039] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:55.440 [2024-07-15 00:15:54.387072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:55.440 [2024-07-15 00:15:54.387205] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:55.440 [2024-07-15 00:15:54.387228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:55.440 [2024-07-15 00:15:54.387350] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:07:55.440 [2024-07-15 00:15:54.387371] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:55.440 #40 NEW cov: 11797 ft: 14292 corp: 28/1290b lim: 50 exec/s: 40 rss: 69Mb L: 50/50 MS: 1 CopyPart- 00:07:55.440 [2024-07-15 00:15:54.436102] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:55.440 [2024-07-15 00:15:54.436135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:55.440 [2024-07-15 00:15:54.436260] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:55.440 [2024-07-15 00:15:54.436281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:55.440 #41 NEW cov: 11797 ft: 14308 corp: 29/1317b lim: 50 exec/s: 41 rss: 69Mb L: 27/50 MS: 1 EraseBytes- 00:07:55.440 [2024-07-15 00:15:54.486561] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:55.440 [2024-07-15 00:15:54.486594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:55.440 [2024-07-15 00:15:54.486716] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:55.440 [2024-07-15 00:15:54.486738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:55.440 [2024-07-15 00:15:54.486868] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:55.440 [2024-07-15 00:15:54.486889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:55.700 #42 NEW cov: 11797 ft: 14319 corp: 30/1351b lim: 50 exec/s: 42 rss: 69Mb L: 34/50 MS: 1 EraseBytes- 00:07:55.700 [2024-07-15 00:15:54.536471] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:55.700 [2024-07-15 00:15:54.536499] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:55.700 [2024-07-15 00:15:54.536640] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:55.700 [2024-07-15 00:15:54.536663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:55.700 #43 NEW cov: 11797 ft: 14344 corp: 31/1379b lim: 50 exec/s: 43 rss: 69Mb L: 28/50 MS: 1 EraseBytes- 00:07:55.700 [2024-07-15 00:15:54.586595] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:55.700 [2024-07-15 00:15:54.586631] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:55.700 [2024-07-15 00:15:54.586765] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:55.700 [2024-07-15 00:15:54.586789] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:55.700 [2024-07-15 00:15:54.637038] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:55.700 [2024-07-15 00:15:54.637072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:55.700 [2024-07-15 00:15:54.637168] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:55.700 [2024-07-15 00:15:54.637188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:55.700 [2024-07-15 00:15:54.637317] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:55.700 [2024-07-15 00:15:54.637338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:55.700 #45 NEW cov: 11797 ft: 14365 corp: 32/1409b lim: 50 exec/s: 45 rss: 69Mb L: 30/50 MS: 2 EraseBytes-InsertRepeatedBytes- 00:07:55.700 [2024-07-15 00:15:54.687766] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:55.700 [2024-07-15 00:15:54.687799] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:55.700 [2024-07-15 00:15:54.687922] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:55.700 [2024-07-15 00:15:54.687949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:55.700 [2024-07-15 00:15:54.688073] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:55.700 [2024-07-15 00:15:54.688100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:55.700 [2024-07-15 00:15:54.688226] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:55.700 [2024-07-15 00:15:54.688248] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:55.700 [2024-07-15 00:15:54.688379] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:07:55.700 [2024-07-15 00:15:54.688404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:55.700 #46 NEW cov: 11797 ft: 14371 corp: 33/1459b lim: 50 exec/s: 23 rss: 69Mb L: 50/50 MS: 1 ShuffleBytes- 00:07:55.700 #46 DONE cov: 11797 ft: 14371 corp: 33/1459b lim: 50 exec/s: 23 rss: 69Mb 00:07:55.700 Done 46 runs in 2 second(s) 00:07:55.959 00:15:54 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_21.conf 00:07:55.959 00:15:54 -- ../common.sh@72 -- # (( i++ )) 00:07:55.959 00:15:54 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:55.959 00:15:54 -- ../common.sh@73 -- # start_llvm_fuzz 22 1 0x1 00:07:55.959 00:15:54 -- nvmf/run.sh@23 -- # local fuzzer_type=22 00:07:55.959 00:15:54 -- nvmf/run.sh@24 -- # local timen=1 00:07:55.959 00:15:54 -- nvmf/run.sh@25 -- # local core=0x1 00:07:55.959 00:15:54 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:07:55.959 00:15:54 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_22.conf 00:07:55.959 00:15:54 -- nvmf/run.sh@29 -- # printf %02d 22 00:07:55.959 00:15:54 -- nvmf/run.sh@29 -- # port=4422 00:07:55.959 00:15:54 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:07:55.959 00:15:54 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4422' 00:07:55.959 00:15:54 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4422"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:55.959 00:15:54 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4422' -c /tmp/fuzz_json_22.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 -Z 22 -r /var/tmp/spdk22.sock 00:07:55.959 [2024-07-15 00:15:54.874821] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:55.959 [2024-07-15 00:15:54.874890] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid336096 ] 00:07:55.959 EAL: No free 2048 kB hugepages reported on node 1 00:07:56.260 [2024-07-15 00:15:55.051397] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:56.260 [2024-07-15 00:15:55.113008] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:56.260 [2024-07-15 00:15:55.113154] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:56.260 [2024-07-15 00:15:55.171193] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:56.260 [2024-07-15 00:15:55.187437] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4422 *** 00:07:56.260 INFO: Running with entropic power schedule (0xFF, 100). 00:07:56.260 INFO: Seed: 1190738390 00:07:56.260 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x280a94c, 0x285dea9), 00:07:56.260 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x285deb0,0x2d93480), 00:07:56.260 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:07:56.260 INFO: A corpus is not provided, starting from an empty corpus 00:07:56.260 #2 INITED exec/s: 0 rss: 60Mb 00:07:56.260 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:56.260 This may also happen if the target rejected all inputs we tried so far 00:07:56.260 [2024-07-15 00:15:55.253831] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:56.260 [2024-07-15 00:15:55.253865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.260 [2024-07-15 00:15:55.253990] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:56.260 [2024-07-15 00:15:55.254013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:56.260 [2024-07-15 00:15:55.254133] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:56.260 [2024-07-15 00:15:55.254154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:56.546 NEW_FUNC[1/672]: 0x4a82f0 in fuzz_nvm_reservation_register_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:644 00:07:56.546 NEW_FUNC[2/672]: 0x4bd260 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:56.546 #7 NEW cov: 11596 ft: 11597 corp: 2/66b lim: 85 exec/s: 0 rss: 67Mb L: 65/65 MS: 5 CopyPart-CrossOver-ChangeBit-ChangeBit-InsertRepeatedBytes- 00:07:56.546 [2024-07-15 00:15:55.594820] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:56.546 [2024-07-15 00:15:55.594859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.546 [2024-07-15 00:15:55.595002] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:56.546 [2024-07-15 00:15:55.595027] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:56.546 [2024-07-15 00:15:55.595167] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:56.546 [2024-07-15 00:15:55.595190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:56.805 #8 NEW cov: 11709 ft: 12300 corp: 3/131b lim: 85 exec/s: 0 rss: 67Mb L: 65/65 MS: 1 ShuffleBytes- 00:07:56.805 [2024-07-15 00:15:55.655166] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:56.805 [2024-07-15 00:15:55.655198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.805 [2024-07-15 00:15:55.655342] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:56.805 [2024-07-15 00:15:55.655364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:56.805 [2024-07-15 00:15:55.655508] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:56.805 [2024-07-15 00:15:55.655529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:56.805 #10 NEW cov: 11715 ft: 12608 corp: 4/195b lim: 85 exec/s: 0 rss: 67Mb L: 64/65 MS: 2 CopyPart-InsertRepeatedBytes- 00:07:56.805 [2024-07-15 00:15:55.705173] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:56.805 [2024-07-15 00:15:55.705206] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.805 [2024-07-15 00:15:55.705345] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:56.806 [2024-07-15 00:15:55.705375] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:56.806 [2024-07-15 00:15:55.705525] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:56.806 [2024-07-15 00:15:55.705550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:56.806 #11 NEW cov: 11800 ft: 12792 corp: 5/260b lim: 85 exec/s: 0 rss: 67Mb L: 65/65 MS: 1 ShuffleBytes- 00:07:56.806 [2024-07-15 00:15:55.755399] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:56.806 [2024-07-15 00:15:55.755436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.806 [2024-07-15 00:15:55.755565] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:56.806 [2024-07-15 00:15:55.755587] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:56.806 [2024-07-15 00:15:55.755727] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:56.806 [2024-07-15 00:15:55.755751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:56.806 #17 NEW cov: 11800 ft: 12861 corp: 6/325b lim: 85 exec/s: 0 rss: 67Mb L: 65/65 MS: 1 CopyPart- 00:07:56.806 [2024-07-15 00:15:55.815274] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:56.806 [2024-07-15 00:15:55.815308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.806 [2024-07-15 00:15:55.815454] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:56.806 [2024-07-15 00:15:55.815477] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:56.806 #18 NEW cov: 11800 ft: 13305 corp: 7/369b lim: 85 exec/s: 0 rss: 67Mb L: 44/65 MS: 1 EraseBytes- 00:07:57.065 [2024-07-15 00:15:55.865761] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:57.065 [2024-07-15 00:15:55.865797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.065 [2024-07-15 00:15:55.865913] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:57.065 [2024-07-15 00:15:55.865938] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.065 [2024-07-15 00:15:55.866080] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:57.065 [2024-07-15 00:15:55.866100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:57.065 #19 NEW cov: 11800 ft: 13395 corp: 8/436b lim: 85 exec/s: 0 rss: 68Mb L: 67/67 MS: 1 CMP- DE: "\377\004"- 00:07:57.065 [2024-07-15 00:15:55.915815] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:57.065 [2024-07-15 00:15:55.915847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.065 [2024-07-15 00:15:55.915922] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:57.065 [2024-07-15 00:15:55.915946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.065 [2024-07-15 00:15:55.916085] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:57.065 [2024-07-15 00:15:55.916106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:57.065 #20 NEW cov: 11800 ft: 13418 corp: 9/501b lim: 85 exec/s: 0 rss: 68Mb L: 65/67 MS: 1 CopyPart- 00:07:57.065 [2024-07-15 00:15:55.965469] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:57.065 [2024-07-15 00:15:55.965494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.065 #21 NEW cov: 11800 ft: 14315 corp: 10/525b lim: 85 exec/s: 0 rss: 68Mb L: 24/67 MS: 1 InsertRepeatedBytes- 00:07:57.065 [2024-07-15 00:15:56.016134] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:57.065 [2024-07-15 00:15:56.016165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.065 [2024-07-15 00:15:56.016269] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:57.065 [2024-07-15 00:15:56.016294] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.065 [2024-07-15 00:15:56.016434] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:57.065 [2024-07-15 00:15:56.016461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:57.065 #22 NEW cov: 11800 ft: 14340 corp: 11/592b lim: 85 exec/s: 0 rss: 68Mb L: 67/67 MS: 1 ChangeBinInt- 00:07:57.065 [2024-07-15 00:15:56.065771] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:57.065 [2024-07-15 00:15:56.065804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.065 #23 NEW cov: 11800 ft: 14380 corp: 12/617b lim: 85 exec/s: 0 rss: 68Mb L: 25/67 MS: 1 CrossOver- 00:07:57.065 [2024-07-15 00:15:56.115871] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:57.065 [2024-07-15 00:15:56.115896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.325 NEW_FUNC[1/1]: 0x195e300 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:57.325 #24 NEW cov: 11823 ft: 14481 corp: 13/642b lim: 85 exec/s: 0 rss: 68Mb L: 25/67 MS: 1 CopyPart- 00:07:57.325 [2024-07-15 00:15:56.176893] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:57.325 [2024-07-15 00:15:56.176927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.325 [2024-07-15 00:15:56.177052] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:57.325 [2024-07-15 00:15:56.177077] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.325 [2024-07-15 00:15:56.177211] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:57.325 [2024-07-15 00:15:56.177231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:57.325 [2024-07-15 00:15:56.177354] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:57.325 [2024-07-15 00:15:56.177376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:57.325 #25 NEW cov: 11823 ft: 14836 corp: 14/719b lim: 85 exec/s: 0 rss: 68Mb L: 77/77 MS: 1 CopyPart- 00:07:57.325 [2024-07-15 00:15:56.237066] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:57.325 [2024-07-15 00:15:56.237097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.325 [2024-07-15 00:15:56.237178] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:57.325 [2024-07-15 00:15:56.237204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.325 [2024-07-15 00:15:56.237327] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:57.325 [2024-07-15 00:15:56.237347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:57.325 [2024-07-15 00:15:56.237469] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:57.325 [2024-07-15 00:15:56.237495] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:57.325 #26 NEW cov: 11823 ft: 14853 corp: 15/796b lim: 85 exec/s: 26 rss: 68Mb L: 77/77 MS: 1 CrossOver- 00:07:57.325 [2024-07-15 00:15:56.287012] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:57.325 [2024-07-15 00:15:56.287044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.325 [2024-07-15 00:15:56.287173] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:57.325 [2024-07-15 00:15:56.287197] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.325 [2024-07-15 00:15:56.287330] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:57.325 [2024-07-15 00:15:56.287354] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:57.325 #27 NEW cov: 11823 ft: 14883 corp: 16/863b lim: 85 exec/s: 27 rss: 68Mb L: 67/77 MS: 1 ChangeBinInt- 00:07:57.325 [2024-07-15 00:15:56.337178] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:57.325 [2024-07-15 00:15:56.337209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.325 [2024-07-15 00:15:56.337326] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:57.325 [2024-07-15 00:15:56.337345] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.325 [2024-07-15 00:15:56.337493] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:57.325 [2024-07-15 00:15:56.337518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:57.325 #33 NEW cov: 11823 ft: 14907 corp: 17/930b lim: 85 exec/s: 33 rss: 69Mb L: 67/77 MS: 1 ChangeBit- 00:07:57.585 [2024-07-15 00:15:56.387377] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:57.585 [2024-07-15 00:15:56.387411] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.585 [2024-07-15 00:15:56.387555] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:57.585 [2024-07-15 00:15:56.387576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.585 [2024-07-15 00:15:56.387709] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:57.585 [2024-07-15 00:15:56.387735] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:57.585 #34 NEW cov: 11823 ft: 14939 corp: 18/995b lim: 85 exec/s: 34 rss: 69Mb L: 65/77 MS: 1 ChangeByte- 00:07:57.585 [2024-07-15 00:15:56.437862] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:57.585 [2024-07-15 00:15:56.437894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.585 [2024-07-15 00:15:56.437996] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:57.585 [2024-07-15 00:15:56.438013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.585 [2024-07-15 00:15:56.438134] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:57.585 [2024-07-15 00:15:56.438160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:57.585 [2024-07-15 00:15:56.438301] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:57.585 [2024-07-15 00:15:56.438327] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:57.585 #35 NEW cov: 11823 ft: 14953 corp: 19/1063b lim: 85 exec/s: 35 rss: 69Mb L: 68/77 MS: 1 CrossOver- 00:07:57.585 [2024-07-15 00:15:56.497486] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:57.585 [2024-07-15 00:15:56.497530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.585 [2024-07-15 00:15:56.497653] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:57.585 [2024-07-15 00:15:56.497677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.585 #36 NEW cov: 11823 ft: 14994 corp: 20/1107b lim: 85 exec/s: 36 rss: 69Mb L: 44/77 MS: 1 ChangeBit- 00:07:57.585 [2024-07-15 00:15:56.548008] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:57.585 [2024-07-15 00:15:56.548037] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.585 [2024-07-15 00:15:56.548169] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:57.585 [2024-07-15 00:15:56.548194] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.585 [2024-07-15 00:15:56.548326] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:57.585 [2024-07-15 00:15:56.548351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:57.585 #37 NEW cov: 11823 ft: 15007 corp: 21/1174b lim: 85 exec/s: 37 rss: 69Mb L: 67/77 MS: 1 ChangeBit- 00:07:57.585 [2024-07-15 00:15:56.598114] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:57.585 [2024-07-15 00:15:56.598146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.585 [2024-07-15 00:15:56.598255] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:57.585 [2024-07-15 00:15:56.598279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.585 [2024-07-15 00:15:56.598410] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:57.585 [2024-07-15 00:15:56.598434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:57.585 #38 NEW cov: 11823 ft: 15033 corp: 22/1241b lim: 85 exec/s: 38 rss: 69Mb L: 67/77 MS: 1 ChangeByte- 00:07:57.844 [2024-07-15 00:15:56.648528] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:57.844 [2024-07-15 00:15:56.648572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.844 [2024-07-15 00:15:56.648715] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:57.844 [2024-07-15 00:15:56.648743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.844 [2024-07-15 00:15:56.648889] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:57.844 [2024-07-15 00:15:56.648911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:57.844 [2024-07-15 00:15:56.649046] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:57.844 [2024-07-15 00:15:56.649069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:57.844 #39 NEW cov: 11823 ft: 15052 corp: 23/1309b lim: 85 exec/s: 39 rss: 69Mb L: 68/77 MS: 1 InsertRepeatedBytes- 00:07:57.844 [2024-07-15 00:15:56.708597] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:57.844 [2024-07-15 00:15:56.708633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.844 [2024-07-15 00:15:56.708758] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:57.844 [2024-07-15 00:15:56.708781] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.844 [2024-07-15 00:15:56.708914] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:57.844 [2024-07-15 00:15:56.708935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:57.844 #40 NEW cov: 11823 ft: 15087 corp: 24/1376b lim: 85 exec/s: 40 rss: 69Mb L: 67/77 MS: 1 CopyPart- 00:07:57.844 [2024-07-15 00:15:56.758970] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:57.844 [2024-07-15 00:15:56.759004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.844 [2024-07-15 00:15:56.759112] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:57.844 [2024-07-15 00:15:56.759137] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.844 [2024-07-15 00:15:56.759281] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:57.844 [2024-07-15 00:15:56.759306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:57.844 [2024-07-15 00:15:56.759445] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:57.844 [2024-07-15 00:15:56.759470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:57.844 #41 NEW cov: 11823 ft: 15106 corp: 25/1456b lim: 85 exec/s: 41 rss: 69Mb L: 80/80 MS: 1 InsertRepeatedBytes- 00:07:57.844 [2024-07-15 00:15:56.819191] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:57.844 [2024-07-15 00:15:56.819230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.844 [2024-07-15 00:15:56.819363] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:57.844 [2024-07-15 00:15:56.819384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.844 [2024-07-15 00:15:56.819520] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:57.844 [2024-07-15 00:15:56.819540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:57.844 [2024-07-15 00:15:56.819681] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:57.844 [2024-07-15 00:15:56.819699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:57.844 #42 NEW cov: 11823 ft: 15116 corp: 26/1527b lim: 85 exec/s: 42 rss: 69Mb L: 71/80 MS: 1 CrossOver- 00:07:57.844 [2024-07-15 00:15:56.869303] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:57.844 [2024-07-15 00:15:56.869335] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.844 [2024-07-15 00:15:56.869459] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:57.844 [2024-07-15 00:15:56.869480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.844 [2024-07-15 00:15:56.869614] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:57.844 [2024-07-15 00:15:56.869638] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:57.844 [2024-07-15 00:15:56.869769] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:57.844 [2024-07-15 00:15:56.869792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:57.844 #43 NEW cov: 11823 ft: 15135 corp: 27/1604b lim: 85 exec/s: 43 rss: 69Mb L: 77/80 MS: 1 ChangeBit- 00:07:58.103 [2024-07-15 00:15:56.919423] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:58.103 [2024-07-15 00:15:56.919464] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:58.103 [2024-07-15 00:15:56.919590] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:58.103 [2024-07-15 00:15:56.919616] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:58.103 [2024-07-15 00:15:56.919750] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:58.103 [2024-07-15 00:15:56.919775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:58.104 [2024-07-15 00:15:56.919918] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:58.104 [2024-07-15 00:15:56.919945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:58.104 #44 NEW cov: 11823 ft: 15147 corp: 28/1682b lim: 85 exec/s: 44 rss: 70Mb L: 78/80 MS: 1 InsertByte- 00:07:58.104 [2024-07-15 00:15:56.968722] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:58.104 [2024-07-15 00:15:56.968748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:58.104 #45 NEW cov: 11823 ft: 15158 corp: 29/1707b lim: 85 exec/s: 45 rss: 70Mb L: 25/80 MS: 1 CopyPart- 00:07:58.104 [2024-07-15 00:15:57.019508] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:58.104 [2024-07-15 00:15:57.019547] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:58.104 [2024-07-15 00:15:57.019693] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:58.104 [2024-07-15 00:15:57.019719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:58.104 [2024-07-15 00:15:57.019849] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:58.104 [2024-07-15 00:15:57.019877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:58.104 #46 NEW cov: 11823 ft: 15171 corp: 30/1766b lim: 85 exec/s: 46 rss: 70Mb L: 59/80 MS: 1 EraseBytes- 00:07:58.104 [2024-07-15 00:15:57.069640] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:58.104 [2024-07-15 00:15:57.069671] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:58.104 [2024-07-15 00:15:57.069812] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:58.104 [2024-07-15 00:15:57.069837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:58.104 [2024-07-15 00:15:57.069977] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:58.104 [2024-07-15 00:15:57.070003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:58.104 #47 NEW cov: 11823 ft: 15173 corp: 31/1833b lim: 85 exec/s: 47 rss: 70Mb L: 67/80 MS: 1 CrossOver- 00:07:58.104 [2024-07-15 00:15:57.119272] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:58.104 [2024-07-15 00:15:57.119298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:58.104 #48 NEW cov: 11823 ft: 15184 corp: 32/1857b lim: 85 exec/s: 48 rss: 70Mb L: 24/80 MS: 1 CMP- DE: "\001\000\000\000\000\000\000\000"- 00:07:58.364 [2024-07-15 00:15:57.180255] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:58.364 [2024-07-15 00:15:57.180289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:58.364 [2024-07-15 00:15:57.180416] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:58.364 [2024-07-15 00:15:57.180446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:58.364 [2024-07-15 00:15:57.180574] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:58.364 [2024-07-15 00:15:57.180594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:58.364 [2024-07-15 00:15:57.180721] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:58.364 [2024-07-15 00:15:57.180744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:58.364 #49 NEW cov: 11823 ft: 15198 corp: 33/1934b lim: 85 exec/s: 49 rss: 70Mb L: 77/80 MS: 1 CrossOver- 00:07:58.364 [2024-07-15 00:15:57.240429] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:58.364 [2024-07-15 00:15:57.240468] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:58.364 [2024-07-15 00:15:57.240580] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:58.364 [2024-07-15 00:15:57.240606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:58.364 [2024-07-15 00:15:57.240733] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:58.364 [2024-07-15 00:15:57.240757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:58.364 [2024-07-15 00:15:57.240894] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:58.364 [2024-07-15 00:15:57.240919] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:58.364 #50 NEW cov: 11823 ft: 15214 corp: 34/2012b lim: 85 exec/s: 25 rss: 70Mb L: 78/80 MS: 1 ChangeBinInt- 00:07:58.364 #50 DONE cov: 11823 ft: 15214 corp: 34/2012b lim: 85 exec/s: 25 rss: 70Mb 00:07:58.364 ###### Recommended dictionary. ###### 00:07:58.364 "\377\004" # Uses: 2 00:07:58.364 "\001\000\000\000\000\000\000\000" # Uses: 0 00:07:58.364 ###### End of recommended dictionary. ###### 00:07:58.364 Done 50 runs in 2 second(s) 00:07:58.364 00:15:57 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_22.conf 00:07:58.364 00:15:57 -- ../common.sh@72 -- # (( i++ )) 00:07:58.364 00:15:57 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:58.364 00:15:57 -- ../common.sh@73 -- # start_llvm_fuzz 23 1 0x1 00:07:58.364 00:15:57 -- nvmf/run.sh@23 -- # local fuzzer_type=23 00:07:58.364 00:15:57 -- nvmf/run.sh@24 -- # local timen=1 00:07:58.364 00:15:57 -- nvmf/run.sh@25 -- # local core=0x1 00:07:58.364 00:15:57 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:07:58.364 00:15:57 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_23.conf 00:07:58.364 00:15:57 -- nvmf/run.sh@29 -- # printf %02d 23 00:07:58.364 00:15:57 -- nvmf/run.sh@29 -- # port=4423 00:07:58.364 00:15:57 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:07:58.364 00:15:57 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4423' 00:07:58.364 00:15:57 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4423"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:58.364 00:15:57 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4423' -c /tmp/fuzz_json_23.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 -Z 23 -r /var/tmp/spdk23.sock 00:07:58.624 [2024-07-15 00:15:57.429724] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:58.624 [2024-07-15 00:15:57.429802] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid336636 ] 00:07:58.624 EAL: No free 2048 kB hugepages reported on node 1 00:07:58.624 [2024-07-15 00:15:57.607048] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:58.624 [2024-07-15 00:15:57.669163] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:58.624 [2024-07-15 00:15:57.669309] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:58.884 [2024-07-15 00:15:57.727224] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:58.884 [2024-07-15 00:15:57.743496] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4423 *** 00:07:58.884 INFO: Running with entropic power schedule (0xFF, 100). 00:07:58.884 INFO: Seed: 3747735803 00:07:58.884 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x280a94c, 0x285dea9), 00:07:58.884 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x285deb0,0x2d93480), 00:07:58.884 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:07:58.884 INFO: A corpus is not provided, starting from an empty corpus 00:07:58.884 #2 INITED exec/s: 0 rss: 61Mb 00:07:58.884 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:58.884 This may also happen if the target rejected all inputs we tried so far 00:07:58.884 [2024-07-15 00:15:57.799002] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:58.884 [2024-07-15 00:15:57.799032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:58.884 [2024-07-15 00:15:57.799072] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:58.884 [2024-07-15 00:15:57.799088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:58.884 [2024-07-15 00:15:57.799147] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:07:58.884 [2024-07-15 00:15:57.799162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:58.884 [2024-07-15 00:15:57.799218] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:07:58.884 [2024-07-15 00:15:57.799234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:59.143 NEW_FUNC[1/671]: 0x4ab520 in fuzz_nvm_reservation_report_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:671 00:07:59.143 NEW_FUNC[2/671]: 0x4bd260 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:59.143 #3 NEW cov: 11528 ft: 11530 corp: 2/21b lim: 25 exec/s: 0 rss: 67Mb L: 20/20 MS: 1 InsertRepeatedBytes- 00:07:59.143 [2024-07-15 00:15:58.129608] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:59.143 [2024-07-15 00:15:58.129642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.143 [2024-07-15 00:15:58.129695] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:59.143 [2024-07-15 00:15:58.129711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.143 [2024-07-15 00:15:58.129761] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:07:59.143 [2024-07-15 00:15:58.129775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:59.143 #4 NEW cov: 11642 ft: 12536 corp: 3/40b lim: 25 exec/s: 0 rss: 67Mb L: 19/20 MS: 1 CrossOver- 00:07:59.143 [2024-07-15 00:15:58.179460] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:59.143 [2024-07-15 00:15:58.179488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.403 #9 NEW cov: 11648 ft: 13184 corp: 4/45b lim: 25 exec/s: 0 rss: 67Mb L: 5/20 MS: 5 CrossOver-ChangeBit-CMP-CopyPart-InsertByte- DE: "\376\377"- 00:07:59.403 [2024-07-15 00:15:58.219881] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:59.403 [2024-07-15 00:15:58.219909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.403 [2024-07-15 00:15:58.219951] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:59.403 [2024-07-15 00:15:58.219966] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.403 [2024-07-15 00:15:58.220019] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:07:59.403 [2024-07-15 00:15:58.220035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:59.403 [2024-07-15 00:15:58.220087] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:07:59.403 [2024-07-15 00:15:58.220103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:59.403 #10 NEW cov: 11733 ft: 13444 corp: 5/65b lim: 25 exec/s: 0 rss: 67Mb L: 20/20 MS: 1 CopyPart- 00:07:59.403 [2024-07-15 00:15:58.260022] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:59.403 [2024-07-15 00:15:58.260050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.403 [2024-07-15 00:15:58.260095] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:59.403 [2024-07-15 00:15:58.260110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.403 [2024-07-15 00:15:58.260162] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:07:59.403 [2024-07-15 00:15:58.260177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:59.403 [2024-07-15 00:15:58.260231] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:07:59.403 [2024-07-15 00:15:58.260246] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:59.403 #11 NEW cov: 11733 ft: 13551 corp: 6/85b lim: 25 exec/s: 0 rss: 68Mb L: 20/20 MS: 1 ChangeBit- 00:07:59.403 [2024-07-15 00:15:58.300144] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:59.403 [2024-07-15 00:15:58.300171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.403 [2024-07-15 00:15:58.300216] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:59.403 [2024-07-15 00:15:58.300231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.403 [2024-07-15 00:15:58.300281] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:07:59.403 [2024-07-15 00:15:58.300297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:59.403 [2024-07-15 00:15:58.300348] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:07:59.403 [2024-07-15 00:15:58.300362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:59.403 #12 NEW cov: 11733 ft: 13730 corp: 7/105b lim: 25 exec/s: 0 rss: 68Mb L: 20/20 MS: 1 ChangeBinInt- 00:07:59.403 [2024-07-15 00:15:58.339961] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:59.403 [2024-07-15 00:15:58.339986] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.403 #13 NEW cov: 11733 ft: 13807 corp: 8/110b lim: 25 exec/s: 0 rss: 68Mb L: 5/20 MS: 1 ChangeBit- 00:07:59.403 [2024-07-15 00:15:58.380510] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:59.403 [2024-07-15 00:15:58.380536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.403 [2024-07-15 00:15:58.380579] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:59.403 [2024-07-15 00:15:58.380593] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.403 [2024-07-15 00:15:58.380645] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:07:59.403 [2024-07-15 00:15:58.380675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:59.403 [2024-07-15 00:15:58.380724] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:07:59.403 [2024-07-15 00:15:58.380737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:59.403 [2024-07-15 00:15:58.380790] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:07:59.403 [2024-07-15 00:15:58.380804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:59.403 #14 NEW cov: 11733 ft: 13932 corp: 9/135b lim: 25 exec/s: 0 rss: 68Mb L: 25/25 MS: 1 CopyPart- 00:07:59.403 [2024-07-15 00:15:58.420523] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:59.403 [2024-07-15 00:15:58.420549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.403 [2024-07-15 00:15:58.420592] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:59.404 [2024-07-15 00:15:58.420606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.404 [2024-07-15 00:15:58.420660] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:07:59.404 [2024-07-15 00:15:58.420674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:59.404 [2024-07-15 00:15:58.420726] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:07:59.404 [2024-07-15 00:15:58.420741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:59.404 #15 NEW cov: 11733 ft: 13957 corp: 10/156b lim: 25 exec/s: 0 rss: 68Mb L: 21/25 MS: 1 InsertByte- 00:07:59.404 [2024-07-15 00:15:58.450568] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:59.404 [2024-07-15 00:15:58.450595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.404 [2024-07-15 00:15:58.450633] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:59.404 [2024-07-15 00:15:58.450646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.404 [2024-07-15 00:15:58.450699] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:07:59.404 [2024-07-15 00:15:58.450714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:59.404 [2024-07-15 00:15:58.450768] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:07:59.404 [2024-07-15 00:15:58.450783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:59.663 #16 NEW cov: 11733 ft: 14007 corp: 11/177b lim: 25 exec/s: 0 rss: 68Mb L: 21/25 MS: 1 InsertByte- 00:07:59.663 [2024-07-15 00:15:58.490741] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:59.663 [2024-07-15 00:15:58.490772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.663 [2024-07-15 00:15:58.490811] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:59.664 [2024-07-15 00:15:58.490824] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.664 [2024-07-15 00:15:58.490879] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:07:59.664 [2024-07-15 00:15:58.490895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:59.664 [2024-07-15 00:15:58.490950] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:07:59.664 [2024-07-15 00:15:58.490965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:59.664 #17 NEW cov: 11733 ft: 14025 corp: 12/197b lim: 25 exec/s: 0 rss: 68Mb L: 20/25 MS: 1 ShuffleBytes- 00:07:59.664 [2024-07-15 00:15:58.530811] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:59.664 [2024-07-15 00:15:58.530837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.664 [2024-07-15 00:15:58.530884] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:59.664 [2024-07-15 00:15:58.530900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.664 [2024-07-15 00:15:58.530944] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:07:59.664 [2024-07-15 00:15:58.530959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:59.664 [2024-07-15 00:15:58.531009] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:07:59.664 [2024-07-15 00:15:58.531024] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:59.664 #18 NEW cov: 11733 ft: 14071 corp: 13/217b lim: 25 exec/s: 0 rss: 68Mb L: 20/25 MS: 1 ChangeBit- 00:07:59.664 [2024-07-15 00:15:58.570874] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:59.664 [2024-07-15 00:15:58.570901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.664 [2024-07-15 00:15:58.570940] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:59.664 [2024-07-15 00:15:58.570955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.664 [2024-07-15 00:15:58.571009] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:07:59.664 [2024-07-15 00:15:58.571024] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:59.664 #19 NEW cov: 11733 ft: 14104 corp: 14/236b lim: 25 exec/s: 0 rss: 68Mb L: 19/25 MS: 1 ShuffleBytes- 00:07:59.664 [2024-07-15 00:15:58.610858] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:59.664 [2024-07-15 00:15:58.610884] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.664 [2024-07-15 00:15:58.610935] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:59.664 [2024-07-15 00:15:58.610950] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.664 #20 NEW cov: 11733 ft: 14409 corp: 15/247b lim: 25 exec/s: 0 rss: 68Mb L: 11/25 MS: 1 InsertRepeatedBytes- 00:07:59.664 [2024-07-15 00:15:58.651179] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:59.664 [2024-07-15 00:15:58.651206] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.664 [2024-07-15 00:15:58.651268] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:59.664 [2024-07-15 00:15:58.651284] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.664 [2024-07-15 00:15:58.651337] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:07:59.664 [2024-07-15 00:15:58.651353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:59.664 [2024-07-15 00:15:58.651407] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:07:59.664 [2024-07-15 00:15:58.651421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:59.664 NEW_FUNC[1/1]: 0x195e300 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:59.664 #21 NEW cov: 11756 ft: 14430 corp: 16/267b lim: 25 exec/s: 0 rss: 68Mb L: 20/25 MS: 1 InsertByte- 00:07:59.664 [2024-07-15 00:15:58.691277] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:59.664 [2024-07-15 00:15:58.691303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.664 [2024-07-15 00:15:58.691368] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:59.664 [2024-07-15 00:15:58.691384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.664 [2024-07-15 00:15:58.691435] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:07:59.664 [2024-07-15 00:15:58.691455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:59.664 [2024-07-15 00:15:58.691511] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:07:59.664 [2024-07-15 00:15:58.691524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:59.664 #22 NEW cov: 11756 ft: 14452 corp: 17/287b lim: 25 exec/s: 0 rss: 69Mb L: 20/25 MS: 1 ChangeByte- 00:07:59.924 [2024-07-15 00:15:58.731418] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:59.924 [2024-07-15 00:15:58.731447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.924 [2024-07-15 00:15:58.731497] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:59.924 [2024-07-15 00:15:58.731512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.924 [2024-07-15 00:15:58.731564] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:07:59.924 [2024-07-15 00:15:58.731578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:59.924 [2024-07-15 00:15:58.731633] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:07:59.924 [2024-07-15 00:15:58.731647] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:59.924 #23 NEW cov: 11756 ft: 14477 corp: 18/307b lim: 25 exec/s: 0 rss: 69Mb L: 20/25 MS: 1 ShuffleBytes- 00:07:59.924 [2024-07-15 00:15:58.771520] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:59.924 [2024-07-15 00:15:58.771550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.924 [2024-07-15 00:15:58.771587] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:59.924 [2024-07-15 00:15:58.771601] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.924 [2024-07-15 00:15:58.771654] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:07:59.924 [2024-07-15 00:15:58.771669] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:59.924 [2024-07-15 00:15:58.771722] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:07:59.924 [2024-07-15 00:15:58.771737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:59.924 #24 NEW cov: 11756 ft: 14502 corp: 19/331b lim: 25 exec/s: 24 rss: 69Mb L: 24/25 MS: 1 InsertRepeatedBytes- 00:07:59.924 [2024-07-15 00:15:58.811791] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:59.924 [2024-07-15 00:15:58.811818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.924 [2024-07-15 00:15:58.811867] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:59.924 [2024-07-15 00:15:58.811882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.924 [2024-07-15 00:15:58.811935] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:07:59.924 [2024-07-15 00:15:58.811951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:59.924 [2024-07-15 00:15:58.812003] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:07:59.924 [2024-07-15 00:15:58.812019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:59.924 [2024-07-15 00:15:58.812075] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:07:59.924 [2024-07-15 00:15:58.812090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:59.924 #25 NEW cov: 11756 ft: 14519 corp: 20/356b lim: 25 exec/s: 25 rss: 69Mb L: 25/25 MS: 1 InsertByte- 00:07:59.924 [2024-07-15 00:15:58.851711] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:59.924 [2024-07-15 00:15:58.851738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.924 [2024-07-15 00:15:58.851787] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:59.924 [2024-07-15 00:15:58.851802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.924 [2024-07-15 00:15:58.851854] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:07:59.924 [2024-07-15 00:15:58.851869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:59.924 [2024-07-15 00:15:58.851921] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:07:59.924 [2024-07-15 00:15:58.851936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:59.924 #26 NEW cov: 11756 ft: 14542 corp: 21/376b lim: 25 exec/s: 26 rss: 69Mb L: 20/25 MS: 1 ChangeByte- 00:07:59.924 [2024-07-15 00:15:58.881711] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:59.924 [2024-07-15 00:15:58.881739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.924 [2024-07-15 00:15:58.881775] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:59.924 [2024-07-15 00:15:58.881790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.924 [2024-07-15 00:15:58.881844] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:07:59.924 [2024-07-15 00:15:58.881860] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:59.924 #27 NEW cov: 11756 ft: 14561 corp: 22/394b lim: 25 exec/s: 27 rss: 69Mb L: 18/25 MS: 1 EraseBytes- 00:07:59.925 [2024-07-15 00:15:58.921710] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:59.925 [2024-07-15 00:15:58.921736] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.925 [2024-07-15 00:15:58.921774] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:59.925 [2024-07-15 00:15:58.921789] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.925 #28 NEW cov: 11756 ft: 14569 corp: 23/404b lim: 25 exec/s: 28 rss: 69Mb L: 10/25 MS: 1 EraseBytes- 00:07:59.925 [2024-07-15 00:15:58.962108] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:59.925 [2024-07-15 00:15:58.962133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.925 [2024-07-15 00:15:58.962183] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:59.925 [2024-07-15 00:15:58.962199] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.925 [2024-07-15 00:15:58.962251] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:07:59.925 [2024-07-15 00:15:58.962266] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:59.925 [2024-07-15 00:15:58.962318] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:07:59.925 [2024-07-15 00:15:58.962331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:00.184 #29 NEW cov: 11756 ft: 14579 corp: 24/428b lim: 25 exec/s: 29 rss: 69Mb L: 24/25 MS: 1 CopyPart- 00:08:00.184 [2024-07-15 00:15:59.001938] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:00.184 [2024-07-15 00:15:59.001964] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:00.184 [2024-07-15 00:15:59.002000] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:00.184 [2024-07-15 00:15:59.002016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:00.184 #30 NEW cov: 11756 ft: 14597 corp: 25/442b lim: 25 exec/s: 30 rss: 70Mb L: 14/25 MS: 1 CrossOver- 00:08:00.184 [2024-07-15 00:15:59.041987] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:00.184 [2024-07-15 00:15:59.042014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:00.184 #31 NEW cov: 11756 ft: 14633 corp: 26/449b lim: 25 exec/s: 31 rss: 70Mb L: 7/25 MS: 1 CrossOver- 00:08:00.185 [2024-07-15 00:15:59.082070] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:00.185 [2024-07-15 00:15:59.082096] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:00.185 #32 NEW cov: 11756 ft: 14654 corp: 27/454b lim: 25 exec/s: 32 rss: 70Mb L: 5/25 MS: 1 CMP- DE: "\002\000\000\000"- 00:08:00.185 [2024-07-15 00:15:59.122528] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:00.185 [2024-07-15 00:15:59.122554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:00.185 [2024-07-15 00:15:59.122602] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:00.185 [2024-07-15 00:15:59.122616] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:00.185 [2024-07-15 00:15:59.122668] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:00.185 [2024-07-15 00:15:59.122684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:00.185 [2024-07-15 00:15:59.122738] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:00.185 [2024-07-15 00:15:59.122752] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:00.185 #33 NEW cov: 11756 ft: 14692 corp: 28/474b lim: 25 exec/s: 33 rss: 70Mb L: 20/25 MS: 1 ShuffleBytes- 00:08:00.185 [2024-07-15 00:15:59.152714] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:00.185 [2024-07-15 00:15:59.152740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:00.185 [2024-07-15 00:15:59.152794] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:00.185 [2024-07-15 00:15:59.152809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:00.185 [2024-07-15 00:15:59.152861] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:00.185 [2024-07-15 00:15:59.152875] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:00.185 [2024-07-15 00:15:59.152927] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:00.185 [2024-07-15 00:15:59.152942] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:00.185 [2024-07-15 00:15:59.152994] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:00.185 [2024-07-15 00:15:59.153009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:00.185 #34 NEW cov: 11756 ft: 14702 corp: 29/499b lim: 25 exec/s: 34 rss: 70Mb L: 25/25 MS: 1 InsertByte- 00:08:00.185 [2024-07-15 00:15:59.192731] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:00.185 [2024-07-15 00:15:59.192758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:00.185 [2024-07-15 00:15:59.192799] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:00.185 [2024-07-15 00:15:59.192813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:00.185 [2024-07-15 00:15:59.192870] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:00.185 [2024-07-15 00:15:59.192901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:00.185 [2024-07-15 00:15:59.192956] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:00.185 [2024-07-15 00:15:59.192975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:00.185 #35 NEW cov: 11756 ft: 14715 corp: 30/520b lim: 25 exec/s: 35 rss: 70Mb L: 21/25 MS: 1 ChangeBinInt- 00:08:00.185 [2024-07-15 00:15:59.232965] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:00.185 [2024-07-15 00:15:59.232992] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:00.185 [2024-07-15 00:15:59.233047] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:00.185 [2024-07-15 00:15:59.233062] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:00.185 [2024-07-15 00:15:59.233117] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:00.185 [2024-07-15 00:15:59.233132] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:00.185 [2024-07-15 00:15:59.233188] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:00.185 [2024-07-15 00:15:59.233204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:00.185 [2024-07-15 00:15:59.233258] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:00.185 [2024-07-15 00:15:59.233273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:00.445 #36 NEW cov: 11756 ft: 14731 corp: 31/545b lim: 25 exec/s: 36 rss: 70Mb L: 25/25 MS: 1 ShuffleBytes- 00:08:00.445 [2024-07-15 00:15:59.272868] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:00.445 [2024-07-15 00:15:59.272895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:00.445 [2024-07-15 00:15:59.272951] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:00.445 [2024-07-15 00:15:59.272967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:00.445 [2024-07-15 00:15:59.273023] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:00.445 [2024-07-15 00:15:59.273038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:00.445 #37 NEW cov: 11756 ft: 14741 corp: 32/564b lim: 25 exec/s: 37 rss: 70Mb L: 19/25 MS: 1 InsertRepeatedBytes- 00:08:00.445 [2024-07-15 00:15:59.313008] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:00.445 [2024-07-15 00:15:59.313034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:00.445 [2024-07-15 00:15:59.313071] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:00.445 [2024-07-15 00:15:59.313087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:00.445 [2024-07-15 00:15:59.313143] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:00.445 [2024-07-15 00:15:59.313158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:00.445 #38 NEW cov: 11756 ft: 14770 corp: 33/581b lim: 25 exec/s: 38 rss: 70Mb L: 17/25 MS: 1 EraseBytes- 00:08:00.445 [2024-07-15 00:15:59.353202] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:00.445 [2024-07-15 00:15:59.353229] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:00.445 [2024-07-15 00:15:59.353271] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:00.445 [2024-07-15 00:15:59.353287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:00.445 [2024-07-15 00:15:59.353339] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:00.445 [2024-07-15 00:15:59.353354] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:00.445 [2024-07-15 00:15:59.353411] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:00.445 [2024-07-15 00:15:59.353427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:00.445 #39 NEW cov: 11756 ft: 14774 corp: 34/604b lim: 25 exec/s: 39 rss: 70Mb L: 23/25 MS: 1 InsertRepeatedBytes- 00:08:00.445 [2024-07-15 00:15:59.383327] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:00.445 [2024-07-15 00:15:59.383353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:00.445 [2024-07-15 00:15:59.383400] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:00.445 [2024-07-15 00:15:59.383415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:00.445 [2024-07-15 00:15:59.383479] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:00.445 [2024-07-15 00:15:59.383493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:00.445 [2024-07-15 00:15:59.383549] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:00.445 [2024-07-15 00:15:59.383563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:00.445 #40 NEW cov: 11756 ft: 14789 corp: 35/627b lim: 25 exec/s: 40 rss: 70Mb L: 23/25 MS: 1 CopyPart- 00:08:00.445 [2024-07-15 00:15:59.423089] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:00.445 [2024-07-15 00:15:59.423114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:00.445 #41 NEW cov: 11756 ft: 14800 corp: 36/634b lim: 25 exec/s: 41 rss: 70Mb L: 7/25 MS: 1 ChangeByte- 00:08:00.445 [2024-07-15 00:15:59.463449] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:00.445 [2024-07-15 00:15:59.463475] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:00.445 [2024-07-15 00:15:59.463512] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:00.445 [2024-07-15 00:15:59.463528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:00.445 [2024-07-15 00:15:59.463582] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:00.445 [2024-07-15 00:15:59.463598] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:00.445 #42 NEW cov: 11756 ft: 14833 corp: 37/652b lim: 25 exec/s: 42 rss: 70Mb L: 18/25 MS: 1 ChangeBit- 00:08:00.704 [2024-07-15 00:15:59.503668] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:00.704 [2024-07-15 00:15:59.503695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:00.704 [2024-07-15 00:15:59.503750] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:00.704 [2024-07-15 00:15:59.503768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:00.704 [2024-07-15 00:15:59.503823] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:00.704 [2024-07-15 00:15:59.503837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:00.704 [2024-07-15 00:15:59.503891] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:00.704 [2024-07-15 00:15:59.503907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:00.704 #43 NEW cov: 11756 ft: 14871 corp: 38/672b lim: 25 exec/s: 43 rss: 70Mb L: 20/25 MS: 1 ChangeByte- 00:08:00.704 [2024-07-15 00:15:59.533536] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:00.704 [2024-07-15 00:15:59.533562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:00.704 [2024-07-15 00:15:59.533616] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:00.704 [2024-07-15 00:15:59.533630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:00.704 #44 NEW cov: 11756 ft: 14882 corp: 39/686b lim: 25 exec/s: 44 rss: 70Mb L: 14/25 MS: 1 ShuffleBytes- 00:08:00.704 [2024-07-15 00:15:59.573518] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:00.704 [2024-07-15 00:15:59.573543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:00.704 #47 NEW cov: 11756 ft: 14898 corp: 40/691b lim: 25 exec/s: 47 rss: 70Mb L: 5/25 MS: 3 CrossOver-ChangeBit-InsertByte- 00:08:00.704 [2024-07-15 00:15:59.613957] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:00.704 [2024-07-15 00:15:59.613984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:00.704 [2024-07-15 00:15:59.614030] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:00.704 [2024-07-15 00:15:59.614044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:00.704 [2024-07-15 00:15:59.614096] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:00.704 [2024-07-15 00:15:59.614111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:00.704 [2024-07-15 00:15:59.614163] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:00.704 [2024-07-15 00:15:59.614176] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:00.705 #48 NEW cov: 11756 ft: 14907 corp: 41/715b lim: 25 exec/s: 48 rss: 70Mb L: 24/25 MS: 1 InsertRepeatedBytes- 00:08:00.705 [2024-07-15 00:15:59.653721] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:00.705 [2024-07-15 00:15:59.653747] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:00.705 #49 NEW cov: 11756 ft: 14915 corp: 42/720b lim: 25 exec/s: 49 rss: 70Mb L: 5/25 MS: 1 PersAutoDict- DE: "\002\000\000\000"- 00:08:00.705 [2024-07-15 00:15:59.694100] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:00.705 [2024-07-15 00:15:59.694127] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:00.705 [2024-07-15 00:15:59.694165] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:00.705 [2024-07-15 00:15:59.694184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:00.705 [2024-07-15 00:15:59.694237] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:00.705 [2024-07-15 00:15:59.694253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:00.705 #50 NEW cov: 11756 ft: 14921 corp: 43/739b lim: 25 exec/s: 50 rss: 70Mb L: 19/25 MS: 1 ShuffleBytes- 00:08:00.705 [2024-07-15 00:15:59.734365] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:00.705 [2024-07-15 00:15:59.734391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:00.705 [2024-07-15 00:15:59.734447] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:00.705 [2024-07-15 00:15:59.734460] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:00.705 [2024-07-15 00:15:59.734514] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:00.705 [2024-07-15 00:15:59.734544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:00.705 [2024-07-15 00:15:59.734598] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:00.705 [2024-07-15 00:15:59.734613] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:00.705 [2024-07-15 00:15:59.734667] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:00.705 [2024-07-15 00:15:59.734683] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:00.705 #51 NEW cov: 11756 ft: 14922 corp: 44/764b lim: 25 exec/s: 51 rss: 70Mb L: 25/25 MS: 1 ShuffleBytes- 00:08:00.964 [2024-07-15 00:15:59.774408] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:00.964 [2024-07-15 00:15:59.774436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:00.964 [2024-07-15 00:15:59.774489] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:00.964 [2024-07-15 00:15:59.774504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:00.964 [2024-07-15 00:15:59.774560] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:00.964 [2024-07-15 00:15:59.774574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:00.964 [2024-07-15 00:15:59.774629] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:00.964 [2024-07-15 00:15:59.774645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:00.964 #52 NEW cov: 11756 ft: 14989 corp: 45/784b lim: 25 exec/s: 26 rss: 70Mb L: 20/25 MS: 1 ShuffleBytes- 00:08:00.964 #52 DONE cov: 11756 ft: 14989 corp: 45/784b lim: 25 exec/s: 26 rss: 70Mb 00:08:00.964 ###### Recommended dictionary. ###### 00:08:00.964 "\376\377" # Uses: 0 00:08:00.964 "\002\000\000\000" # Uses: 1 00:08:00.964 ###### End of recommended dictionary. ###### 00:08:00.964 Done 52 runs in 2 second(s) 00:08:00.964 00:15:59 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_23.conf 00:08:00.964 00:15:59 -- ../common.sh@72 -- # (( i++ )) 00:08:00.964 00:15:59 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:00.964 00:15:59 -- ../common.sh@73 -- # start_llvm_fuzz 24 1 0x1 00:08:00.964 00:15:59 -- nvmf/run.sh@23 -- # local fuzzer_type=24 00:08:00.964 00:15:59 -- nvmf/run.sh@24 -- # local timen=1 00:08:00.964 00:15:59 -- nvmf/run.sh@25 -- # local core=0x1 00:08:00.964 00:15:59 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:00.964 00:15:59 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_24.conf 00:08:00.964 00:15:59 -- nvmf/run.sh@29 -- # printf %02d 24 00:08:00.964 00:15:59 -- nvmf/run.sh@29 -- # port=4424 00:08:00.964 00:15:59 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:00.964 00:15:59 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4424' 00:08:00.964 00:15:59 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4424"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:00.964 00:15:59 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4424' -c /tmp/fuzz_json_24.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 -Z 24 -r /var/tmp/spdk24.sock 00:08:00.964 [2024-07-15 00:15:59.957780] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:08:00.964 [2024-07-15 00:15:59.957868] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid337036 ] 00:08:00.964 EAL: No free 2048 kB hugepages reported on node 1 00:08:01.224 [2024-07-15 00:16:00.146064] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:01.224 [2024-07-15 00:16:00.211876] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:01.224 [2024-07-15 00:16:00.212009] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:01.224 [2024-07-15 00:16:00.270296] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:01.483 [2024-07-15 00:16:00.286606] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4424 *** 00:08:01.483 INFO: Running with entropic power schedule (0xFF, 100). 00:08:01.483 INFO: Seed: 1995782220 00:08:01.483 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x280a94c, 0x285dea9), 00:08:01.483 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x285deb0,0x2d93480), 00:08:01.483 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:01.483 INFO: A corpus is not provided, starting from an empty corpus 00:08:01.483 #2 INITED exec/s: 0 rss: 60Mb 00:08:01.483 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:01.483 This may also happen if the target rejected all inputs we tried so far 00:08:01.483 [2024-07-15 00:16:00.351701] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.483 [2024-07-15 00:16:00.351732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:01.742 NEW_FUNC[1/672]: 0x4ac600 in fuzz_nvm_compare_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:685 00:08:01.742 NEW_FUNC[2/672]: 0x4bd260 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:01.742 #9 NEW cov: 11594 ft: 11590 corp: 2/40b lim: 100 exec/s: 0 rss: 67Mb L: 39/39 MS: 2 CrossOver-InsertRepeatedBytes- 00:08:01.742 [2024-07-15 00:16:00.682677] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.742 [2024-07-15 00:16:00.682723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:01.742 #10 NEW cov: 11714 ft: 12098 corp: 3/79b lim: 100 exec/s: 0 rss: 67Mb L: 39/39 MS: 1 ShuffleBytes- 00:08:01.742 [2024-07-15 00:16:00.732832] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.742 [2024-07-15 00:16:00.732862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:01.742 [2024-07-15 00:16:00.732939] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.742 [2024-07-15 00:16:00.732956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:01.742 #11 NEW cov: 11720 ft: 13047 corp: 4/121b lim: 100 exec/s: 0 rss: 67Mb L: 42/42 MS: 1 CrossOver- 00:08:01.742 [2024-07-15 00:16:00.772775] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069584781311 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.742 [2024-07-15 00:16:00.772802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:01.742 #12 NEW cov: 11805 ft: 13372 corp: 5/160b lim: 100 exec/s: 0 rss: 67Mb L: 39/42 MS: 1 ChangeByte- 00:08:02.001 [2024-07-15 00:16:00.813099] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.001 [2024-07-15 00:16:00.813130] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.001 [2024-07-15 00:16:00.813189] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.001 [2024-07-15 00:16:00.813208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.001 #13 NEW cov: 11805 ft: 13427 corp: 6/210b lim: 100 exec/s: 0 rss: 67Mb L: 50/50 MS: 1 CopyPart- 00:08:02.001 [2024-07-15 00:16:00.853210] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069584781311 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.001 [2024-07-15 00:16:00.853237] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.001 [2024-07-15 00:16:00.853274] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.001 [2024-07-15 00:16:00.853290] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.001 #14 NEW cov: 11805 ft: 13531 corp: 7/250b lim: 100 exec/s: 0 rss: 68Mb L: 40/50 MS: 1 InsertByte- 00:08:02.001 [2024-07-15 00:16:00.903233] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.001 [2024-07-15 00:16:00.903262] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.001 #15 NEW cov: 11805 ft: 13603 corp: 8/281b lim: 100 exec/s: 0 rss: 68Mb L: 31/50 MS: 1 EraseBytes- 00:08:02.001 [2024-07-15 00:16:00.943325] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069584781311 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.001 [2024-07-15 00:16:00.943353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.001 #16 NEW cov: 11805 ft: 13632 corp: 9/306b lim: 100 exec/s: 0 rss: 68Mb L: 25/50 MS: 1 EraseBytes- 00:08:02.001 [2024-07-15 00:16:00.983579] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.001 [2024-07-15 00:16:00.983606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.002 [2024-07-15 00:16:00.983642] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.002 [2024-07-15 00:16:00.983658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.002 #17 NEW cov: 11805 ft: 13787 corp: 10/356b lim: 100 exec/s: 0 rss: 68Mb L: 50/50 MS: 1 ShuffleBytes- 00:08:02.002 [2024-07-15 00:16:01.023571] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.002 [2024-07-15 00:16:01.023599] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.002 #18 NEW cov: 11805 ft: 13878 corp: 11/387b lim: 100 exec/s: 0 rss: 68Mb L: 31/50 MS: 1 ShuffleBytes- 00:08:02.262 [2024-07-15 00:16:01.064111] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:44719 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.262 [2024-07-15 00:16:01.064139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.262 [2024-07-15 00:16:01.064186] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:12587190073825341102 len:44719 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.262 [2024-07-15 00:16:01.064202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.262 [2024-07-15 00:16:01.064259] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:12587190073825341102 len:44719 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.262 [2024-07-15 00:16:01.064275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:02.262 [2024-07-15 00:16:01.064332] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744072350597119 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.262 [2024-07-15 00:16:01.064347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:02.262 #19 NEW cov: 11805 ft: 14319 corp: 12/471b lim: 100 exec/s: 0 rss: 68Mb L: 84/84 MS: 1 InsertRepeatedBytes- 00:08:02.262 [2024-07-15 00:16:01.104101] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.262 [2024-07-15 00:16:01.104128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.262 [2024-07-15 00:16:01.104166] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:12587190075184295598 len:44719 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.262 [2024-07-15 00:16:01.104183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.262 [2024-07-15 00:16:01.104238] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:12587190073825341102 len:44719 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.262 [2024-07-15 00:16:01.104255] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:02.262 #20 NEW cov: 11805 ft: 14622 corp: 13/544b lim: 100 exec/s: 0 rss: 68Mb L: 73/84 MS: 1 CrossOver- 00:08:02.262 [2024-07-15 00:16:01.144230] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.262 [2024-07-15 00:16:01.144258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.262 [2024-07-15 00:16:01.144312] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:12587190075184295598 len:44719 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.262 [2024-07-15 00:16:01.144328] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.262 [2024-07-15 00:16:01.144386] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:12587190073825341102 len:44719 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.262 [2024-07-15 00:16:01.144404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:02.262 #21 NEW cov: 11805 ft: 14663 corp: 14/617b lim: 100 exec/s: 0 rss: 68Mb L: 73/84 MS: 1 ShuffleBytes- 00:08:02.262 [2024-07-15 00:16:01.194381] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.262 [2024-07-15 00:16:01.194408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.262 [2024-07-15 00:16:01.194467] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:12587190075184295598 len:44719 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.262 [2024-07-15 00:16:01.194485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.262 [2024-07-15 00:16:01.194522] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:12587190073825341102 len:44719 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.262 [2024-07-15 00:16:01.194537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:02.262 #22 NEW cov: 11805 ft: 14672 corp: 15/690b lim: 100 exec/s: 0 rss: 68Mb L: 73/84 MS: 1 ShuffleBytes- 00:08:02.262 [2024-07-15 00:16:01.234165] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069584781311 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.262 [2024-07-15 00:16:01.234192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.262 NEW_FUNC[1/1]: 0x195e300 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:02.262 #23 NEW cov: 11828 ft: 14727 corp: 16/729b lim: 100 exec/s: 0 rss: 69Mb L: 39/84 MS: 1 ChangeBinInt- 00:08:02.262 [2024-07-15 00:16:01.274332] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069584781311 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.262 [2024-07-15 00:16:01.274362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.262 #24 NEW cov: 11828 ft: 14773 corp: 17/765b lim: 100 exec/s: 0 rss: 69Mb L: 36/84 MS: 1 EraseBytes- 00:08:02.262 [2024-07-15 00:16:01.314592] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.262 [2024-07-15 00:16:01.314621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.262 [2024-07-15 00:16:01.314669] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446743322090274815 len:20561 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.262 [2024-07-15 00:16:01.314684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.521 #25 NEW cov: 11828 ft: 14829 corp: 18/822b lim: 100 exec/s: 25 rss: 69Mb L: 57/84 MS: 1 InsertRepeatedBytes- 00:08:02.521 [2024-07-15 00:16:01.354667] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.521 [2024-07-15 00:16:01.354695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.521 [2024-07-15 00:16:01.354742] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.521 [2024-07-15 00:16:01.354759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.521 #26 NEW cov: 11828 ft: 14840 corp: 19/872b lim: 100 exec/s: 26 rss: 69Mb L: 50/84 MS: 1 ShuffleBytes- 00:08:02.521 [2024-07-15 00:16:01.394603] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069584781311 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.521 [2024-07-15 00:16:01.394630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.521 #27 NEW cov: 11828 ft: 14879 corp: 20/897b lim: 100 exec/s: 27 rss: 69Mb L: 25/84 MS: 1 ChangeBit- 00:08:02.521 [2024-07-15 00:16:01.434935] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.521 [2024-07-15 00:16:01.434963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.521 [2024-07-15 00:16:01.435006] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.521 [2024-07-15 00:16:01.435023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.521 #28 NEW cov: 11828 ft: 14905 corp: 21/947b lim: 100 exec/s: 28 rss: 69Mb L: 50/84 MS: 1 ChangeBit- 00:08:02.521 [2024-07-15 00:16:01.475203] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.521 [2024-07-15 00:16:01.475230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.521 [2024-07-15 00:16:01.475268] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:12587190075184295598 len:44719 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.521 [2024-07-15 00:16:01.475283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.521 [2024-07-15 00:16:01.475341] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:12587190073825341102 len:44719 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.521 [2024-07-15 00:16:01.475375] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:02.521 #29 NEW cov: 11828 ft: 14965 corp: 22/1020b lim: 100 exec/s: 29 rss: 69Mb L: 73/84 MS: 1 CrossOver- 00:08:02.521 [2024-07-15 00:16:01.525046] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069584781311 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.521 [2024-07-15 00:16:01.525074] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.521 #30 NEW cov: 11828 ft: 14977 corp: 23/1059b lim: 100 exec/s: 30 rss: 69Mb L: 39/84 MS: 1 ShuffleBytes- 00:08:02.521 [2024-07-15 00:16:01.555131] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069584781311 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.521 [2024-07-15 00:16:01.555160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.780 #36 NEW cov: 11828 ft: 14983 corp: 24/1098b lim: 100 exec/s: 36 rss: 69Mb L: 39/84 MS: 1 ChangeBinInt- 00:08:02.780 [2024-07-15 00:16:01.595410] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069584781311 len:2816 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.780 [2024-07-15 00:16:01.595438] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.780 [2024-07-15 00:16:01.595486] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.780 [2024-07-15 00:16:01.595502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.780 #37 NEW cov: 11828 ft: 14997 corp: 25/1154b lim: 100 exec/s: 37 rss: 69Mb L: 56/84 MS: 1 CrossOver- 00:08:02.780 [2024-07-15 00:16:01.635740] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.780 [2024-07-15 00:16:01.635768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.780 [2024-07-15 00:16:01.635804] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:44719 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.780 [2024-07-15 00:16:01.635820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.780 [2024-07-15 00:16:01.635876] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:12587190073825341102 len:44719 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.780 [2024-07-15 00:16:01.635892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:02.780 #38 NEW cov: 11828 ft: 15087 corp: 26/1227b lim: 100 exec/s: 38 rss: 70Mb L: 73/84 MS: 1 CopyPart- 00:08:02.780 [2024-07-15 00:16:01.685901] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.780 [2024-07-15 00:16:01.685931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.780 [2024-07-15 00:16:01.685971] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:12587190075184295598 len:44719 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.780 [2024-07-15 00:16:01.685986] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.780 [2024-07-15 00:16:01.686046] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:12587190073825341102 len:44719 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.780 [2024-07-15 00:16:01.686062] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:02.780 #39 NEW cov: 11828 ft: 15097 corp: 27/1301b lim: 100 exec/s: 39 rss: 70Mb L: 74/84 MS: 1 InsertByte- 00:08:02.780 [2024-07-15 00:16:01.726078] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:44719 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.780 [2024-07-15 00:16:01.726107] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.780 [2024-07-15 00:16:01.726141] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:12587190073825341102 len:44719 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.780 [2024-07-15 00:16:01.726156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.780 [2024-07-15 00:16:01.726210] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:12587190073825341102 len:44719 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.780 [2024-07-15 00:16:01.726241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:02.780 [2024-07-15 00:16:01.726299] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744072345288703 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.780 [2024-07-15 00:16:01.726314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:02.780 #40 NEW cov: 11828 ft: 15108 corp: 28/1386b lim: 100 exec/s: 40 rss: 70Mb L: 85/85 MS: 1 InsertByte- 00:08:02.780 [2024-07-15 00:16:01.775799] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069584781311 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.780 [2024-07-15 00:16:01.775829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.780 #41 NEW cov: 11828 ft: 15121 corp: 29/1422b lim: 100 exec/s: 41 rss: 70Mb L: 36/85 MS: 1 ChangeBinInt- 00:08:02.780 [2024-07-15 00:16:01.816052] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.780 [2024-07-15 00:16:01.816080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.780 [2024-07-15 00:16:01.816133] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.780 [2024-07-15 00:16:01.816149] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.780 #42 NEW cov: 11828 ft: 15190 corp: 30/1472b lim: 100 exec/s: 42 rss: 70Mb L: 50/85 MS: 1 ShuffleBytes- 00:08:03.039 [2024-07-15 00:16:01.856138] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069584781311 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.040 [2024-07-15 00:16:01.856166] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:03.040 [2024-07-15 00:16:01.856216] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.040 [2024-07-15 00:16:01.856231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:03.040 #43 NEW cov: 11828 ft: 15207 corp: 31/1512b lim: 100 exec/s: 43 rss: 70Mb L: 40/85 MS: 1 ShuffleBytes- 00:08:03.040 [2024-07-15 00:16:01.896414] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.040 [2024-07-15 00:16:01.896446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:03.040 [2024-07-15 00:16:01.896496] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:12587190075184295598 len:44719 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.040 [2024-07-15 00:16:01.896511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:03.040 [2024-07-15 00:16:01.896568] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:12587190073825341102 len:44719 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.040 [2024-07-15 00:16:01.896583] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:03.040 #44 NEW cov: 11828 ft: 15220 corp: 32/1586b lim: 100 exec/s: 44 rss: 70Mb L: 74/85 MS: 1 InsertByte- 00:08:03.040 [2024-07-15 00:16:01.936680] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.040 [2024-07-15 00:16:01.936708] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:03.040 [2024-07-15 00:16:01.936748] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073704222382 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.040 [2024-07-15 00:16:01.936763] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:03.040 [2024-07-15 00:16:01.936819] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18423855192261787647 len:44719 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.040 [2024-07-15 00:16:01.936834] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:03.040 [2024-07-15 00:16:01.936895] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:12587190073825341102 len:44719 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.040 [2024-07-15 00:16:01.936911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:03.040 #45 NEW cov: 11828 ft: 15305 corp: 33/1680b lim: 100 exec/s: 45 rss: 70Mb L: 94/94 MS: 1 InsertRepeatedBytes- 00:08:03.040 [2024-07-15 00:16:01.976307] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069951455231 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.040 [2024-07-15 00:16:01.976334] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:03.040 #46 NEW cov: 11828 ft: 15316 corp: 34/1711b lim: 100 exec/s: 46 rss: 70Mb L: 31/94 MS: 1 ChangeBinInt- 00:08:03.040 [2024-07-15 00:16:02.016728] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.040 [2024-07-15 00:16:02.016755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:03.040 [2024-07-15 00:16:02.016794] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:12587190075184295598 len:44719 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.040 [2024-07-15 00:16:02.016810] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:03.040 [2024-07-15 00:16:02.016867] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:12587190073825341102 len:44719 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.040 [2024-07-15 00:16:02.016883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:03.040 #47 NEW cov: 11828 ft: 15373 corp: 35/1785b lim: 100 exec/s: 47 rss: 70Mb L: 74/94 MS: 1 ShuffleBytes- 00:08:03.040 [2024-07-15 00:16:02.056854] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.040 [2024-07-15 00:16:02.056880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:03.040 [2024-07-15 00:16:02.056920] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:12587190075184295598 len:44719 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.040 [2024-07-15 00:16:02.056936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:03.040 [2024-07-15 00:16:02.056994] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:12587190073825341102 len:44719 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.040 [2024-07-15 00:16:02.057009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:03.040 #48 NEW cov: 11828 ft: 15428 corp: 36/1859b lim: 100 exec/s: 48 rss: 70Mb L: 74/94 MS: 1 InsertByte- 00:08:03.300 [2024-07-15 00:16:02.096947] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.300 [2024-07-15 00:16:02.096975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:03.300 [2024-07-15 00:16:02.097013] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:12587190075184295598 len:44719 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.300 [2024-07-15 00:16:02.097029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:03.300 [2024-07-15 00:16:02.097087] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:12587279482159541849 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.300 [2024-07-15 00:16:02.097106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:03.300 #49 NEW cov: 11828 ft: 15436 corp: 37/1923b lim: 100 exec/s: 49 rss: 70Mb L: 64/94 MS: 1 EraseBytes- 00:08:03.300 [2024-07-15 00:16:02.136931] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069584781311 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.300 [2024-07-15 00:16:02.136960] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:03.300 [2024-07-15 00:16:02.137006] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:72057598332895231 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.300 [2024-07-15 00:16:02.137022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:03.300 #50 NEW cov: 11828 ft: 15461 corp: 38/1971b lim: 100 exec/s: 50 rss: 70Mb L: 48/94 MS: 1 CMP- DE: "\001\000\000\000\000\000\000\000"- 00:08:03.300 [2024-07-15 00:16:02.177020] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:16777216 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.300 [2024-07-15 00:16:02.177047] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:03.300 [2024-07-15 00:16:02.177103] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.300 [2024-07-15 00:16:02.177119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:03.300 #51 NEW cov: 11828 ft: 15526 corp: 39/2021b lim: 100 exec/s: 51 rss: 70Mb L: 50/94 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000\000"- 00:08:03.300 [2024-07-15 00:16:02.217280] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.300 [2024-07-15 00:16:02.217306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:03.300 [2024-07-15 00:16:02.217359] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:12587190075184251054 len:44719 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.300 [2024-07-15 00:16:02.217376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:03.300 [2024-07-15 00:16:02.217434] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:12587190073825341102 len:44719 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.300 [2024-07-15 00:16:02.217454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:03.300 #52 NEW cov: 11828 ft: 15532 corp: 40/2096b lim: 100 exec/s: 52 rss: 70Mb L: 75/94 MS: 1 InsertByte- 00:08:03.301 [2024-07-15 00:16:02.257626] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.301 [2024-07-15 00:16:02.257653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:03.301 [2024-07-15 00:16:02.257689] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18423855192261787647 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.301 [2024-07-15 00:16:02.257706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:03.301 [2024-07-15 00:16:02.257764] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:44719 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.301 [2024-07-15 00:16:02.257797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:03.301 [2024-07-15 00:16:02.257859] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:12587190073825341102 len:44719 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.301 [2024-07-15 00:16:02.257874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:03.301 #53 NEW cov: 11828 ft: 15540 corp: 41/2194b lim: 100 exec/s: 53 rss: 70Mb L: 98/98 MS: 1 InsertRepeatedBytes- 00:08:03.301 [2024-07-15 00:16:02.307490] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069584781311 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.301 [2024-07-15 00:16:02.307518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:03.301 [2024-07-15 00:16:02.307589] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:72057598332895231 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.301 [2024-07-15 00:16:02.307604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:03.301 #59 NEW cov: 11828 ft: 15623 corp: 42/2250b lim: 100 exec/s: 29 rss: 70Mb L: 56/98 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000\000"- 00:08:03.301 #59 DONE cov: 11828 ft: 15623 corp: 42/2250b lim: 100 exec/s: 29 rss: 70Mb 00:08:03.301 ###### Recommended dictionary. ###### 00:08:03.301 "\001\000\000\000\000\000\000\000" # Uses: 2 00:08:03.301 ###### End of recommended dictionary. ###### 00:08:03.301 Done 59 runs in 2 second(s) 00:08:03.560 00:16:02 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_24.conf 00:08:03.560 00:16:02 -- ../common.sh@72 -- # (( i++ )) 00:08:03.560 00:16:02 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:03.560 00:16:02 -- nvmf/run.sh@71 -- # trap - SIGINT SIGTERM EXIT 00:08:03.560 00:08:03.560 real 1m3.946s 00:08:03.560 user 1m39.976s 00:08:03.560 sys 0m7.293s 00:08:03.560 00:16:02 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:03.560 00:16:02 -- common/autotest_common.sh@10 -- # set +x 00:08:03.560 ************************************ 00:08:03.560 END TEST nvmf_fuzz 00:08:03.560 ************************************ 00:08:03.560 00:16:02 -- fuzz/llvm.sh@60 -- # for fuzzer in "${fuzzers[@]}" 00:08:03.560 00:16:02 -- fuzz/llvm.sh@61 -- # case "$fuzzer" in 00:08:03.560 00:16:02 -- fuzz/llvm.sh@63 -- # run_test vfio_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/run.sh 00:08:03.560 00:16:02 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:08:03.560 00:16:02 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:08:03.560 00:16:02 -- common/autotest_common.sh@10 -- # set +x 00:08:03.560 ************************************ 00:08:03.560 START TEST vfio_fuzz 00:08:03.560 ************************************ 00:08:03.560 00:16:02 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/run.sh 00:08:03.560 * Looking for test storage... 00:08:03.560 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:03.560 00:16:02 -- vfio/run.sh@55 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/common.sh 00:08:03.560 00:16:02 -- setup/common.sh@6 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh 00:08:03.560 00:16:02 -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:08:03.560 00:16:02 -- common/autotest_common.sh@34 -- # set -e 00:08:03.560 00:16:02 -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:08:03.560 00:16:02 -- common/autotest_common.sh@36 -- # shopt -s extglob 00:08:03.560 00:16:02 -- common/autotest_common.sh@38 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh ]] 00:08:03.560 00:16:02 -- common/autotest_common.sh@39 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh 00:08:03.560 00:16:02 -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:08:03.560 00:16:02 -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:08:03.560 00:16:02 -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=n 00:08:03.560 00:16:02 -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:08:03.560 00:16:02 -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:08:03.560 00:16:02 -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:08:03.560 00:16:02 -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:08:03.560 00:16:02 -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:08:03.560 00:16:02 -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:08:03.560 00:16:02 -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:08:03.560 00:16:02 -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:08:03.560 00:16:02 -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:08:03.560 00:16:02 -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:08:03.822 00:16:02 -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:08:03.822 00:16:02 -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:08:03.822 00:16:02 -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:08:03.822 00:16:02 -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:08:03.822 00:16:02 -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:08:03.822 00:16:02 -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:08:03.822 00:16:02 -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:08:03.822 00:16:02 -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:08:03.822 00:16:02 -- common/build_config.sh@22 -- # CONFIG_CET=n 00:08:03.822 00:16:02 -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=n 00:08:03.822 00:16:02 -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:08:03.822 00:16:02 -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:08:03.822 00:16:02 -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:08:03.822 00:16:02 -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:08:03.822 00:16:02 -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:08:03.822 00:16:02 -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:08:03.822 00:16:02 -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:08:03.822 00:16:02 -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:08:03.822 00:16:02 -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:08:03.822 00:16:02 -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:08:03.822 00:16:02 -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB=/usr/lib64/clang/16/lib/libclang_rt.fuzzer_no_main-x86_64.a 00:08:03.822 00:16:02 -- common/build_config.sh@35 -- # CONFIG_FUZZER=y 00:08:03.822 00:16:02 -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build 00:08:03.822 00:16:02 -- common/build_config.sh@37 -- # CONFIG_CRYPTO=n 00:08:03.822 00:16:02 -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:08:03.822 00:16:02 -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:08:03.822 00:16:02 -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:08:03.822 00:16:02 -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR= 00:08:03.822 00:16:02 -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:08:03.822 00:16:02 -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:08:03.822 00:16:02 -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:08:03.822 00:16:02 -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:08:03.822 00:16:02 -- common/build_config.sh@46 -- # CONFIG_COVERAGE=y 00:08:03.822 00:16:02 -- common/build_config.sh@47 -- # CONFIG_RDMA=y 00:08:03.822 00:16:02 -- common/build_config.sh@48 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:08:03.822 00:16:02 -- common/build_config.sh@49 -- # CONFIG_URING_PATH= 00:08:03.822 00:16:02 -- common/build_config.sh@50 -- # CONFIG_XNVME=n 00:08:03.822 00:16:02 -- common/build_config.sh@51 -- # CONFIG_VFIO_USER=y 00:08:03.822 00:16:02 -- common/build_config.sh@52 -- # CONFIG_ARCH=native 00:08:03.822 00:16:02 -- common/build_config.sh@53 -- # CONFIG_URING_ZNS=n 00:08:03.822 00:16:02 -- common/build_config.sh@54 -- # CONFIG_WERROR=y 00:08:03.822 00:16:02 -- common/build_config.sh@55 -- # CONFIG_HAVE_LIBBSD=n 00:08:03.822 00:16:02 -- common/build_config.sh@56 -- # CONFIG_UBSAN=y 00:08:03.822 00:16:02 -- common/build_config.sh@57 -- # CONFIG_IPSEC_MB_DIR= 00:08:03.822 00:16:02 -- common/build_config.sh@58 -- # CONFIG_GOLANG=n 00:08:03.822 00:16:02 -- common/build_config.sh@59 -- # CONFIG_ISAL=y 00:08:03.822 00:16:02 -- common/build_config.sh@60 -- # CONFIG_IDXD_KERNEL=y 00:08:03.822 00:16:02 -- common/build_config.sh@61 -- # CONFIG_DPDK_LIB_DIR= 00:08:03.822 00:16:02 -- common/build_config.sh@62 -- # CONFIG_RDMA_PROV=verbs 00:08:03.822 00:16:02 -- common/build_config.sh@63 -- # CONFIG_APPS=y 00:08:03.822 00:16:02 -- common/build_config.sh@64 -- # CONFIG_SHARED=n 00:08:03.822 00:16:02 -- common/build_config.sh@65 -- # CONFIG_FC_PATH= 00:08:03.822 00:16:02 -- common/build_config.sh@66 -- # CONFIG_DPDK_PKG_CONFIG=n 00:08:03.822 00:16:02 -- common/build_config.sh@67 -- # CONFIG_FC=n 00:08:03.822 00:16:02 -- common/build_config.sh@68 -- # CONFIG_AVAHI=n 00:08:03.822 00:16:02 -- common/build_config.sh@69 -- # CONFIG_FIO_PLUGIN=y 00:08:03.822 00:16:02 -- common/build_config.sh@70 -- # CONFIG_RAID5F=n 00:08:03.822 00:16:02 -- common/build_config.sh@71 -- # CONFIG_EXAMPLES=y 00:08:03.822 00:16:02 -- common/build_config.sh@72 -- # CONFIG_TESTS=y 00:08:03.822 00:16:02 -- common/build_config.sh@73 -- # CONFIG_CRYPTO_MLX5=n 00:08:03.822 00:16:02 -- common/build_config.sh@74 -- # CONFIG_MAX_LCORES= 00:08:03.822 00:16:02 -- common/build_config.sh@75 -- # CONFIG_IPSEC_MB=n 00:08:03.822 00:16:02 -- common/build_config.sh@76 -- # CONFIG_DEBUG=y 00:08:03.822 00:16:02 -- common/build_config.sh@77 -- # CONFIG_DPDK_COMPRESSDEV=n 00:08:03.822 00:16:02 -- common/build_config.sh@78 -- # CONFIG_CROSS_PREFIX= 00:08:03.822 00:16:02 -- common/build_config.sh@79 -- # CONFIG_URING=n 00:08:03.822 00:16:02 -- common/autotest_common.sh@48 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:08:03.822 00:16:02 -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:08:03.822 00:16:02 -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:08:03.822 00:16:02 -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:08:03.822 00:16:02 -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:08:03.822 00:16:02 -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:03.822 00:16:02 -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:08:03.822 00:16:02 -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:03.822 00:16:02 -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:08:03.822 00:16:02 -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:08:03.822 00:16:02 -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:08:03.822 00:16:02 -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:08:03.822 00:16:02 -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:08:03.822 00:16:02 -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:08:03.822 00:16:02 -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/config.h ]] 00:08:03.822 00:16:02 -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:08:03.822 #define SPDK_CONFIG_H 00:08:03.822 #define SPDK_CONFIG_APPS 1 00:08:03.822 #define SPDK_CONFIG_ARCH native 00:08:03.822 #undef SPDK_CONFIG_ASAN 00:08:03.823 #undef SPDK_CONFIG_AVAHI 00:08:03.823 #undef SPDK_CONFIG_CET 00:08:03.823 #define SPDK_CONFIG_COVERAGE 1 00:08:03.823 #define SPDK_CONFIG_CROSS_PREFIX 00:08:03.823 #undef SPDK_CONFIG_CRYPTO 00:08:03.823 #undef SPDK_CONFIG_CRYPTO_MLX5 00:08:03.823 #undef SPDK_CONFIG_CUSTOMOCF 00:08:03.823 #undef SPDK_CONFIG_DAOS 00:08:03.823 #define SPDK_CONFIG_DAOS_DIR 00:08:03.823 #define SPDK_CONFIG_DEBUG 1 00:08:03.823 #undef SPDK_CONFIG_DPDK_COMPRESSDEV 00:08:03.823 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build 00:08:03.823 #define SPDK_CONFIG_DPDK_INC_DIR 00:08:03.823 #define SPDK_CONFIG_DPDK_LIB_DIR 00:08:03.823 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:08:03.823 #define SPDK_CONFIG_ENV /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:08:03.823 #define SPDK_CONFIG_EXAMPLES 1 00:08:03.823 #undef SPDK_CONFIG_FC 00:08:03.823 #define SPDK_CONFIG_FC_PATH 00:08:03.823 #define SPDK_CONFIG_FIO_PLUGIN 1 00:08:03.823 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:08:03.823 #undef SPDK_CONFIG_FUSE 00:08:03.823 #define SPDK_CONFIG_FUZZER 1 00:08:03.823 #define SPDK_CONFIG_FUZZER_LIB /usr/lib64/clang/16/lib/libclang_rt.fuzzer_no_main-x86_64.a 00:08:03.823 #undef SPDK_CONFIG_GOLANG 00:08:03.823 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:08:03.823 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:08:03.823 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:08:03.823 #undef SPDK_CONFIG_HAVE_LIBBSD 00:08:03.823 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:08:03.823 #define SPDK_CONFIG_IDXD 1 00:08:03.823 #define SPDK_CONFIG_IDXD_KERNEL 1 00:08:03.823 #undef SPDK_CONFIG_IPSEC_MB 00:08:03.823 #define SPDK_CONFIG_IPSEC_MB_DIR 00:08:03.823 #define SPDK_CONFIG_ISAL 1 00:08:03.823 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:08:03.823 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:08:03.823 #define SPDK_CONFIG_LIBDIR 00:08:03.823 #undef SPDK_CONFIG_LTO 00:08:03.823 #define SPDK_CONFIG_MAX_LCORES 00:08:03.823 #define SPDK_CONFIG_NVME_CUSE 1 00:08:03.823 #undef SPDK_CONFIG_OCF 00:08:03.823 #define SPDK_CONFIG_OCF_PATH 00:08:03.823 #define SPDK_CONFIG_OPENSSL_PATH 00:08:03.823 #undef SPDK_CONFIG_PGO_CAPTURE 00:08:03.823 #undef SPDK_CONFIG_PGO_USE 00:08:03.823 #define SPDK_CONFIG_PREFIX /usr/local 00:08:03.823 #undef SPDK_CONFIG_RAID5F 00:08:03.823 #undef SPDK_CONFIG_RBD 00:08:03.823 #define SPDK_CONFIG_RDMA 1 00:08:03.823 #define SPDK_CONFIG_RDMA_PROV verbs 00:08:03.823 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:08:03.823 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:08:03.823 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:08:03.823 #undef SPDK_CONFIG_SHARED 00:08:03.823 #undef SPDK_CONFIG_SMA 00:08:03.823 #define SPDK_CONFIG_TESTS 1 00:08:03.823 #undef SPDK_CONFIG_TSAN 00:08:03.823 #define SPDK_CONFIG_UBLK 1 00:08:03.823 #define SPDK_CONFIG_UBSAN 1 00:08:03.823 #undef SPDK_CONFIG_UNIT_TESTS 00:08:03.823 #undef SPDK_CONFIG_URING 00:08:03.823 #define SPDK_CONFIG_URING_PATH 00:08:03.823 #undef SPDK_CONFIG_URING_ZNS 00:08:03.823 #undef SPDK_CONFIG_USDT 00:08:03.823 #undef SPDK_CONFIG_VBDEV_COMPRESS 00:08:03.823 #undef SPDK_CONFIG_VBDEV_COMPRESS_MLX5 00:08:03.823 #define SPDK_CONFIG_VFIO_USER 1 00:08:03.823 #define SPDK_CONFIG_VFIO_USER_DIR 00:08:03.823 #define SPDK_CONFIG_VHOST 1 00:08:03.823 #define SPDK_CONFIG_VIRTIO 1 00:08:03.823 #undef SPDK_CONFIG_VTUNE 00:08:03.823 #define SPDK_CONFIG_VTUNE_DIR 00:08:03.823 #define SPDK_CONFIG_WERROR 1 00:08:03.823 #define SPDK_CONFIG_WPDK_DIR 00:08:03.823 #undef SPDK_CONFIG_XNVME 00:08:03.823 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:08:03.823 00:16:02 -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:08:03.823 00:16:02 -- common/autotest_common.sh@49 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:08:03.823 00:16:02 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:08:03.823 00:16:02 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:08:03.823 00:16:02 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:08:03.823 00:16:02 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:03.823 00:16:02 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:03.823 00:16:02 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:03.823 00:16:02 -- paths/export.sh@5 -- # export PATH 00:08:03.823 00:16:02 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:03.823 00:16:02 -- common/autotest_common.sh@50 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:08:03.823 00:16:02 -- pm/common@6 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:08:03.823 00:16:02 -- pm/common@6 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:08:03.823 00:16:02 -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:08:03.823 00:16:02 -- pm/common@7 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/../../../ 00:08:03.823 00:16:02 -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:08:03.823 00:16:02 -- pm/common@16 -- # TEST_TAG=N/A 00:08:03.823 00:16:02 -- pm/common@17 -- # TEST_TAG_FILE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.run_test_name 00:08:03.823 00:16:02 -- common/autotest_common.sh@52 -- # : 1 00:08:03.823 00:16:02 -- common/autotest_common.sh@53 -- # export RUN_NIGHTLY 00:08:03.823 00:16:02 -- common/autotest_common.sh@56 -- # : 0 00:08:03.823 00:16:02 -- common/autotest_common.sh@57 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:08:03.823 00:16:02 -- common/autotest_common.sh@58 -- # : 0 00:08:03.823 00:16:02 -- common/autotest_common.sh@59 -- # export SPDK_RUN_VALGRIND 00:08:03.823 00:16:02 -- common/autotest_common.sh@60 -- # : 1 00:08:03.823 00:16:02 -- common/autotest_common.sh@61 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:08:03.823 00:16:02 -- common/autotest_common.sh@62 -- # : 0 00:08:03.823 00:16:02 -- common/autotest_common.sh@63 -- # export SPDK_TEST_UNITTEST 00:08:03.823 00:16:02 -- common/autotest_common.sh@64 -- # : 00:08:03.823 00:16:02 -- common/autotest_common.sh@65 -- # export SPDK_TEST_AUTOBUILD 00:08:03.823 00:16:02 -- common/autotest_common.sh@66 -- # : 0 00:08:03.823 00:16:02 -- common/autotest_common.sh@67 -- # export SPDK_TEST_RELEASE_BUILD 00:08:03.823 00:16:02 -- common/autotest_common.sh@68 -- # : 0 00:08:03.823 00:16:02 -- common/autotest_common.sh@69 -- # export SPDK_TEST_ISAL 00:08:03.823 00:16:02 -- common/autotest_common.sh@70 -- # : 0 00:08:03.823 00:16:02 -- common/autotest_common.sh@71 -- # export SPDK_TEST_ISCSI 00:08:03.823 00:16:02 -- common/autotest_common.sh@72 -- # : 0 00:08:03.823 00:16:02 -- common/autotest_common.sh@73 -- # export SPDK_TEST_ISCSI_INITIATOR 00:08:03.823 00:16:02 -- common/autotest_common.sh@74 -- # : 0 00:08:03.823 00:16:02 -- common/autotest_common.sh@75 -- # export SPDK_TEST_NVME 00:08:03.823 00:16:02 -- common/autotest_common.sh@76 -- # : 0 00:08:03.823 00:16:02 -- common/autotest_common.sh@77 -- # export SPDK_TEST_NVME_PMR 00:08:03.823 00:16:02 -- common/autotest_common.sh@78 -- # : 0 00:08:03.823 00:16:02 -- common/autotest_common.sh@79 -- # export SPDK_TEST_NVME_BP 00:08:03.823 00:16:02 -- common/autotest_common.sh@80 -- # : 0 00:08:03.823 00:16:02 -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME_CLI 00:08:03.823 00:16:02 -- common/autotest_common.sh@82 -- # : 0 00:08:03.823 00:16:02 -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_CUSE 00:08:03.823 00:16:02 -- common/autotest_common.sh@84 -- # : 0 00:08:03.823 00:16:02 -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_FDP 00:08:03.823 00:16:02 -- common/autotest_common.sh@86 -- # : 0 00:08:03.824 00:16:02 -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVMF 00:08:03.824 00:16:02 -- common/autotest_common.sh@88 -- # : 0 00:08:03.824 00:16:02 -- common/autotest_common.sh@89 -- # export SPDK_TEST_VFIOUSER 00:08:03.824 00:16:02 -- common/autotest_common.sh@90 -- # : 0 00:08:03.824 00:16:02 -- common/autotest_common.sh@91 -- # export SPDK_TEST_VFIOUSER_QEMU 00:08:03.824 00:16:02 -- common/autotest_common.sh@92 -- # : 1 00:08:03.824 00:16:02 -- common/autotest_common.sh@93 -- # export SPDK_TEST_FUZZER 00:08:03.824 00:16:02 -- common/autotest_common.sh@94 -- # : 1 00:08:03.824 00:16:02 -- common/autotest_common.sh@95 -- # export SPDK_TEST_FUZZER_SHORT 00:08:03.824 00:16:02 -- common/autotest_common.sh@96 -- # : rdma 00:08:03.824 00:16:02 -- common/autotest_common.sh@97 -- # export SPDK_TEST_NVMF_TRANSPORT 00:08:03.824 00:16:02 -- common/autotest_common.sh@98 -- # : 0 00:08:03.824 00:16:02 -- common/autotest_common.sh@99 -- # export SPDK_TEST_RBD 00:08:03.824 00:16:02 -- common/autotest_common.sh@100 -- # : 0 00:08:03.824 00:16:02 -- common/autotest_common.sh@101 -- # export SPDK_TEST_VHOST 00:08:03.824 00:16:02 -- common/autotest_common.sh@102 -- # : 0 00:08:03.824 00:16:02 -- common/autotest_common.sh@103 -- # export SPDK_TEST_BLOCKDEV 00:08:03.824 00:16:02 -- common/autotest_common.sh@104 -- # : 0 00:08:03.824 00:16:02 -- common/autotest_common.sh@105 -- # export SPDK_TEST_IOAT 00:08:03.824 00:16:02 -- common/autotest_common.sh@106 -- # : 0 00:08:03.824 00:16:02 -- common/autotest_common.sh@107 -- # export SPDK_TEST_BLOBFS 00:08:03.824 00:16:02 -- common/autotest_common.sh@108 -- # : 0 00:08:03.824 00:16:02 -- common/autotest_common.sh@109 -- # export SPDK_TEST_VHOST_INIT 00:08:03.824 00:16:02 -- common/autotest_common.sh@110 -- # : 0 00:08:03.824 00:16:02 -- common/autotest_common.sh@111 -- # export SPDK_TEST_LVOL 00:08:03.824 00:16:02 -- common/autotest_common.sh@112 -- # : 0 00:08:03.824 00:16:02 -- common/autotest_common.sh@113 -- # export SPDK_TEST_VBDEV_COMPRESS 00:08:03.824 00:16:02 -- common/autotest_common.sh@114 -- # : 0 00:08:03.824 00:16:02 -- common/autotest_common.sh@115 -- # export SPDK_RUN_ASAN 00:08:03.824 00:16:02 -- common/autotest_common.sh@116 -- # : 1 00:08:03.824 00:16:02 -- common/autotest_common.sh@117 -- # export SPDK_RUN_UBSAN 00:08:03.824 00:16:02 -- common/autotest_common.sh@118 -- # : 00:08:03.824 00:16:02 -- common/autotest_common.sh@119 -- # export SPDK_RUN_EXTERNAL_DPDK 00:08:03.824 00:16:02 -- common/autotest_common.sh@120 -- # : 0 00:08:03.824 00:16:02 -- common/autotest_common.sh@121 -- # export SPDK_RUN_NON_ROOT 00:08:03.824 00:16:02 -- common/autotest_common.sh@122 -- # : 0 00:08:03.824 00:16:02 -- common/autotest_common.sh@123 -- # export SPDK_TEST_CRYPTO 00:08:03.824 00:16:02 -- common/autotest_common.sh@124 -- # : 0 00:08:03.824 00:16:02 -- common/autotest_common.sh@125 -- # export SPDK_TEST_FTL 00:08:03.824 00:16:02 -- common/autotest_common.sh@126 -- # : 0 00:08:03.824 00:16:02 -- common/autotest_common.sh@127 -- # export SPDK_TEST_OCF 00:08:03.824 00:16:02 -- common/autotest_common.sh@128 -- # : 0 00:08:03.824 00:16:02 -- common/autotest_common.sh@129 -- # export SPDK_TEST_VMD 00:08:03.824 00:16:02 -- common/autotest_common.sh@130 -- # : 0 00:08:03.824 00:16:02 -- common/autotest_common.sh@131 -- # export SPDK_TEST_OPAL 00:08:03.824 00:16:02 -- common/autotest_common.sh@132 -- # : 00:08:03.824 00:16:02 -- common/autotest_common.sh@133 -- # export SPDK_TEST_NATIVE_DPDK 00:08:03.824 00:16:02 -- common/autotest_common.sh@134 -- # : true 00:08:03.824 00:16:02 -- common/autotest_common.sh@135 -- # export SPDK_AUTOTEST_X 00:08:03.824 00:16:02 -- common/autotest_common.sh@136 -- # : 0 00:08:03.824 00:16:02 -- common/autotest_common.sh@137 -- # export SPDK_TEST_RAID5 00:08:03.824 00:16:02 -- common/autotest_common.sh@138 -- # : 0 00:08:03.824 00:16:02 -- common/autotest_common.sh@139 -- # export SPDK_TEST_URING 00:08:03.824 00:16:02 -- common/autotest_common.sh@140 -- # : 0 00:08:03.824 00:16:02 -- common/autotest_common.sh@141 -- # export SPDK_TEST_USDT 00:08:03.824 00:16:02 -- common/autotest_common.sh@142 -- # : 0 00:08:03.824 00:16:02 -- common/autotest_common.sh@143 -- # export SPDK_TEST_USE_IGB_UIO 00:08:03.824 00:16:02 -- common/autotest_common.sh@144 -- # : 0 00:08:03.824 00:16:02 -- common/autotest_common.sh@145 -- # export SPDK_TEST_SCHEDULER 00:08:03.824 00:16:02 -- common/autotest_common.sh@146 -- # : 0 00:08:03.824 00:16:02 -- common/autotest_common.sh@147 -- # export SPDK_TEST_SCANBUILD 00:08:03.824 00:16:02 -- common/autotest_common.sh@148 -- # : 00:08:03.824 00:16:02 -- common/autotest_common.sh@149 -- # export SPDK_TEST_NVMF_NICS 00:08:03.824 00:16:02 -- common/autotest_common.sh@150 -- # : 0 00:08:03.824 00:16:02 -- common/autotest_common.sh@151 -- # export SPDK_TEST_SMA 00:08:03.824 00:16:02 -- common/autotest_common.sh@152 -- # : 0 00:08:03.824 00:16:02 -- common/autotest_common.sh@153 -- # export SPDK_TEST_DAOS 00:08:03.824 00:16:02 -- common/autotest_common.sh@154 -- # : 0 00:08:03.824 00:16:02 -- common/autotest_common.sh@155 -- # export SPDK_TEST_XNVME 00:08:03.824 00:16:02 -- common/autotest_common.sh@156 -- # : 0 00:08:03.824 00:16:02 -- common/autotest_common.sh@157 -- # export SPDK_TEST_ACCEL_DSA 00:08:03.824 00:16:02 -- common/autotest_common.sh@158 -- # : 0 00:08:03.824 00:16:02 -- common/autotest_common.sh@159 -- # export SPDK_TEST_ACCEL_IAA 00:08:03.824 00:16:02 -- common/autotest_common.sh@160 -- # : 0 00:08:03.824 00:16:02 -- common/autotest_common.sh@161 -- # export SPDK_TEST_ACCEL_IOAT 00:08:03.824 00:16:02 -- common/autotest_common.sh@163 -- # : 00:08:03.824 00:16:02 -- common/autotest_common.sh@164 -- # export SPDK_TEST_FUZZER_TARGET 00:08:03.824 00:16:02 -- common/autotest_common.sh@165 -- # : 0 00:08:03.824 00:16:02 -- common/autotest_common.sh@166 -- # export SPDK_TEST_NVMF_MDNS 00:08:03.824 00:16:02 -- common/autotest_common.sh@167 -- # : 0 00:08:03.824 00:16:02 -- common/autotest_common.sh@168 -- # export SPDK_JSONRPC_GO_CLIENT 00:08:03.824 00:16:02 -- common/autotest_common.sh@171 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:08:03.824 00:16:02 -- common/autotest_common.sh@171 -- # SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:08:03.824 00:16:02 -- common/autotest_common.sh@172 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib 00:08:03.824 00:16:02 -- common/autotest_common.sh@172 -- # DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib 00:08:03.824 00:16:02 -- common/autotest_common.sh@173 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:03.824 00:16:02 -- common/autotest_common.sh@173 -- # VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:03.824 00:16:02 -- common/autotest_common.sh@174 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:03.824 00:16:02 -- common/autotest_common.sh@174 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:03.824 00:16:02 -- common/autotest_common.sh@177 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:08:03.824 00:16:02 -- common/autotest_common.sh@177 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:08:03.824 00:16:02 -- common/autotest_common.sh@181 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:08:03.824 00:16:02 -- common/autotest_common.sh@181 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:08:03.824 00:16:02 -- common/autotest_common.sh@185 -- # export PYTHONDONTWRITEBYTECODE=1 00:08:03.824 00:16:02 -- common/autotest_common.sh@185 -- # PYTHONDONTWRITEBYTECODE=1 00:08:03.824 00:16:02 -- common/autotest_common.sh@189 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:08:03.824 00:16:02 -- common/autotest_common.sh@189 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:08:03.824 00:16:02 -- common/autotest_common.sh@190 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:08:03.824 00:16:02 -- common/autotest_common.sh@190 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:08:03.824 00:16:02 -- common/autotest_common.sh@194 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:08:03.824 00:16:02 -- common/autotest_common.sh@195 -- # rm -rf /var/tmp/asan_suppression_file 00:08:03.824 00:16:02 -- common/autotest_common.sh@196 -- # cat 00:08:03.824 00:16:02 -- common/autotest_common.sh@222 -- # echo leak:libfuse3.so 00:08:03.824 00:16:02 -- common/autotest_common.sh@224 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:08:03.825 00:16:02 -- common/autotest_common.sh@224 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:08:03.825 00:16:02 -- common/autotest_common.sh@226 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:08:03.825 00:16:02 -- common/autotest_common.sh@226 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:08:03.825 00:16:02 -- common/autotest_common.sh@228 -- # '[' -z /var/spdk/dependencies ']' 00:08:03.825 00:16:02 -- common/autotest_common.sh@231 -- # export DEPENDENCY_DIR 00:08:03.825 00:16:02 -- common/autotest_common.sh@235 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:03.825 00:16:02 -- common/autotest_common.sh@235 -- # SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:03.825 00:16:02 -- common/autotest_common.sh@236 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:03.825 00:16:02 -- common/autotest_common.sh@236 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:03.825 00:16:02 -- common/autotest_common.sh@239 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:08:03.825 00:16:02 -- common/autotest_common.sh@239 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:08:03.825 00:16:02 -- common/autotest_common.sh@240 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:08:03.825 00:16:02 -- common/autotest_common.sh@240 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:08:03.825 00:16:02 -- common/autotest_common.sh@242 -- # export AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:08:03.825 00:16:02 -- common/autotest_common.sh@242 -- # AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:08:03.825 00:16:02 -- common/autotest_common.sh@245 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:08:03.825 00:16:02 -- common/autotest_common.sh@245 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:08:03.825 00:16:02 -- common/autotest_common.sh@248 -- # '[' 0 -eq 0 ']' 00:08:03.825 00:16:02 -- common/autotest_common.sh@249 -- # export valgrind= 00:08:03.825 00:16:02 -- common/autotest_common.sh@249 -- # valgrind= 00:08:03.825 00:16:02 -- common/autotest_common.sh@255 -- # uname -s 00:08:03.825 00:16:02 -- common/autotest_common.sh@255 -- # '[' Linux = Linux ']' 00:08:03.825 00:16:02 -- common/autotest_common.sh@256 -- # HUGEMEM=4096 00:08:03.825 00:16:02 -- common/autotest_common.sh@257 -- # export CLEAR_HUGE=yes 00:08:03.825 00:16:02 -- common/autotest_common.sh@257 -- # CLEAR_HUGE=yes 00:08:03.825 00:16:02 -- common/autotest_common.sh@258 -- # [[ 0 -eq 1 ]] 00:08:03.825 00:16:02 -- common/autotest_common.sh@258 -- # [[ 0 -eq 1 ]] 00:08:03.825 00:16:02 -- common/autotest_common.sh@265 -- # MAKE=make 00:08:03.825 00:16:02 -- common/autotest_common.sh@266 -- # MAKEFLAGS=-j112 00:08:03.825 00:16:02 -- common/autotest_common.sh@282 -- # export HUGEMEM=4096 00:08:03.825 00:16:02 -- common/autotest_common.sh@282 -- # HUGEMEM=4096 00:08:03.825 00:16:02 -- common/autotest_common.sh@284 -- # '[' -z /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output ']' 00:08:03.825 00:16:02 -- common/autotest_common.sh@289 -- # NO_HUGE=() 00:08:03.825 00:16:02 -- common/autotest_common.sh@290 -- # TEST_MODE= 00:08:03.825 00:16:02 -- common/autotest_common.sh@309 -- # [[ -z 337496 ]] 00:08:03.825 00:16:02 -- common/autotest_common.sh@309 -- # kill -0 337496 00:08:03.825 00:16:02 -- common/autotest_common.sh@1665 -- # set_test_storage 2147483648 00:08:03.825 00:16:02 -- common/autotest_common.sh@319 -- # [[ -v testdir ]] 00:08:03.825 00:16:02 -- common/autotest_common.sh@321 -- # local requested_size=2147483648 00:08:03.825 00:16:02 -- common/autotest_common.sh@322 -- # local mount target_dir 00:08:03.825 00:16:02 -- common/autotest_common.sh@324 -- # local -A mounts fss sizes avails uses 00:08:03.825 00:16:02 -- common/autotest_common.sh@325 -- # local source fs size avail mount use 00:08:03.825 00:16:02 -- common/autotest_common.sh@327 -- # local storage_fallback storage_candidates 00:08:03.825 00:16:02 -- common/autotest_common.sh@329 -- # mktemp -udt spdk.XXXXXX 00:08:03.825 00:16:02 -- common/autotest_common.sh@329 -- # storage_fallback=/tmp/spdk.Ig0rYQ 00:08:03.825 00:16:02 -- common/autotest_common.sh@334 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:08:03.825 00:16:02 -- common/autotest_common.sh@336 -- # [[ -n '' ]] 00:08:03.825 00:16:02 -- common/autotest_common.sh@341 -- # [[ -n '' ]] 00:08:03.825 00:16:02 -- common/autotest_common.sh@346 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio /tmp/spdk.Ig0rYQ/tests/vfio /tmp/spdk.Ig0rYQ 00:08:03.825 00:16:02 -- common/autotest_common.sh@349 -- # requested_size=2214592512 00:08:03.825 00:16:02 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:08:03.825 00:16:02 -- common/autotest_common.sh@318 -- # df -T 00:08:03.825 00:16:02 -- common/autotest_common.sh@318 -- # grep -v Filesystem 00:08:03.825 00:16:02 -- common/autotest_common.sh@352 -- # mounts["$mount"]=spdk_devtmpfs 00:08:03.825 00:16:02 -- common/autotest_common.sh@352 -- # fss["$mount"]=devtmpfs 00:08:03.825 00:16:02 -- common/autotest_common.sh@353 -- # avails["$mount"]=67108864 00:08:03.825 00:16:02 -- common/autotest_common.sh@353 -- # sizes["$mount"]=67108864 00:08:03.825 00:16:02 -- common/autotest_common.sh@354 -- # uses["$mount"]=0 00:08:03.825 00:16:02 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:08:03.825 00:16:02 -- common/autotest_common.sh@352 -- # mounts["$mount"]=/dev/pmem0 00:08:03.825 00:16:02 -- common/autotest_common.sh@352 -- # fss["$mount"]=ext2 00:08:03.825 00:16:02 -- common/autotest_common.sh@353 -- # avails["$mount"]=954408960 00:08:03.825 00:16:02 -- common/autotest_common.sh@353 -- # sizes["$mount"]=5284429824 00:08:03.825 00:16:02 -- common/autotest_common.sh@354 -- # uses["$mount"]=4330020864 00:08:03.825 00:16:02 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:08:03.825 00:16:02 -- common/autotest_common.sh@352 -- # mounts["$mount"]=spdk_root 00:08:03.825 00:16:02 -- common/autotest_common.sh@352 -- # fss["$mount"]=overlay 00:08:03.825 00:16:02 -- common/autotest_common.sh@353 -- # avails["$mount"]=54350225408 00:08:03.825 00:16:02 -- common/autotest_common.sh@353 -- # sizes["$mount"]=61742317568 00:08:03.825 00:16:02 -- common/autotest_common.sh@354 -- # uses["$mount"]=7392092160 00:08:03.825 00:16:02 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:08:03.825 00:16:02 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:08:03.825 00:16:02 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:08:03.825 00:16:02 -- common/autotest_common.sh@353 -- # avails["$mount"]=30868566016 00:08:03.825 00:16:02 -- common/autotest_common.sh@353 -- # sizes["$mount"]=30871158784 00:08:03.825 00:16:02 -- common/autotest_common.sh@354 -- # uses["$mount"]=2592768 00:08:03.825 00:16:02 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:08:03.825 00:16:02 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:08:03.825 00:16:02 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:08:03.825 00:16:02 -- common/autotest_common.sh@353 -- # avails["$mount"]=12342484992 00:08:03.825 00:16:02 -- common/autotest_common.sh@353 -- # sizes["$mount"]=12348465152 00:08:03.825 00:16:02 -- common/autotest_common.sh@354 -- # uses["$mount"]=5980160 00:08:03.825 00:16:02 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:08:03.825 00:16:02 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:08:03.825 00:16:02 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:08:03.825 00:16:02 -- common/autotest_common.sh@353 -- # avails["$mount"]=30870425600 00:08:03.825 00:16:02 -- common/autotest_common.sh@353 -- # sizes["$mount"]=30871158784 00:08:03.825 00:16:02 -- common/autotest_common.sh@354 -- # uses["$mount"]=733184 00:08:03.825 00:16:02 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:08:03.825 00:16:02 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:08:03.825 00:16:02 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:08:03.825 00:16:02 -- common/autotest_common.sh@353 -- # avails["$mount"]=6174224384 00:08:03.825 00:16:02 -- common/autotest_common.sh@353 -- # sizes["$mount"]=6174228480 00:08:03.825 00:16:02 -- common/autotest_common.sh@354 -- # uses["$mount"]=4096 00:08:03.825 00:16:02 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:08:03.825 00:16:02 -- common/autotest_common.sh@357 -- # printf '* Looking for test storage...\n' 00:08:03.825 * Looking for test storage... 00:08:03.825 00:16:02 -- common/autotest_common.sh@359 -- # local target_space new_size 00:08:03.825 00:16:02 -- common/autotest_common.sh@360 -- # for target_dir in "${storage_candidates[@]}" 00:08:03.825 00:16:02 -- common/autotest_common.sh@363 -- # df /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:03.825 00:16:02 -- common/autotest_common.sh@363 -- # awk '$1 !~ /Filesystem/{print $6}' 00:08:03.825 00:16:02 -- common/autotest_common.sh@363 -- # mount=/ 00:08:03.825 00:16:02 -- common/autotest_common.sh@365 -- # target_space=54350225408 00:08:03.826 00:16:02 -- common/autotest_common.sh@366 -- # (( target_space == 0 || target_space < requested_size )) 00:08:03.826 00:16:02 -- common/autotest_common.sh@369 -- # (( target_space >= requested_size )) 00:08:03.826 00:16:02 -- common/autotest_common.sh@371 -- # [[ overlay == tmpfs ]] 00:08:03.826 00:16:02 -- common/autotest_common.sh@371 -- # [[ overlay == ramfs ]] 00:08:03.826 00:16:02 -- common/autotest_common.sh@371 -- # [[ / == / ]] 00:08:03.826 00:16:02 -- common/autotest_common.sh@372 -- # new_size=9606684672 00:08:03.826 00:16:02 -- common/autotest_common.sh@373 -- # (( new_size * 100 / sizes[/] > 95 )) 00:08:03.826 00:16:02 -- common/autotest_common.sh@378 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:03.826 00:16:02 -- common/autotest_common.sh@378 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:03.826 00:16:02 -- common/autotest_common.sh@379 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:03.826 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:03.826 00:16:02 -- common/autotest_common.sh@380 -- # return 0 00:08:03.826 00:16:02 -- common/autotest_common.sh@1667 -- # set -o errtrace 00:08:03.826 00:16:02 -- common/autotest_common.sh@1668 -- # shopt -s extdebug 00:08:03.826 00:16:02 -- common/autotest_common.sh@1669 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:08:03.826 00:16:02 -- common/autotest_common.sh@1671 -- # PS4=' \t -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:08:03.826 00:16:02 -- common/autotest_common.sh@1672 -- # true 00:08:03.826 00:16:02 -- common/autotest_common.sh@1674 -- # xtrace_fd 00:08:03.826 00:16:02 -- common/autotest_common.sh@25 -- # [[ -n 14 ]] 00:08:03.826 00:16:02 -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/14 ]] 00:08:03.826 00:16:02 -- common/autotest_common.sh@27 -- # exec 00:08:03.826 00:16:02 -- common/autotest_common.sh@29 -- # exec 00:08:03.826 00:16:02 -- common/autotest_common.sh@31 -- # xtrace_restore 00:08:03.826 00:16:02 -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:08:03.826 00:16:02 -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:08:03.826 00:16:02 -- common/autotest_common.sh@18 -- # set -x 00:08:03.826 00:16:02 -- vfio/run.sh@56 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/../common.sh 00:08:03.826 00:16:02 -- ../common.sh@8 -- # pids=() 00:08:03.826 00:16:02 -- vfio/run.sh@58 -- # fuzzfile=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c 00:08:03.826 00:16:02 -- vfio/run.sh@59 -- # grep -c '\.fn =' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c 00:08:03.826 00:16:02 -- vfio/run.sh@59 -- # fuzz_num=7 00:08:03.826 00:16:02 -- vfio/run.sh@60 -- # (( fuzz_num != 0 )) 00:08:03.826 00:16:02 -- vfio/run.sh@62 -- # trap 'cleanup /tmp/vfio-user-*; exit 1' SIGINT SIGTERM EXIT 00:08:03.826 00:16:02 -- vfio/run.sh@65 -- # mem_size=0 00:08:03.826 00:16:02 -- vfio/run.sh@66 -- # [[ 1 -eq 1 ]] 00:08:03.826 00:16:02 -- vfio/run.sh@67 -- # start_llvm_fuzz_short 7 1 00:08:03.826 00:16:02 -- ../common.sh@69 -- # local fuzz_num=7 00:08:03.826 00:16:02 -- ../common.sh@70 -- # local time=1 00:08:03.826 00:16:02 -- ../common.sh@72 -- # (( i = 0 )) 00:08:03.826 00:16:02 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:03.826 00:16:02 -- ../common.sh@73 -- # start_llvm_fuzz 0 1 0x1 00:08:03.826 00:16:02 -- vfio/run.sh@22 -- # local fuzzer_type=0 00:08:03.826 00:16:02 -- vfio/run.sh@23 -- # local timen=1 00:08:03.826 00:16:02 -- vfio/run.sh@24 -- # local core=0x1 00:08:03.826 00:16:02 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:03.826 00:16:02 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-0 00:08:03.826 00:16:02 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-0/domain/1 00:08:03.826 00:16:02 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-0/domain/2 00:08:03.826 00:16:02 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-0/fuzz_vfio_json.conf 00:08:03.826 00:16:02 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-0 /tmp/vfio-user-0/domain/1 /tmp/vfio-user-0/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:03.826 00:16:02 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-0/domain/1%; 00:08:03.826 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-0/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:03.826 00:16:02 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-0/domain/1 -c /tmp/vfio-user-0/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 -Y /tmp/vfio-user-0/domain/2 -r /tmp/vfio-user-0/spdk0.sock -Z 0 00:08:03.826 [2024-07-15 00:16:02.817906] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:08:03.826 [2024-07-15 00:16:02.817988] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid337543 ] 00:08:03.826 EAL: No free 2048 kB hugepages reported on node 1 00:08:04.085 [2024-07-15 00:16:02.894335] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:04.085 [2024-07-15 00:16:02.965072] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:04.085 [2024-07-15 00:16:02.965238] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:04.343 INFO: Running with entropic power schedule (0xFF, 100). 00:08:04.343 INFO: Seed: 546822894 00:08:04.343 INFO: Loaded 1 modules (338583 inline 8-bit counters): 338583 [0x27cc1cc, 0x281ec63), 00:08:04.343 INFO: Loaded 1 PC tables (338583 PCs): 338583 [0x281ec68,0x2d495d8), 00:08:04.343 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:04.343 INFO: A corpus is not provided, starting from an empty corpus 00:08:04.343 #2 INITED exec/s: 0 rss: 61Mb 00:08:04.343 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:04.343 This may also happen if the target rejected all inputs we tried so far 00:08:04.600 NEW_FUNC[1/631]: 0x4806f0 in fuzz_vfio_user_region_rw /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:85 00:08:04.600 NEW_FUNC[2/631]: 0x486290 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:04.600 #6 NEW cov: 10716 ft: 10676 corp: 2/10b lim: 60 exec/s: 0 rss: 66Mb L: 9/9 MS: 4 CopyPart-ShuffleBytes-EraseBytes-CMP- DE: "S\035?>\324\233*\000"- 00:08:04.858 #7 NEW cov: 10730 ft: 13591 corp: 3/19b lim: 60 exec/s: 0 rss: 67Mb L: 9/9 MS: 1 PersAutoDict- DE: "S\035?>\324\233*\000"- 00:08:05.117 NEW_FUNC[1/1]: 0x192aaa0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:05.117 #8 NEW cov: 10748 ft: 15449 corp: 4/29b lim: 60 exec/s: 0 rss: 68Mb L: 10/10 MS: 1 InsertByte- 00:08:05.376 #9 NEW cov: 10748 ft: 16098 corp: 5/38b lim: 60 exec/s: 9 rss: 68Mb L: 9/10 MS: 1 PersAutoDict- DE: "S\035?>\324\233*\000"- 00:08:05.376 #10 NEW cov: 10748 ft: 16768 corp: 6/48b lim: 60 exec/s: 10 rss: 68Mb L: 10/10 MS: 1 CopyPart- 00:08:05.634 #11 NEW cov: 10748 ft: 17221 corp: 7/58b lim: 60 exec/s: 11 rss: 68Mb L: 10/10 MS: 1 ChangeBit- 00:08:05.893 #12 NEW cov: 10748 ft: 17494 corp: 8/68b lim: 60 exec/s: 12 rss: 68Mb L: 10/10 MS: 1 ChangeASCIIInt- 00:08:06.151 #13 NEW cov: 10748 ft: 17749 corp: 9/78b lim: 60 exec/s: 13 rss: 68Mb L: 10/10 MS: 1 InsertByte- 00:08:06.151 #14 NEW cov: 10755 ft: 18074 corp: 10/87b lim: 60 exec/s: 14 rss: 68Mb L: 9/10 MS: 1 ChangeByte- 00:08:06.411 #15 NEW cov: 10755 ft: 18284 corp: 11/140b lim: 60 exec/s: 7 rss: 68Mb L: 53/53 MS: 1 InsertRepeatedBytes- 00:08:06.411 #15 DONE cov: 10755 ft: 18284 corp: 11/140b lim: 60 exec/s: 7 rss: 68Mb 00:08:06.411 ###### Recommended dictionary. ###### 00:08:06.411 "S\035?>\324\233*\000" # Uses: 2 00:08:06.411 ###### End of recommended dictionary. ###### 00:08:06.411 Done 15 runs in 2 second(s) 00:08:06.670 00:16:05 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-0 00:08:06.670 00:16:05 -- ../common.sh@72 -- # (( i++ )) 00:08:06.670 00:16:05 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:06.670 00:16:05 -- ../common.sh@73 -- # start_llvm_fuzz 1 1 0x1 00:08:06.670 00:16:05 -- vfio/run.sh@22 -- # local fuzzer_type=1 00:08:06.670 00:16:05 -- vfio/run.sh@23 -- # local timen=1 00:08:06.670 00:16:05 -- vfio/run.sh@24 -- # local core=0x1 00:08:06.670 00:16:05 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:08:06.670 00:16:05 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-1 00:08:06.670 00:16:05 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-1/domain/1 00:08:06.670 00:16:05 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-1/domain/2 00:08:06.670 00:16:05 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-1/fuzz_vfio_json.conf 00:08:06.670 00:16:05 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-1 /tmp/vfio-user-1/domain/1 /tmp/vfio-user-1/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:08:06.670 00:16:05 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-1/domain/1%; 00:08:06.670 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-1/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:06.670 00:16:05 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-1/domain/1 -c /tmp/vfio-user-1/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 -Y /tmp/vfio-user-1/domain/2 -r /tmp/vfio-user-1/spdk1.sock -Z 1 00:08:06.670 [2024-07-15 00:16:05.632619] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:08:06.670 [2024-07-15 00:16:05.632689] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid338082 ] 00:08:06.670 EAL: No free 2048 kB hugepages reported on node 1 00:08:06.670 [2024-07-15 00:16:05.704438] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:06.929 [2024-07-15 00:16:05.773487] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:06.929 [2024-07-15 00:16:05.773650] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:06.929 INFO: Running with entropic power schedule (0xFF, 100). 00:08:06.929 INFO: Seed: 3348816376 00:08:06.929 INFO: Loaded 1 modules (338583 inline 8-bit counters): 338583 [0x27cc1cc, 0x281ec63), 00:08:06.929 INFO: Loaded 1 PC tables (338583 PCs): 338583 [0x281ec68,0x2d495d8), 00:08:06.929 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:08:06.929 INFO: A corpus is not provided, starting from an empty corpus 00:08:06.929 #2 INITED exec/s: 0 rss: 61Mb 00:08:06.929 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:06.930 This may also happen if the target rejected all inputs we tried so far 00:08:07.189 [2024-07-15 00:16:06.099225] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:07.189 [2024-07-15 00:16:06.099255] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:07.189 [2024-07-15 00:16:06.099275] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:07.448 NEW_FUNC[1/638]: 0x480c90 in fuzz_vfio_user_version /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:72 00:08:07.448 NEW_FUNC[2/638]: 0x486290 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:07.448 #11 NEW cov: 10728 ft: 10482 corp: 2/7b lim: 40 exec/s: 0 rss: 66Mb L: 6/6 MS: 4 InsertByte-CopyPart-InsertByte-CopyPart- 00:08:07.708 [2024-07-15 00:16:06.591085] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:07.708 [2024-07-15 00:16:06.591122] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:07.708 [2024-07-15 00:16:06.591140] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:07.708 #12 NEW cov: 10742 ft: 13427 corp: 3/14b lim: 40 exec/s: 0 rss: 68Mb L: 7/7 MS: 1 CrossOver- 00:08:07.967 [2024-07-15 00:16:06.776222] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:07.967 [2024-07-15 00:16:06.776244] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:07.967 [2024-07-15 00:16:06.776261] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:07.967 NEW_FUNC[1/1]: 0x192aaa0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:07.967 #13 NEW cov: 10759 ft: 14797 corp: 4/21b lim: 40 exec/s: 0 rss: 69Mb L: 7/7 MS: 1 InsertByte- 00:08:07.967 [2024-07-15 00:16:06.958793] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:07.967 [2024-07-15 00:16:06.958815] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:07.967 [2024-07-15 00:16:06.958832] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:08.227 #14 NEW cov: 10762 ft: 15907 corp: 5/28b lim: 40 exec/s: 14 rss: 69Mb L: 7/7 MS: 1 ChangeBinInt- 00:08:08.227 [2024-07-15 00:16:07.144642] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:08.227 [2024-07-15 00:16:07.144663] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:08.227 [2024-07-15 00:16:07.144680] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:08.227 #15 NEW cov: 10762 ft: 16145 corp: 6/43b lim: 40 exec/s: 15 rss: 69Mb L: 15/15 MS: 1 InsertRepeatedBytes- 00:08:08.486 [2024-07-15 00:16:07.334467] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:08.486 [2024-07-15 00:16:07.334489] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:08.486 [2024-07-15 00:16:07.334508] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:08.486 #16 NEW cov: 10762 ft: 16417 corp: 7/50b lim: 40 exec/s: 16 rss: 69Mb L: 7/15 MS: 1 ChangeBinInt- 00:08:08.486 [2024-07-15 00:16:07.521272] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:08.486 [2024-07-15 00:16:07.521294] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:08.486 [2024-07-15 00:16:07.521310] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:08.745 #17 NEW cov: 10762 ft: 16595 corp: 8/56b lim: 40 exec/s: 17 rss: 69Mb L: 6/15 MS: 1 ShuffleBytes- 00:08:08.745 [2024-07-15 00:16:07.711337] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:08.745 [2024-07-15 00:16:07.711358] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:08.745 [2024-07-15 00:16:07.711375] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:09.008 #18 NEW cov: 10769 ft: 17639 corp: 9/67b lim: 40 exec/s: 18 rss: 69Mb L: 11/15 MS: 1 CMP- DE: "\377\377\377\377"- 00:08:09.008 [2024-07-15 00:16:07.900120] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:09.008 [2024-07-15 00:16:07.900142] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:09.008 [2024-07-15 00:16:07.900159] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:09.008 #19 NEW cov: 10769 ft: 17724 corp: 10/82b lim: 40 exec/s: 9 rss: 69Mb L: 15/15 MS: 1 PersAutoDict- DE: "\377\377\377\377"- 00:08:09.008 #19 DONE cov: 10769 ft: 17724 corp: 10/82b lim: 40 exec/s: 9 rss: 69Mb 00:08:09.008 ###### Recommended dictionary. ###### 00:08:09.008 "\377\377\377\377" # Uses: 1 00:08:09.008 ###### End of recommended dictionary. ###### 00:08:09.008 Done 19 runs in 2 second(s) 00:08:09.268 00:16:08 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-1 00:08:09.268 00:16:08 -- ../common.sh@72 -- # (( i++ )) 00:08:09.268 00:16:08 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:09.268 00:16:08 -- ../common.sh@73 -- # start_llvm_fuzz 2 1 0x1 00:08:09.268 00:16:08 -- vfio/run.sh@22 -- # local fuzzer_type=2 00:08:09.268 00:16:08 -- vfio/run.sh@23 -- # local timen=1 00:08:09.268 00:16:08 -- vfio/run.sh@24 -- # local core=0x1 00:08:09.268 00:16:08 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:08:09.268 00:16:08 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-2 00:08:09.268 00:16:08 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-2/domain/1 00:08:09.268 00:16:08 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-2/domain/2 00:08:09.268 00:16:08 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-2/fuzz_vfio_json.conf 00:08:09.268 00:16:08 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-2 /tmp/vfio-user-2/domain/1 /tmp/vfio-user-2/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:08:09.268 00:16:08 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-2/domain/1%; 00:08:09.268 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-2/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:09.268 00:16:08 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-2/domain/1 -c /tmp/vfio-user-2/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 -Y /tmp/vfio-user-2/domain/2 -r /tmp/vfio-user-2/spdk2.sock -Z 2 00:08:09.268 [2024-07-15 00:16:08.311593] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:08:09.268 [2024-07-15 00:16:08.311661] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid338624 ] 00:08:09.526 EAL: No free 2048 kB hugepages reported on node 1 00:08:09.526 [2024-07-15 00:16:08.383597] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:09.526 [2024-07-15 00:16:08.452249] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:09.526 [2024-07-15 00:16:08.452412] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:09.784 INFO: Running with entropic power schedule (0xFF, 100). 00:08:09.784 INFO: Seed: 1735850796 00:08:09.784 INFO: Loaded 1 modules (338583 inline 8-bit counters): 338583 [0x27cc1cc, 0x281ec63), 00:08:09.784 INFO: Loaded 1 PC tables (338583 PCs): 338583 [0x281ec68,0x2d495d8), 00:08:09.784 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:08:09.784 INFO: A corpus is not provided, starting from an empty corpus 00:08:09.784 #2 INITED exec/s: 0 rss: 61Mb 00:08:09.784 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:09.784 This may also happen if the target rejected all inputs we tried so far 00:08:09.784 [2024-07-15 00:16:08.715978] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:10.043 NEW_FUNC[1/636]: 0x481670 in fuzz_vfio_user_get_region_info /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:104 00:08:10.043 NEW_FUNC[2/636]: 0x486290 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:10.043 #4 NEW cov: 10708 ft: 10414 corp: 2/29b lim: 80 exec/s: 0 rss: 66Mb L: 28/28 MS: 2 CopyPart-InsertRepeatedBytes- 00:08:10.302 [2024-07-15 00:16:09.125827] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:10.302 #5 NEW cov: 10722 ft: 13218 corp: 3/58b lim: 80 exec/s: 0 rss: 68Mb L: 29/29 MS: 1 InsertByte- 00:08:10.302 [2024-07-15 00:16:09.239561] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:10.302 #6 NEW cov: 10725 ft: 14167 corp: 4/87b lim: 80 exec/s: 0 rss: 69Mb L: 29/29 MS: 1 CrossOver- 00:08:10.302 [2024-07-15 00:16:09.353544] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:10.560 #7 NEW cov: 10725 ft: 14320 corp: 5/124b lim: 80 exec/s: 0 rss: 69Mb L: 37/37 MS: 1 InsertRepeatedBytes- 00:08:10.561 [2024-07-15 00:16:09.467630] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:10.561 NEW_FUNC[1/1]: 0x192aaa0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:10.561 #8 NEW cov: 10742 ft: 14526 corp: 6/153b lim: 80 exec/s: 0 rss: 69Mb L: 29/37 MS: 1 ShuffleBytes- 00:08:10.561 [2024-07-15 00:16:09.581440] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:10.819 #14 NEW cov: 10742 ft: 14557 corp: 7/182b lim: 80 exec/s: 0 rss: 69Mb L: 29/37 MS: 1 CMP- DE: "\201\000\000\000\000\000\000\000"- 00:08:10.819 [2024-07-15 00:16:09.695312] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:10.819 #15 NEW cov: 10742 ft: 15501 corp: 8/211b lim: 80 exec/s: 15 rss: 69Mb L: 29/37 MS: 1 ChangeBinInt- 00:08:10.819 [2024-07-15 00:16:09.808017] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:10.819 #16 NEW cov: 10742 ft: 15693 corp: 9/247b lim: 80 exec/s: 16 rss: 69Mb L: 36/37 MS: 1 CopyPart- 00:08:11.078 [2024-07-15 00:16:09.921850] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:11.078 #17 NEW cov: 10742 ft: 15997 corp: 10/276b lim: 80 exec/s: 17 rss: 69Mb L: 29/37 MS: 1 ShuffleBytes- 00:08:11.078 [2024-07-15 00:16:10.035511] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:11.078 #18 NEW cov: 10742 ft: 16268 corp: 11/305b lim: 80 exec/s: 18 rss: 69Mb L: 29/37 MS: 1 EraseBytes- 00:08:11.337 [2024-07-15 00:16:10.149860] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:11.337 #19 NEW cov: 10742 ft: 16393 corp: 12/333b lim: 80 exec/s: 19 rss: 69Mb L: 28/37 MS: 1 ChangeBit- 00:08:11.337 [2024-07-15 00:16:10.262405] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:11.337 #20 NEW cov: 10742 ft: 16665 corp: 13/363b lim: 80 exec/s: 20 rss: 69Mb L: 30/37 MS: 1 InsertByte- 00:08:11.337 [2024-07-15 00:16:10.376197] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:11.596 #21 NEW cov: 10742 ft: 16757 corp: 14/429b lim: 80 exec/s: 21 rss: 70Mb L: 66/66 MS: 1 InsertRepeatedBytes- 00:08:11.596 [2024-07-15 00:16:10.490063] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:11.596 #22 NEW cov: 10749 ft: 16917 corp: 15/459b lim: 80 exec/s: 22 rss: 70Mb L: 30/66 MS: 1 InsertByte- 00:08:11.596 [2024-07-15 00:16:10.603103] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:11.855 #23 NEW cov: 10749 ft: 16964 corp: 16/487b lim: 80 exec/s: 11 rss: 70Mb L: 28/66 MS: 1 EraseBytes- 00:08:11.855 #23 DONE cov: 10749 ft: 16964 corp: 16/487b lim: 80 exec/s: 11 rss: 70Mb 00:08:11.855 ###### Recommended dictionary. ###### 00:08:11.855 "\201\000\000\000\000\000\000\000" # Uses: 0 00:08:11.855 ###### End of recommended dictionary. ###### 00:08:11.855 Done 23 runs in 2 second(s) 00:08:12.113 00:16:10 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-2 00:08:12.114 00:16:10 -- ../common.sh@72 -- # (( i++ )) 00:08:12.114 00:16:10 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:12.114 00:16:10 -- ../common.sh@73 -- # start_llvm_fuzz 3 1 0x1 00:08:12.114 00:16:10 -- vfio/run.sh@22 -- # local fuzzer_type=3 00:08:12.114 00:16:10 -- vfio/run.sh@23 -- # local timen=1 00:08:12.114 00:16:10 -- vfio/run.sh@24 -- # local core=0x1 00:08:12.114 00:16:10 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:08:12.114 00:16:10 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-3 00:08:12.114 00:16:10 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-3/domain/1 00:08:12.114 00:16:10 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-3/domain/2 00:08:12.114 00:16:10 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-3/fuzz_vfio_json.conf 00:08:12.114 00:16:10 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-3 /tmp/vfio-user-3/domain/1 /tmp/vfio-user-3/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:08:12.114 00:16:10 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-3/domain/1%; 00:08:12.114 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-3/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:12.114 00:16:10 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-3/domain/1 -c /tmp/vfio-user-3/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 -Y /tmp/vfio-user-3/domain/2 -r /tmp/vfio-user-3/spdk3.sock -Z 3 00:08:12.114 [2024-07-15 00:16:10.982575] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:08:12.114 [2024-07-15 00:16:10.982653] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid339109 ] 00:08:12.114 EAL: No free 2048 kB hugepages reported on node 1 00:08:12.114 [2024-07-15 00:16:11.055275] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:12.114 [2024-07-15 00:16:11.123379] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:12.114 [2024-07-15 00:16:11.123543] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:12.372 INFO: Running with entropic power schedule (0xFF, 100). 00:08:12.372 INFO: Seed: 112878592 00:08:12.372 INFO: Loaded 1 modules (338583 inline 8-bit counters): 338583 [0x27cc1cc, 0x281ec63), 00:08:12.372 INFO: Loaded 1 PC tables (338583 PCs): 338583 [0x281ec68,0x2d495d8), 00:08:12.372 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:08:12.372 INFO: A corpus is not provided, starting from an empty corpus 00:08:12.372 #2 INITED exec/s: 0 rss: 61Mb 00:08:12.372 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:12.372 This may also happen if the target rejected all inputs we tried so far 00:08:12.372 [2024-07-15 00:16:11.387480] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [(nil), (nil)) fd=323 offset=0 prot=0x3: Invalid argument 00:08:12.372 [2024-07-15 00:16:11.387515] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:08:12.372 [2024-07-15 00:16:11.387525] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Invalid argument 00:08:12.372 [2024-07-15 00:16:11.387558] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:12.889 NEW_FUNC[1/636]: 0x481d50 in fuzz_vfio_user_dma_map /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:125 00:08:12.889 NEW_FUNC[2/636]: 0x486290 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:12.889 #9 NEW cov: 10721 ft: 10242 corp: 2/114b lim: 320 exec/s: 0 rss: 66Mb L: 113/113 MS: 2 ChangeBit-InsertRepeatedBytes- 00:08:12.889 [2024-07-15 00:16:11.798385] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0 prot=0x3: Invalid argument 00:08:12.889 [2024-07-15 00:16:11.798421] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:08:12.889 [2024-07-15 00:16:11.798432] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Invalid argument 00:08:12.889 [2024-07-15 00:16:11.798456] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:12.889 NEW_FUNC[1/2]: 0x15ebff0 in _is_io_flags_valid /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_ns_cmd.c:141 00:08:12.889 NEW_FUNC[2/2]: 0x1608a00 in _nvme_md_excluded_from_xfer /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_ns_cmd.c:54 00:08:12.889 #10 NEW cov: 10740 ft: 12771 corp: 3/227b lim: 320 exec/s: 0 rss: 68Mb L: 113/113 MS: 1 CrossOver- 00:08:12.889 [2024-07-15 00:16:11.913193] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0 prot=0x3: Invalid argument 00:08:12.889 [2024-07-15 00:16:11.913219] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:08:12.889 [2024-07-15 00:16:11.913230] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Invalid argument 00:08:12.889 [2024-07-15 00:16:11.913248] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:13.148 #11 NEW cov: 10740 ft: 14505 corp: 4/340b lim: 320 exec/s: 0 rss: 69Mb L: 113/113 MS: 1 ChangeByte- 00:08:13.148 [2024-07-15 00:16:12.028050] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0 prot=0x3: Invalid argument 00:08:13.148 [2024-07-15 00:16:12.028074] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:08:13.148 [2024-07-15 00:16:12.028084] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Invalid argument 00:08:13.148 [2024-07-15 00:16:12.028116] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:13.148 #16 NEW cov: 10740 ft: 14824 corp: 5/470b lim: 320 exec/s: 0 rss: 69Mb L: 130/130 MS: 5 CrossOver-CMP-InsertByte-ChangeBit-InsertRepeatedBytes- DE: "\000\000\000\000"- 00:08:13.148 [2024-07-15 00:16:12.153102] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0 prot=0x3: Invalid argument 00:08:13.148 [2024-07-15 00:16:12.153127] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:08:13.148 [2024-07-15 00:16:12.153139] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Invalid argument 00:08:13.148 [2024-07-15 00:16:12.153158] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:13.406 NEW_FUNC[1/1]: 0x192aaa0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:13.406 #17 NEW cov: 10760 ft: 15207 corp: 6/576b lim: 320 exec/s: 0 rss: 69Mb L: 106/130 MS: 1 EraseBytes- 00:08:13.406 #18 NEW cov: 10764 ft: 15420 corp: 7/682b lim: 320 exec/s: 18 rss: 69Mb L: 106/130 MS: 1 ChangeBit- 00:08:13.406 [2024-07-15 00:16:12.380921] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0 prot=0x3: Invalid argument 00:08:13.406 [2024-07-15 00:16:12.380945] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:08:13.406 [2024-07-15 00:16:12.380955] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Invalid argument 00:08:13.406 [2024-07-15 00:16:12.380988] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:13.406 #19 NEW cov: 10764 ft: 15624 corp: 8/799b lim: 320 exec/s: 19 rss: 69Mb L: 117/130 MS: 1 PersAutoDict- DE: "\000\000\000\000"- 00:08:13.665 [2024-07-15 00:16:12.495818] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0 prot=0x3: Invalid argument 00:08:13.665 [2024-07-15 00:16:12.495843] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:08:13.665 [2024-07-15 00:16:12.495853] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Invalid argument 00:08:13.665 [2024-07-15 00:16:12.495887] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:13.665 #20 NEW cov: 10764 ft: 15692 corp: 9/991b lim: 320 exec/s: 20 rss: 69Mb L: 192/192 MS: 1 InsertRepeatedBytes- 00:08:13.665 [2024-07-15 00:16:12.610577] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0 prot=0x3: Invalid argument 00:08:13.666 [2024-07-15 00:16:12.610617] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:08:13.666 [2024-07-15 00:16:12.610628] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Invalid argument 00:08:13.666 [2024-07-15 00:16:12.610645] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:13.666 #21 NEW cov: 10764 ft: 16144 corp: 10/1259b lim: 320 exec/s: 21 rss: 69Mb L: 268/268 MS: 1 CrossOver- 00:08:13.924 #24 NEW cov: 10764 ft: 16261 corp: 11/1364b lim: 320 exec/s: 24 rss: 69Mb L: 105/268 MS: 3 InsertByte-ShuffleBytes-CrossOver- 00:08:13.924 [2024-07-15 00:16:12.852303] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0 prot=0x3: Invalid argument 00:08:13.924 [2024-07-15 00:16:12.852332] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:08:13.924 [2024-07-15 00:16:12.852343] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Invalid argument 00:08:13.924 [2024-07-15 00:16:12.852361] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:13.924 #25 NEW cov: 10764 ft: 16351 corp: 12/1620b lim: 320 exec/s: 25 rss: 69Mb L: 256/268 MS: 1 InsertRepeatedBytes- 00:08:14.183 #26 NEW cov: 10764 ft: 16547 corp: 13/1726b lim: 320 exec/s: 26 rss: 69Mb L: 106/268 MS: 1 ChangeBinInt- 00:08:14.183 [2024-07-15 00:16:13.085181] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0 prot=0x3: Invalid argument 00:08:14.183 [2024-07-15 00:16:13.085207] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:08:14.183 [2024-07-15 00:16:13.085217] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Invalid argument 00:08:14.183 [2024-07-15 00:16:13.085250] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:14.183 #27 NEW cov: 10771 ft: 16557 corp: 14/1865b lim: 320 exec/s: 27 rss: 69Mb L: 139/268 MS: 1 InsertRepeatedBytes- 00:08:14.183 [2024-07-15 00:16:13.201067] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0 prot=0x3: Invalid argument 00:08:14.183 [2024-07-15 00:16:13.201091] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:08:14.183 [2024-07-15 00:16:13.201101] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Invalid argument 00:08:14.183 [2024-07-15 00:16:13.201133] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:14.440 #33 NEW cov: 10771 ft: 16594 corp: 15/1979b lim: 320 exec/s: 33 rss: 69Mb L: 114/268 MS: 1 InsertByte- 00:08:14.440 [2024-07-15 00:16:13.316947] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0 prot=0x3: Invalid argument 00:08:14.440 [2024-07-15 00:16:13.316970] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:08:14.440 [2024-07-15 00:16:13.316980] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Invalid argument 00:08:14.440 [2024-07-15 00:16:13.317012] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:14.440 #34 NEW cov: 10771 ft: 16705 corp: 16/2092b lim: 320 exec/s: 17 rss: 69Mb L: 113/268 MS: 1 PersAutoDict- DE: "\000\000\000\000"- 00:08:14.440 #34 DONE cov: 10771 ft: 16705 corp: 16/2092b lim: 320 exec/s: 17 rss: 69Mb 00:08:14.440 ###### Recommended dictionary. ###### 00:08:14.440 "\000\000\000\000" # Uses: 2 00:08:14.440 ###### End of recommended dictionary. ###### 00:08:14.440 Done 34 runs in 2 second(s) 00:08:14.697 00:16:13 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-3 00:08:14.697 00:16:13 -- ../common.sh@72 -- # (( i++ )) 00:08:14.697 00:16:13 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:14.697 00:16:13 -- ../common.sh@73 -- # start_llvm_fuzz 4 1 0x1 00:08:14.697 00:16:13 -- vfio/run.sh@22 -- # local fuzzer_type=4 00:08:14.697 00:16:13 -- vfio/run.sh@23 -- # local timen=1 00:08:14.697 00:16:13 -- vfio/run.sh@24 -- # local core=0x1 00:08:14.697 00:16:13 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:08:14.697 00:16:13 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-4 00:08:14.697 00:16:13 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-4/domain/1 00:08:14.697 00:16:13 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-4/domain/2 00:08:14.697 00:16:13 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-4/fuzz_vfio_json.conf 00:08:14.697 00:16:13 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-4 /tmp/vfio-user-4/domain/1 /tmp/vfio-user-4/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:08:14.697 00:16:13 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-4/domain/1%; 00:08:14.697 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-4/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:14.697 00:16:13 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-4/domain/1 -c /tmp/vfio-user-4/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 -Y /tmp/vfio-user-4/domain/2 -r /tmp/vfio-user-4/spdk4.sock -Z 4 00:08:14.697 [2024-07-15 00:16:13.687831] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:08:14.697 [2024-07-15 00:16:13.687900] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid339473 ] 00:08:14.697 EAL: No free 2048 kB hugepages reported on node 1 00:08:14.955 [2024-07-15 00:16:13.761067] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:14.955 [2024-07-15 00:16:13.833342] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:14.955 [2024-07-15 00:16:13.833512] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:15.214 INFO: Running with entropic power schedule (0xFF, 100). 00:08:15.214 INFO: Seed: 2832885613 00:08:15.214 INFO: Loaded 1 modules (338583 inline 8-bit counters): 338583 [0x27cc1cc, 0x281ec63), 00:08:15.214 INFO: Loaded 1 PC tables (338583 PCs): 338583 [0x281ec68,0x2d495d8), 00:08:15.214 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:08:15.214 INFO: A corpus is not provided, starting from an empty corpus 00:08:15.214 #2 INITED exec/s: 0 rss: 61Mb 00:08:15.214 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:15.214 This may also happen if the target rejected all inputs we tried so far 00:08:15.472 NEW_FUNC[1/632]: 0x4825d0 in fuzz_vfio_user_dma_unmap /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:145 00:08:15.472 NEW_FUNC[2/632]: 0x486290 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:15.472 #9 NEW cov: 10701 ft: 10641 corp: 2/77b lim: 320 exec/s: 0 rss: 66Mb L: 76/76 MS: 2 CrossOver-InsertRepeatedBytes- 00:08:15.731 #10 NEW cov: 10719 ft: 14669 corp: 3/153b lim: 320 exec/s: 0 rss: 67Mb L: 76/76 MS: 1 CopyPart- 00:08:15.990 NEW_FUNC[1/1]: 0x192aaa0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:15.990 #11 NEW cov: 10736 ft: 15675 corp: 4/229b lim: 320 exec/s: 0 rss: 68Mb L: 76/76 MS: 1 ChangeBit- 00:08:16.249 #12 NEW cov: 10736 ft: 15792 corp: 5/305b lim: 320 exec/s: 12 rss: 68Mb L: 76/76 MS: 1 ChangeByte- 00:08:16.249 #13 NEW cov: 10736 ft: 15826 corp: 6/381b lim: 320 exec/s: 13 rss: 68Mb L: 76/76 MS: 1 ChangeBinInt- 00:08:16.508 #14 NEW cov: 10736 ft: 16023 corp: 7/457b lim: 320 exec/s: 14 rss: 68Mb L: 76/76 MS: 1 ChangeBit- 00:08:16.767 #15 NEW cov: 10736 ft: 16438 corp: 8/534b lim: 320 exec/s: 15 rss: 68Mb L: 77/77 MS: 1 InsertByte- 00:08:16.767 #16 NEW cov: 10736 ft: 16698 corp: 9/619b lim: 320 exec/s: 16 rss: 68Mb L: 85/85 MS: 1 CopyPart- 00:08:17.025 #17 NEW cov: 10743 ft: 16719 corp: 10/695b lim: 320 exec/s: 17 rss: 69Mb L: 76/85 MS: 1 ChangeBinInt- 00:08:17.284 #18 NEW cov: 10743 ft: 16818 corp: 11/771b lim: 320 exec/s: 9 rss: 69Mb L: 76/85 MS: 1 ChangeByte- 00:08:17.284 #18 DONE cov: 10743 ft: 16818 corp: 11/771b lim: 320 exec/s: 9 rss: 69Mb 00:08:17.284 Done 18 runs in 2 second(s) 00:08:17.544 00:16:16 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-4 00:08:17.544 00:16:16 -- ../common.sh@72 -- # (( i++ )) 00:08:17.544 00:16:16 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:17.544 00:16:16 -- ../common.sh@73 -- # start_llvm_fuzz 5 1 0x1 00:08:17.544 00:16:16 -- vfio/run.sh@22 -- # local fuzzer_type=5 00:08:17.544 00:16:16 -- vfio/run.sh@23 -- # local timen=1 00:08:17.544 00:16:16 -- vfio/run.sh@24 -- # local core=0x1 00:08:17.544 00:16:16 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:08:17.544 00:16:16 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-5 00:08:17.544 00:16:16 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-5/domain/1 00:08:17.544 00:16:16 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-5/domain/2 00:08:17.544 00:16:16 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-5/fuzz_vfio_json.conf 00:08:17.544 00:16:16 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-5 /tmp/vfio-user-5/domain/1 /tmp/vfio-user-5/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:08:17.544 00:16:16 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-5/domain/1%; 00:08:17.544 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-5/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:17.544 00:16:16 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-5/domain/1 -c /tmp/vfio-user-5/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 -Y /tmp/vfio-user-5/domain/2 -r /tmp/vfio-user-5/spdk5.sock -Z 5 00:08:17.544 [2024-07-15 00:16:16.467928] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:08:17.544 [2024-07-15 00:16:16.467999] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid340012 ] 00:08:17.544 EAL: No free 2048 kB hugepages reported on node 1 00:08:17.544 [2024-07-15 00:16:16.539204] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:17.803 [2024-07-15 00:16:16.607246] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:17.803 [2024-07-15 00:16:16.607393] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:17.803 INFO: Running with entropic power schedule (0xFF, 100). 00:08:17.803 INFO: Seed: 1301902409 00:08:17.803 INFO: Loaded 1 modules (338583 inline 8-bit counters): 338583 [0x27cc1cc, 0x281ec63), 00:08:17.803 INFO: Loaded 1 PC tables (338583 PCs): 338583 [0x281ec68,0x2d495d8), 00:08:17.803 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:08:17.803 INFO: A corpus is not provided, starting from an empty corpus 00:08:17.803 #2 INITED exec/s: 0 rss: 61Mb 00:08:17.803 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:17.803 This may also happen if the target rejected all inputs we tried so far 00:08:18.152 [2024-07-15 00:16:16.900470] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:18.152 [2024-07-15 00:16:16.900515] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:18.411 NEW_FUNC[1/638]: 0x482fd0 in fuzz_vfio_user_irq_set /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:172 00:08:18.411 NEW_FUNC[2/638]: 0x486290 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:18.411 #9 NEW cov: 10733 ft: 10677 corp: 2/58b lim: 120 exec/s: 0 rss: 67Mb L: 57/57 MS: 2 CopyPart-InsertRepeatedBytes- 00:08:18.411 [2024-07-15 00:16:17.364961] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:18.411 [2024-07-15 00:16:17.365007] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:18.670 #10 NEW cov: 10747 ft: 14107 corp: 3/115b lim: 120 exec/s: 0 rss: 68Mb L: 57/57 MS: 1 ShuffleBytes- 00:08:18.670 [2024-07-15 00:16:17.543371] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:18.670 [2024-07-15 00:16:17.543401] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:18.670 NEW_FUNC[1/1]: 0x192aaa0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:18.670 #11 NEW cov: 10764 ft: 15423 corp: 4/172b lim: 120 exec/s: 0 rss: 69Mb L: 57/57 MS: 1 ChangeBinInt- 00:08:18.670 [2024-07-15 00:16:17.723902] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:18.670 [2024-07-15 00:16:17.723932] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:18.930 #12 NEW cov: 10764 ft: 15859 corp: 5/232b lim: 120 exec/s: 12 rss: 69Mb L: 60/60 MS: 1 InsertRepeatedBytes- 00:08:18.930 [2024-07-15 00:16:17.910764] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:18.930 [2024-07-15 00:16:17.910794] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:19.189 #13 NEW cov: 10764 ft: 15993 corp: 6/292b lim: 120 exec/s: 13 rss: 69Mb L: 60/60 MS: 1 ShuffleBytes- 00:08:19.189 [2024-07-15 00:16:18.088683] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:19.189 [2024-07-15 00:16:18.088713] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:19.189 #14 NEW cov: 10764 ft: 16549 corp: 7/357b lim: 120 exec/s: 14 rss: 69Mb L: 65/65 MS: 1 CopyPart- 00:08:19.449 [2024-07-15 00:16:18.269619] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:19.449 [2024-07-15 00:16:18.269648] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:19.449 #15 NEW cov: 10764 ft: 16690 corp: 8/419b lim: 120 exec/s: 15 rss: 69Mb L: 62/65 MS: 1 CMP- DE: "\377\377"- 00:08:19.449 [2024-07-15 00:16:18.450214] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:19.449 [2024-07-15 00:16:18.450244] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:19.708 #16 NEW cov: 10764 ft: 16911 corp: 9/482b lim: 120 exec/s: 16 rss: 69Mb L: 63/65 MS: 1 InsertByte- 00:08:19.708 [2024-07-15 00:16:18.629918] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:19.708 [2024-07-15 00:16:18.629947] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:19.708 #17 NEW cov: 10771 ft: 17014 corp: 10/547b lim: 120 exec/s: 17 rss: 70Mb L: 65/65 MS: 1 ChangeBinInt- 00:08:19.967 [2024-07-15 00:16:18.808582] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:19.967 [2024-07-15 00:16:18.808611] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:19.967 #18 NEW cov: 10771 ft: 17260 corp: 11/598b lim: 120 exec/s: 9 rss: 70Mb L: 51/65 MS: 1 EraseBytes- 00:08:19.967 #18 DONE cov: 10771 ft: 17260 corp: 11/598b lim: 120 exec/s: 9 rss: 70Mb 00:08:19.967 ###### Recommended dictionary. ###### 00:08:19.967 "\377\377" # Uses: 0 00:08:19.967 ###### End of recommended dictionary. ###### 00:08:19.967 Done 18 runs in 2 second(s) 00:08:20.226 00:16:19 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-5 00:08:20.226 00:16:19 -- ../common.sh@72 -- # (( i++ )) 00:08:20.226 00:16:19 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:20.226 00:16:19 -- ../common.sh@73 -- # start_llvm_fuzz 6 1 0x1 00:08:20.226 00:16:19 -- vfio/run.sh@22 -- # local fuzzer_type=6 00:08:20.226 00:16:19 -- vfio/run.sh@23 -- # local timen=1 00:08:20.226 00:16:19 -- vfio/run.sh@24 -- # local core=0x1 00:08:20.226 00:16:19 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:08:20.226 00:16:19 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-6 00:08:20.226 00:16:19 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-6/domain/1 00:08:20.226 00:16:19 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-6/domain/2 00:08:20.226 00:16:19 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-6/fuzz_vfio_json.conf 00:08:20.226 00:16:19 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-6 /tmp/vfio-user-6/domain/1 /tmp/vfio-user-6/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:08:20.226 00:16:19 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-6/domain/1%; 00:08:20.226 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-6/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:20.226 00:16:19 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-6/domain/1 -c /tmp/vfio-user-6/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 -Y /tmp/vfio-user-6/domain/2 -r /tmp/vfio-user-6/spdk6.sock -Z 6 00:08:20.226 [2024-07-15 00:16:19.224891] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:08:20.226 [2024-07-15 00:16:19.224985] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid340554 ] 00:08:20.226 EAL: No free 2048 kB hugepages reported on node 1 00:08:20.485 [2024-07-15 00:16:19.297439] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:20.485 [2024-07-15 00:16:19.364491] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:20.485 [2024-07-15 00:16:19.364634] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:20.744 INFO: Running with entropic power schedule (0xFF, 100). 00:08:20.744 INFO: Seed: 4059901667 00:08:20.744 INFO: Loaded 1 modules (338583 inline 8-bit counters): 338583 [0x27cc1cc, 0x281ec63), 00:08:20.744 INFO: Loaded 1 PC tables (338583 PCs): 338583 [0x281ec68,0x2d495d8), 00:08:20.744 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:08:20.744 INFO: A corpus is not provided, starting from an empty corpus 00:08:20.744 #2 INITED exec/s: 0 rss: 61Mb 00:08:20.744 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:20.744 This may also happen if the target rejected all inputs we tried so far 00:08:20.744 [2024-07-15 00:16:19.657483] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:20.744 [2024-07-15 00:16:19.657554] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:21.003 NEW_FUNC[1/638]: 0x483cc0 in fuzz_vfio_user_set_msix /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:190 00:08:21.003 NEW_FUNC[2/638]: 0x486290 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:21.003 #29 NEW cov: 10721 ft: 10623 corp: 2/40b lim: 90 exec/s: 0 rss: 67Mb L: 39/39 MS: 2 InsertRepeatedBytes-InsertRepeatedBytes- 00:08:21.262 [2024-07-15 00:16:20.137604] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:21.262 [2024-07-15 00:16:20.137649] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:21.262 #30 NEW cov: 10735 ft: 14345 corp: 3/112b lim: 90 exec/s: 0 rss: 68Mb L: 72/72 MS: 1 InsertRepeatedBytes- 00:08:21.262 [2024-07-15 00:16:20.316054] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:21.262 [2024-07-15 00:16:20.316085] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:21.521 NEW_FUNC[1/1]: 0x192aaa0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:21.521 #31 NEW cov: 10752 ft: 15436 corp: 4/152b lim: 90 exec/s: 0 rss: 69Mb L: 40/72 MS: 1 InsertByte- 00:08:21.521 [2024-07-15 00:16:20.503495] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:21.521 [2024-07-15 00:16:20.503526] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:21.780 #37 NEW cov: 10752 ft: 15759 corp: 5/192b lim: 90 exec/s: 37 rss: 69Mb L: 40/72 MS: 1 InsertByte- 00:08:21.780 [2024-07-15 00:16:20.682046] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:21.780 [2024-07-15 00:16:20.682074] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:21.780 #38 NEW cov: 10752 ft: 16186 corp: 6/265b lim: 90 exec/s: 38 rss: 69Mb L: 73/73 MS: 1 InsertByte- 00:08:22.038 [2024-07-15 00:16:20.858939] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:22.039 [2024-07-15 00:16:20.858968] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:22.039 #39 NEW cov: 10752 ft: 16306 corp: 7/316b lim: 90 exec/s: 39 rss: 69Mb L: 51/73 MS: 1 CopyPart- 00:08:22.039 [2024-07-15 00:16:21.038343] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:22.039 [2024-07-15 00:16:21.038373] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:22.296 #40 NEW cov: 10752 ft: 16725 corp: 8/396b lim: 90 exec/s: 40 rss: 69Mb L: 80/80 MS: 1 InsertRepeatedBytes- 00:08:22.296 [2024-07-15 00:16:21.215975] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:22.296 [2024-07-15 00:16:21.216005] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:22.296 #41 NEW cov: 10752 ft: 16748 corp: 9/436b lim: 90 exec/s: 41 rss: 69Mb L: 40/80 MS: 1 ChangeBinInt- 00:08:22.555 [2024-07-15 00:16:21.394290] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:22.555 [2024-07-15 00:16:21.394320] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:22.555 #47 NEW cov: 10759 ft: 17149 corp: 10/487b lim: 90 exec/s: 47 rss: 70Mb L: 51/80 MS: 1 CopyPart- 00:08:22.555 [2024-07-15 00:16:21.572874] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:22.555 [2024-07-15 00:16:21.572903] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:22.814 #51 NEW cov: 10759 ft: 17531 corp: 11/527b lim: 90 exec/s: 25 rss: 70Mb L: 40/80 MS: 4 CopyPart-ChangeByte-ChangeBit-CrossOver- 00:08:22.814 #51 DONE cov: 10759 ft: 17531 corp: 11/527b lim: 90 exec/s: 25 rss: 70Mb 00:08:22.814 Done 51 runs in 2 second(s) 00:08:23.073 00:16:21 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-6 00:08:23.073 00:16:21 -- ../common.sh@72 -- # (( i++ )) 00:08:23.073 00:16:21 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:23.073 00:16:21 -- vfio/run.sh@75 -- # trap - SIGINT SIGTERM EXIT 00:08:23.073 00:08:23.073 real 0m19.454s 00:08:23.073 user 0m26.862s 00:08:23.073 sys 0m1.811s 00:08:23.073 00:16:21 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:23.073 00:16:21 -- common/autotest_common.sh@10 -- # set +x 00:08:23.073 ************************************ 00:08:23.073 END TEST vfio_fuzz 00:08:23.073 ************************************ 00:08:23.073 00:16:21 -- fuzz/llvm.sh@67 -- # [[ 1 -eq 0 ]] 00:08:23.073 00:08:23.073 real 1m23.609s 00:08:23.073 user 2m6.915s 00:08:23.073 sys 0m9.259s 00:08:23.073 00:16:21 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:23.073 00:16:21 -- common/autotest_common.sh@10 -- # set +x 00:08:23.073 ************************************ 00:08:23.073 END TEST llvm_fuzz 00:08:23.073 ************************************ 00:08:23.073 00:16:22 -- spdk/autotest.sh@378 -- # [[ 0 -eq 1 ]] 00:08:23.073 00:16:22 -- spdk/autotest.sh@383 -- # trap - SIGINT SIGTERM EXIT 00:08:23.073 00:16:22 -- spdk/autotest.sh@385 -- # timing_enter post_cleanup 00:08:23.073 00:16:22 -- common/autotest_common.sh@712 -- # xtrace_disable 00:08:23.073 00:16:22 -- common/autotest_common.sh@10 -- # set +x 00:08:23.073 00:16:22 -- spdk/autotest.sh@386 -- # autotest_cleanup 00:08:23.073 00:16:22 -- common/autotest_common.sh@1371 -- # local autotest_es=0 00:08:23.073 00:16:22 -- common/autotest_common.sh@1372 -- # xtrace_disable 00:08:23.073 00:16:22 -- common/autotest_common.sh@10 -- # set +x 00:08:29.638 INFO: APP EXITING 00:08:29.638 INFO: killing all VMs 00:08:29.638 INFO: killing vhost app 00:08:29.638 INFO: EXIT DONE 00:08:32.175 Waiting for block devices as requested 00:08:32.175 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:08:32.175 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:08:32.434 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:08:32.434 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:08:32.434 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:08:32.694 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:08:32.694 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:08:32.694 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:08:32.694 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:08:32.953 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:08:32.953 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:08:32.953 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:08:33.213 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:08:33.213 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:08:33.213 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:08:33.472 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:08:33.472 0000:d8:00.0 (8086 0a54): vfio-pci -> nvme 00:08:37.665 Cleaning 00:08:37.665 Removing: /dev/shm/spdk_tgt_trace.pid303254 00:08:37.665 Removing: /var/run/dpdk/spdk_pid300790 00:08:37.665 Removing: /var/run/dpdk/spdk_pid302045 00:08:37.665 Removing: /var/run/dpdk/spdk_pid303254 00:08:37.665 Removing: /var/run/dpdk/spdk_pid303851 00:08:37.665 Removing: /var/run/dpdk/spdk_pid304121 00:08:37.665 Removing: /var/run/dpdk/spdk_pid304436 00:08:37.665 Removing: /var/run/dpdk/spdk_pid304774 00:08:37.665 Removing: /var/run/dpdk/spdk_pid305077 00:08:37.665 Removing: /var/run/dpdk/spdk_pid305363 00:08:37.665 Removing: /var/run/dpdk/spdk_pid305653 00:08:37.665 Removing: /var/run/dpdk/spdk_pid305966 00:08:37.665 Removing: /var/run/dpdk/spdk_pid306831 00:08:37.665 Removing: /var/run/dpdk/spdk_pid309920 00:08:37.665 Removing: /var/run/dpdk/spdk_pid310324 00:08:37.666 Removing: /var/run/dpdk/spdk_pid310637 00:08:37.666 Removing: /var/run/dpdk/spdk_pid310654 00:08:37.666 Removing: /var/run/dpdk/spdk_pid311231 00:08:37.666 Removing: /var/run/dpdk/spdk_pid311497 00:08:37.666 Removing: /var/run/dpdk/spdk_pid311958 00:08:37.666 Removing: /var/run/dpdk/spdk_pid312075 00:08:37.666 Removing: /var/run/dpdk/spdk_pid312385 00:08:37.666 Removing: /var/run/dpdk/spdk_pid312653 00:08:37.666 Removing: /var/run/dpdk/spdk_pid312774 00:08:37.666 Removing: /var/run/dpdk/spdk_pid312954 00:08:37.666 Removing: /var/run/dpdk/spdk_pid313391 00:08:37.666 Removing: /var/run/dpdk/spdk_pid313622 00:08:37.666 Removing: /var/run/dpdk/spdk_pid313904 00:08:37.666 Removing: /var/run/dpdk/spdk_pid314168 00:08:37.666 Removing: /var/run/dpdk/spdk_pid314363 00:08:37.666 Removing: /var/run/dpdk/spdk_pid314555 00:08:37.666 Removing: /var/run/dpdk/spdk_pid314608 00:08:37.666 Removing: /var/run/dpdk/spdk_pid314874 00:08:37.666 Removing: /var/run/dpdk/spdk_pid315167 00:08:37.666 Removing: /var/run/dpdk/spdk_pid315429 00:08:37.666 Removing: /var/run/dpdk/spdk_pid315624 00:08:37.666 Removing: /var/run/dpdk/spdk_pid315793 00:08:37.666 Removing: /var/run/dpdk/spdk_pid316030 00:08:37.666 Removing: /var/run/dpdk/spdk_pid316296 00:08:37.666 Removing: /var/run/dpdk/spdk_pid316587 00:08:37.666 Removing: /var/run/dpdk/spdk_pid316856 00:08:37.666 Removing: /var/run/dpdk/spdk_pid317139 00:08:37.666 Removing: /var/run/dpdk/spdk_pid317368 00:08:37.666 Removing: /var/run/dpdk/spdk_pid317569 00:08:37.666 Removing: /var/run/dpdk/spdk_pid317729 00:08:37.666 Removing: /var/run/dpdk/spdk_pid318005 00:08:37.666 Removing: /var/run/dpdk/spdk_pid318277 00:08:37.666 Removing: /var/run/dpdk/spdk_pid318558 00:08:37.666 Removing: /var/run/dpdk/spdk_pid318835 00:08:37.666 Removing: /var/run/dpdk/spdk_pid319117 00:08:37.666 Removing: /var/run/dpdk/spdk_pid319338 00:08:37.666 Removing: /var/run/dpdk/spdk_pid319545 00:08:37.666 Removing: /var/run/dpdk/spdk_pid319715 00:08:37.666 Removing: /var/run/dpdk/spdk_pid319979 00:08:37.666 Removing: /var/run/dpdk/spdk_pid320245 00:08:37.666 Removing: /var/run/dpdk/spdk_pid320534 00:08:37.666 Removing: /var/run/dpdk/spdk_pid320802 00:08:37.666 Removing: /var/run/dpdk/spdk_pid321088 00:08:37.666 Removing: /var/run/dpdk/spdk_pid321320 00:08:37.666 Removing: /var/run/dpdk/spdk_pid321523 00:08:37.666 Removing: /var/run/dpdk/spdk_pid321681 00:08:37.666 Removing: /var/run/dpdk/spdk_pid321954 00:08:37.666 Removing: /var/run/dpdk/spdk_pid322226 00:08:37.666 Removing: /var/run/dpdk/spdk_pid322507 00:08:37.666 Removing: /var/run/dpdk/spdk_pid322785 00:08:37.666 Removing: /var/run/dpdk/spdk_pid323069 00:08:37.666 Removing: /var/run/dpdk/spdk_pid323304 00:08:37.666 Removing: /var/run/dpdk/spdk_pid323524 00:08:37.666 Removing: /var/run/dpdk/spdk_pid323687 00:08:37.666 Removing: /var/run/dpdk/spdk_pid323940 00:08:37.666 Removing: /var/run/dpdk/spdk_pid324214 00:08:37.666 Removing: /var/run/dpdk/spdk_pid324496 00:08:37.666 Removing: /var/run/dpdk/spdk_pid324659 00:08:37.666 Removing: /var/run/dpdk/spdk_pid324899 00:08:37.666 Removing: /var/run/dpdk/spdk_pid325574 00:08:37.666 Removing: /var/run/dpdk/spdk_pid326012 00:08:37.666 Removing: /var/run/dpdk/spdk_pid326685 00:08:37.666 Removing: /var/run/dpdk/spdk_pid327544 00:08:37.666 Removing: /var/run/dpdk/spdk_pid327856 00:08:37.666 Removing: /var/run/dpdk/spdk_pid328378 00:08:37.666 Removing: /var/run/dpdk/spdk_pid328870 00:08:37.666 Removing: /var/run/dpdk/spdk_pid329212 00:08:37.666 Removing: /var/run/dpdk/spdk_pid329759 00:08:37.666 Removing: /var/run/dpdk/spdk_pid330199 00:08:37.666 Removing: /var/run/dpdk/spdk_pid330593 00:08:37.666 Removing: /var/run/dpdk/spdk_pid331130 00:08:37.666 Removing: /var/run/dpdk/spdk_pid331559 00:08:37.666 Removing: /var/run/dpdk/spdk_pid331966 00:08:37.666 Removing: /var/run/dpdk/spdk_pid332511 00:08:37.666 Removing: /var/run/dpdk/spdk_pid332929 00:08:37.666 Removing: /var/run/dpdk/spdk_pid333345 00:08:37.666 Removing: /var/run/dpdk/spdk_pid333882 00:08:37.666 Removing: /var/run/dpdk/spdk_pid334253 00:08:37.666 Removing: /var/run/dpdk/spdk_pid334714 00:08:37.666 Removing: /var/run/dpdk/spdk_pid335257 00:08:37.666 Removing: /var/run/dpdk/spdk_pid335675 00:08:37.666 Removing: /var/run/dpdk/spdk_pid336096 00:08:37.666 Removing: /var/run/dpdk/spdk_pid336636 00:08:37.666 Removing: /var/run/dpdk/spdk_pid337036 00:08:37.666 Removing: /var/run/dpdk/spdk_pid337543 00:08:37.666 Removing: /var/run/dpdk/spdk_pid338082 00:08:37.666 Removing: /var/run/dpdk/spdk_pid338624 00:08:37.666 Removing: /var/run/dpdk/spdk_pid339109 00:08:37.666 Removing: /var/run/dpdk/spdk_pid339473 00:08:37.666 Removing: /var/run/dpdk/spdk_pid340012 00:08:37.666 Removing: /var/run/dpdk/spdk_pid340554 00:08:37.666 Clean 00:08:37.666 killing process with pid 256059 00:08:41.859 killing process with pid 256056 00:08:41.859 killing process with pid 256058 00:08:41.859 killing process with pid 256057 00:08:41.859 00:16:40 -- common/autotest_common.sh@1436 -- # return 0 00:08:41.859 00:16:40 -- spdk/autotest.sh@387 -- # timing_exit post_cleanup 00:08:41.859 00:16:40 -- common/autotest_common.sh@718 -- # xtrace_disable 00:08:41.859 00:16:40 -- common/autotest_common.sh@10 -- # set +x 00:08:41.859 00:16:40 -- spdk/autotest.sh@389 -- # timing_exit autotest 00:08:41.859 00:16:40 -- common/autotest_common.sh@718 -- # xtrace_disable 00:08:41.859 00:16:40 -- common/autotest_common.sh@10 -- # set +x 00:08:41.859 00:16:40 -- spdk/autotest.sh@390 -- # chmod a+r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt 00:08:41.859 00:16:40 -- spdk/autotest.sh@392 -- # [[ -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/udev.log ]] 00:08:41.859 00:16:40 -- spdk/autotest.sh@392 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/udev.log 00:08:41.859 00:16:40 -- spdk/autotest.sh@394 -- # hash lcov 00:08:41.859 00:16:40 -- spdk/autotest.sh@394 -- # [[ CC_TYPE=clang == *\c\l\a\n\g* ]] 00:08:41.859 00:16:40 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:08:41.859 00:16:40 -- scripts/common.sh@433 -- $ [[ -e /bin/wpdk_common.sh ]] 00:08:41.859 00:16:40 -- scripts/common.sh@441 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:08:41.859 00:16:40 -- scripts/common.sh@442 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:08:41.859 00:16:40 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:41.859 00:16:40 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:41.859 00:16:40 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:41.859 00:16:40 -- paths/export.sh@5 -- $ export PATH 00:08:41.859 00:16:40 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:41.859 00:16:40 -- common/autobuild_common.sh@434 -- $ out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:08:41.859 00:16:40 -- common/autobuild_common.sh@435 -- $ date +%s 00:08:41.859 00:16:40 -- common/autobuild_common.sh@435 -- $ mktemp -dt spdk_1720995400.XXXXXX 00:08:41.859 00:16:40 -- common/autobuild_common.sh@435 -- $ SPDK_WORKSPACE=/tmp/spdk_1720995400.O1abnW 00:08:41.859 00:16:40 -- common/autobuild_common.sh@437 -- $ [[ -n '' ]] 00:08:41.859 00:16:40 -- common/autobuild_common.sh@441 -- $ '[' -n '' ']' 00:08:41.859 00:16:40 -- common/autobuild_common.sh@444 -- $ scanbuild_exclude='--exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/' 00:08:41.859 00:16:40 -- common/autobuild_common.sh@448 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp' 00:08:41.860 00:16:40 -- common/autobuild_common.sh@450 -- $ scanbuild='scan-build -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/ --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:08:41.860 00:16:40 -- common/autobuild_common.sh@451 -- $ get_config_params 00:08:41.860 00:16:40 -- common/autotest_common.sh@387 -- $ xtrace_disable 00:08:41.860 00:16:40 -- common/autotest_common.sh@10 -- $ set +x 00:08:41.860 00:16:40 -- common/autobuild_common.sh@451 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-vfio-user' 00:08:41.860 00:16:40 -- spdk/autopackage.sh@10 -- $ MAKEFLAGS=-j112 00:08:41.860 00:16:40 -- spdk/autopackage.sh@11 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:08:41.860 00:16:40 -- spdk/autopackage.sh@13 -- $ [[ 0 -eq 1 ]] 00:08:41.860 00:16:40 -- spdk/autopackage.sh@18 -- $ [[ 1 -eq 0 ]] 00:08:41.860 00:16:40 -- spdk/autopackage.sh@18 -- $ [[ 0 -eq 0 ]] 00:08:41.860 00:16:40 -- spdk/autopackage.sh@19 -- $ timing_finish 00:08:41.860 00:16:40 -- common/autotest_common.sh@724 -- $ flamegraph=/usr/local/FlameGraph/flamegraph.pl 00:08:41.860 00:16:40 -- common/autotest_common.sh@725 -- $ '[' -x /usr/local/FlameGraph/flamegraph.pl ']' 00:08:41.860 00:16:40 -- common/autotest_common.sh@727 -- $ /usr/local/FlameGraph/flamegraph.pl --title 'Build Timing' --nametype Step: --countname seconds /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt 00:08:41.860 00:16:40 -- spdk/autopackage.sh@20 -- $ exit 0 00:08:41.860 + [[ -n 212686 ]] 00:08:41.860 + sudo kill 212686 00:08:41.870 [Pipeline] } 00:08:41.890 [Pipeline] // stage 00:08:41.896 [Pipeline] } 00:08:41.914 [Pipeline] // timeout 00:08:41.920 [Pipeline] } 00:08:41.939 [Pipeline] // catchError 00:08:41.946 [Pipeline] } 00:08:41.963 [Pipeline] // wrap 00:08:41.968 [Pipeline] } 00:08:41.987 [Pipeline] // catchError 00:08:42.000 [Pipeline] stage 00:08:42.003 [Pipeline] { (Epilogue) 00:08:42.024 [Pipeline] catchError 00:08:42.025 [Pipeline] { 00:08:42.039 [Pipeline] echo 00:08:42.040 Cleanup processes 00:08:42.047 [Pipeline] sh 00:08:42.331 + sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:08:42.331 349630 sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:08:42.347 [Pipeline] sh 00:08:42.631 ++ sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:08:42.631 ++ grep -v 'sudo pgrep' 00:08:42.631 ++ awk '{print $1}' 00:08:42.631 + sudo kill -9 00:08:42.631 + true 00:08:42.644 [Pipeline] sh 00:08:42.929 + jbp/jenkins/jjb-config/jobs/scripts/compress_artifacts.sh 00:08:42.929 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,721 MiB 00:08:42.929 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,721 MiB 00:08:43.881 [Pipeline] sh 00:08:44.167 + jbp/jenkins/jjb-config/jobs/scripts/check_artifacts_size.sh 00:08:44.167 Artifacts sizes are good 00:08:44.187 [Pipeline] archiveArtifacts 00:08:44.196 Archiving artifacts 00:08:44.314 [Pipeline] sh 00:08:44.627 + sudo chown -R sys_sgci /var/jenkins/workspace/short-fuzz-phy-autotest 00:08:44.640 [Pipeline] cleanWs 00:08:44.649 [WS-CLEANUP] Deleting project workspace... 00:08:44.649 [WS-CLEANUP] Deferred wipeout is used... 00:08:44.654 [WS-CLEANUP] done 00:08:44.655 [Pipeline] } 00:08:44.675 [Pipeline] // catchError 00:08:44.687 [Pipeline] sh 00:08:44.967 + logger -p user.info -t JENKINS-CI 00:08:44.976 [Pipeline] } 00:08:44.992 [Pipeline] // stage 00:08:44.998 [Pipeline] } 00:08:45.015 [Pipeline] // node 00:08:45.035 [Pipeline] End of Pipeline 00:08:45.072 Finished: SUCCESS